The ‘Scientific Consensus’ Is Often Wrong

The ‘Scientific Consensus’ Is Often Wrong
A statue of Nicolaus Copernicus in Toruń, Poland. (Mateusz_foto/Pixabay.com)
Bob Zeidman
12/1/2020
Updated:
12/1/2020
Commentary
Dr. Scott Atlas is a senior fellow at Stanford University’s Hoover Institution who studies “the impact of government and the private sector on access, quality, and pricing in health care, global trends in health care innovation, and key economic issues related to the future of technology-based medical advances.” He’s written two books on these subjects. From 1998 to 2012, he served as professor and chief of neuroradiology at Stanford University Medical Center. In August, he was appointed to President Donald Trump’s Coronavirus Task Force.
Based on research and statistics, Atlas believes that the lockdowns have done more harm than good and recommended on Twitter that people use their voices and their votes to be heard. He may not have stated this clearly, as he explained afterwards, but Stanford’s Faculty Senate immediately condemned Atlas for “promot[ing] a view of COVID-19 that contradicts medical science.” There have also been calls for him to be fired from Stanford and that his medical license be revoked.
Is Atlas always correct? Certainly not. But is he a scientist? Absolutely. So why are people claiming he’s not or that he doesn’t understand “The Science”? As I’ve written previously, the practice of science has become biased. Science has become a cult with followers of scientists rather than science. This cult uses its influence to argue for climate change, gender fluidity, and the killing of unborn children while also arguing for masks and lockdowns.
The biggest misconception by the public, and by many scientists, is that there exists something omnipotent called “The Science.” Science is a study of nature that includes theory and experimentation and organization of results. It’s ongoing and, until mankind understands all of nature, the process will continue to reveal new information and refine and refute previously understood phenomena. There is no such thing as The Science, and to prove that, here are 10 times that a majority of scientists had consensus about how the universe worked, but they were significantly wrong.

Phlogiston

In the early years of chemistry theory, scientists believed that every combustible substance contained something called phlogiston. While this may sound silly to us now, this theory was proposed by respected chemist Johann Joachim Becher. He believed that all matter contained three kinds of “earth,” which he called “the vitrifiable,” “the mercurial,” and “the combustible.” The name phlogiston was coined by another respected scientist, Georg Ernst Stahl.

These were not wizards or magicians but respected scientists of their day. Becher lived in England around the same time that Isaac Newton revolutionized mathematics and physics with his publication of his “Principia.” Stahl was the first royal physician and court counselor to Frederick William I of Prussia, head of Berlin’s Medical Board, and founder of the Berlin Medical-Surgical College.

Phlogiston and Becher’s theories of matter were accepted by chemists in Europe for about 100 years until the “Chemical Revolution” initiated by Antoine-Laurent Lavoisier at the end of the 18th century, which laid the groundwork for the modern molecular and atomic theories of matter.

Artificial Sweeteners and Cancer

Throughout the last half-century, many products have been universally accepted by the consensus of medical scientists as causing cancer, some having been banned by government regulation, and then later shown to be acceptable and not dangerous.
In the 1970s, the popular sugar substitute saccharin was found in several scientific studies to cause bladder cancer in rats, and in 1981 it was listed in the National Toxicology Program’s Report on Carcinogens as a human carcinogen. It was only delisted in 2000 (pdf), along with eight other “known carcinogens,” when further studies showed it had no effect on humans. However, while the same studies implicated cyclamate as a carcinogen, it was fully banned by the government in 1969. Later studies concluded that cyclamate was not in fact a carcinogen, but it is still banned by the Food and Drug Administration (FDA).
Two other popular sugar substitutes, aspartame (marketed as NutraSweet® and Equal®) and sucralose (marketed as Splenda®), have at some point been identified in scientific studies as human carcinogens, only to later be cleared by other studies.

Ulcers and Bacteria

Stomach ulcers, as long ago as scientists began researching them, were “understood” to be the result of stress. In the late 19th century, doctors treated ulcers by cutting out the bottom of the stomach and reconnecting the intestine. With the advent of X-ray machines, doctors could see the painful inflammations and found a correlation with stressful lifestyles. They could induce ulcers in rats by putting them in straitjackets and submersing them in ice water. They could show that excessive acid was produced in the stomach and that antacids reduced the inflammation. Without a single appropriate, double-blind test, scientists connected correlation and cause. They “knew” that stress caused ulcers so there was no reason for tests.
In 1981, internist Barry Marshall began working with pathologist Robin Warren at the Royal Perth Hospital. Two years earlier, Warren had discovered that a corkscrew-shaped bacteria called Helicobacter pylori could survive in the harsh, acid-filled bottom of the stomach.

Gastroenterologists were uniformly dismissive. They already knew that ulcers were caused by stress and that the cure was psychotherapy. In an interview with Discover magazine, Marshall explained it this way:

“I presented that work at the annual meeting of the Royal Australasian College of Physicians in Perth. That was my first experience of people being totally skeptical. To gastroenterologists, the concept of a germ causing ulcers was like saying that the Earth is flat. After that I realized my paper was going to have difficulty being accepted. You think, ‘It’s science; it’s got to be accepted.’ But it’s not an absolute given. The idea was too weird.”

While the established medical community mostly ignored their findings, magazines like Reader’s Digest and the National Enquirer began writing stories about them. Eventually the National Institutes of Health and FDA in the United States fast-tracked larger studies and lent their credibility to spread the word. It took a decade for the medical community to recognize this bacterial cause of ulcers, for which Marshall and Warren received the 2005 Nobel Prize in Medicine.

The Law of Electronics

If you’ve studied electronics, you know about Ohm’s Law. It’s the very first law of electronics that electrical engineers study. Resistors are measured in Ohms. It’s named after Georg Simon Ohm, a high school teacher in 18th century Germany. Through extensive experimentation, he figured out that the amount of electric current that flows through a wire is proportional to the area of its cross section and inversely proportional to its length. The fundamental relationship between voltage, current, and resistance forms the basis for the design of every electrical device in existence. The consensus of scientists at the time was that this high school teacher was a loony.
After publishing his groundbreaking paper in 1827, he was forced to resign from his high school. Influential scientists such as Johannes Schultz and Georg Friedrich Pohl were certain that Ohm’s approach was wrong. It was only a few years before his death that Ohm received the recognition he deserved from the scientific community.

DC vs. AC Electrical Systems

We’ve all used batteries. Those are direct current (DC) sources of electricity. The voltage is constant. The electricity that comes to our homes from electrical power plants is alternating current (AC) where the voltage cycles through different values 60 times every second. We take this for granted now, but that wasn’t the case in the late 19th century.
Thomas Edison, the “Wizard of Menlo Park,” was arguably the greatest inventor of the 20th century. He had worked with DC electricity all of his long, successful career. But understanding AC electricity required advanced mathematics that was beyond Edison’s grasp. He had been a brilliant experimenter not a mathematician. In 1884, Edison hired Nikola Tesla, a gifted mathematician/physicist who had recently immigrated from Croatia. Edison reportedly took advantage of Tesla’s genius with false promises of financial compensation.
Tesla left Edison Electric and started researching and patenting AC motors. He eventually went to work for Edison’s competitor George Westinghouse who had been inventing and installing AC electrical systems. Edison could have designed his own AC systems, but instead spent years performing stunts to show the dangers of AC electricity including electrocuting cows, horses, elephants, and even a convicted murderer. Despite these gruesome demonstrations and Edison’s propaganda, the more efficient AC technology eventually won out.

Astronomy and John Harrison’s Clock

If you saw the inspiring and outstanding movie “Longitude,” you know about John Harrison. In the 18th century, European countries were competing for dominance of the oceans. Precisely measuring longitude was critical to navigating and returning ships to port rather than crashing into the nearby shore. In 1714, the British Parliament passed the Longitude Act, which offered a prize of 20,000 pounds ($26,700) for a solution that could determine longitude to within half a degree.
Astronomers were convinced that the solution was to map out the positions of the stars and the moon. But Harrison, a carpenter, decided what was needed was a clock that kept accurate time at sea. Harrison was mocked by the experts of the Board of Longitude who denied him the prize despite passing numerous rigorous tests of his clock at sea. Clockmakers also ridiculed him—after all he was a carpenter not a clockmaker. Only after years of pursuing appeals to King George III and Parliament was Harrison recognized and given the total cash prize in 1773, three years before his death.

The Ether

Since the times of the ancient Greeks, scientists could not comprehend the nothingness of a vacuum or space. How could light travel from one place to another or gravity affect objects at a distance if there was nothing in between? So the concept of ether was advanced, an invisible substance that existed everywhere, even when there was nothing else there.
The notion of ether was accepted for millennia, only being seriously questioned when scientists Albert Michelson and Edward Morley performed their famous Michelson-Morley Experiment in 1887. They measured a beam of light that was redirected in two perpendicular directions using mirrors and timed the light’s travel from its source to it destination. Since the Earth moved through the stationary ether, they expected the light beam traveling with the ether to move at a different speed than the light beam moving across the ether. Their measurements surprised them by finding that both beams moved at the same speed. This single experiment destroyed the millennia-long scientific consensus about ether. It led to Albert Einstein’s Theory of Relativity and much of our modern understanding of light, radiation, space, time, and the universe.

Albert Einstein and Quantum Mechanics

Albert Einstein is considered one of the most brilliant physicists of all time. He changed the way we understand space and time with his Theory of Relativity, which led the way for the development of nuclear power and explained much about the origins of the universe and the strange objects within it like quasars and black holes. And yet Einstein never accepted the theory of Quantum Mechanics that led to the development of transistors and semiconductors upon which all modern electronics is based.
The formulation of Quantum Mechanics was headed by physicist Niels Bohr. In 1927, the Fifth Solvay Conference on Electrons and Photons was held in Brussels. Einstein attended the conference of 29 physicists, 17 of whom already had or would eventually be awarded a Nobel Prize in Physics. But Einstein was a lone dissenter at this conference that formulated the basic principles of Quantum Theory, which state that, like energy, matter doesn’t consist of particles but rather waves that represent probabilities rather than certainties. Einstein famously stated, “God does not play dice.” Bohr allegedly replied, “Einstein, stop telling God what to do.”

Celestial Spheres

Starting at least as early as the 6th century BCE, scientists believed that the planets were encapsulated in spherical layers of material that rotated around each other. While this model of the solar system seems absurd to us in modern times, it was the accepted consensus for hundreds of years.
Scientists throughout these centuries accepted this model including such scientific giants as Plato, Aristotle, Ptolemy, and Copernicus. Early cosmological models placed the Earth at the center of these spheres. Nicolaus Copernicus placed the sun at the center but maintained the model of the spheres. Johannes Kepler’s laws of planetary motion, developed in the early 1600s, continued to describe the planets in terms of spheres.
It was only in 1687 when Isaac Newton published his monumental “Philosophiæ Naturalis Principia Mathematica” that went against the scientific consensus for over a millennium by showing that gravitation and not spheres kept the planets in orbit. Even this work, perhaps the most influential publication in the history of physics, remained controversial, and not fully accepted by the scientific community for over 100 years.

The Big Bang Theory

Like millions of people, you’ve probably watched the hit TV show “The Big Bang Theory” (one of my all-time favorite shows), and you’ve probably heard of the well-accepted cosmological theory of the same name that describes the beginning of the universe as an infinitesimal spec of matter that began expanding to become the universe. As the universe expands, it cools down but continues its expansion indefinitely.
At least that’s the current thinking. But when I was studying physics back in the eighties, I was taught the Oscillating Universe Theory. According to this theory, the Big Bang is followed by a Big Crunch where the universe’s expansion slows to the point that gravitation from all the matter within it begins to pull the universe back in on itself. This contraction eventually returns the universe to an infinitesimal point when a new Big Bang occurs and the whole cycle repeats indefinitely.

About 10 years ago at my Stanford reunion, I attended a physics lecture by a renowned cosmologist who gave a layman’s seminar about the universe. She explained the continuous expansion of the universe. Someone in the audience, about my age, raised his hand. “When I was in school,” he said, “we were taught about a cycling universe. Was that wrong?”

“Oh yes,” the professor laughed. “That’s what we used to think but now we know the truth.”

When a scientist tells you she knows the truth, that’s when you need to worry. And I won’t even get into String Theory, which has attracted more physics students and absorbed more government funding than any other area of theoretical physics ever. This “theory” can’t answer any questions about the universe because, after nearly half a century, physicists haven’t even figured out what questions to ask. Some physicists have begun questioning the usefulness of studying a theory that make no testable predictions about the world. For now, though, the consensus is that this complex mathematical formulation will prove useful at some indeterminate time in the future.

Science Means Not Relying on Blind Trust

When you hear scientists state, “the truth is …” they are wrong. Scientists come closer and closer to the truth, which will probably never be fully known. When someone says “the consensus among scientists is …” you can be sure it’s hyperbole. True science is not a democracy determined by a vote; it’s driven by argument, debate, experimentation, and perpetual refinement, but never by consensus.
Bob Zeidman is the creator of the field of software forensics and the founder of successful high-tech Silicon Valley firms, including Zeidman Consulting and Software Analysis and Forensic Engineering. He is the author of textbooks on engineering and intellectual property as well as screenplays and novels. His latest novel is the political satire “Good Intentions.”
Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.
Bob Zeidman is the creator of the field of software forensics and the founder of several successful high-tech Silicon Valley firms including Zeidman Consulting and Software Analysis and Forensic Engineering. His latest venture is Good Beat Poker, a new way to play and watch poker online. He is the author of textbooks on engineering and intellectual property as well as screenplays and novels. His latest novel is the political satire "Animal Lab," a modern sequel to George Orwell’s classic "Animal Farm."
Related Topics