A Dubious Contribution to Climate Change Alarmism Literature

A Dubious Contribution to Climate Change Alarmism Literature
A demonstrator holds a placard reading "time is running out" during a protest in Barcelona on Sept. 27, 2019. (Josep Lago/AFP via Getty Images)
Mark Hendrickson
2/10/2023
Updated:
2/10/2023
0:00
Commentary
On Jan. 30, an article titled “Critical climate thresholds may be nearer than thought: study finds” was posted on Axios.com. It reported on a study by researchers from Stanford and Colorado State universities that was published in Proceedings of the National Academy of Sciences.

Although the study contributes nothing new to the climate change/global warming issue, it did reveal some of the weaknesses in the alarmists’ position. A tip of the hat to the author of the Axios report for neatly assembling those weaknesses.

The first sentence in the Axios article deftly sets the pattern: “A new study relying on machine learning methods finds the climate thresholds enshrined in the Paris agreement may be coming up faster than previously anticipated.”

Changes may happen faster than previously thought—or maybe not.
Similarly, check out this wishy-washy statement: “... passing 1.5°C or 2°C above preindustrial levels could [emphasis added] dramatically increase the risks to society and ecosystems.”

Here, the writer implicitly admits that there’s no guarantee of dire consequences. (Actually, the likelihood of cataclysm seems quite remote, since Earth has experienced such temperatures twice before in the past 4,000 years—the Roman and Minoan periods—and human civilization flourished both times.)

The weasel language continues with the statement, “the machine learning techniques may [emphasis added] be biased by the computer models they were trained on.”

“May”? If the machines were crunching numbers from specific computer models, how could they not be “biased” by those models? Aren’t the conclusions reached by artificial intelligence predetermined by the data and rules of the game fed into them?

The Axios article displays a pronounced bias of its own: “The world is already suffering the impacts of 1.1°C (1.98°F) to 1.2°C (2.16°F) of warming to date ...”

The word “suffering” is employed to manipulate the reader into believing that if the effects of warming so far have led to “suffering,” then further increases would likely be even more hurtful. The problem with the use of the word “suffering” in connection with the warming of the past century-and-a-half or so is that it’s a warped mischaracterization of historical reality.

In the mid-1800s, the world began to climb out of the Little Ice Age—a bitterly cold period of several centuries’ duration that, in fact, inflicted much suffering and hardship on the human race. In addition to the bone-chilling winters that humans living north of the Tropic of Cancer suffered through (in this context, the verb “suffer” is literally true), the concentration of CO2 in Earth’s atmosphere (~290 parts per million, ppm) was only about 100–150 ppm above what is necessary for plant life (hence, animal and human life) to survive on Earth.

Since the mid-1800s (i.e., the end of the Little Ice Age), temperatures have risen a couple of degrees and the concentration of CO2 in the atmosphere has risen to ~412 ppm. The result has been a significant greening of the planet. Agricultural production has received a significant boost from the CO2 enrichment and the longer growing seasons. The incidence of weather-related deaths has shrunk dramatically. Far more human beings are living at far higher standards of living than ever before. If this is “suffering,” then please, let us have more of it!

The researchers who prepared the study used neural networks “trained on climate model simulations.” Those networks agreed with the models’ predictions about how much and how fast global temperatures will be rising. I don’t see how anybody can regard such predictions as having any validity whatsoever, since the computer models on which these simulations were based have never matched actual temperature-related data. The models are highly flawed theoretical constructs that don’t conform to empirical reality.
Another point made in the Axios article is that the researchers employed a “data-driven approach.” That sounds reassuring, and it’s intended to confer some authority on the study’s findings, but actually, scientific research on temperatures is severely handicapped by defects in measuring, collecting, and reporting data. Here are a few of the difficulties:
  • Historically, there was a lack of quality control to ensure standardization of thermometers used to measure temperatures in different parts of the world.
  • There were also vast expanses of land and even vaster expanses of seas where data collection was sporadic, unreliable, and often nonexistent.
  • The well-known urban heat island effect has warped many measurements.
  • Official temperature-measuring installations frequently are compromised by poor siting. In 2009, on-site investigations found that many U.S. weather stations were yielding data corrupted by thermometers being placed close to localized heat sources—everything from being right next to sunbaked asphalt tennis courts to running engines. A repeat investigation last year found that the problem has gotten even worse, with fully 96 percent of U.S. weather stations they sampled providing corrupted temperature measurements.
  • Other times, temperature measurements are impacted by where the thermometers are located. For example, as recently as 1991, the National Oceanic and Atmospheric Administration (NOAA) collected data from nearly 600 weather stations in Canada. By 2010, the number of such stations in our vast northern neighbor dropped to 35, with only one north of the Arctic Circle. Can you guess what happens to average temperature when you include fewer measurements from colder regions? NOAA, by the way, was given an explicit exemption from the 2004 Data Quality Act, a federal law that requires the use of sound data in policymaking. (See the Wall Street Journal, Sept. 29, 2004, p. A18.)
NASA, too, has gained something of a reputation for revising historical temperature data to create an illusion of rapidly rising temperatures. See the fantastic video “Everywhere Warming Twice As Fast As Everywhere Else” by Tony Heller, in which he provides examples of data manipulation and of a popular misleading rhetorical trick employed by journalists to hype climate change.
A far more reliable temperature-gathering effort has been that provided by satellites (data that only goes back to 1979). It turns out that the satellites contradict the insistent claims of alarmists that we have been experiencing record heat in recent years. The satellites, in fact, show the opposite—a cooling trend during the years 2015 through 2022.
Predicting future climatological conditions is hard enough as it is (in fact, the UN’s IPCC stated that it’s impossible) but scientists don’t even have an unambiguous understanding of the past. Even the present is often distorted in the pursuit of ambitious political goals. Nobody can say what the future has in store for us. But, as I’ve written before, we’re making astounding progress in coping with Mother Nature’s meteorological convulsions, and as long as we keep growing economically, we'll continue to make progress in preserving and protecting human life from those events. That’s the challenge we face, regardless of whether we meet the arbitrary temperature thresholds that the climate change cabal has conjured up for us.
Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.
Mark Hendrickson is an economist who retired from the faculty of Grove City College in Pennsylvania, where he remains fellow for economic and social policy at the Institute for Faith and Freedom. He is the author of several books on topics as varied as American economic history, anonymous characters in the Bible, the wealth inequality issue, and climate change, among others.
Related Topics