Home
/
RELIGION & LIBERTY ONLINE
/
You Can’t Have ‘Settled Science’ Based on Unsettled Data
You Can’t Have ‘Settled Science’ Based on Unsettled Data
Mar 18, 2025 6:57 PM

During his most recent State of the Union address, President Obama talked about climate change and claimed, “2014 was the planet’s warmest year on record.”

Obama was basing his statement on a press release by NASA’s Goddard Institute for Space Studies (GISS). According to the NASA data collected from more than 3,000 weather stations around the globe, “The year 2014 ranks as Earth’s warmest since 1880.” Climate change skeptics pushed back by questioning the accuracy of the report (more on that below) which invariably led to push back on the claims of the skeptics.

For instance, Marcelo Gleiser, a theoretical physicist and cosmologist, wrote for NPR that “Clearly, the scientists in charge know what they are doing.”

Dr. Gleiser is a scientist, not a journalist, so such a silly appeal to expertise can be excused.* But many journalists, like everyone else, seem to have the same “experts must know” reaction to such claims. The problem is that there isn’t much evidence the experts even know what true global temperatures are—or that they can even acquire such data with any precision.

Before you dismiss me as a “skeptic” let me clarify what sort of skeptic I am so that you can dismiss my viewpoint for the right reasons.

I’m not an anthropomorphic climate change skeptic; I’m agnostic on the question of whether mankind is heating up the planet (though I’d be surprised if we didn’t have some effect). What I am a skeptical about—closer to an outright “denialist”—is the idea that global surface temperatures can be measures with any precision.

Let me explain the reasons why and then I’ll discuss why it matters.

Scientific investigation requires the collection and interpretation of data. Consensus on matters of science therefore requires that there be no significant dispute on either the data (i.e., its relevance or accuracy) or its interpretation. The debate over whether there is a “consensus” about anthropogenic climate change has tended to focus on the interpretation of the data. But what if the data is fatally flawed?

The reason I became skeptical about temperature data is because of my familiarity with metrology (the science of measurement) of electronic devices, specifically avionics equipment. The ability to obtain accurate measurements is a result of numerous factors, three of the most important being observer bias, equipment calibration, and acceptable margin of error.

Let’s start with observer bias on old-fashioned liquid-in-glass thermometers, which were the only type of tool used until about 20 years ago. They are still used in some land-based temperature sensors

Here is NOAA’s description of how thermometers work in their stations:

Thermometers used in a [Cotton Region Shelter] are two basic types: Alcohol and Mercury. Alcohol thermometers are used to record the minimum temperatures. Minimum thermometers have a small bar embedded in the liquid which is pulled down the tube as the temperature falls. As the temperature warms again and the liquid moves back up the tube, the bar remains at the “minimum” which allows the observer to read the lowest temperature.

Mercury thermometers are used to record the maximum temperature. Maximum thermometers have a small break near the base of the well of liquid at the bottom of the thermometer. So as the temperature falls from the high, this break in the liquid keeps the liquid in place at its high point. The observer then twirls the thermometers in a rack which rejoins the mercury or sends the bar back to the top of the liquid, resetting them for another days recording.

This is what they look like:

The first problem with thermometers, which makes them less than reliable, is resolution and parallax error.

Let’s assume the scale is marked in 1c steps (as in the picture above). Since you cannot extrapolate between the scale markers, this particular thermometer’s resolution is 1c, which is normally stated as plus or minus 0.5c (+/- 0.5c). As Mark Sonter explains,

This example of resolution is where observing the temperature is under perfect conditions, and you have been properly trained to read a thermometer. In reality you might glance at the thermometer or you might have to use a flash-light to look at it, or it may be covered in a dusting of snow, rain, etc. Mercury forms a pronounced meniscus in a thermometer that can exceed 1c and many observers incorrectly observe the temperature as the base of the meniscus rather than it’s peak: ( this picture shows an alcohol meniscus, a mercury meniscus bulges upward rather than down)

The mon error in reading a thermometer is the parallax error. This is where refraction of light through the glass thermometer exaggerates any error caused by the eye not being level with the surface of the fluid in the thermometer.

Parallax error can be affected by the eye’s level in relation to the thermometer. While most weather station thermometers are about the same height (5’ from the ground) the people taking the measurement are not. If they are taller or shorter, you have to account for that bias. That has, of course, never been taken into account in climate change data.

As Sonter notes,

If you are using data from 100’s of thermometers scattered over a wide area, with data being recorded by hand, by dozens of different people, the observational resolution should be reduced. In the oil industry it mon to accept an error margin of 2-4% when using manually acquired data for example.

Calibration of such thermometers is also difficult to impossible, despite the fact that the effect of temperature causes the glass to shrink and the alcohol to evaporate—both of which affect accuracy. This can affect readings up to 5° c.

(Another form of bias is “fudging the numbers.” Weather stations with thermometers require a human to make twice daily recordings and reset the gauges. As anyone who has ever worked at a job (such as security) where frequent sign-offs are required will tell you, backdating data is mon practice. But we’ll ignore that fact for now and assume that for the past hundred years nobody was doing this when it came to recording temperatures around the globe and that the recording of data from glass thermometers is perfect.)

Nowadays, it is much mon to use thermistors. These instruments, like the one you use to take your own temperature, use a resistor whose resistance changes with temperature. Because of the known dependence of resistance on temperature, the resistor can be used as a temperature sensor. But this leads to another sort of accuracy problem:

Electronic sensors suffer from drift and hysteresis and must be calibrated annually to be accurate, yet most weather station temp sensors are NEVER calibrated after they have been installed. Drift is where the recorder temp increases steadily or decreases steadily, even when the real temp is static and is a fundamental characteristic of all electronic devices.

Drift, is where a recording error gradually gets larger and larger over time- this is a quantum mechanics effect in the metal parts of the temperature sensor that cannot pensated for typical drift of a -100c to+100c electronic thermometer is about 1c per year! and the sensor must be recalibrated annually to fix this error.

Hysteresis is mon problem as well- this is where increasing temperature has a different mechanical affect on the pared to decreasing temperature, so for example if the ambient temperature increases by 1.05c, the thermometer reads an increase on 1c, but when the ambient temperature drops by 1.05c, the same thermometer records a drop of 1.1c. (this is a mon problem in metrology).

Not only are the instruments not calibrated as required, their estimated average error rate has been incorrectly calculated at ±0.2 c. But when scientists tested three ideally suited weather stations they found a representative lower-limit uncertainty of ±0.46 c. In the published journal article theconclusion was:

This ±0.46 C reveals that the global surface air temperature anomaly trend from 1880 through 2000 is statistically indistinguishable from 0 C, and represents a lower limit of calibration uncertainty for climate models and for any prospective physically justifiable proxy reconstruction of paleo-temperature. The rate and magnitude of 20th century warming are thus unknowable, and suggestions of an unprecedented trend in 20th century global air temperature are unsustainable.

In other words, the data would be inaccurate even if all weather stations were ideally located. But they are not.

NOAA requires that thesensor in their recording stations should be at least 100 feet from any paved or concrete surface.Yetneitherthe National Climatic Data Center (NCDC) and NASA’s Goddard Institute of Space Flight (GISS) who are the main collectors, analyzers, and modelers of climatic data have ever done a site by site hands on photographic survey to determine whether microsite influences near the thermometer. As Anthony Watts says, “To date all such studies conducted have been data analysis and data manipulations used to spot and/or minimize data inconsistencies.”

Volunteers have been collecting pictures of sites for SurfaceStations.org. Here’s an example from Lovelock, Nevada:

And here’s one from Marysville, California.

AsSteven F. Hayward says,

The Marysville temperature station is located at the city’s fire department, next to an asphalt parking lot and a cell phone tower, and only a few feet away from two air pressors that spew out considerable heat. These sources of heat amplification mean that the temperature readings from the Marysville station are useless for determining accurate temperatures for the Marysville area.

Indeed, the Marysville station violates the quality control standards of the National Oceanic and Atmospheric Administration (NOAA). NOAA admits that stations like Marysville, sited close to artificial heat sources such as parking lots, can produce errors as large as 5 degrees Celsius. That is not the only ing of the Marysville data; it turns out that daily data were missing for as many as half the days of any given month. Either the device failed to self-record, or no one recorded the daily data as procedure requires. NASA simply filled in the gaps in the data by “interpolating.”

This leads to the final problem withthe data:interpolating and guessing.

When pressed NOAA and other agencies will admit that there are problems of “bias” inthe data. So how do they account for such grievous errors? By insertingtheir own bias. Take, for instance, this claim by NOAA:

Q. What are some of the temperature discrepancies you found in the climate record and how have pensated for them?

NOAA is deploying a new network of stations called the U.S. Historical Climatology Network – Modernization. These stations maintain the same level of climate science quality measurements as the USCRN, but are spaced more closely and focus solely on temperature and precipitation.

Over time, the thousands of weather stations around the world have undergone changes that often result in sudden or unrealistic discrepanciesin observed temperatures requiring a correction. For the U.S.-based stations, we have access to detailed station history that helps us identify and correct discrepancies.Some of these differences have simple corrections.

The most important difference globally was the modification in measured sea surface temperatures. In the past, ship measurements were taken by throwing a bucket over the side, bringing some ocean water on deck and putting a thermometer in it. Today, temperatures are recorded by reading thermometers in the engine coolant water intake — this is considered a more accurate measure of ocean temperature.The bucket readings used early in the record were cooler than engine intake observations, so the early data have been adjusted warmer to account for that difference. This makes global temperatures indicate less warming than the raw data does.

The most important difference in the U.S. temperature record occurred with the systematic change in observing times from the afternoon (when it is warm) to morning (when it is cooler). This shift has resulted in a well-documented and increasing cool discrepancy over the last several decades and is addressed by applying a correction to the data.

Notice the admission of “applying correction to the data.” While this may be mon practice, it leads toa shift from “empirical data-driven science” to “speculative and ideology-driven guesswork.” The only way you can “apply a correction” to temperature data is to assume that you know what the data should have been and then, with a few adjustments, just make it so. That’s not data collection; that’s fudging the numbers. Because of such “adjustments”we should be very, very skeptical of the accuracy of any data associated with surface temperature readings.

But the problem is not just with the accuracy but with the illusion of certainty. The NASA press release said that 2014 was the “hottest year on record” yetfailed to mention that the alleged increase over 2010, the previous ‘warmest year’, was just two-hundredths of a degree – or 0.02 c. The margin of error is said by scientists to be approximately 0.1C – several times as much. As the Daily Mail notes,

As a result, GISS’s director Gavin Schmidt has now admitted NASA thinks the likelihood that 2014 was the warmest year since 1880 is just 38 per cent. However, when asked by this newspaper whether he regretted that the news release did not mention this, he did not respond. Another analysis, from the Berkeley Earth Surface Temperature (BEST) project, drawn from ten times as many measuring stations as GISS, concluded that if 2014 was a record year, it was by an even tinier amount.

In other words, NASA chose to give information that would result in a scary headline (“Hottest Year Ever Recorded!”) rather thana more accurate story(“Temperature About the Same as 2005 and 2010”).

None ofthis would matter much if it were simply a dispute over thecollection and accuracy of temperature data. But this is the type of flaweddata being used to justify public policies that will hurt all Americans—the poor most of all. It is also used to justify “geoengineering” solutions, such as shooting millions of tons of sulfur dioxide into the atmosphere.And when you question the data you’re called a“denier” pared to“Holocaust deniers.”

The reality is, contra Gleiser, that the scientists in charge don’t really appear to know what they are doing. If you claim world-changing confidence in a claim based on shoddy, inaccurate, or unknowable data then you aren’t really doing science at all. You’re just an activist in a lab coat.

*One part of his article that can’t be excused is his failure to apply basic mathematics. He says, “Third, and most important, the data ties 2005, 2010 and 2014 as the three hottest years on record. Even within the margin of error and the occasional yearly fluctuation, our planet is getting steadily warmer.” Let’s assume that the claim is accurate and 2005, 2010, and 2014, are all tied for the hottest year. For the sake of simplicity, let’s say the hottest year on record is 100°. If the 2005, 2010, and 2014 were all 100° then the top temperature was the same over roughly ten years. That is not, as Gleiser says, “steady warming.” We could say that warming increases below the peak, but that the peak temperature hasn’t increased for about a decade. In other words, there is no evidence of “steady warming” based on peak-temperature data.

Comments
Welcome to mreligion comments! Please keep conversations courteous and on-topic. To fosterproductive and respectful conversations, you may see comments from our Community Managers.
Sign up to post
Sort by
Show More Comments
RELIGION & LIBERTY ONLINE
The Social Capital Index: A geography of ‘associational life’ in America
In recent decades, America has experienced a wave of economic and social disruption. In our search for solutions, however, we tend to look only at the surface, assessing the architecture of particular policies or stroking our chins over economic measurements like Gross Domestic Product. But what if we had a deeper view of the dynamics beneath the surface? What if we had way to measure, assess, and observe the state of“associational life”in America (as Alexis de Tocqueville may have called...
Cronyism fueled the murder of a Slovak journalist
“Slovakia has been living through one of the most turbulent times in its young history,” says Martina Bobulová in this week’s Acton Commentary. “It has been almost a month since the murder of investigative journalist Ján Kuciak and his fiancée, Martina Kušnírová, which have put these events in motion.” Much has changed in past four weeks – the nation went to the streets and the country experienced the biggest public protests since the Velvet Revolution in 1989. Robert Fico’s third...
Virtues, once again
“Crisis of Responsibility: Our Cultural Addiction to Blame and How You Can Cure It,” by David L. Bahnsen; Foreward by David French; PostHill Press, 2018; 170 pp.; $26. It’s been a long, hard slog on humanity’s path to the current century and its peculiar predicaments. Along the way, there have been numerous guidebooks to assist our respective generations’ quests for living honorable lives in the face of varyingly difficult circumstances. To list them, in fact, would create a magnificent bibliography...
Video: Dispelling myths about economic inequality
The lure of socialism lies in its promise of “equality,” a hazily defined concept that educational and political leaders transform into an even more ambiguous social goal. The word itself triggers the innate sense of fairness and equity cherished by everyone raised under the influence of Western culture. The Bible, after all, repeatedly warns believers to have no respect of persons when meting out justice, which Aquinas ranked as “foremost among all the moral virtues.” But do modern-day social engineers...
Radio Free Acton: Discussion on Communism in Cuba; Tech & work part II: Growing technology in agriculture
On this episode of Radio Free Acton, Acton’s director of programs and education, Paul Bonicelli, talks to John Suarez, research director at the Center for a Free Cuba. This talk is a preview of an ing event at Acton on April 17: Communism in Cuba, its international impact, the democratic resistance and U.S. Cuba policy. Then, on the next Tech and the Future of Work segment, Dan Churchwell, Acton’s associate director of program outreach, speaks with Kevin Scott, a soybean...
Fifty years later, cities still suffer the economic effects of the 1968 riots
This month marks the 50th anniversary of the riots that began in 1968 after the assassination of Dr. Martin Luther King Jr. The riots—sometimes referred to as the Holy Week Uprising or King assassination riots—spread through 110 cities across the United States. As historian Peter B. Levy notes, Fifty-four cities suffered at least $100,000 in property damage, with the nation’s capital and Baltimore topping the list at approximately $15 million and $12 million, respectively. Thousands of small shopkeepers saw their...
Unemployment as economic-spiritual indicator — March 2018 report
Series Note: Jobs are one of the most important aspects of a morally functioning economy. They help us serve the needs of our neighbors and lead to human flourishing both for the individual and munities. Conversely, not having a job can adversely affect spiritual and psychological well-being of individuals and families. Because unemployment is a spiritual problem, Christians in America need to understand and be aware of the monthly data on employment. Each month highlight the latest numbers we need...
Is there a connection between opioid use and unemployment?
For the past several years the U.S. has been undergoing an opioid epidemic. Opioidsare drugs, whether illegal or prescription, that reduce the intensity of pain signals reaching the brain and affect those brain areas controlling emotion, which diminishes the effects of a painful stimulus. According to the Centers for Disease Control (CDC), in 2013 there were more than249 million prescriptionsfor opioid pain medication written by healthcare providers. This is enough for every adult in America to have a bottle of...
How growth rates affect the wealth of nations
Note: This is post #74 in a weekly video series on basic economics. In the previous video in this series we learned a basic fact of economic wealth—that countries can vary widely in standard of living. How can we explain wealth disparities between countries? The answer, as Alex Tabarrok of Marginal Revolution university explains, is growth rates. Tabarrok examines the growth rate of the U.S. economy and considers what would life be like if our economy had grown at an...
Remember the intangibles: A caution to the 21st-century economist
Today’s economists have no shortage of confidence, offering models and measurements aplenty. But are the tools of the field keeping pace with the actual forces and factors at work? bination of economics with statistics in plex world promises a lot more than it delivers,” economist Russ Roberts recently wrote. “We economists should be more humble and honest about the reliability and precision of statistical analysis.” Indeed, in our plex economy, what can economists actually know? In a new essay at...
Related Classification
Copyright 2023-2025 - www.mreligion.com All Rights Reserved