Home
/
RELIGION & LIBERTY ONLINE
/
You Can’t Have ‘Settled Science’ Based on Unsettled Data
You Can’t Have ‘Settled Science’ Based on Unsettled Data
Dec 25, 2025 12:07 AM

During his most recent State of the Union address, President Obama talked about climate change and claimed, “2014 was the planet’s warmest year on record.”

Obama was basing his statement on a press release by NASA’s Goddard Institute for Space Studies (GISS). According to the NASA data collected from more than 3,000 weather stations around the globe, “The year 2014 ranks as Earth’s warmest since 1880.” Climate change skeptics pushed back by questioning the accuracy of the report (more on that below) which invariably led to push back on the claims of the skeptics.

For instance, Marcelo Gleiser, a theoretical physicist and cosmologist, wrote for NPR that “Clearly, the scientists in charge know what they are doing.”

Dr. Gleiser is a scientist, not a journalist, so such a silly appeal to expertise can be excused.* But many journalists, like everyone else, seem to have the same “experts must know” reaction to such claims. The problem is that there isn’t much evidence the experts even know what true global temperatures are—or that they can even acquire such data with any precision.

Before you dismiss me as a “skeptic” let me clarify what sort of skeptic I am so that you can dismiss my viewpoint for the right reasons.

I’m not an anthropomorphic climate change skeptic; I’m agnostic on the question of whether mankind is heating up the planet (though I’d be surprised if we didn’t have some effect). What I am a skeptical about—closer to an outright “denialist”—is the idea that global surface temperatures can be measures with any precision.

Let me explain the reasons why and then I’ll discuss why it matters.

Scientific investigation requires the collection and interpretation of data. Consensus on matters of science therefore requires that there be no significant dispute on either the data (i.e., its relevance or accuracy) or its interpretation. The debate over whether there is a “consensus” about anthropogenic climate change has tended to focus on the interpretation of the data. But what if the data is fatally flawed?

The reason I became skeptical about temperature data is because of my familiarity with metrology (the science of measurement) of electronic devices, specifically avionics equipment. The ability to obtain accurate measurements is a result of numerous factors, three of the most important being observer bias, equipment calibration, and acceptable margin of error.

Let’s start with observer bias on old-fashioned liquid-in-glass thermometers, which were the only type of tool used until about 20 years ago. They are still used in some land-based temperature sensors

Here is NOAA’s description of how thermometers work in their stations:

Thermometers used in a [Cotton Region Shelter] are two basic types: Alcohol and Mercury. Alcohol thermometers are used to record the minimum temperatures. Minimum thermometers have a small bar embedded in the liquid which is pulled down the tube as the temperature falls. As the temperature warms again and the liquid moves back up the tube, the bar remains at the “minimum” which allows the observer to read the lowest temperature.

Mercury thermometers are used to record the maximum temperature. Maximum thermometers have a small break near the base of the well of liquid at the bottom of the thermometer. So as the temperature falls from the high, this break in the liquid keeps the liquid in place at its high point. The observer then twirls the thermometers in a rack which rejoins the mercury or sends the bar back to the top of the liquid, resetting them for another days recording.

This is what they look like:

The first problem with thermometers, which makes them less than reliable, is resolution and parallax error.

Let’s assume the scale is marked in 1c steps (as in the picture above). Since you cannot extrapolate between the scale markers, this particular thermometer’s resolution is 1c, which is normally stated as plus or minus 0.5c (+/- 0.5c). As Mark Sonter explains,

This example of resolution is where observing the temperature is under perfect conditions, and you have been properly trained to read a thermometer. In reality you might glance at the thermometer or you might have to use a flash-light to look at it, or it may be covered in a dusting of snow, rain, etc. Mercury forms a pronounced meniscus in a thermometer that can exceed 1c and many observers incorrectly observe the temperature as the base of the meniscus rather than it’s peak: ( this picture shows an alcohol meniscus, a mercury meniscus bulges upward rather than down)

The mon error in reading a thermometer is the parallax error. This is where refraction of light through the glass thermometer exaggerates any error caused by the eye not being level with the surface of the fluid in the thermometer.

Parallax error can be affected by the eye’s level in relation to the thermometer. While most weather station thermometers are about the same height (5’ from the ground) the people taking the measurement are not. If they are taller or shorter, you have to account for that bias. That has, of course, never been taken into account in climate change data.

As Sonter notes,

If you are using data from 100’s of thermometers scattered over a wide area, with data being recorded by hand, by dozens of different people, the observational resolution should be reduced. In the oil industry it mon to accept an error margin of 2-4% when using manually acquired data for example.

Calibration of such thermometers is also difficult to impossible, despite the fact that the effect of temperature causes the glass to shrink and the alcohol to evaporate—both of which affect accuracy. This can affect readings up to 5° c.

(Another form of bias is “fudging the numbers.” Weather stations with thermometers require a human to make twice daily recordings and reset the gauges. As anyone who has ever worked at a job (such as security) where frequent sign-offs are required will tell you, backdating data is mon practice. But we’ll ignore that fact for now and assume that for the past hundred years nobody was doing this when it came to recording temperatures around the globe and that the recording of data from glass thermometers is perfect.)

Nowadays, it is much mon to use thermistors. These instruments, like the one you use to take your own temperature, use a resistor whose resistance changes with temperature. Because of the known dependence of resistance on temperature, the resistor can be used as a temperature sensor. But this leads to another sort of accuracy problem:

Electronic sensors suffer from drift and hysteresis and must be calibrated annually to be accurate, yet most weather station temp sensors are NEVER calibrated after they have been installed. Drift is where the recorder temp increases steadily or decreases steadily, even when the real temp is static and is a fundamental characteristic of all electronic devices.

Drift, is where a recording error gradually gets larger and larger over time- this is a quantum mechanics effect in the metal parts of the temperature sensor that cannot pensated for typical drift of a -100c to+100c electronic thermometer is about 1c per year! and the sensor must be recalibrated annually to fix this error.

Hysteresis is mon problem as well- this is where increasing temperature has a different mechanical affect on the pared to decreasing temperature, so for example if the ambient temperature increases by 1.05c, the thermometer reads an increase on 1c, but when the ambient temperature drops by 1.05c, the same thermometer records a drop of 1.1c. (this is a mon problem in metrology).

Not only are the instruments not calibrated as required, their estimated average error rate has been incorrectly calculated at ±0.2 c. But when scientists tested three ideally suited weather stations they found a representative lower-limit uncertainty of ±0.46 c. In the published journal article theconclusion was:

This ±0.46 C reveals that the global surface air temperature anomaly trend from 1880 through 2000 is statistically indistinguishable from 0 C, and represents a lower limit of calibration uncertainty for climate models and for any prospective physically justifiable proxy reconstruction of paleo-temperature. The rate and magnitude of 20th century warming are thus unknowable, and suggestions of an unprecedented trend in 20th century global air temperature are unsustainable.

In other words, the data would be inaccurate even if all weather stations were ideally located. But they are not.

NOAA requires that thesensor in their recording stations should be at least 100 feet from any paved or concrete surface.Yetneitherthe National Climatic Data Center (NCDC) and NASA’s Goddard Institute of Space Flight (GISS) who are the main collectors, analyzers, and modelers of climatic data have ever done a site by site hands on photographic survey to determine whether microsite influences near the thermometer. As Anthony Watts says, “To date all such studies conducted have been data analysis and data manipulations used to spot and/or minimize data inconsistencies.”

Volunteers have been collecting pictures of sites for SurfaceStations.org. Here’s an example from Lovelock, Nevada:

And here’s one from Marysville, California.

AsSteven F. Hayward says,

The Marysville temperature station is located at the city’s fire department, next to an asphalt parking lot and a cell phone tower, and only a few feet away from two air pressors that spew out considerable heat. These sources of heat amplification mean that the temperature readings from the Marysville station are useless for determining accurate temperatures for the Marysville area.

Indeed, the Marysville station violates the quality control standards of the National Oceanic and Atmospheric Administration (NOAA). NOAA admits that stations like Marysville, sited close to artificial heat sources such as parking lots, can produce errors as large as 5 degrees Celsius. That is not the only ing of the Marysville data; it turns out that daily data were missing for as many as half the days of any given month. Either the device failed to self-record, or no one recorded the daily data as procedure requires. NASA simply filled in the gaps in the data by “interpolating.”

This leads to the final problem withthe data:interpolating and guessing.

When pressed NOAA and other agencies will admit that there are problems of “bias” inthe data. So how do they account for such grievous errors? By insertingtheir own bias. Take, for instance, this claim by NOAA:

Q. What are some of the temperature discrepancies you found in the climate record and how have pensated for them?

NOAA is deploying a new network of stations called the U.S. Historical Climatology Network – Modernization. These stations maintain the same level of climate science quality measurements as the USCRN, but are spaced more closely and focus solely on temperature and precipitation.

Over time, the thousands of weather stations around the world have undergone changes that often result in sudden or unrealistic discrepanciesin observed temperatures requiring a correction. For the U.S.-based stations, we have access to detailed station history that helps us identify and correct discrepancies.Some of these differences have simple corrections.

The most important difference globally was the modification in measured sea surface temperatures. In the past, ship measurements were taken by throwing a bucket over the side, bringing some ocean water on deck and putting a thermometer in it. Today, temperatures are recorded by reading thermometers in the engine coolant water intake — this is considered a more accurate measure of ocean temperature.The bucket readings used early in the record were cooler than engine intake observations, so the early data have been adjusted warmer to account for that difference. This makes global temperatures indicate less warming than the raw data does.

The most important difference in the U.S. temperature record occurred with the systematic change in observing times from the afternoon (when it is warm) to morning (when it is cooler). This shift has resulted in a well-documented and increasing cool discrepancy over the last several decades and is addressed by applying a correction to the data.

Notice the admission of “applying correction to the data.” While this may be mon practice, it leads toa shift from “empirical data-driven science” to “speculative and ideology-driven guesswork.” The only way you can “apply a correction” to temperature data is to assume that you know what the data should have been and then, with a few adjustments, just make it so. That’s not data collection; that’s fudging the numbers. Because of such “adjustments”we should be very, very skeptical of the accuracy of any data associated with surface temperature readings.

But the problem is not just with the accuracy but with the illusion of certainty. The NASA press release said that 2014 was the “hottest year on record” yetfailed to mention that the alleged increase over 2010, the previous ‘warmest year’, was just two-hundredths of a degree – or 0.02 c. The margin of error is said by scientists to be approximately 0.1C – several times as much. As the Daily Mail notes,

As a result, GISS’s director Gavin Schmidt has now admitted NASA thinks the likelihood that 2014 was the warmest year since 1880 is just 38 per cent. However, when asked by this newspaper whether he regretted that the news release did not mention this, he did not respond. Another analysis, from the Berkeley Earth Surface Temperature (BEST) project, drawn from ten times as many measuring stations as GISS, concluded that if 2014 was a record year, it was by an even tinier amount.

In other words, NASA chose to give information that would result in a scary headline (“Hottest Year Ever Recorded!”) rather thana more accurate story(“Temperature About the Same as 2005 and 2010”).

None ofthis would matter much if it were simply a dispute over thecollection and accuracy of temperature data. But this is the type of flaweddata being used to justify public policies that will hurt all Americans—the poor most of all. It is also used to justify “geoengineering” solutions, such as shooting millions of tons of sulfur dioxide into the atmosphere.And when you question the data you’re called a“denier” pared to“Holocaust deniers.”

The reality is, contra Gleiser, that the scientists in charge don’t really appear to know what they are doing. If you claim world-changing confidence in a claim based on shoddy, inaccurate, or unknowable data then you aren’t really doing science at all. You’re just an activist in a lab coat.

*One part of his article that can’t be excused is his failure to apply basic mathematics. He says, “Third, and most important, the data ties 2005, 2010 and 2014 as the three hottest years on record. Even within the margin of error and the occasional yearly fluctuation, our planet is getting steadily warmer.” Let’s assume that the claim is accurate and 2005, 2010, and 2014, are all tied for the hottest year. For the sake of simplicity, let’s say the hottest year on record is 100°. If the 2005, 2010, and 2014 were all 100° then the top temperature was the same over roughly ten years. That is not, as Gleiser says, “steady warming.” We could say that warming increases below the peak, but that the peak temperature hasn’t increased for about a decade. In other words, there is no evidence of “steady warming” based on peak-temperature data.

Comments
Welcome to mreligion comments! Please keep conversations courteous and on-topic. To fosterproductive and respectful conversations, you may see comments from our Community Managers.
Sign up to post
Sort by
Show More Comments
RELIGION & LIBERTY ONLINE
Justice Alito exposes the hypocrisy of liberal double-standards
You probably haven’t even heard about it, but yesterday there was an exchange in the Supreme Court that future generations will regard as one of the most significant revelations of our political era. The case of Minnesota Voters Alliance v. Mansky concerns a Minnesota statute that broadly bans all political apparel at the polling place. When Andrew Cilek went to vote in 2010, he wore a shirt bearing the image of the “Don’t Tread on Me” flag and a button...
Work as flourishing in prison: The power of a ‘triple bottom line’ business
For much of his life, Pete Ochs was a successful investment banker in Wichita, Kansas. Yet having started his own business and created significant wealth through a series of investments, he struggled to see the value and purpose of it all. When the market took a turn for the worse, he realized that something needed to change. “After 9/11, our business dropped 50%, and I looked at God and said, ‘don’t you understand what I’ve done for you?’” he explains....
Alex Chafuen awarded for an exemplary career in defense of freedom
Today The Instituto Juan de Mariana has awarded the “Premio Juan de Mariana” to Acton’s Director of International Outreach Alex Chafuen. This award recognizes an exemplary career in the defense of freedom and liberty. The Juan de Mariana Prize is presented at the Freedom Dinner as a part of Freedom Week. Chafuen was recognized especially for his work at the Atlas Network. During 26 years with Chafuen in a leading role, Atlas brought together more than 450 institutions from almost...
Radio Free Acton: Yuval Levin on finding solidarity in the Age of Trump; Upstream on ‘Black Panther’
On this episode of Radio Free Acton, Marc Vander Maas, audio/visual manager at Acton, speaks with Yuval Levin, Vice President of the Ethics and Public Policy Center, on finding solidarity in the “Age of Trump,” what it means, how it came about, and then touch on the history of political polarization in America. On the Upstream segment, Caroline Roberts has a discussion with Julian Chambliss, professor of history at Rollins College, on Marvel’s new hit movie, “Black Panther.” Check out...
Oscar-winner ‘Coco’ is a free-market family gem
Last night, Coco joined the elite group of animated films to win a “grand slam”: the Golden Globe, BAFTA, theAnnie Award,andan Oscar. Neither of the victories at last night’s 90th annual Academy Awards came as a surprise – fans have dubbed the Best Animated Feature Film category “the Pixar award” – but the blockbuster’s plot touches on how the free market rewards or rebuffs unethical practices, how technological progress brings justice, and the eternal significance of vocation and memory. The...
How budget constraints affect consumer choices
Note: This is post #70 in a weekly video series on basic microeconomics. There are numerous variables that determine the price of goods and services—including your willingness to pay the price. Because we have choices in what we buy, the price is relative to other goods. For example, one pizza may cost the equivalent to two cups of coffee so we have to make tradeoffs between goods. We also have budget constraints, which are a crucial variable in helping you...
Milton Friedman debates President Trump on trade
Many of us thought it was empty rhetoric or that an advisor who had read an economics textbook would talk him out of it. But yesterday President Trump announced he’ll keep his campaign promise to start a trade war by slapping tariffs of 25 percent on foreign-made steel and 10 percent on aluminum. On Twitter, the president followed up with the bafflingly, ment that, When a country (USA) is losing many billions of dollars on trade with virtually every country...
Black Panther has something important to offer
In this week’s Acton Commentary I examine the dynamics of marginalization and solidarity in the blockbuster phenomenon Black Panther. As so mentators have suggested, there’s a lot to this film, and one of the important things it has to offer is a valuable perspective on the underlying unity amidst diversity in humanity. Another aspect of the film worth highlighting is that it presents Wakanda, and Africa more generally, as having something positive to offer the world; advanced technology and rare...
How the Reformation led to a reallocation of religious resources
Soon after Martin Luther nailed his ninety-five theses to the church door at Wittenberg (if he even did), Protestants began to be blamed for unleashing many of the destructive influences of Western Civilization. As a Baptist, I thinkthe criticisms are overstated (and thatthe good of the Reformation far outweighs the bad) but they aren’t wholly without merit. There is more than a grain of truth that anunintended effect of the Protestant Reformation was to increase the rapid expansion of secularization....
Keeping warm during the ‘Beast from the East’? Thank energy investors
As the UK beds down for the night, it is blanketed with government alerts that traveling out into the snow-covered landscape might prove deadly – as it already has for 10 people ranging in age from seven to 75. The snowfall may total more than 19 inches, as Storm Emma collides with the “Beast from the East.” Subzero temperatures also strained energy supplies on Thursday, triggering the largest spike in consumer demand in eight years. While far from perfect, the...
Related Classification
Copyright 2023-2025 - www.mreligion.com All Rights Reserved