H. Sterling Burnett: NASA and NOAA’s Latest Climate Warning Is a Result of Purposefully Flawed Data

2018 was fourth hottest year on record: researchers
AFP

Because science is the pursuit of knowledge, and political actions almost necessarily restrict personal freedom, science, laws, and regulations should use the best available data. Using bad data undermines both the pursuit of truth and the legitimate justification of laws and regulations.

Everyone, from the far left to the far right on the political spectrum, should be able to agree about this.

Sadly, in the field of climate research and climate policy, good data, when not ignored entirely, is increasingly twisted to fit the narrative claiming that humans are causing a climate crisis. Climate action partisans, in pursuit of political power and ever increasing resources, force data to fit their delusion that humans must forego modern, industrial civilization to save humanity and the earth from climate doom.

This problem is more than evident in a recent report from the National Aeronautics and Space Administration (NASA) and the National Oceanic and Atmospheric Administration (NOAA) on global temperature trends. Between them, the two agencies operate the most accurate, comprehensive system of temperature measuring instruments in the world. But rather than cite data from their best sources when NASA and NOAA reported global temperatures on January 15, they chose to use severely compromised data from temperature readings adjusted—in a process called “homogenization”—they and others gathered from biased monitoring stations.

NASA and NOAA announced that 2019 was the second warmest year since modern record keeping began in 1880, helping to make the 2010s the “warmest decade on record.”

These claims are based on the utterly unreliable adjusted temperature measurements recorded by surface temperature stations scattered across the globe. These measurements, at least the raw data from them, are usually sufficiently accurate to inform local inhabitants of the temperature and weather anomalies in their area on a particular day, but as measures of actual trends telling us something important about whether humans are causing global warming, most of them are virtually worthless.

As has been hammered home repeatedly over the years by meteorologist Anthony Watts (who is also a Senior Fellow with The Heartland Institute), many of the monitoring stations throughout the United States fail to meet the standards established by the agencies themselves for reliable data measurement. Watts recorded hundreds of stations on pavement, at airports collecting jet exhaust, located next to artificial sources of hot and cold, such as air conditioning systems or commercial grill heat exhausts. Many of these stations were once located in rural areas, but are now surrounded by development, and others are rural stations where data is not recorded or monitored regularly.

After Watts’ 2014 revelations, the U.S. Office of the Inspector General issued a scathing report, almost entirely ignored by the media, that found lack of oversight, non-compliance, and a lax review process for the climate recording network led it to conclude program data “cannot be consistently relied upon by decision-makers.” In a panic, during the investigative process that resulted in the Inspector General’s report, NOAA closed approximately 600 of its most problematic weather stations.

Numerous reports have shown data manipulation is not limited to the United States, but is common across the globe. Temperatures recorded at pristine rural monitoring stations in far flung locations such as Australia, Paraguay, and Switzerland have been inexplicably homogenized so that past temperatures are now reported as cooler than were actually recorded, and recent temperatures are now reported as warmer than were recorded, necessarily making the temperature rise at these locations over the past century appear steeper and larger than the unadjusted data indicate.

NOAA violated its own rules when it undertook a similar adjustment process for recording ocean temperatures, beginning in 2015. As David Rose wrote for the Daily Mail, “[NOAA scientists] took reliable readings from [ocean] buoys but then ‘adjusted’ them upwards—using readings from seawater intakes on ships … even though readings from the ships have long been known to be too hot.” When you mix bad data with good, you no more produce reliable results than you do by adding muddy river water to purified bottle water to produce safe drinking water.

NASA and NOAA’s new report is another instance of “garbage in, garbage out,” in which their use of bad data produces flawed results, which, based on experience, will be used to push bad policies.

NASA and NOAA jointly or separately operate the U.S. Climate Reference Network, the gold standard of surface temperature data, global satellites, and weather balloons. The temperature data recorded by these three independent, unbiased temperature-measuring networks show minimal warming over the past 40 years. Yet the agencies ignored these data sets in their recent report—proving their dogmatic belief in a human-caused climate catastrophe.

NASA and NOAA are like toddlers trying to fit square pegs into round holes, and just as likely as toddlers to throw fits when their efforts are stymied by reality.

The Trump administration should steeply cut NASA and NOAA’s climate budgets until agency heads and career staff get the message they will not be rewarded for repeatedly telling “sky is falling” climate scare stories, when the truth about temperature and climate trends is, in fact, far from alarming.

H. Sterling Burnett, Ph.D. (hburnett@heartland.org) is a senior fellow on energy and the environment at The Heartland Institute, a nonpartisan, nonprofit research center headquartered in Arlington Heights, Illinois.

COMMENTS

Please let us know if you're having issues with commenting.