Science Validates Common Sense
Purposefully isolated in Wisconsin’s Necedah National Wildlife Refuge, station 0423E2 of the U.S. Climate Reference Network (CRN) has measured air temperatures and assorted other conditions that characterize the local climate, continuously since 2005. Managed by the National Oceanic and Atmospheric Administration (NOAA), the Necedah station and the rest of the state-of-the-art CRN pre-empt error with three independent temperature sensors per site and locations remote enough that for most of the coming century, no nearby human activity is likely to influence the measurements.
Installation of the 114-station CRN was completed in 2008. NOAA says its purpose is to be able “with the highest degree of confidence” 50 years from now to answer the question, “How has the climate of the nation changed over the past 50 years?”
But despite the CRN’s product being undoubtedly the highest-quality climatological data gathered by any such system on Earth, it’s not what makes up the official U.S. temperature record.
Why Not the Best?
A scientific paper published earlier this year validates concerns that motivated creation of the CRN, and yet there are defensible explanations for the nation’s temperature statistics not being based on CRN data that’s obviously the best available. In climatology, an observational record spanning 30 years is considered necessary to support any claim that a trend is afoot—as distinct from routine short-term variability—and the full Climate Reference Network has been up and running for only a little more than a decade.
On the other hand, there are problems aplenty with the network of monitoring stations that do furnish the data used in compiling the official history of U.S. temperatures across the more than 120 years since the start of what are considered representative nationwide thermometer records.
NOAA seeks to cope with these problems by “adjusting” data collected at the 1,219 monitoring stations across the contiguous United States that comprise the U.S. Historical Climatology Network (USHCN). The 1,219 stations were selected in the mid-1980s from an even larger network, so while quality of measurement is presumably improved, there will inevitably be differences compared with data supplied by the earlier network.
Moreover, the datasets have been revised at least three times from the 1990s through the first decade of this century. And those revised datasets themselves—to again use NOAA’s words—“contained adjustments to the monthly mean, maximum, minimum, and average temperature data that addressed potential changes in biases (inhomogeneities) in data from USHCN stations documented in *NCEI’s station history archives.”
In plain language, the numbers processed to depict historical temperature trends are not necessarily the same ones written down at the moment in history when a volunteer observer checked a thermometer or, more recently, an automated system recorded the data.
The “inhomogeneities” NOAA seeks to resolve include stations being relocated, changes in the time of day observations are made, and equipment changes, for instance an electronic sensor replacing a bulb thermometer. All these things can subtly influence the data, and NOAA does its best to compensate with revisions hoped to ensure continuity across changing circumstances.
There’s been plenty to compensate for, as direct inspection of most of the USHCN sites revealed beginning a dozen years ago.
A System Found Wanting
Twelve years ago this month, Wisconsin Energy Cooperative News (WECN) published an article detailing substandard conditions at many USHCN sites. Widespread deficiencies raised pointed questions for a citizenry hearing talk of new taxes targeting energy use, and fundamental lifestyle changes said to be urgent because of an estimated increase in the worldwide average temperature totaling a bit less than one and a half degrees (Fahrenheit) over about 160 years.
Doubts about the quality of temperature monitoring had already been raised without this magazine’s help. The October 2007 article quoted a then 10-year old United Nations report calling climate-monitoring capabilities “inadequate and deteriorating worldwide,” warning that absent remedial action, “the ability to characterize climate change and variations over the next 25 years will be even less than during the past quarter century.”
These faltering capacities were attributed partly to great numbers of weather stations in northern latitudes ceasing operations after 1990 as the Soviet Union dissolved itself.
But what WECN was nearly the first to report, and at the time was the largest publication to report, was that much of the world’s best monitoring system, the USHCN, was in surprisingly poor condition.
Weather stations were sited in close proximity to air-conditioner exhausts; on flat, dark-colored downtown rooftops; adjacent to and sometimes directly over asphalt pavement: conditions that expose temperature sensors to artificial heat sources or concentrated natural heat. Mixed with readings from hundreds of other stations to calculate a nationwide average, the effects of bad siting would be limited to tenths of degrees or less. But differences between years headlined as “the hottest ever” and years they’re said to displace are also limited to tenths of degrees or less.
Volunteers Pitch In
Among the earliest to examine siting conditions of USHCN weather stations—something the federal government had never done—California meteorologist Anthony Watts in 2007 launched the “SurfaceStations Project,” revealing overwhelmingly slipshod compliance with NOAA’s published standards for ensuring accurate climatological data. That summer, Wisconsin Energy Cooperative News interviewed Watts. Still in its early stages, the SurfaceStations Project was turning up large numbers of weather stations ill-suited to contribute accuracy to the record.
So abundant were badly sited stations, we asked if volunteer observers might be homing in on absurd examples; human nature being what it is, we asked, where’s the fun in reporting that something’s working as intended?
Watts replied that the network’s overall quality couldn’t be fairly assessed until a broader sample was obtained. Over the months the sample grew and there was little change.
Ultimately, more than 82 percent (1,007) of USHCN sites were photographically documented. Only 8 percent met NOAA’s standards for measuring temperature within an error range of 1 degree Celsius (1.8 degrees, Fahrenheit) or less. More than nine in 10 stations surveyed had an error margin greater than all the warming believed to have occurred since the mid-19th century.
NOAA backhandedly acknowledges the SurfaceStations Project by dismissing its findings. The agency cites a 2010 paper that compared trends from well- and badly sited USHCN stations and recognized bias in unadjusted data but called it “consistent with previously documented changes associated with the widespread conversion to electronic sensors in the USHCN during the [preceding]25 years.”
NOAA also maintained the bias was “counterintuitive to photographic documentation of poor exposure,” arguing that instrument changes had artificially lowered the high temperatures. The agency also maintains the USHCN data tracks closely with that from the CRN.
However, a paper published this spring under NOAA’s auspices drew conclusions consistent with SurfaceStations Project findings and WECN’s reporting in 2007.
“Impacts of Small-Scale Urban Encroachment on Air Temperature Observations” was published in May by the American Meteorological Society’s Journal of Applied Meteorology and Climatology. Its less-than-stupendous conclusion was that warming is exaggerated when monitoring equipment is exposed to artificial heat sourcesÑa proposition nevertheless hotly contested by many climate researchers in its applicability to large network or global trends.
Temperature sensors were placed at graduated distances from buildings and paved surfaces built for the experiment at Oak Ridge, Tennessee. Observations from November 2012 through April 2014 were compared with observations from a Rhode Island CRN site with known urban encroachment, and a nearby site free of encroachment.
The biggest effect at the experiment site was a narrowed difference between high and low temperatures over a given 24-hour period: the closer sensors were to the built environment, the higher were the nighttime lows. Seemingly tinyÑalmost certainly less than any change humans can senseÑthe 0.7 to 0.86 degree (Fahrenheit) rise in overnight lows could be meaningful.
Recent warming of global average temperatures, seen mainly during the late 20th century, is attributed primarily to milder overnight lows, with comparatively little observed warming of daytime highs. And the total increase of global temperatures since about 1850 is generally agreed to be about 1.5 degrees Fahrenheit. NOAA’s experiment suggests a not-inconsequential share of this could reflect artificial urban influences.
“Look, Up in the Sky…”
The 114 CRN stations that may someday supplant the USHCN aren’t the only data sources unaffected by hot pavement or air-conditioner exhausts.
In September the University of Alabama-Huntsville (UAH), which interprets data from NASA satellites, reported the monthly global average temperature for August made that month the fourth warmest August (behind 1998, 2016, and 2017) in the satellite record that began in 1979.
The UAH-identified global warming trend works out to 1.3 degrees per century.