Report Explodes Global-warming Alarmism — “Adjusted” Data Accounts for All the Hyped Temperatures

Report Explodes Global-warming Alarmism — “Adjusted” Data Accounts for All the Hyped Temperatures

WILLIAM F. JASPER

A new blockbuster study that examines the most relied-upon global temperature data sets could start a chain-reaction demolition of global-warming alarmism. The peer-reviewed study by two climate scientists and a statistician challenges the “adjustment” process used to produce the global average temperature datasets (GAST) that have dominated media headlines and political debate for the past two decades. The new report could also upend the multi-billion dollar climate lobby, as well as the Obama administration’s EPA executive order declaring carbon dioxide to be a dangerous gas.

The new study, “On the Validity of NOAA, NASA and Hadley CRU Global Average Surface Temperature Data & The Validity of EPA’s CO2 Endangerment Finding,” co-authored by Drs. James P. Wallace III, Craig D. Idso, and Joseph S. D’Aleo, was released on June 27. It was peer-reviewed by a distinguished group of seven scientists, including Dr. Alan Carlin (retired senior analyst and manager, U.S. Environmental Protection Agency), Professor Anthony R. Lupo (expert reviewer for the UN’s Intergovernmental Panel on Climate Change), and Dr. George T. Wolff (former chair of the EPA’s Clean Air Scientific Advisory Committee).

The “conclusive findings” of their research, say the study co-authors are that inappropriate “adjustments” have been made to the temperature record, with the result that “the three GAST data sets are not a valid representation of reality.” “In fact, the magnitude of their historical data adjustments, that removed their cyclical temperature patterns, are totally inconsistent with published and credible U.S. and other temperature data. Thus, it is impossible to conclude from the three published GAST data sets that recent years have been the warmest ever — despite current claims of record setting warming.”

The researchers found that “each new version of GAST has nearly always exhibited a steeper warming linear trend over its entire history. And, it was nearly always accomplished by systematically removing the previously existing cyclical temperature pattern.” If the invalid GAST “adjustments” are discarded, there is virtually no warming trend to speak of. That, of course, is what the “pause” or “hiatus” in global temperatures of the past 20 years is all about. Despite the monkeying with GAST, even many of the top alarmists have admitted that the pause is real and that they have no satisfactory explanation for it.

The three datasets that comprise the GAST record come from NOAA (National Oceanic and Atmospheric Administration), NASA (National Aeronautics and Space Administration), and Hadley CRU (Climatic Research Unit in England). Critics have long protested the non-transparent “adjustment” methods used by the GAST data managers, given that the adjustments invariably have driven the reported global temperatures in an upward direction, dramatically “steeping” the trend and suspiciously aiding the claim that anthropogenic (man-made) CO2 is causing a dangerous rise in global temperatures. However, the data managers have all too often refused to release the raw data for genuine peer (and public) review. Scientists may adjust data to correct for “biases” that creep into the record due to any number of causes (changes in instruments, technology, and procedures, for instance), but in real science those “corrections” and the raw data on which they are based must be available for checking by others. For many years now, skeptical scientists have criticized the climate alarmists for “secret science,” for constantly “adjusting,” “correcting,” “smoothing,” “normalizing,” “infilling,” and “homogenizing” the temperature data and refusing to yield up their raw data and methods to peer scrutiny.

As we have reported in The New American many times, computer models used by the alarmists have invariably run hot, which is to say that they always show hotter temperatures than the real, observed temperatures recorded by instruments (satellites, weather balloons, weather stations, ocean buoys). (See here, here, here, and here.)

In fact, of the 73 computer models used by the UN’s IPCC, all 73 — without exception — predicted far higher temperatures than actually occurred. This is not the disputed claim of climate change “deniers”; the IPCC itself has admitted this and did so in graphic detail (most likely inadvertently) with its infamous “Spaghetti Graph” showing its falsified computer model projections juxtaposed with the actual record, that appeared in the IPCC’s 2013  Fifth Assessment Report (IPCC AR5).

The response from the AGW (anthropogenic global warming) alarmists has been not to go back to the drawing board and correct their erroneous computer models, but to instead circle the wagons and “adjust” the recorded measurements to conform to their models! This has led to outrageous frauds that amount to the equivalent of changing the weights and measures, to wit: dropping thousands of weather stations in the colder higher latitudes and higher elevations, “losing” winter month cold temperatures, failing to account for the urban heat island (UHI) effect and documented siting violations that have been causing most land-based weather stations to give a wildly exaggerated warming trend.

Orwellian Memory Hole: “Adjusting” Facts Out of Existence

As The New American has previously reported:

In their report, “Surface Temperature Records: Policy Driven Deception?,” published in August 2010 by the Science & Public Policy Institute, meteorologists Joseph D’Aleo and Anthony Watts write that “only about 3% [of the U.S. weather stations] met the ideal specification for siting.” The volunteers “found stations located next to the exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat,” Watts and D’Aleo wrote. They documented these finding with photographs that appear both in the report and on the Surface Stations Project website. These horrendous siting problems might be attributed to government incompetence, negligence, and/or laziness. But there’s much more that can only be explained as intentional duplicity. Interestingly, beginning in 1990, NOAA, NASA, and the Global Historical Climatology Network (GHCN), managed by the National Climatic Data Center (NCDC), began a massive and radical series of “adjustments” that invariably injected a dramatic warming bias into the temperature data. Those changes included: 1) dropping thousands of stations globally, overwhelmingly from cooler regions (northern latitudes, higher elevations, and rural areas); 2) dropping cold months from the annual records; and 3) switching to new, automated thermometers that have a proven warming bias.

The magnitude of the fraud is astounding. We’re not talking here about some minor tinkering and tampering. As we noted last year, “Globally, the number of surface temperature stations dropped from 6,000 to just over 1,000. ‘The Russian station count dropped from 476 to 121 so over 40% of Russian territory was not included in global temperature calculations,’ note D’Aleo and Watts. ‘In Canada, the number of stations dropped from 600 to less than 50.’ Less than 50 for all of Canada!” Those are spectacular adjustments, yes? But there is much more. Our report continued: “At the same time, more mid-latitude and lower-elevation stations were added, along with more populated centers, adding more urban heat island (UHI) effect. D’Aleo and Watts point out: ‘Forty percent of GHCN v2 stations have at least one missing month. This is concentrated in the winter months.’ No problem; the NOAA/NASA/GHCN folks simply ‘infill’ with ‘adjusted’ data, always biasing in the warming direction, of course.”

If the continuous temperature adjustments were legitimate, one would expect that at least some of them — probably around half of them — would be adjustments downward in temperature, if only to correct for the pervasive UHI bias that falsely projects ever-increasing global warming. But, as we have seen, that is not the case. The fact that the adjustments persistently cool the past, while always warming recent years and the present to create a steeping trend upward, defies both probability and the empirical evidence. It shows the adjustments are driven by a political agenda, by duplicity, not reality.

Obama’s EPA “Endangerment Finding” for CO2 Now Endangered

The new Wallace/D’Aleo/Idso study is likely to provide additional fuel to President Donald Trump’s efforts to undo the Obama administration’s EPA “endangerment finding,” which made the incredible claim that carbon dioxide (CO2), the ubiquitous, beneficial “gas of life” is a “pollutant” that must be severely regulated by the EPA to prevent catastrophic global warming.

How did the EPA arrive at this conclusion despite the facts that all plants need CO2 and all animals exhale CO2, not to mention the acknowledged contradictory evidence that global temperatures have remained stagnant, “on pause,” for the past 20 years, even as anthropogenic CO2 has risen dramatically? The answer is that the EPA relied on the three doctored (i.e., “adjusted”) GAST “lines of evidence” mentioned above. And it is using this newly acquired authority to regulate CO2 and other greenhouse gases (GHG) to wreak havoc all across our economy.

The study authors explain the purpose of their investigation:

The objective of this research was to test the hypothesis that Global Average Surface Temperature (GAST) data, produced by NOAA, NASA, and HADLEY, are sufficiently credible estimates of global average temperatures such that they can be relied upon for climate modeling and policy analysis purposes. The relevance of this research is that the validity of all three of the so-called Lines of Evidence in EPA’s GHG/CO2 Endangerment Finding require GAST data to be a valid representation of reality.

After noting the three GAST data sets “are not a valid representation of reality,” the authors state that “since GAST data set validity is a necessary condition for EPA’s GHG/CO2 Endangerment Finding, it too is invalidated by these research findings.”

In an excellent piece in The Daily Caller, Michael Bastasch notes, “Since President Donald Trump ordered EPA Administrator Scott Pruitt to review the Clean Power Plan, there’s been speculation the administration would reopen the endangerment finding to new scrutiny.” Bastasch interviewed Sam Kazman, an attorney with the Competitive Enterprise Institute (CEI), one of the organizations petitioning the EPA to reopen the CO2 endangerment finding.  Kazmen views the Wallace/D’Aleo/Idso study as an “important new piece of evidence to this debate.” If the Trump administration effectively uses this powerful new evidence, it may be able to undo much of the damage inflicted by the Obama administration with EPA’s endangerment finding. That would be a potent follow-up to President Trump’s decision to cancel President Obama’s unconstitutional commitment of the United States to the UN’s dangerous Paris climate accord.

************

Original article

ER recommends other articles by The New American

About the author