Article Tags: Doug L. Hoffman, Headline Story

It has come to light that a number of climate scientists have been less than truthful with regard to climate data. As shocking and embarrassing as this has been to the scientific community, it serves only to emphasize the huge blind spot that scientists have for their computer models. It is a career ending offense to knowingly falsify data, yet the entire climate science community engages in even worse deception without a second thought. This is because lies are generated for them wholesale by their faithful yet duplicitous servants: computer climate models.
In a guest post on Roger Pielke Sr.’s web site, Hiroshi L. Tanaka of the University of the University of Tsukuba in Japan, reported on the results of a new paper he published with his student, Masahiro Ohashi (see “Data Analysis of Recent Warming Pattern in the Arctic”). In it they state “it is shown that both of decadal variabilities before and after 1989 in the Arctic can be mostly explained by the natural variability of the AO not by the external response due to the human activity.” While this is an important finding in and of itself, that is not what caught my attention.
The implications of Ohashi and Tanaka’s finding for climate modeling are even more dramatic, and even more damaging, for the climate alarmist cause. Why this new finding is so damaging goes to the heart of how modeling is done and how models are calibrated to reflect previously “known” conditions. Here is how Dr. Tanaka stated the implications of their research:
Click source to read FULL report from Doug L. Hoffman
Source: theresilientearth.com