What did ExxonMobil Know and when did they know it? (Part 3, Exxon: The Fork Not Taken)

October 23, 2015


This just keeps getting more hilarious…

Exxon Confirmed Global Warming Consensus in 1982 with In-House Climate Models
The company chairman would later mock climate models as unreliable while he campaigned to stop global action to reduce fossil fuel emissions.
Lisa Song, Neela Banerjee, David Hasemyer
Sep 22, 2015

Steve Knisely was an intern at Exxon Research and Engineering in the summer of 1979 when a vice president asked him to analyze how global warming might affect fuel use.

“I think this guy was looking for validation that the greenhouse effect should spur some investment in alternative energy that’s not bad for the environment,” Knisely, now 58 and a partner in a management consulting company, recalled in a recent interview.

Knisely projected that unless fossil fuel use was constrained, there would be “noticeable temperature changes” and 400 parts per million of carbon dioxide (CO2) in the air by 2010, up from about 280 ppm before the Industrial Revolution.


Through much of the 1980s, Exxon researchers worked alongside university and government scientists to generate objective climate models that yielded papers published in peer-reviewed journals. Their work confirmed the emerging scientific consensus on global warming’s risks.

Yet starting in 1989, Exxon leaders went down a different road. They repeatedly argued that the uncertainty inherent in computer models makes them useless for important policy decisions. Even as the models grew more powerful and reliable, Exxon publicly derided the type of work its own scientists had done. The company continued its involvement with climate research, but its reputation for objectivity began to erode as it campaigned internationally to cast doubt on the science.


Climate ‘Catastrophe’ Foreseen

By 1981, Exxon scientists were no longer questioning whether the buildup of CO2 would cause the world to heat up. Through their own studies and their participation in government-sponsored conferences, company researchers had concluded that rising CO2 levels could create catastrophic impacts within the first half of the 21st century if the burning of oil, gas and coal wasn’t contained.


Unanimous Agreement

“Over the past several years a clear scientific consensus has emerged regarding the expected climatic effects of increased atmospheric CO2,” Cohen wrote to A.M. Natkin of Exxon Corporation’s Science and Technology Office in 1982. “The consensus is that a doubling of atmospheric CO2 from its pre-industrial revolution value would result in an average global temperature rise of 3.0 ± 1.5°C.” (Equal to 5.4 ± 2.7°F).

“There is unanimous agreement in the scientific community that a temperature increase of this magnitude would bring about significant changes in the earth’s climate, including rainfall distribution and alterations in the biosphere.”

Exxon’s own modeling research confirmed this and the company’s results were later published in at least three peer-reviewed science articles. Two of them were co-authored by Hoffert, and a third was written entirely by Flannery.

Exxon’s modeling experts also explained away the less-dire predictions of a 1979 study led by Reginald Newell, a prominent atmospheric scientist at the Massachusetts Institute of Technology. Newell’s model projected that the effects of climate change would not be as severe as most scientists were predicting.

Specifically, Newell and a co-author from the Air Force named Thomas Dopplick challenged the prevailing view that a doubling of the earth’s CO2 blanket would raise temperatures about 3°C (5°F)– a measure known as climate sensitivity. Instead, they said the earth’s true climate sensitivity was roughly less than 1°C (2°F).



I have yet to find any Exxon models… Much less any that confirmed a “Global Warming Consensus” or “Climate ‘Catastrophe'”.  What I have found are reports which cite other people’s models and quite a few “cartoons” derived from them.

Same as it ever was.

“When you come to a fork in the road, take it!” (Lawrence “Yogi” Berra)

Half wrong.

Exxon: The Fork Not Taken

It’s notable that Exxon was made aware of the so-called consensus…

“The consensus is that a doubling of atmospheric CO2 from its pre-industrial revolution value would result in an average global temperature rise of 3.0 ± 1.5°C.”

And they were also made aware of reality…

“Newell and a co-author from the Air Force named Thomas Dopplick … said the earth’s true climate sensitivity was roughly less than 1°C.”

Inside Climate likes to make a big deal out of this…

Exxon’s former chairman and CEO, Lee Raymond, took an even tougher line against climate science. Speaking before the World Petroleum Congress in Beijing in 1997, Raymond mocked climate models in an effort to stop the imminent adoption of the Kyoto Protocol, an international accord to reduce emissions.

“They are notoriously inaccurate,” Raymond said. “1990’s models were predicting temperature increases of two to five degrees Celsius by the year 2100,” he said, without explaining the source of those numbers. “Last year’s models say one to three degrees. Where to next year?”

Mr. Raymond was correct. The models have been “notoriously inaccurate.” However, they have been very precise in their inaccuracies…

What did ExxonMobil Know and when did they know it? (Part Deux) “Same as it ever was.”

October 23, 2015

If you thought Part 1 was a doozy, “you ain’t seen nothing yet”…

Exxon Believed Deep Dive Into Climate Research Would Protect Its Business
Outfitting its biggest supertanker to measure the ocean’s absorption of carbon dioxide was a crown jewel in Exxon’s research program.

Neela Banerjee, Lisa Song, David Hasemyer
Sep 21, 2015

In 1981, 12-year-old Laura Shaw won her seventh-grade science fair at the Solomon Schechter Day School in Cranford, N.J. with a project on the greenhouse effect.

For her experiment, Laura used two souvenir miniatures of the Washington Monument, each with a thermometer attached to one side. She placed them in glass bowls and covered one with plastic wrap – her model of how a blanket of carbon dioxide traps the reflected heat of the sun and warms the Earth. When she turned a lamp on them, the thermometer in the plastic-covered bowl showed a higher temperature than the one in the uncovered bowl.

If Laura and her two younger siblings were unusually well-versed in the emerging science of the greenhouse effect, as global warming was known, it was because their father, Henry Shaw, had been busily tracking it for Exxon Corporation.


Henry Shaw was part of an accomplished group at Exxon tasked with studying the greenhouse effect. In the mid-70s, documents show that Shaw was responsible for seeking out new projects that were “of national significance,” and that could win federal funding. Others included Edward E. David, Jr., a former science advisor to President Richard Nixon, and James F. Black, who worked on hydrogen bomb research at Oak Ridge National Laboratory in the 1950s.

Black, who died in 1988, was among the first Exxon scientists to become acquainted with the greenhouse effect. Esso, as Exxon was known when he started, allowed him to pursue personal scientific interests. Black was fascinated by the idea of intentionally modifying weather to improve agriculture in arid countries, said his daughter, Claudia Black-Kalinsky.

“He believed that big science could save the world,” she said. In the early 1960s, Black helped draft a National Academy of Sciences report on weather and climate modification. Published in 1966, it said the buildup of carbon dioxide in the atmosphere “agrees quite well with the rate of its production by man’s consumption of fossil fuels.”

In the same period, a report for President Lyndon Johnson from the President’s Science Advisory Council in 1965 said the burning of fossil fuels “may be sufficient to produce measurable and perhaps marked changes in climate” by the year 2000.

By 1977, Black had become a top technical expert at Exxon Research & Engineering, a research hub based in Linden, N.J., and a science advisor to Exxon’s top management.  That year he made a presentation to the company’s leading executives warning that carbon dioxide accumulating in the upper atmosphere would warm the planet and if the CO2 concentration continued to rise, it could harm the environment and humankind.



Firstly, the Earth’s atmosphere is not air in a jar.

Secondly, the Black presentation was dated in 1978.

Thirdly, the Black presentation was just another survey of government and academic publications on the so-called greenhouse effect.

Here’s what Exxon knew in 1978…

Exxon knew that most government and academic scientists wanted more research money.

Exxon knew that most government and academic scientists wanted more research money.

“Same as it ever was…”


“Same as it ever was…”

In 1978, Exxon knew that the effects on sea level and the polar ice caps would likely be negligible, models were useless and more effort should be directed at paleoclimatology.

In 1978, Exxon knew that the effects on sea level and the polar ice caps would likely be negligible, models were useless and more effort should be directed at paleoclimatology.

“Same as it ever was…”

In 1978, Exxon knew that the models were useless.

In 1978, Exxon knew that the models were useless.

“Same as it ever was…”

Inside Climate then bemoaned the fact that Exxon management scrubbed a science project…

Exxon’s enthusiasm for the project flagged in the early ’80s when federal funds fell through. Exxon Research cancelled the tanker project in 1982, but not before Garvey, Shaw and other company engineers published an initial paper in a highly specialized journal on the project’s methodology.

We were anxious to get the word out that we were doing this study,” Garvey said of the paper, which did not reach sweeping conclusions. “The paper was the first of what we hoped to be many papers from the work,” he said in a recent email. But the other publications never materialized.

I never worked for “big oil,” however, “little oil” tries to avoid spending money on science projects.

What did ExxonMobil Know and when did they know it? Part 1

October 22, 2015

Maybe ExxonMobil should file a RICO lawsuit against the “Shukla 20” and this gentleman…

Exxon Knew Everything There Was to Know About Climate Change by the Mid-1980s—and Denied It
And thanks to their willingness to sucker the world, the world is now a chaotic mess.

By Bill McKibben YESTERDAY 12:13 PM

A few weeks before the last great international climate conference—2009, in Copenhagen—the e-mail accounts of a few climate scientists were hacked and reviewed for incriminating evidence suggesting that global warming was a charade. Eight separate investigations later concluded that there was literally nothing to “Climategate,” save a few sentences taken completely out of context—but by that time, endless, breathless media accounts about the “scandal” had damaged the prospects for any progress at the conference.

Now, on the eve of the next global gathering in Paris this December, there’s a new scandal. But this one doesn’t come from an anonymous hacker taking a few sentences out of context. This one comes from months of careful reporting by two separate teams, one at the Pulitzer Prize–winning website Inside Climate News, and other at the Los Angeles Times (with an assist from the Columbia Journalism School). Following separate lines of evidence and document trails, they’ve reached the same bombshell conclusion: ExxonMobil, the world’s largest and most powerful oil company, knew everything there was to know about climate change by the mid-1980s, and then spent the next few decades systematically funding climate denial and lying about the state of the science.



These folks are so desperate to create a tobacco company analogy that they will resort to bald-faced lies.

Read the rest of this entry »

Shell to quit US Arctic due to “unpredictable federal regulatory environment”

September 29, 2015

Guest post by David Middleton

Disappointing results from an initial rank wildcat can’t kill a play.

The cyclical ups and downs of product prices can’t kill a play.

High operating costs can’t kill a play.

Only massively incompetent government can kill a play.

“This is a clearly disappointing exploration outcome,” Marvin Odum, director of Shell’s Upstream Americas unit, said in a statement. While indications of oil and gas were present in the Burger J well in Alaska’s Chukchi Sea, they weren’t sufficient to warrant further exploration, the company said. Shell will now plug and abandon the well.

Shell had planned a two-year drilling program starting this July. The company was seeking to resume work halted in 2012 when its main drilling rig ran aground and was lost. It was also fined for air pollution breaches. The Anglo-Dutch company first discovered oil and gas in the region in the late 1980s.

The company continues to see potential in the region and the decision not to explore further in Alaskan waters “reflects both the Burger J well result, the high costs associated with the project, and the challenging and unpredictable federal regulatory environment in offshore Alaska,” according to the statement.


The potential of the Alaska OCS is nearly as large as the Central Gulf of Mexico…

Product prices and exploration results are by their nature, unpredictable. Operating costs are tied to product prices and regulatory requirements. Regulatory requirements must be predictable in order for any business to function.

Answer to Jon Chambers

September 8, 2015

Jonathan Chambers

Hey Mid. Received the below from a friend of a friend. Went on line to refute this but all the articles say a 1.4 degree change in temperature is a bad thing. Starts a Little Ice Age.

Anything you can offer to concretely snuff this guy?

Climate change is cyclical. I agree 100%. Ice ages come and go. If you follow this thought process though, the data since the industrial revolution has us coming out of an ice age faster amd more drastically than ever. Climate deniers are my favorite kinds of Republicans

He already snuffed himself better than I could.  His ignorance appears to be manifest.

We are still in an “ice age.” We are just fortunate enough to be living in an interglacial stage known as the Holocene.  Geologically speaking, this is an “ice age”… We’ve been in an ice age since the beginning of the Oligocene (~30 MYA)…

(Older is to the right.)

This particular ice age period (the Quaternary) has been particularly cold. Over the most recent ~2 million years, glacial maxima have had periods of ~100,000 years, punctuated by brief (15-40 KY) interglacial stages. The last glacial maximum occurred about 20,000 years ago.

We are currently in an interglacial stage known as the Holocene. It began about 10,000 years ago and peaked during the Holocene Climatic Optimum (4,000 to 7,000 years ago). Based on analyses of the Milankovitch cycles, some people think the Holocene will be a long interglacial stage. In which case, we have about 30,000 years of balmy weather remaining. If this is a typical interglacial, we only have about 5,000 years left. Even then, it will take thousands of years for the ice to build up and advance across Canada. Places like Chicago might still have 50,000 years remaining to plan on how to deal with a mile-high sheet of ice bulldozing its way south across the Midwest.

Like all other significant biomass features on Earth, humans do affect natural processes. We amplify some processes and attenuate others. Some studies have suggested that AGW has already staved off a decline into the next glacial stage (Ruddiman). Others have suggested that AGW will delay or even prevent the next glacial stage. I think it might delay it by a few centuries, maybe even more than 1,000 years; but it can’t prevent it. The greenhouse effect can’t trap heat that never arrived from the Sun. Atmospheric CO2 remained elevated long after Sangamonian (Eemian, MIS 5) interglacial stage began cooling toward the last glacial maximum..

When insolation declines, less solar energy reaches the Earth and there is less heat for CO2 and other GHG’s to trap. Holocene insolation peaked during the Holocene Climatic Optimum…

(Older is to the left.)

Insolation has been declining since the Holocene Climatic Optimum. While we still can benefit from episodic periods of millennial scale warming, the long-term trend is toward the inevitable next deep freeze.

We came out of the most recent glacial stage at the end of the Pleistocene (~10,000 yrs ago).

The Holocene has been a heck of a lot more stable than the preceding Pleistocene glacial episode (the x-axes of next three graphs are denominated in calendar years – Today is to the right)…

But it has been far from stable…

The end-Pleistocene deglaciation was far more drastic than any climatic change in the Holocene, including the industrial revolution (AKA Anthropocene).  The Earth’s climate has generally been warming since 1600 AD, the coldest part of a period known as the Little Ice Age (LIA).  The LIA is part of a naturally occurring millennial scale climate oscillation which has dominated Holocene climate change and was quite possibly the coldest phase of the Holocene since the 8.2 KYA Cooling Event. The LIA was characterized by maximum glacial advances and the most extensive sea ice coverage since the onset of the Neoglaciation (end of the Holocene Climatic Optimum).

The Little Ice Age.may have been the coldest climatic period of the past 8,200 years.

While volcanic forcing may have played a role in the coldness of the LIA, it was clearly a.cyclical cooling event. Much, if not all, of the warming since the late 16th century is clearly part of a millennial climate cycle.

All of the “global warming” from ~1600 AD through 2000 AD barely brought the climate back to “normal.”

There is nothing anomalous about recent warming unless you fraudulently splce high frequency instrumental data onto low frequency proxy reconstructions (AKA Hockey Sticks).

The following is a composite of several different posts I have written on the subject.  Hopefully it’s not too disjointed.

The rate of warming since ~1978 was no different than the rate of warming from ~1910 to 1945…

The recent warming was totally nonanomalous relative to changes of the prior 2,000 years…

Which were unrelated to atmospheric CO2 concentrations…

The Late Holocene climate has been characterized by millennial scale cycle with a period of ~1,000 years and amplitude of ~0.5 °C.

Figures 7 & 8. Both Moberg and Ljungqvist clearly demonstrate the millennial scale climate cycle.

These cycles even have names…

Figure 9. Ljungqvist with climatic period nomenclature.

These cycles have been long recognized by Quaternary geologists…

Figure 10. The millennial scale climate cycle can clearly be traced back to the end of the Holocene Climatic Optimum and the onset of the Neoglaciation.

Fourier analysis of the GISP2 ice core clearly demonstrates that the millennial scale climate cycle is the dominant signal in the Holocene (Davis & Bohling, 2001). It is pervasive throughout the Holocene (Bond et al., 1997).

Figure 11. The Holocene climate has been dominated by a millennial scale climate cycle.

The industrial era climate has not changed in any manner inconsistent with the well-established natural millennial scale cycle. Assuming that the ice core CO2 is reliable, the modern rise in CO2 has had little, if any effect on climate…

Figure 12. Why would CO2 suddenly start driving climate change in the 19th century?

While the climate may have warmed by 0.2 to 0.4 °C more than what might be expected to occur in a 100% natural warming phase of the millennial cycle, all of the apparent excess warming may very well be due to resolution differences between the instrumental and proxy data…

Figure 13. Ljungqvist demonstrates that the modern warming has not unambiguously exceeded the range of natural variability. The bold black dashed line is the instrumental record. I added The red lines to highlight the margin of error.

According to Ljungqvist…

The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. AD 1-300, reaching up to the 1961-1990 mean temperature level, followed by the Dark Age Cold Period c. AD 300-800. The Medieval Warm Period is seen c. AD 800–1300 and the Little Ice Age is clearly visible c. AD 1300-1900, followed by a rapid temperature increase in the twentieth century. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961-1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself.


The proxy reconstruction itself does not show such an unprecedented warming but we must consider that only a few records used in the reconstruction extend into the 1990s. Nevertheless, a very cautious interpretation of the level of warmth since AD 1990 compared to that of the peak warming during the Roman Warm Period and the Medieval Warm Period is strongly suggested.


The amplitude of the temperature variability on multi-decadal to centennial time-scales reconstructed here should presumably be considered to be the minimum of the true variability on those time-scales.

Ljungqvist is recommending caution in comparing the modern instrumental record to the older proxy reconstructions because the proxy data are of much lower resolution. The proxy data are showing the “minimum of the true variability on those time-scales.” The instrumental data are depicting something closer to actual variability. Even then, the instrumental record doesn’t exceed the margin of error for the proxy data during the peak of the Medieval Warm Period. With a great deal of confidence, perhaps even 67%, it can be concluded that at least half, perhaps all, of the modern warming is the result of quasi-periodic natural climate fluctuations (AKA cycles).

Ljungqvist said that “the temperature since AD 1990 is, however, possibly higher than during any previous time in the past two millennia if we look at the instrumental temperature data spliced to the proxy reconstruction.” The dashed red curve is the instrumental data set…

The instrumental data always show faster (higher amplitude) temperature variations than the proxy reconstructions…

Many multi-decadal periods have warmed at 7-13 times the background rate.  Many 100-yr periods of warming and cooling exceeded the 10,000 year average.

The pronounced spike at 62-65 years on this power spectrum means almost any 100-yr period has a far greater rate and magnitude of warming or cooling than the 10,000-yr average…

“Science Isn’t Broken” (Except when it is)

August 28, 2015

From Five Thirty Eight Science…


If you follow the headlines, your confidence in science may have taken a hit lately.

Peer review? More like self-review. An investigation in November uncovered a scam in which researchers were rubber-stamping their own work, circumventing peer review at five high-profile publishers.

Scientific journals? Not exactly a badge of legitimacy, given that the International Journal of Advanced Computer Technology recently accepted for publication a paper titled “Get Me Off Your (Fracking) Mailing List,” whose text was nothing more than those seven words, repeated over and over for 10 pages. Two other journals allowed an engineer posing as Maggie Simpson and Edna Krabappel to publish a paper, “Fuzzy, Homogeneous Configurations.”

Revolutionary findings? Possibly fabricated. In May, a couple of University of California, Berkeley, grad students discovered irregularities in Michael LaCour’s influential paper suggesting that an in-person conversation with a gay person could change how people felt about same-sex marriage. The journal Science retracted the paper shortly after, when LaCour’s co-author could find no record of the data.

Taken together, headlines like these might suggest that science is a shady enterprise that spits out a bunch of dressed-up nonsense…



While there are a lot of problems with the peer-review process and a population explosion of journals which will readily publish abject bullschist, I think the bigger, more serious scientific breakdown is in journalism and press releases, as it pertains to the public. Most people never read the abstracts of the papers, much less the actual papers.


Here’s an example…

Green Business | Wed Aug 26, 2015 4:10pm EDT Related: ENVIRONMENT
Global sea levels climbed 3 inches since 1992, NASA research shows

Sea levels worldwide rose an average of nearly 3 inches (8 cm) since 1992, the result of warming waters and melting ice, a panel of NASA scientists said on Wednesday.

In 2013, a United Nations panel predicted sea levels would rise from 1 to 3 feet (0.3 to 0.9 meters) by the end of the century. The new research shows that sea level rise most likely will be at the high end of that range, said University of Colorado geophysicist Steve Nerem.

Sea levels are rising faster than they did 50 years ago and “it’s very likely to get worse in the future,” Nerem said.




First, the unbroken science…

Nerem, R. S., D. Chambers, C. Choe, and G. T. Mitchum. “Estimating Mean Sea Level Change from the TOPEX and Jason Altimeter Missions.” Marine Geodesy 33, no. 1 supp 1 (2010): 435.

Dr. Nerem’s science does support 3 inches of sea level rise since 1992.

Now for the broken science…

In 2013, a United Nations panel predicted sea levels would rise from 1 to 3 feet (0.3 to 0.9 meters) by the end of the century. The new research shows that sea level rise most likely will be at the high end of that range, said University of Colorado geophysicist Steve Nerem.

Sea levels are rising faster than they did 50 years ago and “it’s very likely to get worse in the future,” Nerem said.

Sea level has been rising at a rate of about 3 mm per year since the Jason/Topex missions started flying.

The IPCC says that sea level will rise by 300 to 900 mm by the end of this century. Dr. Nerem says that his work indicates that the sea level rise will be at the high end of that range. Since we are 15 years into this century with about 45 mm of sea level rise “in the bank,” sea level would have to rise by 855 mm over the next 85 years to hit the high end. That is 10 mm per year. This caused sea level to rise by ~10 mm/yr for about 10,000 years…

The animation above is of the end-Pleistocene deglaciation (AKA Holocene Transgression)…

All of the sea level rise since 1700 AD is circled at the right hand side of the graph.

The only way sea level rise could approach the high end of the IPCC range is if it exponentially accelerates…

The rate from 2081-2100 would have to average 20 mm per year, twice that of the Holocene Transgression. This is only possible in bad science fiction movies.

Broken science, part deux…

Sea levels are rising faster than they did 50 years ago…

They are rising faster than they were 50 years ago. However, they are rising at the same rate that they were 80, 70 and 60 years ago…



There is nothing abnormal about sea level rising by 3 inches over a 23-yr period.  Nor is a 3 mm/yr sea level rise over a multi-decade period unusual.  There is simply no anomaly requiring an explanation.  The claim that the 3 inches if sea level rise from 1992-2015 is inline with 3 feet of sea level rise in the 21st century is patently false and demonstrably disprovable.  The accurate statement that sea level is rising faster now than it was 50 years ago is cherry-picking of the highest order.  Warning that “it’s very likely to get worse in the future,” is the scientific equivalent of shouting “Fire!” in a crowded movie theater because you constructed a model which predicts that the projection system will burst into flames if it malfunctions at some point in the future.

Miocene (~20 MYA) lizards, preserved in amber, “identical to their modern cousins.”

July 28, 2015

Truly amazing…

Ancient lizards in amber amaze scientists
Tuesday, 28 July 2015 Stuart Gary

A community of lizards from the Caribbean, preserved for 20 million years in amber, have been found to be identical to their modern cousins, say researchers.

This suggests the different niches inhabited by the lizards have – incredibly – changed little over the past 20 million-year, report the team, in this week’s Proceedings of the National Academy of Sciences.



I guess the Gorebots are wrong about climate change endangering lizards

(Older is to the right.)

Gavin says the funniest things…

June 5, 2015

NOAA temperature record updates and the ‘hiatus’

Filed under:

— gavin @ 4 June 2015

In a new paper in Science Express, Karl et al. describe the impacts of two significant updates to the NOAA NCEI (née NCDC) global temperature series. The two updates are: 1) the adoption of ERSST v4 for the ocean temperatures (incorporating a number of corrections for biases for different methods), and 2) the use of the larger International Surface Temperature Initiative (ISTI) weather station database, instead of GHCN. This kind of update happens all the time as datasets expand through data-recovery efforts and increasing digitization, and as biases in the raw measurements are better understood. However, this update is going to be bigger news than normal because of the claim that the ‘hiatus’ is no more. To understand why this is perhaps less dramatic than it might seem, it’s worth stepping back to see a little context…

Global temperature anomaly estimates are a product, not a measurement

The first thing to remember is that an estimate of how much warmer one year is than another in the global mean is just that, an estimate. We do not have direct measurements of the global mean anomaly, rather we have a large database of raw measurements at individual locations over a long period of time, but with an uneven spatial distribution, many missing data points, and a large number of non-climatic biases varying in time and space. To convert that into a useful time-varying global mean needs a statistical model, good understanding of the data problems and enough redundancy to characterise the uncertainties. Fortunately, there have been multiple approaches to this in recent years (GISTEMP, HadCRUT4, Cowtan & Way, Berkeley Earth, and NOAA NCEI), all of which basically give the same picture.


The ‘hiatus’ is so fragile that even those small changes make it disappear


– See more at: http://www.realclimate.org/index.php/archives/2015/06/noaa-temperature-record-updates-and-the-hiatus/#sthash.B1t4pbWO.dpuf

If “the ‘hiatus’ is so fragile that even those small changes make it disappear,” the underlying trend must also be fragile, if small data fudges are the difference between hiatus and the road to AGW calamity.

In the meantime…

Anatomy of a Collapsing Paradigm

March 18, 2015


A framework containing the basic assumptions, ways of thinking, and methodology that are commonly accepted by members of a scientific community.

Paradigm Shift:

These examples point to the third and most fundamental aspect of the incommensurability of competing paradigms. In a sense that I am unable to explicate further, the proponents of competing paradigms practice their trades in different worlds. One contains constrained bodies that fall slowly, the other pendulums that repeat their motions again and again. In one, solutions are compounds, in the other mixtures. One is embedded in a flat, the other in a curved, matrix of space. Practicing in different worlds, the two groups of scientists see different things when they look from the same point in the same direction. Again, that is not to say that they can see anything they please. Both are looking at the world, and what they look at has not changed. But in some areas they see different things, and they see them in different relations one to the other. That is why a law that cannot even be demonstrated to one group of scientists may occasionally seem intuitively obvious to another. Equally, it is why, before they can hope to communicate fully, one group or the other must experience the conversion that we have been calling a paradigm shift. Just because it is a transition between incommensurables, the transition between competing paradigms cannot be made a step at a time, forced by logic and neutral experience. Like the gestalt switch, it must occur all at once (though not necessarily in an instant) or not at all.

–Thomas Kuhn, 1962. The Structure of Scientific Revolutions. Vol. II, No. 2 p. 150

What is the current paradigm?

  • Human activities, primarily carbon dioxide emissions, have been the primary cause of the observed global warming over the past 50 to 150 years.
  • The atmospheric carbon dioxide concentration had stabilized between 270 and 280 ppmv early in the Holocene and had remained in that range prior to the mid-19th century when fossil fuels became the primary energy source of the Industrial Revolution.
  • Anthropogenic carbon dioxide emissions are causing the atmospheric concentration to rise at a dangerously rapid pace to levels not seen in 100’s of thousands to millions of years.
  • The climate sensitivity to a doubling of pre-industrial carbon dioxide concentration “is likely to be in the range of 2 to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C,” possibly even much higher than 4.5°C.
  • Immediate, deep reductions in greenhouse gas emissions are necessary in order to stave off catastrophic climate change.
  • The scientific consensus regarding this paradigm is overwhelming (~97%).

Why is the paradigm collapsing?

  • There has been no increase in the Earth’s average surface temperature since the late 20th century.
  • Every measure of pre-industrial carbon dioxide, not derived from Antarctic ice cores, indicates a higher and more variable atmospheric concentration.
  • The total lack of predictive skill in AGW climate models.
  • An ever-growing body of observation-based studies indicating that the climate sensitivity is in the range of 0.5 to 2.5°C with a best estimate of 1.5 to 2°C, and is very unlikely to be more than 2°C.
  • Clear evidence that the dogmatic insistence of scientific unanimity is at best highly contrived and at worst fraudulent.

The paradigm is collapsing primarily due to the fact that the climate appears to be far less sensitive to changes in atmospheric carbon dioxide concentrations than the so-called scientific consensus had assumed.

One group of scientists has steadfastly resisted the carbon dioxide-driven paradigm: Geologists, particularly petroleum geologists. As Kuhn wrote,

“Practicing in different worlds, the two groups of scientists see different things when they look from the same point in the same direction. Again, that is not to say that they can see anything they please. Both are looking at the world, and what they look at has not changed. But in some areas they see different things, and they see them in different relations one to the other. That is why a law that cannot even be demonstrated to one group of scientists may occasionally seem intuitively obvious to another.”

Petroleum geologists tend to be sedimentary geologists and sedimentary geology is essentially a combination of paleogeography and paleoclimatology. Depositional environments are defined by physical geography and climate. We literally do practice in a different world, the past. Geologists intuitively see Earth processes as cyclical and also tend to look at things from the perspective of “deep time.” For those of us working the Gulf of Mexico, we “go to work” in a world defined by glacioeustatic and halokinetic processes and, quite frankly, most of us don’t see anything anomalous in recent climate changes.

So, it should come as little surprise that geoscientists have consistently been far more likely to think that modern climate changes have been driven by overwhelmingly natural processes…

APEGA is the organization responsible for certifying and licensing professional geoscientists and engineers in Alberta, Canada.

This study is very interesting because it analyzes the frames of reference (Kuhn’s “different worlds”) in which opinions are formed. Skeptical geologists are most likely to view climate change as overwhelmingly natural. Skeptical engineers are more likely to view it as a matter of economics or fatalism. The cost of decarbonization would far outweigh any benefits and/or would have no measurable effect on climate change.

The Obsession With Consensus

In nearly 40 years as an Earth Scientist (counting college), I have never seen such an obsession with consensus. In geology, there are many areas in which there are competing hypotheses; yet there is no obsession with conformance to a consensus.

The acceptance of plate tectonics was a relatively new thing when I was a student. This paradigm had only recently shifted from the geosynclinal theory to plate tectonics. We still learned the geosynclinal theory in Historical Geology and it still has value today. However, I don’t ever recall papers being published claiming a consensus regarding either theory.

Most geologists think that granite is an igneous rock and that petroleum is of organic origin. Yet, the theories of granitization and abiogenic hydrocarbon formation are not ridiculed; nor are the adherents subjected to “witch hunts.”

One of the most frequent methods of attempting to quantify and justify the so-called consensus on climate change has been the abstract search (second hand opinions). I will only bother to review one of these exercises in logical fallacy, Cook et al., 2013.

Second Hand Opinions.

These sorts of papers consist of abstract reviews. The authors’ then tabulate their opnions regarding whether or not the abstracts support the AGW paradgm. As Legates et al., 2013 pointed out, Cook defined the consensus as “most warming since 1950 is anthropogenic.” Cook then relied on three different levels of “endorsement” of that consensus and excluded 67% of the abstracts reviewed because they neither endorsed nor rejected the consensus.

The largest endorsement group was categorized as “implicitly endorses AGW without minimizing it.” They provided this example of an implied endorsement:

‘…carbon sequestration in soil is important for mitigating global climate change’

Carbon sequestration in soil, lime muds, trees, seawater, marine calcifers and a whole lot of other things have always been important for mitigating a wide range of natural processes. I have no doubt that I have implicitly endorsed the so-called consensus based on this example.

The second largest endorsement group was categorized as “implicitly endorses but does not quantify or minimize.” Pardon my obtuseness, but how in the heck can one explicitly endorse the notion that “most warming since 1950 is anthropogenic” without quantification? This is the exmple Cook provided:

‘Emissions of a broad range of greenhouse gases of varying lifetimes contribute to global climate change’

Wow! I contributed to Romney for President… Yet most of his campaign warchest didn’t come from me. By this subjective standard, I have probably explicitly endorsed AGW a few times.

No Schist, Sherlock.

One of the most frequent refrains is the assertion that “climate scientists” endorse the so-called consensus more than other disciplines and that the level of endorsement is proportional to the volume of publications by those climate scientists. Well… No schist, Sherlock! I would bet a good bottle of wine that the most voluminous publishers on UFO’s are disproportionately more likely to endorse Close Encounters of the Third Kind as a documentary. A cursory search for “abiogenic hydrocarbons” in AAPG’s Datapages could lead me to conclude that there is a higher level of endorsement of abiogenic oil among those who publish on the subject than among non-publishing petroleum geologists.

These exercises in expertise cherry-picking are quite common. A classic example was Doran and Kendall Zimmerman, 2009. This survey sample was limited to academic and government Earth Scientists. It excluded all Earth Scientists working in private sector businesses. The two key questions were:

1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

I would answer yes to #1 and my answer to #2 would depend on the meaning of “human activity is a significant contributing factor.” If I realized it was a “push poll,” I would answer “no.”

Interestingly, economic geologists and meteorologists were the most likely to answer “no” to question #2…

The two areas of expertise in the survey with the smallest percentage of participants answering yes to question 2 were economic geology with 47% (48 of 103) and meteorology with 64% (23 of 36).

The authors derisively dismissed the opinions of geologists and meteorologists…

It seems that the debate on the authenticity of global warming and the role played by human activity is largely nonexistent among those who understand the nuances and scientific basis of long-term climate processes.

No discipline has a better understanding the “nuances” than meteorologists and no discipline has a better understanding of the “scientific basis of long-term climate processes” than geologists.

The authors close with a “no schist, Sherlock” bar chart:

The most recent example of expertise cherry-picking was Stenhouse et al., 2014.

The 52% consensus among the membership of the American Meteorological Society was explained away as being due to “perceived scientific consensus,” “political ideology,” and a lack of “expertise” among non-publishing meteorologists and atmospheric scientists…

While we found that higher expertise was associated with a greater likelihood of viewing global warming as real and harmful, this relationship was less strong than for political ideology and perceived consensus. At least for the measure of expertise that we used, climate science expertise may be a less important influence on global warming views than political ideology or social consensus norms. More than any other result of the study, this would be strong evidence against the idea that expert scientists’ views on politically controversial topics can be completely objective.

Finally, we found that perceiving conflict at AMS was associated with lower certainty of global warming views, lower likelihood of viewing global warming as human caused, and lower ratings of predicted harm caused by global warming.

So… Clearly, 97% of AMS membership would endorse the so-called consensus if they were more liberal, more accepting of unanimity and published more papers defending failed climate models.  No schist, Sherlock!

What, exactly, is a “climate scientist”?

35 years ago climatology was a branch of physical geography. Today’s climate scientists can be anything from atmospheric physicists & chemists, mathematicians, computer scientists, astronomers, astrophysicists, oceanographers, biologists, environmental scientists, ecologists, meteorologists, geologists, geophysicists, geochemistry to economists, agronomists, sociologists and/or public policy-ologists.

NASA’s top climate scientist for most of the past 35 years, James Hansen, is an astronomer. The current one, Gavin Schmidt, is a mathematician.

It seems to me that climate science is currently dominated by computer modelers, with little comprehension of the natural climate cycles which have driven climate change throughout the Holocene.

Climate scientist seems to be as nebulous as Cook’s definition of consensus.

What is the actual consensus?

The preliminary results of the AMS survey tell us all we need to know about the former…

89% × 59% = 52%… A far cry from the oft claimed 97% consensus.

Based on BAMS definition, global warming is happening. So, I would be among the 89% who answered “yes” to question #1 and among the 5% who said the cause was mostly natural.

When self-described “climate scientists” and meteorologists/atmospheric scientists are segregated the results become even more interesting…

Only 45% of meteorologists and atmospheric scientists endorse the so-called consensus. When compared to the 2009, American Geophysical Union survey, the collapsing paradigm sticks out like a polar vortex…

In reality, about half of relevant scientists would probably agree that humans have been responsible for >50% of recent climate changes.  And there might even be a 97% consensus that human activities have contributed to recent climate changes.

However, there really isn’t any scientific consensus if it is defined this way:

So… Why is there such an obsession with a 97% consensus?  My guess is that it enables such demagoguery.

Science Lessons for Secretary of State John F. Kerry

March 16, 2015

Secretary of State John F. Kerry’s recent remarks on climate change at the Atlantic Council were so scientifically illiterate that I find it difficult to believe that he managed to barely get a D in geology at Yale University.  As a US citizen and geoscientist, I feel it is my patriotic and professional duty to provide Secretary Kerry with a few complimentary science lessons.

Let’s start with some basics

So stop for a minute and just think about the basics. When an apple falls from a tree, it will drop toward the ground. We know that because of the basic laws of physics. Science tells us that gravity exists, and no one disputes that. Science also tells us that when the water temperature drops below 32 degrees Fahrenheit, it turns to ice. No one disputes that.

So when science tells us that our climate is changing and humans beings are largely causing that change, by what right do people stand up and just say, “Well, I dispute that” or “I deny that elementary truth?” And yet, there are those who do so.


Well Mr. Secretary… The Theory of Gravity can be empirically tested with such repeatability that it has become a Law and can be expressed with a simple equation… It can even be tested and  confirmed on the Moon…

The freezing point of water (phase transition) can also be empirically tested and demonstrated with ample repeatability. However, the freezing point of water is not always 32°F. The freezing point is dependent on both temperature and pressure….

We do not accept gravity and phase transition because science tells us to. We accept these things because they can be empirically tested, repeatedly confirmed and form the bases of solid scientific theories.

Science tells us that climate has always changed and always will be changing. While the radiative forcing effect of CO2 is kind of in the same ballpark as the freezing point of water, the notion that humans are the primary cause of recent climate changes is nothing but a hypothesis which has failed almost every empirical test. This is why many scientists do not accept that this is “settled science.”

From Remote Sensing Systems [with my commentary]:

Over the past decade, we have been collaborating with Ben Santer at LLNL (along with numerous other investigators) to compare our tropospheric results with the predictions of climate models. Our results can be summarized as follows:

  • Over the past 35 years, the troposphere has warmed significantly. The global average temperature has risen at an average rate of about 0.13 degrees Kelvin per decade (0.23 degrees F per decade).[All of the warming occurred in one step-shift in the late 1990’s.]
  • Climate models cannot explain this warming if human-caused increases in greenhouse gases are not included as input to the model simulation.[Only because climate models are programmed to do so. The models are programmed with very high sensitivities to CO2. Then they are paramaterized (fudged) with assumptions about albedo effects of past anthropogenic aerosol emissions in order to retrocast past temperature changes. The climate models almost totally fail to incorporate cloud albedo effects and natural climate oscillations.This is why they lack predictive skill.
  • The spatial pattern of warming is consistent with human-induced warming. See Santer et al 2008, 2009, 2011, and 2012 for more about the detection and attribution of human induced changes in atmospheric temperature using MSU/AMSU data.[Yep. Most of the warming is occurring at night and in the coldest air masses in the Northern Hemisphere.]


  • The climate has not warmed as fast as almost all climate models predict.

[Because the models lack predictive skill.]

To illustrate this last problem, we show several plots below. Each of these plots has a time series of TLT temperature anomalies using a reference period of 1979-2008. In each plot, the thick black line is the measured data from RSS V3.3 MSU/AMSU Temperatures. The yellow band shows the 5% to 95% envelope for the results of 33 CMIP-5 model simulations (19 different models, many with multiple realizations) that are intended to simulate Earth’s Climate over the 20th Century.




RSS shows no warming since 1997…

In fairness, the models have demonstrated precision. They precisely miss the mark to the high side…

The first modern AGW model from 1988 has essentially proven that the climate is relatively insensitive to increasing atmospheric CO2. Subsequent models have confirmed that the hypothesis is wrong…

James Hansen, formerly of NASA-GISS, first proved that AGW was wrong 27 years ago. 

Back in 1988, he published a climate model that, when compared to his own temperature data, substantially disproves AGW…

GISTEMP has tracked the Hansen scenario in which a green utopia was achieved more than a decade ago.

Hansen’s model used an equilibrium climate sensitivity (ECS) of 4.2°C per doubling of pre-industrial CO2. The IPCC “consensus” is 3.0°C. Numerous recent papers have demonstrated that the ECS is less than 2.0°C. T

“Scenario B” might be the most relevant prediction because CH4 and CFC’s have followed closest to the “C” trajectory, while CO2 has tracked “A”.

Since CO2 tracked “A”, CH4 and CFC’s tracked “C” and temperature tracked below “C”… The atmosphere is far less sensitive to CO2 than Hansen modeled… The atmosphere was essentially insensitive to the ~50ppmv rise in CO2 over the last 24 years.

The following CMIP5 model was parameterized (fudged) to accurately retrocast HadCRUT4 from 1950-2004.

Within eight years, the observed temperature is on the verge of dropping out of the lower error band.

This model from Kaufmann et al., 2011 simulated natural and anthropogenic (primarily CO2) forcing mechanisms from 1999-2008. Natural forcing won by a score of 3-1.

A plot of the Wood for Trees Temperature Index on 12 years worth of IPCC model suites demonstrates that the AGW hypothesis has demonstrated no predictive skill in nearly 30 years of testing…

On to basic math and reading skills

Now folks, we literally do not have the time to waste debating whether we can say “climate change.” We have to talk about how we solve climate change. Because no matter how much people want to bury their heads in the sand, it will not alter the fact that 97 percent of peer-reviewed climate studies confirm that climate change is happening and that human activity is largely responsible.

Well Mr. Secretary… The SkepSci bloggers who claimed the bogus 97% consensus don’t even assert that “97 percent of peer-reviewed climate studies confirm that climate change is happening and that human activity is largely responsible.”

 The fact.is that less than 1% “of peer-reviewed climate studies confirm that climate change is happening and that human activity is largely responsible.”

Now for a bit of history

Just look around you. Fourteen of the fifteen warmest years on record in all of history have occurred since 2000, in all of recorded history. Last year was the warmest of all. And I think if you stop and think about it, it seems that almost every next year becomes one of the hottest on record.

Wrong again, Mr. Secretary. While it is possible that “fourteen of the fifteen warmest years on record (since about 1850) have occurred since 2000″… “all of recorded history” goes back a bit farther than that.  Recorded history goes back a time long before this 2,000 year climate reconstruction…


And now back to basic science

It’s not particularly complicated. I don’t mean to sound haughty, but think about it for a minute. Life on Earth would not exist without a greenhouse effect. That is what has kept the average temperature up, until recently, at 57 degrees Fahrenheit, because there is this greenhouse effect. And it was called the greenhouse effect because it does exactly what a greenhouse does. When the sun pours in and bounces off at a different angle, it goes back up at a different angle. That can’t escape, and that warms things – a very simple proposition.

Don’t worry Mr. Secretary… You don’t sound haughty. You sound like a guy who got a D in geology and would have gotten an F in physics.

Neither a greenhouse nor the greenhouse effect rely on the Law of Reflection or Snell’s Law… Nor does the greenhouse effect even behave like a greenhouse…

The greenhouse effect refers to circumstances where the short wavelengths of visible light from the sun pass through a transparent medium and are absorbed, but the longer wavelengths of the infrared re-radiation from the heated objects are unable to pass through that medium. The trapping of the long wavelength radiation leads to more heating and a higher resultant temperature.


A major part of the efficiency of the heating of an actual greenhouse is the trapping of the air so that the energy is not lost by convection. Keeping the hot air from escaping out the top is part of the practical “greenhouse effect”, but it is common usage to refer to the infrared trapping as the “greenhouse effect” in atmospheric applications where the air trapping is not applicable.



The laws and theories of gravity and phase transition are not even remotely analogous to the fatally flawed AGW hypothesis.

97% of peer-reviewed climate studies do not conclude that humans are largely to blame for recent climate changes.

There is no evidence that 14 of the last 15 years are the warmest in all of recorded history.

A greenhouse works by retarding convective cooling.  The greenhouse effect works by retarding radiative cooling.  Secretary Kerry’s lack of scientific literacy will work by retarding our economy.


Get every new post delivered to your Inbox.

Join 27 other followers