April 25, 2017


The push to save U.S. nuclear plants for the sake of fighting climate change is threatening support for the bread and butter of clean power: wind and solar.

New York and Illinois have already approved as much as $10 billion in subsidies to keep struggling reactors open for the next decade as part of a plan to limit fossil fuel consumption. Lawmakers in Ohio, Connecticut and New Jersey are debatingwhether to do the same.

The reactors, which are being squeezed by low natural gas prices, offer a singular advantage in the fight against global warming because they produce round-the-clock electricity without emitting greenhouse gases. Yet renewable energy operators including NRG Energy Inc. and Invenergy LLC say keeping nuclear plants open will leave grids awash with excess power, leaving little demand for new wind and solar farms.

“It’s the wrong policy — and whether it proliferates or not is going to be a really big factor,” Invenergy Chief Operating Officer Jim Murphy said during a panel discussion at the Bloomberg New Energy Finance conference in New York Monday.



“Renewable energy operators say keeping nuclear plants open will leave grids awash with excess power, leaving little demand for new wind and solar farms.”


Keeping the “grids awash with excess power” is the only way to handle bellwether events without having to rely on brownouts and blackouts.  Solar and wind can neither provide base-load nor flexible response to bellwether events.  Increasing reliance on these resources make it imperative that we keep the “grids awash with excess power.”

Getting back to the Bloomberg article, there appears to be a lot of whining about subsidies for nuclear power… With the renewables crowd doing all of the whining:


Nuclear’s economic woes comes as wind and solar are starting to show they’re cheap enough to compete with traditional generators, after years of help from subsidies. The push to aid reactors began last year after Exelon Corp. successfully argued in New York and Illinois that since nuclear does not contribute to global warming, its plants should receive a premium to help level the playing field with wind and solar.

“The fossil generators sell electricity with air pollution,” Joseph Dominguez, an Exelon executive vice president, said in an interview. “We sell electricity without air pollution — and that’s a different product.”

There are key differences between wind and solar subsidies and those for nuclear, according to clean-energy developers. Renewable energy credits have spurred an emerging industry, whereas nuclear subsidies are to preserve aging plants. And while wind and solar developers compete against each other for subsidies, those for nuclear benefit a single technology.

Market Rules

“The renewables industry has been playing by competitive market rules that have helped to produce good prices,” Amy Francetic, an Invenergy senior vice president, said in an interview. “This is picking and winners and losers in a way that’s troubling.”


Nuclear power absolutely is the leader of the pack at reducing so-called “greenhouse” gas emissions:


Figure 1. Nuclear and gas kick @$$, wind breaks even and solar is a loser.


“The renewables industry has been playing by competitive market rules that have helped to produce good prices,” Amy Francetic, an Invenergy senior vice president, said in an interview. “This is picking and winners and losers in a way that’s troubling.”

Really?  Ms. Francetic, *government* always picks “winners and losers in a way that’s troubling.”

As far as the renewables industry “playing by competitive market rules that have helped to produce good prices”…


Figure 2. Ms. Francetic, Data is laughing at you.

The most recent U.S. Energy Information Administration report on energy subsidies reveals the following:


Solar and wind power are insignificant sources of energy.

Energy Subsidies1

Figure 3a. U.S. Energy production by source 2010 & 2013 (trillion Btu), U.S. Energy Information Administration.


Figure 3b. U.S. primary energy production 1981-2015 (million tonnes of oil equivalent), BP 2016 Statistical Review of World Energy.


Solar and wind power receive massive Federal subsidies.

Energy 2

Figure 4. Federal subsidies by energy source 2010 and 2013 (million 2013 US dollars), U.S. Energy Information Administration.


The solar and wind subsidies are truly massive in $/Btu.

Energy Subsidies3

Figure 5. Subsidies per unit of energy by source ($/mmBtu), U.S. Energy Information Administration.


The true folly of solar power is most apparent in subsidies per kilowatt-hour of electricity generation.  At 23¢/kWh, the solar subsidies in 2013 were nearly twice the average U.S. residential retail electricity rate.

Energy Subsidies4

Figure 6. Subsidies per kilowatt-hour of electricity generation, U.S. Energy Information Administration.


Solar and wind subsidies are weighted toward direct expenditures of tax dollars.

Energy Subsidies5

Figure 7. Subsidies by type for wind, solar, nuclear, coal and natural gas & petroleum liquids, U.S. Energy Information Administration.  Table ES2.


Federal solar and wind subsidies were 3-4 times that of nuclear power in 2013.  Only 2% of the nuclear power subsidies consisted of direct expenditures, compared to 72% and 56% for solar and wind power respectively… And the renewables industry has the gall to complain about New York and Illinois kicking in $500 and $235 million per year in extra subsidies to keep nuclear power plants running in their States.  Really?

Most of the Federal subsidies for oil & gas (96%), coal (71%) and nuclear power (67%) consist of tax breaks.  The subsidies for oil & gas aren’t really even subsidies.  These are standard tax deductions and depreciation of assets.

Solar power simply can’t work without massive subsidies.  While the economics of wind power are improving, renewables are still extremely expensive relative to existing coal and nuclear power plants.







“Hard Lessons From the Great Algae Biofuel Bubble”

April 20, 2017

Guest post by David Middleton


From 2005 to 2012, dozens of companies managed to extract hundreds of millions in cash from VCs in hopes of ultimately extracting fuel oil from algae.

CEOs, entrepreneurs and investors were making huge claims about the promise of algae-based biofuels; the U.S. Department of Energy was also making big bets through its bioenergy technologies office; industry advocates claimed that commercial algae fuels were within near-term reach.

Jim Lane of Biofuels Digest authored what was possibly history’s least accurate market forecast, projecting that algal biofuel capacity would reach 1 billion gallons by 2014. In 2009, Solazyme promised competitively priced fuel from algae by 2012. Algenol planned to make 100 million gallons of ethanol annually in Mexico’s Sonoran Desert by the end of 2009 and 1 billion gallons by the end of 2012 at a production rate of 10,000 gallons per acre. PetroSun looked to develop an algae farm network of 1,100 acres of saltwater ponds that could produce 4.4 million gallons of algal oil and 110 million pounds of biomass per year.

Nothing close to 1 billion (or even 1 million) gallons has yet been achieved — nor has competitive pricing.


The promise of algae is tantalizing. Some algal species contain up to 40 percent lipids by weight, a figure that could be boosted further through selective breeding and genetic modification. That basic lipid can be converted into diesel, synthetic petroleum, butanol or industrial chemicals.

According to some sources, an acre of algae could yield 5,000 to 10,000 gallons of oil a year, making algae far more productive than soy (50 gallons per acre), rapeseed (110 to 145 gallons), jatropha (175 gallons), palm (650 gallons), or cellulosic ethanol from poplars (2,700 gallons).


Green Tech Media

“VC” refers to venture capitalists.  I had to look it up because I didn’t think the Viet Cong were still in business.

The problem with algal biofuel is this:

According to some sources, an acre of algae could yield 5,000 to 10,000 gallons of oil a year, making algae far more productive than… 

10,000 gallons is 238 barrels per acre.  A typical oil well in the Gulf of Mexico yields 300-500 barrels per acre*foot and a typical reservoir is 50-100′ thick.  This works out to 15,000 to 50,000 barrels per acre over the life of the well.  Assuming the well produced for 10 years, this works out to 1,500 to 5,000 barrels per acre per year.

Gallons of Oil per Acre per Year Min Max
Algae          5,000          10,000
Typical GOM Oil Field       63,000       210,000

Granted, there are a lot of differences between crude oil and algal oil… And, hypothetically, the acre of algae is “renewable”… However, 1 acre of algae requires 1 acre of land.  An oil well only requires the acreage that it’s production facility covers.  Oil reservoirs can cover 100’s or 1,000’s of acres, can be well over 100′ thick and often occur in stacked sequences.

Shell’s Mars oil field (Mississippi Canyon 807) has produced about 1.3 billion barrels of oil and 1.7 trillion cubic feet of natural gas since 1996.  This works out to about 1.6 billion barrels of oil equivalent (BOE).  The “footprint” of the field (platform + outline of directional wells) covers about 11,000 acres.  The field has averaged over 6,700 BOE (over 280,000 gallons) per acre per year from 1996-2016.

Gallons of Oil per Acre per Year Max
Algae       10,000
Mars Oil Field     282,360

After 20 full years of production Mars is still going strong.  In 2016, it produced over 6,000 BOE per acre.

It’s refreshing to see that some of the green energy herd is capable of learning lessons… But I doubt they learned this lesson.

Clean Coal: Carbon Capture and Enhanced Oil Recovery

April 18, 2017

Petra Nova2


WASHINGTON, D.C. — Secretary of Energy Rick Perry took part in a ribbon-cutting ceremony today to mark the opening of Petra Nova, the world’s largest post-combustion carbon capture project, which was completed on-schedule and on-budget. The large-scale demonstration project, located at the W.A. Parish power plant in Thompsons, Texas, is a joint venture between NRG Energy (NRG) and JX Nippon Oil & Gas Exploration Corporation (JX).

“I commend all those who contributed to this major achievement,” said Secretary Perry. “While the Petra Nova project will certainly benefit Texas, it also demonstrates that clean coal technologies can have a meaningful and positive impact on the Nation’s energy security and economic growth.”

Funded in part by the U.S. Department of Energy (DOE) and originally conceived as a 60-megawatt electric (MWe) capture project, the project sponsors expanded the design to capture emissions from 240 MWe of generation at the Houston-area power plant, quadrupling the size of the capture project without additional federal investment. During performance testing, the system demonstrated a carbon capture rate of more than 90 percent.

At its current level of operation, Petra Nova will capture more than 5,000 tons of carbon dioxide (CO2) per day, which will be used for enhanced oil recovery (EOR) at the West Ranch Oil Field. The project is expected to boost production at West Ranch from 500 barrels per day to approximately 15,000 barrels per day. It is estimated that the field holds 60 million barrels of oil recoverable from EOR operations.

The successful commencement of Petra Nova operations also represents an important step in advancing the commercialization of technologies that capture CO2 from the flue gas of existing power plants. Its success could become the model for future coal-fired power generation facilities. The addition of CO2 capture capability to the existing fleet of power plants could support CO2 pipeline infrastructure development and drive domestic EOR opportunities.

U.S. Department of Energy

The Petra Nova carbon capture system was installed in the W.A. Parish generation station.  This is the largest and cleanest fossil fuel generaton station in the United States:

W.A. Parish Electric Generation Station, Thompson, Texas

Owner/operator: Texas Genco Holdings Inc.

Texas Genco has invested heavily in upgrading its W.A. Parish coal- and gas-fired plant southwest of Houston. Although this nine-unit, 3,653-MW plant is the largest fossil-fueled plant in America, its NOx emissions have been reduced to microscopic levels. Based on those levels, W.A. Parish could rightly claim that it is among the cleanest coal plants in the U.S.

Texas Genco’s W.A. Parish Electric Generation Station (WAP) is the largest coal- and gas-fired power facility in the U.S. based on total net generating capacity. It and its owner, Texas Genco Holdings Inc., operate in the Electric Reliability Council of Texas (ERCOT), one of the largest electric power markets in the nation. Over the past few years, the majority-owned subsidiary of Houston-based CenterPoint Energy Inc. has met the challenge of adding emissions control equipment to these baseload units while maintaining the availability and reliability required by ERCOT’s competitive market.

In the process, Texas Genco has emerged as an industry leader at reducing emissions and demonstrating new NOx-control technologies. The company’s fleet of plants operates at one of the lowest NOx emission rates in the country, and WAP likely emits less NOx on a lb/MMBtu basis than any coal-fired plant of any size in the U.S. Cleanliness is costly; the company has spent more than $700 million on new emission controls since 1999.

With the commissioning of another round of emissions-control equipment this year, NOx emissions from Texas Genco’s Houston-area power plants—including WAP—will be 88% lower than 1998 levels. These actions play a major role in the Houston/Galveston Area Ozone State Implementation Plan and are helping to clean the air in the greater Houston area. To honor the accomplishment, the W.A. Parish plant was recently given the Facility Award by the Power Industry Division of the Instrumentation, Systems, and Automation Society (Research Triangle Park, N.C.) for installing equipment to reduce emissions and improve reliability while minimizing operational costs.



The W.A. Parrish Generation Station has a generating capacity of about 3,660 MW (2,740 MW of coal and 1,190 MW of natural gas capacity).  Its total capacity is approximately the same as the ten largest solar PV plants in the U.S. combined (3,713 MW).  From 2002-2009, W.A. Parrish operated at 85% of capacity.  The war on coal gradually reduced its operations to 57% of capacity in 2016.

The Petra Nova carbon capture system will enable the plant to capture about 90% of the CO2 from 240 MW of its coal capacity.  It is expected to capture about 1.6 million tons of CO2 per year.  The cost of the carbon capture system was approximately $1 billion, with the taxpayers picking up 19% of the tab.  Normally, I would call this a pointless waste of money.  It won’t have any effect on atmospheric CO2 or the weather.  However, this carbon capture system actually serves a useful purpose:


NRG Petra Nova Fact Sheet

The Captured CO2 will employ Enhanced Oil Recovery to enhance production at the West Ranch oil field, which is operated by Hilcorp Energy Company. It is expected that oil production will be boosted from around 300 barrels per day today to up to 15,000 barrels per day while also sequestering CO2 underground. This field is currently estimated to hold approximately 60 million barrels of oil recoverable from EOR operations

How Carbon Capture Works

The Carbon Capture and Enhanced Oil Recvoery Project
The Carbon Dioxide Capture Process
Beneficial use of the captured Carbon Dioxide

Download high resolution images


The West Ranch oil field has produced about 390 million barrels of oil since 1938. CO2 injection will boost the production from 300 to as much as 15,000 barrels of oil per day.  The EOR could lead to the recovery of 60 million barrels of oil that would otherwise be “left in the ground.”  Irony is such a beautiful thing!  

And the really cool thing about this project: It makes money!


NRG’s Petra Nova Plant Captures Carbon, Boosts Bottom Line

An interview with David Greeson, Vice President of Development, NRG Energy Inc.

by Brian Wellborn

NRG Energy Inc. (NRG) and JX Nippon Oil & Gas Exploration jointly operate the Petra Nova Carbon Capture project, the world’s largest retrofit post-combustion carbon capture system, at the W.A. Parish Generating Station southwest of Houston.

Fiscal Notes recently spoke with NRG Vice President of Development David Greeson to discuss the Petra Nova project and learn what makes its capture system unique, environmentally sound and profitable.

Fiscal Notes: What are Petra Nova’s broad environmental goals?

David Greeson: The goal of the Petra Nova project is to capture more than 90 percent of the carbon dioxide (CO2) in the exhaust flue gas from an existing coal-fired unit at the W.A. Parish power plant. We want to prove it’s feasible to build a carbon capture system on schedule and on budget. Demonstrating the system working at full commercial scale will provide a path forward to address CO2 emissions from existing coal-fired plants, both in the U.S. and around the world.

In addition, we’re looking to create a commercial structure that couples power generation with oil recovery for potential long-term viability — not only to pay for the carbon capture and storage system but also to provide an economic return for investors.


Fiscal Notes: How economically viable is Petra Nova’s carbon capture process?

Greeson: As long as oil is priced at around $50 per barrel or above, sales of the oil from the West Ranch field will pay for the Petra Nova project.



The price of CO2 for EOR projects is generally pegged to the price of oil.  At >$50/bbl, the sale of the CO2 to Hilcorp will pay for the carbon capture system.  Projects like this do not need subsidies.

This will enable the coal-fired plants to operate at a higher capacity and prevent 60 million barrels of oil from becoming “stranded assets.”  I just love irony!

The Good, the Bad and the Null Hypothesis

April 4, 2017


When debating the merits of the CAGW (catastrophic anthropogenic global warming) hypothesis, I often encounter this sort of straw man fallacy:

All that stuff is a distraction. Disprove the science of the greenhouse effect. Win a nobel prize get a million bucks. Forget the models and look at the facts. Global temperatures are year after year reaching record temperatures. Or do you want to deny that.


This is akin to arguing that one would have to disprove convection in order to falsify plate tectonics or genetics in order to falsify evolution.  Plate tectonics and evolution are extremely robust scientific theories which rely on a combination of empirical and correlative evidence.  Neither theory can be directly tested through controlled experimentation.  However, both theories have been tested through decades of observations.  Subsequent observations have largely conformed to these theories.

Note: I will not engage in debates about the validity of the scientific theories of plate tectonics or evolution.

The power of such scientific theories is demonstrated through their predictive skill: Theories are predictive of subsequent observations.  This is why a robust scientific theory is even more powerful than facts (AKA observations).

CAGW is a similar type of theory hypothesis.  It relies on empirical science (the “good”) and correlative “science” (the “bad”).

The Good

Carbon dioxide is a so-called “greenhouse” gas.  It retards radiative cooling.  All other factors held equal, increasing the atmospheric concentration of CO2 will lead to a somewhat higher atmospheric temperature.  However, all other things are never held equal.


Figure 1. “Greenhouse” gas spectra.

Atmospheric CO2 has risen since the 19th century.


Figure 2. Atmospheric CO2 from instrumental records, Antarctic ice cores and plant stomata.

Humans are responsible for at least half of this rise in atmospheric CO2.


Figure 3. Natural sources probably account for ~50% of the rise in atmospheric CO2 since 1750.

While anthropogenic sources are a tiny fraction of the total, we are removing carbon from geologic sequestration and returning it to the active carbon cycle.

The average temperature of Earth’s surface and troposphere has generally risen over the past 150 years.


Figure 5. Surface temperature anomalies: BEST (land only), HadCRUT4 & GISTEMP. Satellite lower troposphere: UAH & RSS.

The Bad

The modern warming began long before the recent rise in atmospheric CO2 and prior to the 19th century temperature and CO2 were decoupled:


Figure 6. Temperature reconstruction (Moberg et al., 2005) and Law Dome CO2 (MacFarling Meure et al., 2006)

The recent rise in temperature is no more anomalous than the Medieval Warm Period or the Little Ice Age:


Figure 7. Temperature reconstruction (Ljungqvist, 2010), northern hemisphere instrumental temperature (HadCRUT4) and Law Dome CO2 (MacFarling Meure et al., 2006). Temperatures are 30-yr averages to reflect changing climatology.

Over the past 2,000 years, the average temperature of the Northern Hemisphere has exceeded natural variability (defined as two standard deviations from the pre-1865 mean) three times: 1) the peak of the Medieval Warm Period 2) the nadir of the Little Ice Age and 3) since 1998.  Human activities clearly were not the cause of the first two deviations.  70% of the warming since the early 1600’s clearly falls within the range of natural variability.

While it is possible that the current warm period is about 0.2 °C warmer than the peak of the Medieval Warm Period, this could be due to the differing resolutions of the proxy reconstruction and instrumental data:


Figure 8. The instrumental data demonstrate (higher frequency and higher amplitude temperature variations than the proxy reconstructions.

The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. AD 1-300, reaching up to the 1961-1990 mean temperature level, followed by the Dark Age Cold Period c. AD 300-800. The Medieval Warm Period is seen c. AD 800–1300 and the Little Ice Age is clearly visible c. AD 1300-1900, followed by a rapid temperature increase in the twentieth century. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961-1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself.


The proxy reconstruction itself does not show such an unprecedented warming but we must consider that only a few records used in the reconstruction extend into the 1990s. Nevertheless, a very cautious interpretation of the level of warmth since AD 1990 compared to that of the peak warming during the Roman Warm Period and the Medieval Warm Period is strongly suggested.


The amplitude of the temperature variability on multi-decadal to centennial time-scales reconstructed here should presumably be considered to be the minimum of the true variability on those time-scales.


Ljungqvist, 2010


Figure 9. Ljungqvist demonstrates that the modern warming has not unambiguously exceeded the range of natural variability. The bold black dashed line is the instrumental record. I added The red lines to highlight the margin of error.

The climate of the Holocene has been characterized by a roughly millennial cycle of warming and cooling (for those who don’t like the word “cycle,” pretend that I typed “quasi-periodic fluctuation):


Figure 10. Millennial cycle apparent on Ljungqvist reconstruction.


Figure 11. Millennial scale cycle apparent on Moberg reconstruction.

These cycles (quasi-periodic fluctuations) even have names:


Figure 12. Late Holocene climate cycles (quasi-periodic fluctuations).

These cycles have been long recognized by Quaternary geologists:


Figure 12. The millennial scale climate cycle can clearly be traced back to the end of the Holocene Climatic Optimum and the onset of the Neoglaciation.

Fourier analysis of the GISP2 ice core clearly demonstrates that the millennial scale climate cycle is the dominant signal in the Holocene (Davis & Bohling, 2001).


Figure 13. The Holocene climate has been dominated by a millennial scale climate cycle.

The industrial era climate has not changed in any manner inconsistent with the well-established natural millennial scale cycle. Assuming that the ice core CO2 is reliable, the modern rise in CO2 has had little, if any effect on climate.

The Null Hypothesis

What is a ‘Null Hypothesis’

A null hypothesis is a type of hypothesis used in statistics that proposes that no statistical significance exists in a set of given observations. The null hypothesis attempts to show that no variation exists between variables or that a single variable is no different than its mean. It is presumed to be true until statistical evidence nullifies it for an alternative hypothesis.

Read more: Null Hypothesis
Follow us: Investopedia on Facebook

Since it is impossible to run a controlled experiment on Earth’s climate (there is no control planet), the only way to “test” the CAGW hypothesis is through models.  If the CAGW hypothesis is valid, the models should demonstrate predictive skill.  The models have utterly failed:


Figure 14. “95% of Climate Models Agree: The Observations Must be Wrong.”


Figure 15. “Climate models versus climate reality.” Michaels & Knappenberger.

The models have failed because they result in a climate sensitivity that is 2-3 times that supported by observations:


Figure 15. Equilibrium climate sensitivity: Reality vs. Models.

From Hansen et al. 1988 through every IPCC assessment report, the observed temperatures have consistently tracked the strong mitigation scenarios in which the rise in atmospheric CO2 has been slowed and/or halted.

Apart from the strong El Niño events of 1998 and 2015-16, GISTEMP has tracked Scenario C, in which CO2 levels stopped rising in 2000, holding at 368 ppm.


Figure 16. Hansen’s 1988 model and GISTEMP.

The utter failure of this model is most apparent on the more climate-relevant 5-yr running mean:


Figure 17. Hansen’s 1988 model and GISTEMP, 5-yr running mean.

This is from IPCC’s First Assessment Report:


Figure 18.  IPCC First Assessment Report (FAR).  Model vs. HAdCRUT4.

HadCRUT4 has tracked below Scenario D.


Figure 19. IPCC FAR scenarios.

This is from the IPCC’s Third Assessment Report (TAR):


Figure 20. IPCC TAR model vs. HadCRUT4.

HadCRUT4 has tracked the strong mitigation scenarios, despite a general lack of mitigation.

The climate models have never demonstrated any predictive skill.

And the models aren’t getting better. Even when they start the model run in 2006, the observed temperatures consistently track at or below the low end 5-95% range (p05-p95).  Observed temperatures only approach the model mean (P50) in 2006, 2015 and 2016.


Figure 21.  Climate Lab Book. Comparing CMIP5 & observations.

The ensemble consists of 138 model runs using a range of representative concentration pathways (RCP), from a worst case scenario RCP 8.5, often referred to as “business as usual,” to varying grades of mitigation scenarios (RCP 2.6, 4.5 and 6.0).


Figure 22. Figure 21 with individual model runs displayed.


When we drill wells, we run probability distributions to estimate the oil and gas reserves we will add if the well is successful.  The model inputs consist of a range of estimates of reservoir thickness, area and petrophysical characteristics.  The model output consists of a probability distribution from P10 to P90.

  • P10 = Maximum Case.  There is a 10% probability that the well will produce at least this much oil and/or gas.
  • P50 = Mean Case.  There is a 50% probability that the well will produce at least this much oil and/or gas.  Probable reserves are >P50.
  • P90 = Minimum Case.  There is a 90% probability that the well will produce at least this much oil and/or gas.  Proved reserves are P90.

Over time, a drilling program should track near P50.  If your drilling results track close to P10 or P90, your model input is seriously flawed.

If the CMIP5 model ensemble had predictive skill, the observations should track around P50, half the runs should predict more warming and half less than is actually observed. During the predictive run of the model, HadCRUT4.5 has not *tracked* anywhere near P50…


Figure 23. Figure 21 zoomed in on model run period with probability distributions annotated.

I “eyeballed” the instrumental observations to estimate a probability distribution of predictive run of the model.

Prediction Run Approximate Distribution

2006 P60 (60% of the models predicted a warmer temperature)
2007 P75
2008 P95
2009 P80
2010 P70
2011-2013 >P95
2014 P90
2015-2016 P55

Note that during the 1998-99 El Niño, the observations spiked above P05 (less than 5% of the models predicted this). During the 2015-16 El Niño, HadCRUT only spiked to P55.  El Niño events are not P50 conditions. Strong El Niño and La Niña events should spike toward the P05 and P95 boundaries.

The temperature observations are clearly tracking much closer to strong mitigation scenarios rather than RCP 8.5, the bogus “business as usual” scenario.

The red hachured trapezoid indicates that HadCRUT4.5 will continue to track between less than P100 and P50. This is indicative of a miserable failure of the models and a pretty good clue that the models need be adjusted downward.

In any other field of science CAGW would be a long-discarded falsified hypothesis.


Claims that AGW or CAGW have earned an exemption from the Null Hypothesis principle are patently ridiculous.

In science, a broad, natural explanation for a wide range of phenomena. Theories are concise, coherent, systematic, predictive, and broadly applicable, often integrating and generalizing many hypotheses. Theories accepted by the scientific community are generally strongly supported by many different lines of evidence-but even theories may be modified or overturned if warranted by new evidence and perspectives.

UC Berkeley

This is not a scientific hypothesis.  It is arm waving:

More CO2 will cause some warming.

This is a scientific hypothesis:

A doubling of atmospheric CO2 will cause the lower troposphere to warm by ___ °C.

Thirty-plus years of failed climate models never been able to fill in the blank.  The IPCC’s Fifth Assessment Report essentially stated that it was no longer necessary to fill in the blank.

While it is very likely that human activities are the cause of at least some of the warming over the past 150 years, there is no robust statistical correlation.  The failure of the climate models clearly demonstrates that the null hypothesis still holds true for atmospheric CO2 and temperature.

Selected References

Davis, J. C., and G. C. Bohling, The search for patterns in ice-core temperature curves, 2001, in L. C. Gerhard, W. E. Harrison, and B. M. Hanson, eds., Geological perspectives of global climate change, p. 213–229.

Finsinger, W. and F. Wagner-Cremer. Stomatal-based inference models for reconstruction of atmospheric CO2 concentration: a method assessment using a calibration and validation approach. The Holocene 19,5 (2009) pp. 757–764

Grosjean, M., Suter, P. J., Trachsel, M. and Wanner, H. 2007. Ice-borne prehistoric finds in the Swiss Alps reflect Holocene glacier fluctuations. J. Quaternary Sci.,Vol. 22 pp. 203–207. ISSN 0267-8179.

Hansen, J., I. Fung, A. Lacis, D. Rind, Lebedeff, R. Ruedy, G. Russell, and P. Stone, 1988: Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. J. Geophys. Res., 93, 9341-9364, doi:10.1029/88JD00231.

Kouwenberg, LLR, Wagner F, Kurschner WM, Visscher H (2005) Atmospheric CO2 fluctuations during the last millennium reconstructed by stomatal frequency analysis of Tsuga heterophylla needles. Geology 33:33–36

Ljungqvist, F.C. 2009. N. Hemisphere Extra-Tropics 2,000yr Decadal Temperature Reconstruction. IGBP PAGES/World Data Center for Paleoclimatology Data Contribution Series # 2010-089. NOAA/NCDC Paleoclimatology Program, Boulder CO, USA.

Ljungqvist, F.C. 2010. A new reconstruction of temperature variability in the extra-tropical Northern Hemisphere during the last two millennia. Geografiska Annaler: Physical Geography, Vol. 92 A(3), pp. 339-351, September 2010. DOI: 10.1111/j.1468-459.2010.00399.x

MacFarling Meure, C., D. Etheridge, C. Trudinger, P. Steele, R. Langenfelds, T. van Ommen, A. Smith, and J. Elkins. 2006. The Law Dome CO2, CH4 and N2O Ice Core Records Extended to 2000 years BP. Geophysical Research Letters, Vol. 33, No. 14, L14810 10.1029/2006GL026152.

Moberg, A., D.M. Sonechkin, K. Holmgren, N.M. Datsenko and W. Karlén. 2005. Highly variable Northern Hemisphere temperatures reconstructed from low-and high-resolution proxy data. Nature, Vol. 433, No. 7026, pp. 613-617, 10 February 2005.

Instrumental Temperature Data from Hadley Centre / UEA CRU, NASA Goddard Institute for Space Studies and Berkeley Earth Surface Temperature Project via Wood for Trees.

China Coal

April 4, 2017

Source: 2016 BP Statistical Review of World Energy


March 21, 2017


Goebel et al., 2008

March 14, 2017

The late pleistocene dispersal of modern humans in the americas (Goebel et al 2008)



Hubbert 1956

March 13, 2017


Hasegawa 1/72 scale Douglas A-1 Skyraider

December 24, 2016


Vox’s David Roberts: “Consilience” or just plain silliness?

December 14, 2015

True climate science denier, David Roberts (formerly with Grist) has authored another truly vapid article for Vox …

The 2 key points that climate skeptics miss

Updated by David Roberts on December 11, 2015

Arguments between climate skeptics (or whatever the hell we’re calling them now) and their opponents very frequently devolve into hypertechnical squabbles over particular scientific issues like sea surface temperatures or Milankovitch cycles (don’t ask).

Generally speaking, this is a Bad Thing. Technical scientific disputes are of limited interest the general public — especially technical disputes litigated with great partisan venom. A discussion dominated by such disputes just causes most people to tune out entirely. What’s more, it creates the illusion that the validity of climate science hinges on how these squabbles are resolved. It doesn’t.


Vox (whatever the hell that is)


“Define Irony”

Mr. Roberts actually denied the scientific method prior to making his first point. Science is the process of formulating systematic explanations for observations (hypotheses) and then testing and upholding or falsifying those hypotheses.  Science is not about formulating slogans that non-scientists can “tune in” to.

Read the rest of this entry »