Archive for the ‘Uncategorized’ Category

October 20, 2017

The movie Deepwater Horizon is probably the only movie ever made that actually tried to realistically depict oil drilling operations.  While it didn’t get every detail right, it was compellingly realistic (too realistic for me watching it on IMAX) and told the story of how ordinary people, just doing their jobs, can become heroes when everything goes wrong.  I won’t go into detail about everything that went wrong leading up to the terrible disaster on April 20, 2010.  BP’s Deepwater Horizon Accident Investigation Report is fairly comprehensive.  Ultimately it boiled down to the normalization of deviance.  The 1986 Challenger space shuttle disaster has also been attributed to the normalization of deviance.  When dangerous jobs become routine, corners get cut, people become complacent and a sense of impunity sets in.  The safety director for my first employer, Enserch Exploration used to start almost every safety meeting with this question and answer:

What kills the most people in industrial accidents?  Impunity.

The Deepwater Horizon disaster caused the entire industry recommit itself to rigorous adherence to safety procedures… Because no one wants go to work and not come home.

Deepwater Horizon Myths

BP’s prospect was located in Mississippi Canyon Block 252 (MC 252).  It was called “Macondo.” At the time of the blowout and efforts to regain control of the well, a lot of myths were propagated.  People said things like, “The well encountered the highest pressures ever recorded”… “Macondo was the largest oil discovery in the world”… “BP was keeping the geological data secret – Not even revealing it to the government.”  In reality, there was nothing particularly anomalous about Macondo.


Figure 1. (Southeastern Geophysical Society) Left: Gamma ray log, sandstone deflects to the left of the shale baseline.  Right: Electrical resistivity log, oil/gas deflects to right of shale baseline, saltwater deflects to the left.  It’s a nice looking log; but not really very spectacular looking. The really interesting thing to me was the pore pressures. The main pay sands from 18,075′ to 18,155′ are not abnormally pressured. 12.5 to 12.6 pounds per gallon is actually kind of low for that depth range. It also appears that they may have encountered a pressure inversion. Geopressure generally increases with depth. The pressure at 17,730′ was 14.1 ppg and 13.0 ppg at 17,820′. They were drilling the well with 14.5 ppg mud and were having problems with losing mud into the formation.

Macondo was estimated by BP to be a 50 million barrel discovery.  Kind of small by major oil company standards… A “home run” by smaller independent oil company standards.

One blogger actually wrote this in June 2010:

“No one outside of BP knows the details of the geology under the well site because BP did the geological survey and refuses to release the information – classifying it as proprietary trade secrets.”

BP’s partners (Anadarko and Mitsui) knew exactly what BP did about the geology. The MMS has all of the data that BP has. Operators had to provide all data to the MMS (now BOEM) – even on “tite holes” and proprietary geophysical surveys. All of the companies that bid against BP in OCS 206 on March 19, 2008 knew at least as much about the geology as BP did. BP’s high bid barely beat out smaller independent oil company LLOG Exploration…

  1. BP Exploration & Production Inc. $34,003,428.00
  2. LLOG Exploration Offshore, Inc. $33,625,000.00
  3. Noble Energy, Inc. $17,225,650.00
  4. Red Willow Offshore, LLC $14,075,000.00
  5. Eni Petroleum US LLC $4,577,115.00
  6. Anadarko E&P Company LP $2,145,950.00

Only one of BP’s competitors for the lease, Eni, was a major oil company. The rest were small, mid-sized and large independents. All of those companies knew enough about the geology to bid on the lease. I don’t work that particular area, but I knew enough about the geology to know the approximate size of the reservoir, thickness of the sands and that the sands are Middle Miocene age and trapped against a Cretaceous unconformity. Any company that is a member of the Offshore Oil Scouts Association (OOSA) also knew a great deal about the drilling procedures and hole conditions.

From April through July 2010, the blowout spilled an estimate 4.9 million barrels of oil into the Gulf of Mexico.  By mid-July, the well was capped.  By August, most of the oil was gone… Either recovered by clean up procedures, evaporated, burned and/or consumed by microbes.

Deepwater Horizon Perspective

Just prior to the Macondo blowout, this was on the MMS (now BOEM) website:

Since 1980, OCS operators have produced 4.7 billion barrels (bbl) of oil and spilled only 0.001 percent of this oil, or 1 bbl for every 81,000 bbl produced. In the last 15 years, there have been no spills greater than 1,000 bbl from an OCS platform or drilling rig. The spill risk related to a diesel spill from drilling operations is even less. During the 10-year period (1976-1985) in which data were collected, there were 80 reported diesel spills greater than one barrel associated with drilling activities, compared with 11,944 wells drilled, or a 0.7 percent probability of occurrence. For diesel spills greater than 50 bbls, only 15 spills have occurred, or a 0.1 percent probability.

Natural seepage of oil in the Gulf of Mexico (unrelated to natural gas and oil industry operations) is far more extensive. Researchers have estimated a natural seepage rate of about 120,000 bbl per year from one area (23,000 square kilometers) offshore of Louisiana.

U.S. Minerals Management Service ca July 2010

This passage disappeared from the website shortly after the blowout.

Of the nearly 53,000 wells drilled in the Federal waters of the Gulf of Mexico since 1947, there has been one Macondo.

Relative to the number of wells drilled and volume of hydrocarbons produced, the volume of oil spilled in the history of oil & gas drilling operations in the Gulf of Mexico has been minuscule.


Figure 2. Gulf of Mexico Crude Oil Production 1981-2013 (US EIA), oil spills 1963-2013 (US BSEE), natural oil seeps (NAP).

Having trouble seeing the spills?  Here’s a plot of just the spills and natural seepage estimate:


Figure 3. Oil spills and natural seeps (note y-axis is logarithmic).

Putting this disaster into perspective is in no way meant to diminish this terrible tragedy.  Eleven men died in this disaster.  However, our government’s reaction to it was, in itself, a disaster.  Thirty three rigs that were drilling in deepwater were forced to shut down and temporarily abandon the wells they were drilling.  This created an even greater accident risk than allowing them to complete the wells they were drilling.  The Obama administration’s unlawful drilling moratorium and subsequent “permitorium,” led to the loss of over 200,000 bbl/day of oil production from 2010-2015:


Figure 4. Federal offshore Gulf of Mexico oil production. Hurricane damage vs. Obama damage.

Deepwater Horizon EpiLOGG

Did you ever wonder if anyone ever went back in and re-drilled Macondo?

10/12: LLOG Exploration is making waves in the Gulf


When the Obama administration announced a moratorium on deepwater drilling in the Gulf of Mexico in 2010, politicians and industry spokespeople howled. The explosion on the Deepwater Horizon rig at the Macondo well was tragic, some said, but there was no reason to shut down an entire industry sector that otherwise was following the rules and operating safely.

Conspiracy theories that President Barack Obama would try to kill the industry with onerous new regulations proliferated in some circles. Even if the moratorium eventually was lifted, the naysayers said, companies might take their rigs, jobs and tax money to new locales and not come back for many years.

But at LLOG Exploration, it was no time to panic.



LLOG was founded in 1977, primarily to develop prospects in south Louisiana. As the company grew, its focus expanded to include the depths of the Gulf of Mexico.

In 2004, LLOG purchased seismic data covering a portion of the Gulf known to the offshore industry as the Mississippi Canyon.


LLOG Exploration at a glance

  • Founded: 1977 in Metairie
  • President and CEO: Scott Gutterman, who joined the company in 1993 and became CEO in 2007
  • Headquartered in Covington, with offices in Scott and Houston
  • 170 employees
  • Ranked in 2014 as the top privately owned liquid producer in the United States
  • One of the top 20 exploration and production companies in the Gulf of Mexico, public or private
  • Has drilled more than 350 wells in the Gulf of Mexico and the Texas/Louisiana Gulf Coast since 2001
  • 2014 net production: 26,000 BOE (barrels of oil equivalent) per day
    Source: LLOG

This article was originally published in the Spring 2015 issue of 10/12 Industry Report.

Greater Baton Rouge Business Report

LLOG Exploration was the second-highest bidder on MC 252.  In the wake of Macondo, LLOG was able to purchase the lease from BP and put together a strong lease position in the area.

Drilling To Start at Macondo Reservoir

by The Associated Press|Cain Burdeau|Wednesday, May 13, 2015

NEW ORLEANS (AP) — Deep-water drilling is set to resume near the site of the catastrophic BP PLC well blowout that killed 11 workers and caused the largest U.S. offshore oil spill five years ago off the coast of Louisiana.

A Louisiana-based oil company, LLOG Exploration Offshore LLC, plans to drill into the Macondo reservoir, according to federal records reviewed by The Associated Press.


Richard Charter, a senior fellow with the Ocean Foundation and a longtime industry watchdog, said drilling into that reservoir has proved very dangerous and highly technical, and it raises questions about whether a small company like LLOG has the financial means to respond to a blowout similar to BP’s.

Eric Smith, associate director of the Tulane University Energy Institute in New Orleans, dismissed those concerns. He called LLOG “an extremely well-financed and well-organized” company.

“If I were to pick anyone to go into that field after so many problems, I would pick LLOG,” Smith said. “They have demonstrated their ability to drill in the area.”

Since 2010, LLOG has drilled eight wells in the area in “analogous reservoirs at similar depths and pressures,” Fowler said. The company has drilled more than 50 wells in the Gulf since 2002, he said.

He said the company has studied the investigations into the Macondo disaster and “ensured the lessons from those reports are accounted for in our design and well procedures.”



LLOG Exploration renamed the prospect “Niedermeyer”… part of an Animal House theme (we named our deepwater prospects after Caddyshack characters).  Niiedermeyer was a nice discovery.

  • Four wells on MC 208, 209, 252 and 253.  Feb. 2015 through July 2017.
  • 21.7 million barrels of oil (mmbo) and 57.5 billion cubic feet (bcf) of natural gas.
  • MC 252 SS-1 Well:  6.1 mmbo & 15.6 bcf.  Oct. 2015 through July 2017.  Avg. 9,600 barrels of oil per day (BOPD) and 24 million cubic feet of natural gas per day (mmcf/d).

The Niedermeyer, Marmalarde and Son of Bluto 2 fields were completed as subsea tiebacks to LLOG’s “Delta House” floating production system (FPS) on MC 254.


Figure 5. Delta House schematic diagram and map.

LLOG began drilling the Delta House prospects in 2011 and by 2015 had the Delta House FPS completed and commenced production.

LLOG eyes 2015 Delta House startup


Strategic partnership lets Louisiana operator ramp up deepwater E&P

Russell McCulley
Senior Technical Editor

Louisiana independent LLOG Exploration Co. is gearing up for an ambitious deepwater Gulf of Mexico drilling campaign at the Delta House project, scheduled for first oil in 2015. As of April 2013, the company had drilled two successful wells: one at the Son of Bluto 2 prospect, in Mississippi Canyon 387, and another at Marmalard, in MC 300, and had the Ensco 8502 semisubmersible drilling a third Delta House well at the Marmalard prospect. The three wells will supply initial production to the Delta House platform, to be moored in 4,500 ft (1,372 m) of water in MC 254.

Early this year, LLOG lined up two newbuild DP drilling rigs to carry out work at Delta House and the company’s other central Gulf prospects. A cylindrical Sevan Drilling rig under construction at Cosco Quidong shipyard in China, to be dubbed Sevan Louisiana, will start a three-year, $550-million charter with LLOG in January, 2014. In 3Q 2014, Seadrill’s West Neptune is scheduled to arrive at the Delta House complex to begin an initial three-year, $662-million term. The dual BOP drillship is under construction at Samsung Heavy Industries in South Korea.


Offshore Magazine


Figure 6. Delta House FPS and subsea tiebacks.

In less than five years, the Delta House development went from spudding the first well to 80,000 bbl/d of oil production.

LLOG Exploration is just one of many, efficient and highly competent independent oil companies, that most people have never heard of, developing deepwater prospects in the Gulf of Mexico and around the world.

Oil’s well that end’s well!


DOE/FERC to Enact “Resiliency Pricing Rule” for Coal-fired and Nuclear Power Plants

October 19, 2017

DOE = United States Department of Energy

FERC = Federal Energy Regulatory Commission

OCT 2, 2017

Rick Perry Directs FERC To Complete Final Action On Resiliency Pricing Rule In 60 Days


One of the most sweeping changes to the U.S. electricity supply market in the past two decades may be implemented before the coming winter heating season. The brief bottom line of the change is that eligible power sources will be able to participate in a details-to-be-determined rate structure that allows the owner to recover its “fully allocated costs” plus a “fair return on equity.”

Eligible grid reliability and resiliency resource is any resource that:

  1. is an electric generation resource physically located within a Commission-approved independent system operator or regional transmission organization;
  2. is able to provide essential energy and ancillary reliability services, including but not limited to voltage support, frequency services, operating reserves, and reactive power;
  3. has a 90-day fuel supply on site enabling it to operate during an emergency, extreme weather conditions, or a natural or man-made disaster;
  4. is compliant with all applicable federal, state, and local environmental laws, rules, and regulations; and
  5. is not subject to cost of service rate regulation by any state or local regulatory authority

All licensed nuclear power plants and a significant portion of existing coal plants can meet those requirements today.


Along with his letter, Secretary Perry enclosed a Notice of Proposed Rulemaking (NOPR) that directs the Commission to either take final action within 60 days after publication of the NOPR in the Federal Register (which has not yet occurred) or to issue the proposed rule as an interim final rule. Any rules included in the final action will go into effect within 30 days after publication.

The Summary section of the NOPR includes statements with legal justification for the FERC’s authority to issue the proposed rule without an Environmental Assessment or an Environmental Impact Statement.

It also provides an analysis concluding that the proposed rule does not have any significant economic impact on small entities and thus does not need to meet certain description and analysis requirements of the Regulatory Flexibility Act of 1980 (RFA).


My Conclusion

Free market purists like R Street may have complained a bit about the hand on the scale in favor of renewable energy resources, but their expressed faith in the market decision making process really triggered by the idea that coal and nuclear plants might not be forced to retire.

As a lifelong fan of reliable electricity who has visited places where brownouts and rolling blackouts are an accepted fact of life, I do not support the Enron-conceived notion that electricity is just another tradable commodity with opportunities for fabulous rates of return in certain conditions.

Blackouts and brownouts impose a far greater cost on the overall economy that most people realize. Electricity is too important to be left to the vagaries of short term markets, especially since the markets have already been tipped heavily in favor of unreliables and natural gas.

Wind and solar power advocates are either confused or disingenuous when claiming that their power sources, which – by definition – cannot be well protected from the weather, are reliable and provide resilience.

The American Petroleum Institute and its partners in the natural gas industry are mad because their long running price war against coal and nuclear may be interrupted before it achieves its desired objective of driving out enough of its competitors to give it scarcity pricing power.

The rule makes sense. The urgency is justified. I fully expect that there will be numerous interests that will seek delays because those will help them achieve their objective of forcing permanent plant closures.

It is too bad the rule wasn’t implemented in time to save valuable, emission-free, fuel-secure assets like Vermont Yankee, Kewaunee, and Ft. Calhoun. Fortunately, it looks like it might have been issued in time to save a couple of dozen other nuclear plants that are at risk of prematurely closing in the next five years.


If enacted, this resiliency plan would lead to lower electricity rates than would occur if nuclear and coal-fired plants continue to shut down. The loss of base load capacity would lead to greater dependence on inefficient and expensive “peaker” power plants.

In a purely laissez-faire world, natural gas combined cycle (NGCC) would be the only type of power plant being built. No other type of new power plant has an honestly positive NPV (net present value) using a realistic discount rate. This would be great for my industry (oil & gas), but eventually bad for electricity consumers, because it would drive up natural gas prices. Unlike Australia, we have more than enough natural gas production to meet domestic demand and become a world leader in LNG exports. However, we can’t bet the farm on fracking and shale plays. All shale plays eventually peak. The Marcellus is huge… but it will eventually peak. As natural gas prices gradually creep up from $3/mcf to $5-8/mcf over the next ten years, existing coal and nuclear power plants will become very competitive with NGCC… But, only if those power plants are still existing.

Given a choice between subsidizing reliable base load power plants or unreliable non-dispatchable power plants, it would be foolish to choose the latter.

This isn’t a random pattern:

Electricity costs as a function of per capita installed renewable capacity. Wind and solar only, excludes hydropower. [Updated to add Australia and correct the units] – Source: Willis Eschenbach

Part of Australia already has the most expensive electricity in the world…

Although the causes of Australia’s high electricity prices are a bit more complicated than Germany’s or Denmark’s. The costs are skyrocketing due to a lack of investment in fossil fuel infrastructure and the inability of wind & solar to actually replace coal…

What caused the power price spike?

There are two main causes of the sudden jump in electricity prices. And despite the frothing anger from the shock jocks, it has nothing to do with the rise of renewables.

The main cause is a lack of investment. And the second is the soaring price of gas.

As old coal-fired plants have been retired, there has been insufficient investment to replace them because power companies have been left in policy limbo over carbon pricing.


How does gas affect electricity prices?

The second contributor to soaring electricity prices has been the sudden spike in gas prices.

In the absence of any political leadership, the power industry correctly figured renewables such as wind and solar eventually would be cheaper and more efficient than coal. That is because the fuel — wind and sun — is free and the maintenance costs of the plants is low.

The problem with renewables is their unreliability. Given Australia’s gas abundance, the idea was that gas would cut in whenever there was an energy shortfall.

Unlike coal plants that take weeks to fire up or shut down, gas turbines can be turned on and off at short notice, making them ideal to fill the breech when renewables are offline.

As the last player to enter the market, during power shortages, gas becomes the overall price setter. In case you have not noticed, gas prices have quadrupled because the exporters — many of which are the electricity generators — have sold more gas to offshore customers than their reserves. So, they pillaged local supplies, sending domestic gas prices through the roof.


ABC (Aus)

This sentence is 100% disingenuous:

And despite the frothing anger from the shock jocks, it has nothing to do with the rise of renewables.

The “rise of renewables” may not be the direct cause of Australia’s high electricity prices. However the transition from coal to [fill in the blank] is the direct cause.

Unlike the US, Australia doesn’t produce enough natural gas to export large volumes of LNG without driving up domestic prices.   Australian LNG landed in Japan is actually 40% cheaper than natural gas produced and sold in Australia.

Australia’s coal-fired plants are shutting down and being replaced by unreliable wind & solar and expensive natural gas plants. Claims that wind & solar are now less expensive than coal per MWh are wholly irrelevant. Solar and wind have to be 1/4 to 1/2 the price of coal per MWh to compete. And, even then, they can’t actually provide base load, because they are non-dispatchable.

Even if we assume this is accurate, neither storage nor back-up is factored into the cost of new power generation.

Let’s assume that they can purchase battery backup for $140/kWh and the Li-ion cells last 10 years. Over ten years, $140/kWh works out to about $38.50 (US) per MWh of generation.

Over the twenty year lifespan of the wind and solar power plants, the batteries would have to be replaced once.  That brings the storage cost up to $77 (US) per MWh of generation.  Convert to AUD and it’s $99/MWh.  Tack that on to the LCOE:

Energy supply Cost of energy w/Storage  (AUD/MWh)
Solar $177 $239
Wind $160 $217
Ultra supercritical coal (so-called “clean coal”) $134 $203

Wind & solar only appear to be affordable…

Until you add in the cost of storage…

(OCGT = open cycle gas technology, a common type of “peaker” generator.)

Even with Australia’s high natural gas prices, combined cycle natural gas and coal are much cheaper than wind & solar, if you factor in the storage costs.

Australia’s electricity rates have skyrocketed because they didn’t maintain their base load capacity (mostly coal-fired generation). Perry’s resiliency pricing plan would prevent this from happening here.  While allowing natural gas to kill coal & nuclear would be very good for my industry, in the long run, it wouldn’t be good for grid resiliency, electricity consumers or our nation’s energy security.

“Depths, Times, and Velocities – Facts or Opinions?”

October 17, 2017


  • Wells are drilled in depth -truth
  • Seismic data is a function of two-way travel time – truth
  • Velocity in a highly complex earth – opinion
  • Consequences
  • Four major challenges


Stupid Poll

October 3, 2017


Seventy-two percent of Americans believe climate change is happening, including 85 percent of Democrats and 61 percent of Republicans. Nineteen percent remain unsure.

What was the percentage of respondents who could define the word “climate” and the phrase “climate change”?

Political party and belief in climate change are the main determinants of whether people are willing to pay a modest fee to combat climate change, as opposed to education, income, or geographic location. Democrats are consistently willing to pay more than Republicans.

That’s because Democrats never spend their own money.

Fifty-seven percent support actions taken by some mayors and governors to honor the goals of the Paris climate agreement despite U.S. withdrawal, and 55 percent think their state and local government should do more to address climate change. A third say they should stick to the status quo.

55% of respondents would also probably say that they think think their state and local government should do more to address plate tectonics and entropy too.

Climate change and energy policy are very or extremely important to 48 percent and 54 percent of Americans, respectively, while at least two-thirds say health care, the economy, and terrorism are important policy priorities.

But, what do they fear the most?



Thirty-five percent oppose the direction of energy policy in the United States, while 45 percent lack an opinion and only 17 percent support the direction. Republicans are more likely than Democrats to favor the direction of energy policy, but they are most likely to lack an opinion.

45% of respondents were smart enough to know that they don’t have a fracking clue about energy.

Roughly equal shares of Americans favor, oppose, and neither favor nor oppose the construction of the Keystone and Dakota Access pipelines.

Who cares?

Forty percent of Americans oppose the repeal of the Clean Power Plan, which the Trump administration is reviewing. Thirty-seven percent lack an opinion, while just 20 percent favor its repeal.

How many of the 77% who oppose or have no opinion on the CPP are in favor of their lights actually coming on when they flip a switch and would prefer not to have skyrocketing electricity rates?

More Americans lack an opinion on the use of fracking in the United States than support it: 37 percent neither favor nor oppose fracking, 17 percent favor it, and 41 percent oppose it.

Are 41% of Americans really that stupid?

An equal number of Americans either support or lack an opinion on the withdrawal from the Paris Agreement, while the largest number opposes withdrawal: 42 percent oppose it, 28 percent support it, and 28 percent neither support nor oppose withdrawal. Half of those who support withdrawal say the agreement was too costly for the United States.

Ratification of a treaty requires a two-thirds majority in the Senate.  42% support for a treaty falls a bit short.  I wonder how much overlap there was between the 41% who oppose fracking and the 42% who opposed 86’ing the Paris deal?


October 3, 2017
Social Cost of CO2, 2015-2050 a (in 2007 dollars per metric ton CO2)
Discount Rate and Statistic
High Impact US per capita
Year 5% Average 3% Average 2.5% Average (95th pct at 3%) MtCO2/yr
2015 $11 $36 $56 $105 16
2020 $12 $42 $62 $123 16
2025 $14 $46 $68 $138 16
2030 $16 $50 $73 $152 16
2035 $18 $55 $78 $168 16
2040 $21 $60 $84 $183 16
2045 $23 $64 $89 $197 16
2050 $26 $69 $95 $212 16

Per capita cost per month:

Social Cost of CO2, 2015-2050 a (in 2007 dollars per metric ton CO2)
Discount Rate and Statistic
Per capita $/month High Impact
Year 5% Average 3% Average 2.5% Average (95th pct at 3%)
2015  $               15  $                48  $          75  $                      140
2020  $               16  $                56  $          83  $                       164
2025  $               19  $                61  $          91  $                       184
2030  $               21  $                67  $          97  $                       203
2035  $               24  $                73  $        104  $                       224
2040  $               28  $                80  $        112  $                       244
2045  $               31  $                85  $        119  $                       263
2050  $               35  $                92  $        127  $                       283

Monthly cost to family of four:

Monthly cost to a family of 4 (5% discount rate)
2015  $       58.67
2020  $       64.00
2025  $       74.67
2030  $       85.33
2035  $       96.00
2040  $    112.00
2045  $    122.67
2050  $    138.67


“Tesla is not a car, battery, or tech company; it is an experimental financial services company and should be regulated as such.”

October 3, 2017


Thousands USD
Tesla Amazon
 Operating Cash Flow Long Term Debt Operating Cash Flow Long Term Debt
2014 ($57,337) $1,818,785 $6,842,000 $8,265,000
2015 ($524,499) $2,021,093 $11,920,000 $8,227,000
2016 ($123,829) $5,969,500 $16,443,000 $7,694,000

Great Barrier Reef: 2016 Coral Cover Loss and Local Sea Level Fall

October 2, 2017



Climate change is internationally-recognised as one of the biggest threats to coral reefs around the world, including the Great Barrier Reef. For the last three years, coral bleaching, due to ocean warming associated with climate change, has impacted coral reefs worldwide. Mass coral bleaching events occur during extended periods of elevated sea surface temperatures and have the potential to result in significant and widespread loss of coral.

The current mass coral bleaching occurring in tropical regions across the world since 2014 is the longest mass bleaching event ever recorded. This is a global event triggered by record-breaking sea surface temperatures caused by climate change and amplified in 2016 by a strong El Niño. The ocean is warmer than at any time since the instrumental record began. For Australia’s Great Barrier Reef, this resulted in the worst ever coral bleaching in 2016.


Final report: 2016 coral bleaching event on the Great Barrier Reef

There’s no doubt that the 2016 coral bleaching event and associated coral mortality was the worst in recorded history since at least 1980.  However, no one was really paying much attention to coral bleaching before the early 1980’s.

Insufficient sea temperature data exist from the Great Barrier Reef to indicate changes in the long-term means over recent times. However, an analysis of air temperature records from Townsville shows that mean January February air temperatures above 29°C occurred 6 times between 1980 and 1995, 5 of which coincided with bleaching events at nearby Magnetic Island. Prior to 1980 however, these conditions had occurred only 4 times in the 53 years since 1927, all occurring in the 1930s (Jones 1995; Jones et al. in press).

Guldberg et al., 1999

“This is a global event triggered by record-breaking sea surface temperatures caused by climate change and amplified in 2016 by a strong El Niño.”

Was it “triggered by record-breaking sea surface temperatures caused by climate change and amplified in 2016 by a strong El Niño”?  Or was it caused by a strong El Niño and amplified by a sharp local fall in sea level from 2014-2016?

2016 was the GBR’s second warmest year “on record”…


Figure 1. Annual sea surface temperature anomaly Great Barrier Reef (1900-2016). Australian Bureau of Meteorology/NOAA Extended Reconstructed Sea Surface Temperature Version 4 (ERSST v4). LINK

The SST data for the GBR are derived from NOAA’s ERSST v4… (for whatever that’s worth).

2016 was the second hottest “year on record” for the GBR… However, bleaching occurs in summer.


Figure 2. Summer SST for the GBR… About 0.5°C above average. LINK

At this point, I asked myself: “Self?  Why are they using NOAA’s ERSST v4?  Aren’t there any weather stations in or around the Great Barrier Reef?”

There are actually quite a few weather stations.


Figure 3. AIMS Weather Stations. LINK

Agincourt Reef #3 has fairly complete weather records going back to 1991 and it experienced “high” mortality rates… But it hasn’t participated in Gorebal Warming since at least 1993.


Figure 4. Agincourt Reef #3.


Figure 5. Agincourt Reef #3 air and water temperature record  Spurious data points removed by author. LINK

Original data…

Figure 6.  Agincourt Reef #3 with spurious data points included.

Figure 6.  Agincourt Reef #3, with spurious data points included.

Air temperature only…


Figure 7. Agincourt Reef #3, air temperature, spurious data points removed by author.

I have only looked at Agincourt Reef #3 in detail.  However, my cursory review of Thursday Island, Lizard Island, Cape Bowling Green and Square Rocks didn’t support any significant warming over recent years in the GBR.

So… What is anomalous about 2014-2016?  A really strong El Niño and a sudden, rather sharp local fall in sea level.

Sea level fall and reef mortality

It’s a well-known fact that coral reefs react poorly to falling sea level… Subaerial exposure of the reef is generally fatal.  However sea level fall on the order of 0.5 meter can literally shut reefs down.

A re-examination of 46 recently published U/Th reef flat ages from Heron and One Tree reefs in the southern Great Barrier Reef (GBR) identified several distinct Holocene reef growth phases with a clear 2.3-kyr hiatus in lateral reef accretion from 3.9 ka to 1.5 ka. An analysis of all available published radiocarbon reef flat ages (165) from 27 other mid-outer platform reefs revealed a similar hiatus between 3.6 ka and 1.6 ka for the northern, south-central and southern GBR. However, no hiatus in reef flat growth was observed in reefs from the central GBR with ages ranging from 7.6 ka to 0.9 ka. Increased upwelling, turbidity and cyclone activity in response to increased sea-surface temperature (SST’s), precipitation and El-Nino Southern Oscillation variability have been ruled out as possible mechanisms of reef turn-off for the mid-outer platform reefs. Rather, a fall (~ 0.5 m) in relative sea level at 4–3.5 ka is the most likely explanation for why reefs in the northern and southern regions turned off during this time.

Successive phases of Holocene reef flat development: Evidence from the mid- to outer Great Barrier Reef (PDF Download Available). Available from: [accessed Oct 2, 2017].  LINK

Jim Steele has written about the effects of sea level fall on coral reefs in two excellent articles here and here.

Correlation of 2016 GBR coral cover loss to 2014-2016 sea level fall


Figure 8. Change in GBR coral cover from early to late 2016.

The coral cover loss was most severe in the Lizard Island Transect Area, just north of Agincourt Reef #3.  Using the color bar scale in Figure 8, I assigned a numerical value  to each color, ranging from 1 (very low) to 8 (high).  I then calculated the change in the rating from early to late 2016.  I then posted sea level profiles from University of Colorado’s Interactive Sea Level Time Series Wizard along a north-south transect.


Figure 9. Coral cover loss and local sea surface height.  Note that the areas with most severe coral cover loss in 2016 experienced a sharp (~0.2 m) drop in sea surface height from 2014-2016.

To enumerate the severity of the 2014-2016 sea level fall, I calculated the slope since 2012.

2016  Change in Coral Cover Rating
Lat, Long Slope Since 2012 Lagoon Reef Front Average
-11, 144 -0.8114 -5 -1 -3
-12, 144 -0.6106 -5 -1 -3
-13, 144 -1.6792 -3 -3 -3
-14, 145 -1.747 -5 -3 -4
-15, 146 -1.2204 -2 -4 -3 Lizard Island
-16, 146 -0.6637 0 -2 -1 Agincourt Reef #3
-17, 147 0.0762 0 -2 -1
-18, 147 0.5102 -1 -3 -2
-19, 148 0.5055 -1 -3 -2
-19, 149 0.4738 -1 0 -0.5
-20, 150 0.136 0 0 0
-21, 152 0.0594 0 0 0
-22, 152 0.4375 0 0 0
-23, 152 0.6094 0 0 0

Figure 10.  Average, 2016 Δ Coral Cover vs SSH Slope.


Figure 11. Reef front, 2016 Δ Coral Cover vs SSH Slope.

Figure 11


Figure 12. Lagoon, 2016 Δ Coral Cover vs SSH Slope.


It is highly likely that a localized sea level fall played a greater role in the severity of the 2016 coral bleaching and mortality than Gorebal Warming did.

Gavin’s Twitter Trick

September 25, 2017

goreThis is the nucleus of an upcoming post for WUWT.

Last week, Larry Kummer posted a very thoughtful article:

A climate science milestone: a successful 10-year forecast!

At first glance, this did look like “a successful 10-year forecast:


Figure 1. A successful 10-year forecast?

The observations track closer to the model ensemble mean (P50) than most other models and the 2016 El Niño spikes at least a little bit above P50.  Noticing that this was CMIP3 model, Larry and others asked if CMIP5 (the current Climate Model Intercomparison Project) yielded the same results, to which Dr. Schmidt replied:

Figure 2. A failed 10-year forecast.

The CMIP5 model looks a lot like the CMIP3 model… But the observations bounce between the bottom of the 95% band (P97.5) and just below P50… Then spike to P50 during the 2016 El Niño.  When asked about the “estimate of effect of misspecified forcings, Dr. Schmidt replied:

Basically, the model would look better if it was adjusted to match what actually happened.

The only major difference between the CMIP3 and CMIP5 model outputs was the lower boundary of the 95% band (P97.5). (4)

Figure 3. Improving accuracy by increasing imprecision.

CMIP-5 yielded a range of 0.4° to 1.0° C in 2016, with a P50 of about 0.7° C. CMIP-3 yielded a range of 0.2° to 1.0° C in 2016, with a P50 of about 0.6° C.

They essentially went from 0.7 +/-0.3 to 0.6 +/-04.

Progress shouldn’t consist of expanding the uncertainty… unless they are admitting that the uncertainty of the models has increased.

Larry then asked Dr. Schmidt about this:

Dr. Schmidt’s answer:

“Not sure”?  That instills confidence.  He seems to be saying that the CMIP5 model (the one that failed) may have had “more coherent forcing across the ensemble, more realistic ENSO variability, greater # of simulations.”

I’m not a Twitterer, but I do have a rarely used Twitter account and I just couldn’t resist joining in on the conversation.  Within the thread, there was a discussion of Hansen et al., 1988.  And Dr. Schmidt seemed to be defending that model as being successful because the  2016 El Niño spiked the observations to “business-as-usual.”

Figure 4.  Hansen et al., 1988 is still an epic fail.  The monster El Niño of 2016 is not “business-as-usual.”

I asked the following question:

No answer.  Dr. Schmidt is a very busy person and probably doesn’t have much time for Twitter and blogging.  So, I don’t really expect an answer.

In his post, Larry Kummer also mentioned a model by Zeke Hausfather posted on Carbon Brief


Figure 5.  Another failed climate model.  The 2016 El Niño is not P50 weather.

El Niño events like 1998 and 2016 are not high probability events.  On the HadCRUT4 plot below, I have labeled several probability bands:

Standard Deviation Probability Band % Samples w/ Higher Values
+2σ P02.5 2.5%
+1σ P32 32.0%
Mean P50 50.0%
-1σ P68 68.0%
-2σ P97.5 97.5%

I am assuming that HadCRTU4 is reasonably accurate and not totally from the Adjustocene.  I removed the linear trend, calculated a mean (P50) and two standard deviations (1σ & 2σ).  Then I added the linear trend back in to get the following:



Figure 6.  HadCRUT4 (Wood for Trees) with probability bands.

The 1998 El Niño spiked to P02.5.  The 2016 El Niño spiked pretty close to P0.01.  A strong El Niño should spike from P50 toward P02.5.

All of the models fail in this regard.  Even the Mears-ized RSS satellite data exhibit the same relationship to the CMIP5 models as the surface data do.

The RSS comparison was initialized to 1979-1984.  The 1998 El Niño spiked above P02.5.  The 2016 El Niño only spiked to just above P50… Just like the Schmidt and Hausfather models.  The Schmidt model was initialized in 2000.

This flurry of claims that the models don’t “run hot” because the 2016  El Niño pushed the observations toward P50 are being driven by an inconvenient paper that was recently published in Nature Geoscience (discussed here, here and here).

21 September 2017 0:27

Factcheck: Climate models have not ‘exaggerated’ global warming


A new study published in the Nature Geosciences journal this week by largely UK-based climate scientists has led to claims in the media that climate models are “wrong” and have significantly overestimated the observed warming of the planet.

Here Carbon Brief shows why such claims are a misrepresentation of the paper’s main results.


Carbon Brief

All (95%) of the models run hot, including “Gavin’s Twitter Trick”.  From Hansen et al., 1988 to CMIP5 in 2017, the 2016 El Niño spikes toward the model ensemble mean (P50)… Despite the fact that it was an extremely low probability weather event (<2.5%).

Shrinking Climate Sensitivity

September 24, 2017


Updated climate sensitivity estimates

September 23, 2017

DKLepOlX0AA2Pm_DKM9wDkXkAARHtE-1output_O3Whvr.gifIf you download any of the Mann 2008 reconstructions, you get a tine series that does not differentiate the proxy from the instrumental data.

To see an example of “Mike’s Nature Trick,” go here… Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia

Click this… EIV Temperature Reconstructions

Open up any of the **cru_eiv_composite.csv or **had_eiv_composite.csv files. All of them splice the high frequency instrumental data into the low frequency proxy data. To Mann’s credit, unlike his previous “tricks,” he at least documents this one enough to sort it out in the SI.

This statement from their PNAS paper is totally unsupported by proxy reconstructions… “Recent warmth appears anomalous for at least the past 1,300 years whether or not tree-ring data are used. If tree-ring data are used, the conclusion can be extended to at least the past 1,700 years.”

The anomalous nature of the “recent warmth” is entirely dependent on the “tricky” use of the instrumental data. He didn’t use any proxy data post-1855.

This image from Mann’s 2008 paper falsely implies that all of the reconstructions are in general agreement regarding the claim that the “recent warmth appears anomalous for at least the past 1,300 years”…

By cluttering up the image with many reconstructions and plastering the instrumental record onto end of the graph, it’s impossible to see any details.

Here are Mann (Had_EIV), Moberg and Ljungqvist without the clutter…

Zoomed in on post-1800…

And Mike’s Nature Trick…

The Modern Warming appears anomalous because of the higher resolution of the instrumental record, it’s position at the tail-end of the time series and the negative deflection of the Little Ice Age trough (ca 1600 AD)…

If the Modern Warming is directly compared to the Medieval Warm Period, it appears to be far less anomalous, despite ithe better resolution of the instrumental record…

Particularly if you clutter the image with multiple reconstructions…

The Modern Warming might be 0.2-0.4°C warmer than the Medieval Warm Period. This would be consistent with a climate sensitivity of 0.5-1.0°C per doubling of the pre-industrial atmospheric CO2 level (as opposed to ~3.5°C). Although the difference between the modern warming and MWP is well within the margins of error of the proxy and instrumental reconstructions and could easily be explained by the higher resolution of the instrumental record