Alarmists Gone Wild: Greenland losing 400 1 cubic km ice cubes per year!!!

May 1, 2017



The thaw is happening far faster than once expected. Over the past three decades the area of sea ice in the Arctic has fallen by more than half and its volume has plummeted by three-quarters (see map). SWIPA estimates that the Arctic will be free of sea ice in the summer by 2040. Scientists previously suggested this would not occur until 2070. The thickness of ice in the central Arctic ocean declined by 65% between 1975 and 2012; record lows in the maximum extent of Arctic sea ice occurred in March.


The most worrying changes are happening in Greenland, which lost an average of 375bn tonnes of ice per year between 2011 and 2014—almost twice the rate at which it disappeared between 2003 and 2008 (see chart). This is the equivalent of over 400 massive icebergs measuring 1km on each side disappearing each year. The shrinkage is all the more perturbing because its dynamics are not well understood. Working out what is going on in, around and underneath a supposedly frigid ice sheet is crucial to understanding how it will respond to further warming and the implications of its demise for rising global sea levels (see article).


The Economist

375 billion tonnes per year… Oh my!

400 massive icebergs measuring 1km on each side disappearing each year… Oh no!!!

Wait a second… Those sound like big numbers… But how big are they compared to the Greenland ice sheet?

The USGS says that the volume of the Greenland ice sheet was 2,600,000 km3  at the beginning of the 21st century.

According to the “ice sheet goeth” graph, since 2001, Greenland lost about 3,600 gigatonnes of ice or about 3,840 km3 … That equates  to a 16 km x 16 km x 16 km cube of ice (3√ 3,840 = 15.66).  That’s YUGE!  Right? Not really.

It’s not even a tiny nick when spread out over roughly 1.7 million square kilometers of ice surface.  That works out a sheet of ice less about 2 meters thick… Not even a rounding error compared to the average thickness of the Greenland ice sheet.

  • 2,600,000 km3 / 1,700,000 km2 = 1.53 km

The average thickness of the Greenland ice sheet is approximately 1.5 km (1,500 meters).  2 meters is about 0.15% of 1,500 meters.


Isopach map of the Greenland ice sheet. The first contour inside the white area represents an ice thickness of 1,000 meters. Source: Eric Gaba (Wikimedia Commons user Sting) [CC BY-SA 3.0 (, via Wikimedia Commons

The thickness of the Greenland ice sheet is truly apparent on this radar cross section:


A really cool radar cross section of the Greenland ice sheet.   Source:


From a thickness perspective, 2 meters looks like this:


The top panel is zoomed in on the box in the lower panel.  Each square on the graph paper image represents 5 vertical meters.  


Using The Economist ratio of 400 km3 to 375 gigatonnes, 2,600,000 km3  works out to 2,437,500 gigatonnes.  When some actual perspective is applied, it is obvious that “the ice sheet goeth” nowhere:


The ice sheet goeth nowhere.

Despite all of the warming since the end of Neoglaciation, the Greenland ice sheet still retains more than 99% of its 1900 AD ice mass.



Can the U.S. Become the Saudi Arabia of Natural Gas?

April 28, 2017


The Department of Energy gave a Texas-based energy company permission Tuesday to export liquefied natural gas (LNG) to countries with which the U.S. does not have free trade agreements.

Golden Pass Products will build an LNG export terminal capable of shipping 2.21 billion cubic feet per day (Bcf/d) of natural gas around the world. It’s the first LNG export terminal approved by the Trump administration, adding to the already 19.2 Bcf/d of exports approved by the Obama administration.

The export facility will create an estimated 45,000 direct and indirect jobs over the next five years, according to Golden Pass. The company estimates the construction operation of the facility will generate up to $3.6 billion in federal and state tax revenues.

The Trump administration said the terminal’s approval would help make the U.S. a “dominant” energy force in the world.

“This announcement is another example of President Trump’s leadership in making the United States an energy dominant force,” Energy Secretary Rick Perry said in a press statement. “This is not only good for our economy and American jobs but also assists other countries with their energy security.”

U.S. energy ascendancy will have political implications in Europe where about half the continent’s natural gas supply comes from state-owned Russian companies. Foreign policy experts see U.S. gas exports as a way to undermine Russia’s energy dominance in the region.


U.S. consumers would deal with minimal costs to export LNG and it would lead to huge economic benefits, according to a study published in December 2015 by the DOE. The study found exporting American LNG would provide huge environmental benefits as well. The report states exporting LNG will help “address a variety of environmental concerns in the power‐generation sector.”

Exporting natural gas is likely to be a growth industry, as global demand for natural gas is expected to be 50 percent higher by 2035 than it is now, according to the International Energy Agency. Demand for imports of LNG increased 27 percent in the United Kingdom last year alone.
Read more:



The shale revolution enabled U.S. natural gas production to surge out a 30-yr  doldrum:


chart (1)

Figure 1.  U.S. natural gas production (Bcf).  Source: EIA


The surge in production has been accompanied by a surge in exports:

chart (2)

Figure 2. U.S. Natural gas exports (mmcf).  Source: EIA

LNG exports are expected to be the driving force in the U.S. natural gas business over the next 30 years.

EIA: LNG exports expected to drive growth in U.S. natural gas trade


WASHINGTON, DC — The United States is expected to become a net exporter of natural gas on an average annual basis by 2018, according to the recently released Annual Energy Outlook 2017 (AEO2017) Reference case. The transition to net exporter is driven by declining pipeline imports, growing pipeline exports, and increasing exports of liquefied natural gas (LNG). In most AEO2017 cases, the United States is also projected to become a net exporter of total energy in the 2020s in large part because of increasing natural gas exports.


The growth of natural gas exports, especially from new LNG terminals, sustains continued growth in U.S. natural gas production. In the Reference case, natural gas production is projected to grow through 2020 at about the same rate (3.6% annual average) as it has since 2005, when production of natural gas from shale formations began to grow rapidly. After 2020, natural gas production grows at a lower rate (1.0% annual average) in the Reference case as net export growth moderates, energy efficiencies increase, and natural gas prices slowly rise.

Natural gas production and trade vary with different assumptions for resources and technology, macroeconomic growth, and world oil prices. In the High Oil and Gas Resource and Technology case, larger natural gas resource estimates and improved drilling technology lead to higher domestic natural gas production, lower U.S. natural gas prices, and therefore, greater natural gas exports. Most of the increase in natural gas trade is from LNG exports, which grow to 8.4 Tcf (23 Bcfgd) in 2040.

However, LNG exports are highest in a case with high world oil prices. In the High Oil Price case, when consumers move away from petroleum products when other energy sources become economically favorable, global LNG demand increases and U.S. LNG exports reach 9.2 Tcf, or 25 Bcfgd. Compared with other LNG suppliers, U.S. LNG has the advantage of domestic spot prices that are less sensitive to global oil prices.

Conversely, in a scenario with more pessimistic assumptions for oil and gas resources and technology or a scenario with low world oil prices, LNG exports still increase, but remain below Reference case levels through 2040.

World Oil

The EIA’s base case projection would make the U.S. a dominant player in the global LNG market.


Figure 3.  EIA forecast of U.S. natural gas exports.  7 Tcf/yr = 19.2 Bcf/d          Source: EIA via World Oil

  • 19.2 Bcf/d = 3.4 mmBOE/d
  • Canada exported  3.2 million bbl/d of crude oil in 2013.

The EIA reference case would make the U.S. the “Canada” of natural gas exports.  It would also make our natural gas exports comparable to Russia’s at ~7 Tcf/yr.


Figure 4.  EIA forecast of U.S. natural gas production and consumption.           Source: EIA via World Oil


Figure 5.  EIA forecast range of U.S. LNG exports.   Source: EIA via World Oil

  • 26 Bcf/d = 4.6 mmBOE/d
  • Russia exported  4.9 million bbl/d of crude oil in 2013.

The “High Oil Price” scenario would make the U.S. the “Russia” of natural gas exports.

U.S. exports of LNG are ramping up rapidly.

100th LNG Cargo Shipped from Sabine Pass Liquefaction Facility

Cheniere Energy announced today the 100th cargo of liquefied natural left the company’s Sabine Pass liquefaction facility on Saturday, April 1st, 2017. Including the 100th cargo, Cheniere has delivered cargoes to 18 countries on five continents since the first shipment on February 24, 2016.  “This milestone for Cheniere is a testament to the global demand for American LNG, the hard work and dedication of Cheniere’s workforce, and our unique business model that enables customers large and small to access this fuel,” said Jack Fusco, Cheniere’s President and CEO. “Our entire workforce shares in this milestone and in Cheniere’s future success.”

In February 2016, Cheniere became the first company to ship LNG from the contiguous United States in over 50 years.


LNG Global

By the end of 2018, the U.S. will be a net exporter of natural gas:

COMMODITIES | Wed Mar 29, 2017 | 6:38am EDT

After six decades, U.S. set to turn natgas exporter amid LNG boom

By Scott DiSavino

The last time the United States was a net exporter of natural gas was in 1957, when Dwight Eisenhower was president. That should change in 2018 when the country is expected to become the world’s third-largest exporter of liquefied natural gas (LNG).

By the end of next year, U.S. LNG export capacity in the lower 48 states will top 6 billion cubic feet per day (bcfd), or 8 percent of the country’s domestic consumption, up from zero at the beginning of 2016. Six bcfd of gas can fuel about 30 million U.S. homes, or almost every house in California, Texas and Florida combined.

That growth in U.S. LNG exports is set to transform world energy markets. Just a decade ago, before the shale revolution, the United States was expected to become a growing LNG importer, not an exporter, likely dependent on Russian, Middle East and North African gas, much as it has for decades depended on foreign crude.



There are currently two LNG export terminals operating in the U.S.  Kenai AK, ConocoPhillips, has been in operation since 1969.  It has a capacity of 0.2 Bcf/d.  Sabine LA (Cheniere) has only been in operation for about 1 year.   It has a capacity of 1.4 Bdf/d.


Figure 6. Existing LNG terminals.  Source: FERC



Figure 7.  Approved LNG export terminals.  Source: FERC



Figure 8.  Proposed LNG terminals.  Source: FERC



Natural Gas08

Figure 9. U.S. natural gas production, imports, exports and LNG export capacity,                    Source EIA and FERC


Figure 9.  U.S. natural gas production, imports, exports and LNG export capacity.                  Source: EIA and FERC

  • 16,111 Bcf/yr = 44 Bcf/d
  • 44 Bcf/d = 7.8 mmBOE/d
  • Saudi Arabia exported  7.4 million bbl/d of crude oil in 2013.

If all of the approved and proposed LNG export terminals are built and operate at full capacity, the U.S. would become

Can we “get there from here”?  Is there enough natural gas in the ground for the U.S. to become the “Saudi Arabia” of natural gas.


Reserves and Resources

To answer that question, we have to get a handle on the size of the resource base.

Proved Reserves

Proved reserves are those quantities of petroleum which, by analysis of geological and engineering data, can be estimated with reasonable certainty to be commercially recoverable, from a given date forward, from known reservoirs and under current economic conditions, operating methods, and government regulations.

If deterministic methods are used, the term reasonable certainty is intended to express a high degree of confidence that the quantities will be recovered. If probabilistic methods are used, there should be at least a 90% probability that the quantities actually recovered will equal or exceed the estimate.


Proved reserves are often referred to a “P90” or 1P.  There is a 90% probability that this much gas will be produced.  People often make the mistake of thinking that proved reserves are a fixed number, which will go down with production.  In fact, proved reserves represent the minimum volume that is expected to be produced.  Proved reserves can move up and down simply dud to changes in product prices.

Probable Reserves

Probable reserves are those unproved reserves which analysis of geological and engineering data suggests are more likely than not to be recoverable. In this context, when probabilistic methods are used, there should be at least a 50% probability that the quantities actually recovered will equal or exceed the sum of estimated proved plus probable reserves.

In general, probable reserves may include (1) reserves anticipated to be proved by normal step-out drilling where sub-surface control is inadequate to classify these reserves as proved, (2) reserves in formations that appear to be productive based on well log characteristics but lack core data or definitive tests and which are not analogous to producing or proved reservoirs in the area, (3) incremental reserves attributable to infill drilling that could have been classified as proved if closer statutory spacing had been approved at the time of the estimate, (4) reserves attributable to improved recovery methods that have been established by repeated commercially successful applications when (a) a project or pilot is planned but not in operation and (b) rock, fluid, and reservoir characteristics appear favorable for commercial application, (5) reserves in an area of the formation that appears to be separated from the proved area by faulting and the geologic interpretation indicates the subject area is structurally higher than the proved area, (6) reserves attributable to a future workover, treatment, re-treatment, change of equipment, or other mechanical procedures, where such procedure has not been proved successful in wells which exhibit similar behavior in analogous reservoirs, and (7) incremental reserves in proved reservoirs where an alternative interpretation of performance or volumetric data indicates more reserves than can be classified as proved.


Probable reserves are often referred to a “P50” or “2P.”  There is a 90% probability that this much gas will be produced.  “2P” is also used as a connotation for “proved plus probable.”  Probable reserves are one of the mechanisms by which proved reserves can grow without additional drilling.  2P reflects the most likely volume that will be produced.

Possible Reserves

Possible reserves are those unproved reserves which analysis of geological and engineering data suggests are less likely to be recoverable than probable reserves. In this context, when probabilistic methods are used, there should be at least a 10% probability that the quantities actually recovered will equal or exceed the sum of estimated proved plus probable plus possible reserves.


Technically these are resources, not reserves.  Referred to as “P10” or “3P,” this represents the maximum volume that will be produced.  3P generally refers to “proved plus probable plus possible.”

SEC regulations only require publicly traded oil & gas companies to report proved reserves.  Although probable reserves can also be reported for valuation purposes.

The U.S. Energy Information Administration compiles proved reserve data, which can be accessed through their website.

In order to become the “Saudi Arabia” of natural gas, the U.S. would need to produce about 70% more gas than it currently does.  Do we have the gas resource base to “get there from here”?

The answer is: Yes.

Natural Gas01

Figure 10.  U.S. natural gas proved reserves, proved reserves – production and production (Bcf).  Source: EIA

Natural Gas06

Figure 11.  U.S. natural gas proved reserves in years of production.  Source: EIA

Proved reserves are only a fraction of the resource base.


Figure 12. U.S. natural gas resources.  Source: NGSA


P50 probable reserves (1P+2P) are nearly three times that of P90 proved reserves.

Natural Gas04

Figure 13.  U.S. natural gas proved & probable reserves and resources (Bcf). Source: EIA and NGSA

Natural Gas07

Figure  14.  U.S. natural gas proved & probable reserves and possible resources in years of production.  Source: EIA and NGSA


What Stands in the Way of the U.S. Becoming the Saudi Arabia of Natural Gas?  Natural Gas Prices

Low natural gas prices in the U.S. are the primary reason that proved natural gas reserves declined in 2015.

chart (3)

Figure 15.  U.S. natural gas wellhead price ($/mcf).  Source: EIA



Figure 16.  “Breakeven” prices ($/mcf) for major U.S. shale gas plays.

With U.S. natural gas prices currently around $3.30/mcf, typical shale gas wells are not breaking even, much less yielding a decent return.

To yield a 10% unleveraged return, the shale plays require gas prices of $4.50 to $5.50/mcf.

For natural gas, the dry Marcellus would require a NYMEX gas price of at least $4.55 to generate a 10% unleveraged return, said KLR. The region’s low capital intensity is partly offset by a lower gas price realization.

KLR NYMEX Gas Breakevens

The Fayetteville requires a NYMEX-normalized natural gas price of about $5.50 per Mcfe to generate a 10% return, making it the most expensive natural gas play among those examined by KLR. The Fayetteville was the only basin in last week’s Baker Hughes rig count to show zero activity as low prices continue to make the region uneconomical for new production.

Oil 360

While low U.S. natural gas prices are currently a drag on production and reserve growth, the also provide an advantage to domestic gas producers.  U.S. natural gas is extremely competitive in the global market.

JAN 31, 2016

The U.S. and Australian Race to Export Liquefied Natural Gas

Jude Clemente , CONTRIBUTOR
I cover oil, gas, power, LNG markets, linking to human development

Free market economies Australia and the U.S. will be in competition for the export of Liquified Natural Gas (LNG). Since 2010, Australia’s gas demand has increased 10%, but its gas production has increased 35%, compared to an 8% increase for use and 38% gain in production for the U.S. Per BP data, Australia and the U.S. have netted 75% of the 260 Tcf gain in proven global gas reserves since 2005.

In fact, through 2020, the two countries are expected to account for 90% or more new LNG exports. Overall, the global LNG market is set to increase by 50% between 2015 and 2020, nearly 20 Bcf/day. This year alone will see a 2.6 Bcf/day increase in LNG supply

Australia could add six new LNG export terminals by 2020, tripling its liquefaction capacity to over 13 Bcf/day. Although Cheniere Energy’s U.S. LNG export facility at Sabine Pass, the first of its kind in the continental U.S., was delayed until late-February or so, the country could be exporting 10 Bcf/day by 2020, almost equaling current global leader Qatar.


This year’s expansion of the Panama Canal will up competition in the U.S. to ship LNG to Asia, where over 70% of the world’s LNG is consumed. The U.S. has lower production costs and lower capital costs for new infrastructure, namely liquefaction facilities. Bolstered by the “shale revolution,” for instance, the more difficult Gulf of Mexico now produces just 5% of U.S. natural gas, versus over 25% 20 years ago

 This is in contrast to the expensive offshore gas projects in Australia, now responsible for over 50% of all floating liquefaction capacity under construction. Over 90% of Australia’s traditional gas resources reside in the harder-to-develop North West Shelf offshore.
Escalating labor costs have been a key factor in Australia’s drastic LNG cost overruns. In Australia, oil and gas workers can make $165,000, 30-35% more than in the U.S. and double the world’s average. One Harvard expert finds that “Australian LNG seems to be the worst business case globally,” with costs range being 2-3 times higher than in the U.S. (see here).


Daniel Yergin just said that the Saudi’s “will not destroy the US shale industry…It takes $10bn and five to ten years to launch a deep-water project. It takes $10m and just 20 days to drill for shale.” U.S. gas production is rising by 1.5% per year, three times faster than consumption (projections here).

Thus, U.S. gas prices will remain lower than in other markets, and arbitrage opportunities for companies to ship LNG will remain. North America’s gas prices are mostly set at liquid trading hubs, more linked to supply and demand fundamentals.

The key importing nations are not expected to be producing much more gas, so the internationally traded market will increase its current share of 30% of total gas consumed, closer to the 60% of oil demand that is traded internationally. Making gas more of a global commodity like oil, LNG now accounts for about 33% of all traded gas and 10-12% of total gas demand. The LNG market is just another example of the obvious: the world continues to become more connected, not less.






U.S. gas producers can undercut the price of the competition.  Outside of North America, LNG is currently trading at $5-6/mcf, 50-60% above the breakeven price of the shale plays.

LNG Prices

Figure 17. Global LNG prices ($/mmBtu).  Source: FERC

Natural gas demand rising, particularly in non-OECD nations:


Figure 18. Projected world natural gas consumption (Tcf). Source: EIA


With most export markets currently paying more than $5.50/mmBtu, global natural gas demand on the rise and most of the rest of the world paralyzed by an irrational fear of fracking (no arguments about the spelling, please), the U.S. could easily become the “Canada” of natural gas and clearly has the potential to become the Saudi Arabia of natural gas.

U.S. LNG Export Potential

Proposed LNG Capacity    7,799,490
EIA High Oil Price Case    4,594,200
EIA Base Case    3,388,767


Top 10 Crude Oil Exporters

1 SAUDI ARABIA 7,416,000 2013 EST.
2 RUSSIA 4,888,000 2013 EST.
3 CANADA 3,210,000 2015 EST.
4 UNITED ARAB EMIRATES 2,637,000 2013 EST.
5 IRAQ 2,462,000 2013 EST.
6 NIGERIA 2,231,000 2013 EST.
7 ANGOLA 1,745,000 2013 EST.
8 KUWAIT 1,711,000 2013 EST.
9 VENEZUELA 1,548,000 2013 EST.
10 KAZAKHSTAN 1,466,000 2013 EST.

Source: CIA World Fact Book


Further Reading

EIA Annual Energy Outlook 2017

Featured Image

Cheniere Energy


Mammoth Steppe

April 28, 2017


April 25, 2017


The push to save U.S. nuclear plants for the sake of fighting climate change is threatening support for the bread and butter of clean power: wind and solar.

New York and Illinois have already approved as much as $10 billion in subsidies to keep struggling reactors open for the next decade as part of a plan to limit fossil fuel consumption. Lawmakers in Ohio, Connecticut and New Jersey are debatingwhether to do the same.

The reactors, which are being squeezed by low natural gas prices, offer a singular advantage in the fight against global warming because they produce round-the-clock electricity without emitting greenhouse gases. Yet renewable energy operators including NRG Energy Inc. and Invenergy LLC say keeping nuclear plants open will leave grids awash with excess power, leaving little demand for new wind and solar farms.

“It’s the wrong policy — and whether it proliferates or not is going to be a really big factor,” Invenergy Chief Operating Officer Jim Murphy said during a panel discussion at the Bloomberg New Energy Finance conference in New York Monday.



“Renewable energy operators say keeping nuclear plants open will leave grids awash with excess power, leaving little demand for new wind and solar farms.”


Keeping the “grids awash with excess power” is the only way to handle bellwether events without having to rely on brownouts and blackouts.  Solar and wind can neither provide base-load nor flexible response to bellwether events.  Increasing reliance on these resources make it imperative that we keep the “grids awash with excess power.”

Getting back to the Bloomberg article, there appears to be a lot of whining about subsidies for nuclear power… With the renewables crowd doing all of the whining:


Nuclear’s economic woes comes as wind and solar are starting to show they’re cheap enough to compete with traditional generators, after years of help from subsidies. The push to aid reactors began last year after Exelon Corp. successfully argued in New York and Illinois that since nuclear does not contribute to global warming, its plants should receive a premium to help level the playing field with wind and solar.

“The fossil generators sell electricity with air pollution,” Joseph Dominguez, an Exelon executive vice president, said in an interview. “We sell electricity without air pollution — and that’s a different product.”

There are key differences between wind and solar subsidies and those for nuclear, according to clean-energy developers. Renewable energy credits have spurred an emerging industry, whereas nuclear subsidies are to preserve aging plants. And while wind and solar developers compete against each other for subsidies, those for nuclear benefit a single technology.

Market Rules

“The renewables industry has been playing by competitive market rules that have helped to produce good prices,” Amy Francetic, an Invenergy senior vice president, said in an interview. “This is picking and winners and losers in a way that’s troubling.”


Nuclear power absolutely is the leader of the pack at reducing so-called “greenhouse” gas emissions:


Figure 1. Nuclear and gas kick @$$, wind breaks even and solar is a loser.


“The renewables industry has been playing by competitive market rules that have helped to produce good prices,” Amy Francetic, an Invenergy senior vice president, said in an interview. “This is picking and winners and losers in a way that’s troubling.”

Really?  Ms. Francetic, *government* always picks “winners and losers in a way that’s troubling.”

As far as the renewables industry “playing by competitive market rules that have helped to produce good prices”…


Figure 2. Ms. Francetic, Data is laughing at you.

The most recent U.S. Energy Information Administration report on energy subsidies reveals the following:


Solar and wind power are insignificant sources of energy.

Energy Subsidies1

Figure 3a. U.S. Energy production by source 2010 & 2013 (trillion Btu), U.S. Energy Information Administration.


Figure 3b. U.S. primary energy production 1981-2015 (million tonnes of oil equivalent), BP 2016 Statistical Review of World Energy.


Solar and wind power receive massive Federal subsidies.

Energy 2

Figure 4. Federal subsidies by energy source 2010 and 2013 (million 2013 US dollars), U.S. Energy Information Administration.


The solar and wind subsidies are truly massive in $/Btu.

Energy Subsidies3

Figure 5. Subsidies per unit of energy by source ($/mmBtu), U.S. Energy Information Administration.


The true folly of solar power is most apparent in subsidies per kilowatt-hour of electricity generation.  At 23¢/kWh, the solar subsidies in 2013 were nearly twice the average U.S. residential retail electricity rate.

Energy Subsidies4

Figure 6. Subsidies per kilowatt-hour of electricity generation, U.S. Energy Information Administration.


Solar and wind subsidies are weighted toward direct expenditures of tax dollars.

Energy Subsidies5

Figure 7. Subsidies by type for wind, solar, nuclear, coal and natural gas & petroleum liquids, U.S. Energy Information Administration.  Table ES2.


Federal solar and wind subsidies were 3-4 times that of nuclear power in 2013.  Only 2% of the nuclear power subsidies consisted of direct expenditures, compared to 72% and 56% for solar and wind power respectively… And the renewables industry has the gall to complain about New York and Illinois kicking in $500 and $235 million per year in extra subsidies to keep nuclear power plants running in their States.  Really?

Most of the Federal subsidies for oil & gas (96%), coal (71%) and nuclear power (67%) consist of tax breaks.  The subsidies for oil & gas aren’t really even subsidies.  These are standard tax deductions and depreciation of assets.

Solar power simply can’t work without massive subsidies.  While the economics of wind power are improving, renewables are still extremely expensive relative to existing coal and nuclear power plants.







CO2 InfrastructureGavinDistribution

“Hard Lessons From the Great Algae Biofuel Bubble”

April 20, 2017

Guest post by David Middleton


From 2005 to 2012, dozens of companies managed to extract hundreds of millions in cash from VCs in hopes of ultimately extracting fuel oil from algae.

CEOs, entrepreneurs and investors were making huge claims about the promise of algae-based biofuels; the U.S. Department of Energy was also making big bets through its bioenergy technologies office; industry advocates claimed that commercial algae fuels were within near-term reach.

Jim Lane of Biofuels Digest authored what was possibly history’s least accurate market forecast, projecting that algal biofuel capacity would reach 1 billion gallons by 2014. In 2009, Solazyme promised competitively priced fuel from algae by 2012. Algenol planned to make 100 million gallons of ethanol annually in Mexico’s Sonoran Desert by the end of 2009 and 1 billion gallons by the end of 2012 at a production rate of 10,000 gallons per acre. PetroSun looked to develop an algae farm network of 1,100 acres of saltwater ponds that could produce 4.4 million gallons of algal oil and 110 million pounds of biomass per year.

Nothing close to 1 billion (or even 1 million) gallons has yet been achieved — nor has competitive pricing.


The promise of algae is tantalizing. Some algal species contain up to 40 percent lipids by weight, a figure that could be boosted further through selective breeding and genetic modification. That basic lipid can be converted into diesel, synthetic petroleum, butanol or industrial chemicals.

According to some sources, an acre of algae could yield 5,000 to 10,000 gallons of oil a year, making algae far more productive than soy (50 gallons per acre), rapeseed (110 to 145 gallons), jatropha (175 gallons), palm (650 gallons), or cellulosic ethanol from poplars (2,700 gallons).


Green Tech Media

“VC” refers to venture capitalists.  I had to look it up because I didn’t think the Viet Cong were still in business.

The problem with algal biofuel is this:

According to some sources, an acre of algae could yield 5,000 to 10,000 gallons of oil a year, making algae far more productive than… 

10,000 gallons is 238 barrels per acre.  A typical oil well in the Gulf of Mexico yields 300-500 barrels per acre*foot and a typical reservoir is 50-100′ thick.  This works out to 15,000 to 50,000 barrels per acre over the life of the well.  Assuming the well produced for 10 years, this works out to 1,500 to 5,000 barrels per acre per year.

Gallons of Oil per Acre per Year Min Max
Algae          5,000          10,000
Typical GOM Oil Field       63,000       210,000

Granted, there are a lot of differences between crude oil and algal oil… And, hypothetically, the acre of algae is “renewable”… However, 1 acre of algae requires 1 acre of land.  An oil well only requires the acreage that it’s production facility covers.  Oil reservoirs can cover 100’s or 1,000’s of acres, can be well over 100′ thick and often occur in stacked sequences.

Shell’s Mars oil field (Mississippi Canyon 807) has produced about 1.3 billion barrels of oil and 1.7 trillion cubic feet of natural gas since 1996.  This works out to about 1.6 billion barrels of oil equivalent (BOE).  The “footprint” of the field (platform + outline of directional wells) covers about 11,000 acres.  The field has averaged over 6,700 BOE (over 280,000 gallons) per acre per year from 1996-2016.

Gallons of Oil per Acre per Year Max
Algae       10,000
Mars Oil Field     282,360

After 20 full years of production Mars is still going strong.  In 2016, it produced over 6,000 BOE per acre.

It’s refreshing to see that some of the green energy herd is capable of learning lessons… But I doubt they learned this lesson.

Clean Coal: Carbon Capture and Enhanced Oil Recovery

April 18, 2017

Petra Nova2


WASHINGTON, D.C. — Secretary of Energy Rick Perry took part in a ribbon-cutting ceremony today to mark the opening of Petra Nova, the world’s largest post-combustion carbon capture project, which was completed on-schedule and on-budget. The large-scale demonstration project, located at the W.A. Parish power plant in Thompsons, Texas, is a joint venture between NRG Energy (NRG) and JX Nippon Oil & Gas Exploration Corporation (JX).

“I commend all those who contributed to this major achievement,” said Secretary Perry. “While the Petra Nova project will certainly benefit Texas, it also demonstrates that clean coal technologies can have a meaningful and positive impact on the Nation’s energy security and economic growth.”

Funded in part by the U.S. Department of Energy (DOE) and originally conceived as a 60-megawatt electric (MWe) capture project, the project sponsors expanded the design to capture emissions from 240 MWe of generation at the Houston-area power plant, quadrupling the size of the capture project without additional federal investment. During performance testing, the system demonstrated a carbon capture rate of more than 90 percent.

At its current level of operation, Petra Nova will capture more than 5,000 tons of carbon dioxide (CO2) per day, which will be used for enhanced oil recovery (EOR) at the West Ranch Oil Field. The project is expected to boost production at West Ranch from 500 barrels per day to approximately 15,000 barrels per day. It is estimated that the field holds 60 million barrels of oil recoverable from EOR operations.

The successful commencement of Petra Nova operations also represents an important step in advancing the commercialization of technologies that capture CO2 from the flue gas of existing power plants. Its success could become the model for future coal-fired power generation facilities. The addition of CO2 capture capability to the existing fleet of power plants could support CO2 pipeline infrastructure development and drive domestic EOR opportunities.

U.S. Department of Energy

The Petra Nova carbon capture system was installed in the W.A. Parish generation station.  This is the largest and cleanest fossil fuel generaton station in the United States:

W.A. Parish Electric Generation Station, Thompson, Texas

Owner/operator: Texas Genco Holdings Inc.

Texas Genco has invested heavily in upgrading its W.A. Parish coal- and gas-fired plant southwest of Houston. Although this nine-unit, 3,653-MW plant is the largest fossil-fueled plant in America, its NOx emissions have been reduced to microscopic levels. Based on those levels, W.A. Parish could rightly claim that it is among the cleanest coal plants in the U.S.

Texas Genco’s W.A. Parish Electric Generation Station (WAP) is the largest coal- and gas-fired power facility in the U.S. based on total net generating capacity. It and its owner, Texas Genco Holdings Inc., operate in the Electric Reliability Council of Texas (ERCOT), one of the largest electric power markets in the nation. Over the past few years, the majority-owned subsidiary of Houston-based CenterPoint Energy Inc. has met the challenge of adding emissions control equipment to these baseload units while maintaining the availability and reliability required by ERCOT’s competitive market.

In the process, Texas Genco has emerged as an industry leader at reducing emissions and demonstrating new NOx-control technologies. The company’s fleet of plants operates at one of the lowest NOx emission rates in the country, and WAP likely emits less NOx on a lb/MMBtu basis than any coal-fired plant of any size in the U.S. Cleanliness is costly; the company has spent more than $700 million on new emission controls since 1999.

With the commissioning of another round of emissions-control equipment this year, NOx emissions from Texas Genco’s Houston-area power plants—including WAP—will be 88% lower than 1998 levels. These actions play a major role in the Houston/Galveston Area Ozone State Implementation Plan and are helping to clean the air in the greater Houston area. To honor the accomplishment, the W.A. Parish plant was recently given the Facility Award by the Power Industry Division of the Instrumentation, Systems, and Automation Society (Research Triangle Park, N.C.) for installing equipment to reduce emissions and improve reliability while minimizing operational costs.



The W.A. Parrish Generation Station has a generating capacity of about 3,660 MW (2,740 MW of coal and 1,190 MW of natural gas capacity).  Its total capacity is approximately the same as the ten largest solar PV plants in the U.S. combined (3,713 MW).  From 2002-2009, W.A. Parrish operated at 85% of capacity.  The war on coal gradually reduced its operations to 57% of capacity in 2016.

The Petra Nova carbon capture system will enable the plant to capture about 90% of the CO2 from 240 MW of its coal capacity.  It is expected to capture about 1.6 million tons of CO2 per year.  The cost of the carbon capture system was approximately $1 billion, with the taxpayers picking up 19% of the tab.  Normally, I would call this a pointless waste of money.  It won’t have any effect on atmospheric CO2 or the weather.  However, this carbon capture system actually serves a useful purpose:


NRG Petra Nova Fact Sheet

The Captured CO2 will employ Enhanced Oil Recovery to enhance production at the West Ranch oil field, which is operated by Hilcorp Energy Company. It is expected that oil production will be boosted from around 300 barrels per day today to up to 15,000 barrels per day while also sequestering CO2 underground. This field is currently estimated to hold approximately 60 million barrels of oil recoverable from EOR operations

How Carbon Capture Works

The Carbon Capture and Enhanced Oil Recvoery Project
The Carbon Dioxide Capture Process
Beneficial use of the captured Carbon Dioxide

Download high resolution images


The West Ranch oil field has produced about 390 million barrels of oil since 1938. CO2 injection will boost the production from 300 to as much as 15,000 barrels of oil per day.  The EOR could lead to the recovery of 60 million barrels of oil that would otherwise be “left in the ground.”  Irony is such a beautiful thing!  

And the really cool thing about this project: It makes money!


NRG’s Petra Nova Plant Captures Carbon, Boosts Bottom Line

An interview with David Greeson, Vice President of Development, NRG Energy Inc.

by Brian Wellborn

NRG Energy Inc. (NRG) and JX Nippon Oil & Gas Exploration jointly operate the Petra Nova Carbon Capture project, the world’s largest retrofit post-combustion carbon capture system, at the W.A. Parish Generating Station southwest of Houston.

Fiscal Notes recently spoke with NRG Vice President of Development David Greeson to discuss the Petra Nova project and learn what makes its capture system unique, environmentally sound and profitable.

Fiscal Notes: What are Petra Nova’s broad environmental goals?

David Greeson: The goal of the Petra Nova project is to capture more than 90 percent of the carbon dioxide (CO2) in the exhaust flue gas from an existing coal-fired unit at the W.A. Parish power plant. We want to prove it’s feasible to build a carbon capture system on schedule and on budget. Demonstrating the system working at full commercial scale will provide a path forward to address CO2 emissions from existing coal-fired plants, both in the U.S. and around the world.

In addition, we’re looking to create a commercial structure that couples power generation with oil recovery for potential long-term viability — not only to pay for the carbon capture and storage system but also to provide an economic return for investors.


Fiscal Notes: How economically viable is Petra Nova’s carbon capture process?

Greeson: As long as oil is priced at around $50 per barrel or above, sales of the oil from the West Ranch field will pay for the Petra Nova project.



The price of CO2 for EOR projects is generally pegged to the price of oil.  At >$50/bbl, the sale of the CO2 to Hilcorp will pay for the carbon capture system.  Projects like this do not need subsidies.

This will enable the coal-fired plants to operate at a higher capacity and prevent 60 million barrels of oil from becoming “stranded assets.”  I just love irony!

The Good, the Bad and the Null Hypothesis

April 4, 2017

Dr Norman Page May 26, 2017 at 6:22 am

My main reason for showing the Akasofu curve was to show an example of an interpretation which at least honors the 60 year cycle which clearly exists over the last century.

The roughly 60-yr cycle has been present throughout the Holocene.  It’s the peak at 16 cycles/1,000 years on the GISP2 power spectrum:

The dominant ~1,000-yr cycle is Akasofu’s “Recovery from Little Ice Age.”

When debating the merits of the CAGW (catastrophic anthropogenic global warming) hypothesis, I often encounter this sort of straw man fallacy:

All that stuff is a distraction. Disprove the science of the greenhouse effect. Win a nobel prize get a million bucks. Forget the models and look at the facts. Global temperatures are year after year reaching record temperatures. Or do you want to deny that.


This is akin to arguing that one would have to disprove convection in order to falsify plate tectonics or genetics in order to falsify evolution.  Plate tectonics and evolution are extremely robust scientific theories which rely on a combination of empirical and correlative evidence.  Neither theory can be directly tested through controlled experimentation.  However, both theories have been tested through decades of observations.  Subsequent observations have largely conformed to these theories.

Note: I will not engage in debates about the validity of the scientific theories of plate tectonics or evolution.

The power of such scientific theories is demonstrated through their predictive skill: Theories are predictive of subsequent observations.  This is why a robust scientific theory is even more powerful than facts (AKA observations).

CAGW is a similar type of theory hypothesis.  It relies on empirical science (the “good”) and correlative “science” (the “bad”).

The Good

Carbon dioxide is a so-called “greenhouse” gas.  It retards radiative cooling.  All other factors held equal, increasing the atmospheric concentration of CO2 will lead to a somewhat higher atmospheric temperature.  However, all other things are never held equal.


Figure 1. “Greenhouse” gas spectra.

Atmospheric CO2 has risen since the 19th century.


Figure 2. Atmospheric CO2 from instrumental records, Antarctic ice cores and plant stomata.

Humans are responsible for at least half of this rise in atmospheric CO2.


Figure 3. Natural sources probably account for ~50% of the rise in atmospheric CO2 since 1750.

While anthropogenic sources are a tiny fraction of the total, we are removing carbon from geologic sequestration and returning it to the active carbon cycle.

The average temperature of Earth’s surface and troposphere has generally risen over the past 150 years.


Figure 5. Surface temperature anomalies: BEST (land only), HadCRUT4 & GISTEMP. Satellite lower troposphere: UAH & RSS.

The Bad

The modern warming began long before the recent rise in atmospheric CO2 and prior to the 19th century temperature and CO2 were decoupled:


Figure 6. Temperature reconstruction (Moberg et al., 2005) and Law Dome CO2 (MacFarling Meure et al., 2006)

The recent rise in temperature is no more anomalous than the Medieval Warm Period or the Little Ice Age:


Figure 7. Temperature reconstruction (Ljungqvist, 2010), northern hemisphere instrumental temperature (HadCRUT4) and Law Dome CO2 (MacFarling Meure et al., 2006). Temperatures are 30-yr averages to reflect changing climatology.

Over the past 2,000 years, the average temperature of the Northern Hemisphere has exceeded natural variability (defined as two standard deviations from the pre-1865 mean) three times: 1) the peak of the Medieval Warm Period 2) the nadir of the Little Ice Age and 3) since 1998.  Human activities clearly were not the cause of the first two deviations.  70% of the warming since the early 1600’s clearly falls within the range of natural variability.

While it is possible that the current warm period is about 0.2 °C warmer than the peak of the Medieval Warm Period, this could be due to the differing resolutions of the proxy reconstruction and instrumental data:


Figure 8. The instrumental data demonstrate (higher frequency and higher amplitude temperature variations than the proxy reconstructions.

The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. AD 1-300, reaching up to the 1961-1990 mean temperature level, followed by the Dark Age Cold Period c. AD 300-800. The Medieval Warm Period is seen c. AD 800–1300 and the Little Ice Age is clearly visible c. AD 1300-1900, followed by a rapid temperature increase in the twentieth century. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961-1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself.


The proxy reconstruction itself does not show such an unprecedented warming but we must consider that only a few records used in the reconstruction extend into the 1990s. Nevertheless, a very cautious interpretation of the level of warmth since AD 1990 compared to that of the peak warming during the Roman Warm Period and the Medieval Warm Period is strongly suggested.


The amplitude of the temperature variability on multi-decadal to centennial time-scales reconstructed here should presumably be considered to be the minimum of the true variability on those time-scales.


Ljungqvist, 2010


Figure 9. Ljungqvist demonstrates that the modern warming has not unambiguously exceeded the range of natural variability. The bold black dashed line is the instrumental record. I added The red lines to highlight the margin of error.

The climate of the Holocene has been characterized by a roughly millennial cycle of warming and cooling (for those who don’t like the word “cycle,” pretend that I typed “quasi-periodic fluctuation):


Figure 10. Millennial cycle apparent on Ljungqvist reconstruction.


Figure 11. Millennial scale cycle apparent on Moberg reconstruction.

These cycles (quasi-periodic fluctuations) even have names:


Figure 12. Late Holocene climate cycles (quasi-periodic fluctuations).

These cycles have been long recognized by Quaternary geologists:


Figure 12. The millennial scale climate cycle can clearly be traced back to the end of the Holocene Climatic Optimum and the onset of the Neoglaciation.

Fourier analysis of the GISP2 ice core clearly demonstrates that the millennial scale climate cycle is the dominant signal in the Holocene (Davis & Bohling, 2001).


Figure 13. The Holocene climate has been dominated by a millennial scale climate cycle.

The industrial era climate has not changed in any manner inconsistent with the well-established natural millennial scale cycle. Assuming that the ice core CO2 is reliable, the modern rise in CO2 has had little, if any effect on climate.

The Null Hypothesis

What is a ‘Null Hypothesis’

A null hypothesis is a type of hypothesis used in statistics that proposes that no statistical significance exists in a set of given observations. The null hypothesis attempts to show that no variation exists between variables or that a single variable is no different than its mean. It is presumed to be true until statistical evidence nullifies it for an alternative hypothesis.

Read more: Null Hypothesis
Follow us: Investopedia on Facebook

Since it is impossible to run a controlled experiment on Earth’s climate (there is no control planet), the only way to “test” the CAGW hypothesis is through models.  If the CAGW hypothesis is valid, the models should demonstrate predictive skill.  The models have utterly failed:


Figure 14. “95% of Climate Models Agree: The Observations Must be Wrong.”


Figure 15. “Climate models versus climate reality.” Michaels & Knappenberger.

The models have failed because they result in a climate sensitivity that is 2-3 times that supported by observations:


Figure 15. Equilibrium climate sensitivity: Reality vs. Models.

From Hansen et al. 1988 through every IPCC assessment report, the observed temperatures have consistently tracked the strong mitigation scenarios in which the rise in atmospheric CO2 has been slowed and/or halted.

Apart from the strong El Niño events of 1998 and 2015-16, GISTEMP has tracked Scenario C, in which CO2 levels stopped rising in 2000, holding at 368 ppm.


Figure 16. Hansen’s 1988 model and GISTEMP.

The utter failure of this model is most apparent on the more climate-relevant 5-yr running mean:


Figure 17. Hansen’s 1988 model and GISTEMP, 5-yr running mean.

This is from IPCC’s First Assessment Report:


Figure 18.  IPCC First Assessment Report (FAR).  Model vs. HAdCRUT4.

HadCRUT4 has tracked below Scenario D.


Figure 19. IPCC FAR scenarios.

This is from the IPCC’s Third Assessment Report (TAR):


Figure 20. IPCC TAR model vs. HadCRUT4.

HadCRUT4 has tracked the strong mitigation scenarios, despite a general lack of mitigation.

The climate models have never demonstrated any predictive skill.

And the models aren’t getting better. Even when they start the model run in 2006, the observed temperatures consistently track at or below the low end 5-95% range (p05-p95).  Observed temperatures only approach the model mean (P50) in 2006, 2015 and 2016.


Figure 21.  Climate Lab Book. Comparing CMIP5 & observations.

The ensemble consists of 138 model runs using a range of representative concentration pathways (RCP), from a worst case scenario RCP 8.5, often referred to as “business as usual,” to varying grades of mitigation scenarios (RCP 2.6, 4.5 and 6.0).


Figure 22. Figure 21 with individual model runs displayed.


When we drill wells, we run probability distributions to estimate the oil and gas reserves we will add if the well is successful.  The model inputs consist of a range of estimates of reservoir thickness, area and petrophysical characteristics.  The model output consists of a probability distribution from P10 to P90.

  • P10 = Maximum Case.  There is a 10% probability that the well will produce at least this much oil and/or gas.
  • P50 = Mean Case.  There is a 50% probability that the well will produce at least this much oil and/or gas.  Probable reserves are >P50.
  • P90 = Minimum Case.  There is a 90% probability that the well will produce at least this much oil and/or gas.  Proved reserves are P90.

Over time, a drilling program should track near P50.  If your drilling results track close to P10 or P90, your model input is seriously flawed.

If the CMIP5 model ensemble had predictive skill, the observations should track around P50, half the runs should predict more warming and half less than is actually observed. During the predictive run of the model, HadCRUT4.5 has not *tracked* anywhere near P50…


Figure 23. Figure 21 zoomed in on model run period with probability distributions annotated.

I “eyeballed” the instrumental observations to estimate a probability distribution of predictive run of the model.

Prediction Run Approximate Distribution

2006 P60 (60% of the models predicted a warmer temperature)
2007 P75
2008 P95
2009 P80
2010 P70
2011-2013 >P95
2014 P90
2015-2016 P55

Note that during the 1998-99 El Niño, the observations spiked above P05 (less than 5% of the models predicted this). During the 2015-16 El Niño, HadCRUT only spiked to P55.  El Niño events are not P50 conditions. Strong El Niño and La Niña events should spike toward the P05 and P95 boundaries.

The temperature observations are clearly tracking much closer to strong mitigation scenarios rather than RCP 8.5, the bogus “business as usual” scenario.

The red hachured trapezoid indicates that HadCRUT4.5 will continue to track between less than P100 and P50. This is indicative of a miserable failure of the models and a pretty good clue that the models need be adjusted downward.

In any other field of science CAGW would be a long-discarded falsified hypothesis.



Claims that AGW or CAGW have earned an exemption from the Null Hypothesis principle are patently ridiculous.

In science, a broad, natural explanation for a wide range of phenomena. Theories are concise, coherent, systematic, predictive, and broadly applicable, often integrating and generalizing many hypotheses. Theories accepted by the scientific community are generally strongly supported by many different lines of evidence-but even theories may be modified or overturned if warranted by new evidence and perspectives.

UC Berkeley

This is not a scientific hypothesis.  It is arm waving:

More CO2 will cause some warming.

This is a scientific hypothesis:

A doubling of atmospheric CO2 will cause the lower troposphere to warm by ___ °C.

Thirty-plus years of failed climate models never been able to fill in the blank.  The IPCC’s Fifth Assessment Report essentially stated that it was no longer necessary to fill in the blank.

While it is very likely that human activities are the cause of at least some of the warming over the past 150 years, there is no robust statistical correlation.  The failure of the climate models clearly demonstrates that the null hypothesis still holds true for atmospheric CO2 and temperature.

Selected References

Davis, J. C., and G. C. Bohling, The search for patterns in ice-core temperature curves, 2001, in L. C. Gerhard, W. E. Harrison, and B. M. Hanson, eds., Geological perspectives of global climate change, p. 213–229.

Finsinger, W. and F. Wagner-Cremer. Stomatal-based inference models for reconstruction of atmospheric CO2 concentration: a method assessment using a calibration and validation approach. The Holocene 19,5 (2009) pp. 757–764

Grosjean, M., Suter, P. J., Trachsel, M. and Wanner, H. 2007. Ice-borne prehistoric finds in the Swiss Alps reflect Holocene glacier fluctuations. J. Quaternary Sci.,Vol. 22 pp. 203–207. ISSN 0267-8179.

Hansen, J., I. Fung, A. Lacis, D. Rind, Lebedeff, R. Ruedy, G. Russell, and P. Stone, 1988: Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. J. Geophys. Res., 93, 9341-9364, doi:10.1029/88JD00231.

Kouwenberg, LLR, Wagner F, Kurschner WM, Visscher H (2005) Atmospheric CO2 fluctuations during the last millennium reconstructed by stomatal frequency analysis of Tsuga heterophylla needles. Geology 33:33–36

Ljungqvist, F.C. 2009. N. Hemisphere Extra-Tropics 2,000yr Decadal Temperature Reconstruction. IGBP PAGES/World Data Center for Paleoclimatology Data Contribution Series # 2010-089. NOAA/NCDC Paleoclimatology Program, Boulder CO, USA.

Ljungqvist, F.C. 2010. A new reconstruction of temperature variability in the extra-tropical Northern Hemisphere during the last two millennia. Geografiska Annaler: Physical Geography, Vol. 92 A(3), pp. 339-351, September 2010. DOI: 10.1111/j.1468-459.2010.00399.x

MacFarling Meure, C., D. Etheridge, C. Trudinger, P. Steele, R. Langenfelds, T. van Ommen, A. Smith, and J. Elkins. 2006. The Law Dome CO2, CH4 and N2O Ice Core Records Extended to 2000 years BP. Geophysical Research Letters, Vol. 33, No. 14, L14810 10.1029/2006GL026152.

Moberg, A., D.M. Sonechkin, K. Holmgren, N.M. Datsenko and W. Karlén. 2005. Highly variable Northern Hemisphere temperatures reconstructed from low-and high-resolution proxy data. Nature, Vol. 433, No. 7026, pp. 613-617, 10 February 2005.

Instrumental Temperature Data from Hadley Centre / UEA CRU, NASA Goddard Institute for Space Studies and Berkeley Earth Surface Temperature Project via Wood for Trees.

China Coal

April 4, 2017

Source: 2016 BP Statistical Review of World Energy


March 21, 2017


Goebel et al., 2008

March 14, 2017

The late pleistocene dispersal of modern humans in the americas (Goebel et al 2008)