September 22, 2017



Musings On the Value of Old Textbooks

September 20, 2017

When I first started “debating” climate change on the Internet, I ran across this New York Times article:

From a Rapt Audience, a Call to Cool the Hype


Hollywood has a thing for Al Gore and his three-alarm film on global warming, “An Inconvenient Truth,” which won an Academy Award for best documentary. So do many environmentalists, who praise him as a visionary, and many scientists, who laud him for raising public awareness of climate change.

But part of his scientific audience is uneasy. In talks, articles and blog entries that have appeared since his film and accompanying book came out last year, these scientists argue that some of Mr. Gore’s central points are exaggerated and erroneous. They are alarmed, some say, at what they call his alarmism.

“I don’t want to pick on Al Gore,” Don J. Easterbrook, an emeritus professor of geology at Western Washington University, told hundreds of experts at the annual meeting of the Geological Society of America. “But there are a lot of inaccuracies in the statements we are seeing, and we have to temper that with real data.”


In October, Dr. Easterbrook made similar points at the geological society meeting in Philadelphia. He hotly disputed Mr. Gore’s claim that “our civilization has never experienced any environmental shift remotely similar to this” threatened change.

Nonsense, Dr. Easterbrook told the crowded session. He flashed a slide that showed temperature trends for the past 15,000 years. It highlighted 10 large swings, including the medieval warm period. These shifts, he said, were up to “20 times greater than the warming in the past century.”

Getting personal, he mocked Mr. Gore’s assertion that scientists agreed on global warming except those industry had corrupted. “I’ve never been paid a nickel by an oil company,” Dr. Easterbrook told the group. “And I’m not a Republican.”


NY Times

Don J. Easterbrook… Funny thing… I remember the names of the authors of many of my college (1976-1980) textbooks.

  • The Oceans by Sverdrup, Johnson & Fleming
  • Principles of Geology by Press & Siever
  • Principles of Sedimentology by Friedman & Sanders
  • Structural Geology by Billings
  • Manual of Field Geology by Compton
  • Evolution of the Earth by Dott & Batten
  • Mineralogy by Berry & Mason
  • Petrology of Igneous and Metamorphic Rocks by Hyndman
  • Meteorology by Donn
  • Principles of Geomorphology by Don J. Easterbrook

Who could have guessed that 30+ years later, I would be “fighting” alongside Dr. Easterbrook in the Internet climate change wars?

If that isn’t cool enough, my introduction to climatology occurred in my first semester of college, when I took a course in physical geography.


Physical Geography Today: A Portrait of a Planet by Muller & Oberlander

One day, I was curious as to what my physical geography textbook had to say about the greenhouse effect and global warming… So I dug it out of a box in the garage and opened it to find…


Reid Bryson was a contributor… How cool is that?

The late Reid Bryson was known as the “father of scientific climatology” and a prominent AGW skeptic.  This is what they had to say about the so-called greenhouse effect…


One mention of the greenhouse effect.


“Mostly harmless”… Douglas Adams

This was only 14 years before Al Gore and James Hansen “invented” Anthropogenic Gorebal Warming.

“As a planet, the Earth is not warming or cooling appreciably on average…”

The book was published in 1974, just before Earth was nearly plunged into an ice age.


The Ice Age Cometh? Science News, March 1, 1975


The rate of warming from 1975-2010 is almost identical to the rate of warming from 1910-1945 (smack in the middle of “not warming or cooling appreciably on average” climate). (3)

HadCRUT4 global temperature anomaly (° C). From Wood For Trees.

This leads to the following equation:

  • Green = “not warming or cooling appreciably on average”
  • Red = Gorebal Warming crisis.

Green ≈ Red

Therfore AGW is 


Irma v Maria

September 20, 2017 (2)

LinkedIn “Debate”

September 20, 2017

LinkedIn posts…

How one can make the leap from:

David Middleton

The point is that the warming observed in the instrumental temperature record doesn’t significantly deviate from the pre-exiting Holocene pattern of climate change…

Over the past 2,000 years, the average temperature of the northern hemisphere has exceeded natural variability (+/-2 std dev) 3 times: The Medieval Warm Period, the Little Ice Age and the modern warming. Humans didn’t cause at least two of the three and the current one only exceeds natural variability by about 0.2 C. And this is a maximum, because the instrumental data have much higher resolution than the proxy data.


Nigel Goodwin

David, what you have decided to ignore is basic physics and chemistry.

Defies any logical explanation… And it got worse from there:

David Middleton

No. I have not. All other things held equal, the radiative forcing effect of CO2 works out to about 1 C of warming per doubling of atmospheric CO2. In Earth Science, the “all other things” are never held equal.

The catastrophic global warming predictions invoke strong positive feedback mechanisms (e.g. increased water vapor content) to increase the climate sensitivity from 1 to 3-6 C per doubling. This is one of the reasons the climate models fail to demonstrate predictive skill 95-99% of the time. All of the recent observation-based estimates of climate sensitivity put the transient climate response at about 1.35 C per doubling (ranging from 0.5 to 2.4 C).

Updated climate sensitivity estimates

1.35 C per doubling won’t yield catastrophic warming. It will stay well below the so-called 2.0 C limit. It won’t even be significantly warmer than what would have happened if humans never discovered how to burn things.

Which yielded an appeal to emotion fallacy:

Nigel Goodwin

Are you willing to bet your future, your children’s’ future, and your grandchildren’s future, on your speculation that it will be within 1.35C? You have absolutely no certainty about that. Do a proper risk analysis.

At this point, I just stopped replying to his comments.

Africa Coal

September 13, 2017


NG        4,315 59%
Hydro        1,030 14%
Wind           810 11%
Solar           628 9%
Fuel Oil           196 3%
Geothermal           158 2%
Biomass             89 1%
Diesel             35 0%
Fossil Fuels        4,546 63%
Solar + Wind        1,438 20%
Hydro        1,030 14%
Geothermal           158 2%
Biomass             89 1%



September 8, 2017



August 26, 2017


“No Moore’s Law for batteries”

August 24, 2017

No Moore’s Law for batteries

The public has become accustomed to rapid progress in mobile phone technology, computers, and access to information; tablet computers, smart phones, and other powerful new devices are familiar to most people on the planet.

These developments are due in part to the ongoing exponential increase in computer processing power, doubling approximately every 2 years for the past several decades. This pattern is usually called Moore’s Law and is named for Gordon Moore, a cofounder of Intel. The law is not a law like that for gravity; it is an empirical observation, which has become a self-fulfilling prophecy. Unfortunately, much of the public has come to expect that all technology does, will, or should follow such a law, which is not consistent with our everyday observations: For example, the maximum speed of cars, planes, or ships does not increase exponentially; maximum speed barely increases at all.

Cars require a portable fuel, preferably one that is widely available, low in cost, and with a high energy density. Gasoline is nature’s ideal fuel. A full tank of gasoline contains as much energy as 1,000 sticks of dynamite. However, cost, national security, global climate change, and pollution lead to a national need to wean ourselves from powering cars with gasoline. There are not many alternate candidates. Natural gas is still a fossil fuel, and hydrogen can presently be produced only at a high energy cost and has low energy density. And then there is electricity. We power our mobile phones and our laptops with lithium-ion batteries—why not power our cars this way? We already have an infrastructure for generating and distributing electricity. If only we had batteries that could store enough energy to power a car several hundred kilometers and that were not too heavy and would not cost a fortune.

Sadly, such batteries do not exist. There is no Moore’s Law for batteries. The reason there is a Moore’s Law for computer processors is that electrons are small and they do not take up space on a chip. Chip performance is limited by the lithography technology used to fabricate the chips; as lithography improves ever smaller features can be made on processors. Batteries are not like this. Ions, which transfer charge in batteries, are large, and they take up space, as do anodes, cathodes, and electrolytes. A D-cell battery stores more energy than an AA-cell. Potentials in a battery are dictated by the relevant chemical reactions, thus limiting eventual battery performance. Significant improvement in battery capacity can only be made by changing to a different chemistry.

Scientists and battery experts, who have been optimistic in the recent past about improving lithium-ion batteries and about developing new battery chemistries—lithium/air and lithium/sulfur are the leading candidates—are considerably less optimistic now. Improvement in energy storage density of lithium-ion batteries has been only incremental for the past decade. A large-scale research consortium (the Joint Center for Energy Storage Research) has been created with an ambitious goal of improving energy storage density by a factor of five and reducing cost by a factor of five in 5 years. This can only happen if there is a terrific, wonderful, and amazing breakthrough in battery technology. One can only hope.

In addition to increased performance and lower cost, batteries need to be safe. Of course gasoline is not safe, there are hundreds of thousands of car fires every year in the United States. Nonetheless, the public is more wary of electricity than of gasoline, and the recent safety issues of lithium-ion batteries on Boeing 787 aircraft have done little to reassure the public about the safety of such batteries. Consumers are questioning the practice of putting into cars batteries that can burst into flames.

Meanwhile, while waiting for a wonderful breakthrough in battery technology, we do have a valuable and underutilized resource: energy efficiency, which in many cases is free or even has a negative cost. Cars can be made more energy efficient by reducing size, weight, and power. Incentives to reduce vehicle miles driven can be made by improving access to public transit. There are policy and financial incentives to driving less, such as higher taxes on gasoline to investments in the public transportation infrastructure.

Improving the energy efficiency of cars is not a long-term solution to the problems related to combustion of fossil fuels, as cars will still be powered by gasoline. However, improved energy efficiency can happen and is happening. A good example of improved energy efficiency is hybrid cars, which can be considerably more energy efficient than traditional cars. We must take this pragmatic direction while awaiting that terrific breakthrough in battery technology we all so desire.!divAbstract


A Few Notes on the Fourth National Climate Assessment (NCA4)

August 16, 2017

The current draft of the report can be found here:

Fourth National Climate Assessment (NCA4), Fifth-Order Draft (5OD) 

After a cursory review of the document, a few items are worth noting.

Representative Concentration Pathways (RCPs)

A third difference between the RCPs and previous scenarios is that while none of the SRES scenarios included a scenario with explicit policies and measures to limit climate forcing, all of the three lower RCP scenarios (2.6, 4.5, and 6.0) are climate-policy scenarios. At the higher end of the range, the RCP8.5 scenario corresponds to a future where carbon and methane emissions continue to rise as a result of fossil fuel use, albeit with significant declines in emission growth rates over the second half of the century (Figure 4.1), significant reduction in aerosols, and modest improvements in energy intensity and technology (Riahi et al. 2011). Atmospheric carbon dioxide levels for RCP8.5 are similar to those of the SRES A1fi scenario: they rise from current-day levels of 400 up to 936 parts per million (ppm). CO2 16 -equivalent levels (including emissions of other non-CO2 17 greenhouse gases, aerosols, and other substances that affect climate) reach more than 1200 ppm by 2100, and global temperature is projected to increase by 5.4°–9.9°F (3°–5.5°C) by 2100 relative to the 1986–2005 average. RCP8.5 reflects the upper range of the open literature on emissions, but is not intended to serve as an upper limit on possible emissions nor as a business-as-usual or reference scenario for the other three scenarios.

Page 190

First the good news:

“RCP8.5… is not intended to serve as… a business-as-usual or reference scenario.”

That said, a text search of the document returned the following:

Representative Concentration Pathways
Scenario Occurrences
RCP2.6, RCP 2.6 17
RCP4.5, RCP 4.5 32
RCP8.5, RCP 8.5 75

One might think that the business-as-usual or reference scenario might be the most commonly referenced scenario.  However, RCP 8.5 is referenced more than all the other scenarios combined.  While RCP 6.0, “a mitigation scenario, meaning it includes explicit steps to combat greenhouse gas emissions (in this case, through a carbon tax)” which most closely matches “business-as-usual,” is only referenced once.

A Sampling of Key Findings

3. Detection and Attribution of Climate Change

Key Findings

1. The likely range of the human contribution to the global mean temperature increase over the period 1951–2010 is 1.1° to 1.4°F (0.6° to 0.8°C), and the central estimate of the observed warming of 1.2°F (0.65°C) lies within this range (high confidence). This translates to a likely human contribution of 93%–123% of the observed 1951–2010 change. It is extremely likely that more than half of the global mean temperature increase since 1951 was caused by human influence on climate (high confidence). The likely contributions of natural forcing and internal variability to global temperature change over that period are minor (high confidence).

Page 160

Over the past 2,000 years, the average temperature of the Northern Hemisphere has exceeded natural variability (defined as two standard deviations from the pre-1865 mean) three times: 1) the peak of the Medieval Warm Period 2) the nadir of the Little Ice Age and 3) since 1998.  Human activities clearly were not the cause of the first two deviations.  70% of the warming since the early 1600’s clearly falls within the range of natural variability.


Figure 2. Temperature reconstruction (Ljungqvist, 2010), northern hemisphere instrumental temperature (HadCRUT4) and Law Dome CO2 (MacFarling Meure et al., 2006). Temperatures are 30-yr averages to reflect changing climatology. (The Good, the Bad and the Null Hypothesis)

On a climatology basis, the modern warming only exceeds Common Era pre-industrial natural variability by a maximum of 0.216° C

Nature 3 Man 1

Figure 3. The modern warming only exceeds Common Era pre-industrial natural variability by a maximum of 0.216° C.  So, it is highly unlikely that the “range of the human contribution to the global mean temperature increase over the period 1951–2010 is 1.1° to 1.4°F (0.6° to 0.8°C).”


6. Temperature Changes in the United States


1. Average annual temperature over the contiguous United States has increased by 1.2°F (0.7°C) for the period 1986–2016 relative to 1901–1960 and by 1.8°F (1.0°C) based on a linear regression for the period 1895–2016 (very high confidence). Surface and satellite data are consistent in their depiction of rapid warming since 1979 (high confidence). Paleo-temperature evidence shows that recent decades are the warmest of the past 1,500 years (medium confidence).

Page 267

“Medium confidence” is equivalent to a Scientific Wild-Ass Guess (SWAG).


Figure 4. NCA4 Confidence levels, page 10.

The lame-stream media took “suggestive evidence, limited consistency, models incomplete, methods emerging, competing schools of thought and turned it into a statement of fact:

Just as troubling were draft findings destined for the quadrennial National Climate Assessment. Scientists from 13 federal agencies found that a rapid rise in temperatures since the 1980s in the United States represents the warmest period in 1,500 years.

USA Today

A “medium confidence” Mannian Hockey Stick became: “Scientists from 13 federal agencies found that a rapid rise in temperatures since the 1980s in the United States represents the warmest period in 1,500 years.”

They based this assertion on one hockey-stick climate reconstruction, Mann et al., 2008.


Figure 5.  NCA4 Figure 1.8  Mann et al., 2008.  Even with this Hockey Stick, the modern warming only exceeded pre-industrial natural variability by 0.5° F (0.3° C).  At least they had the decency to clearly identify where they spliced in the instrumental data.


Figure 6.  Same image as above.  When the uncertainty range of the proxy data is honored, it cannot be stated that the rate of recent warming is unprecedented.

12. Sea Level Rise


1. Global mean sea level (GMSL) has risen by about 7–8 inches (about 16–21 cm) since 1900, with about 3 of those inches (about 7 cm) occurring since 1993 (very high confidence). Human-caused climate change has made a substantial contribution to GMSL rise since 1900 (high confidence), contributing to a rate of rise that is greater than during any preceding century in at least 2,800 years (medium confidence).

2. Relative to the year 2000, GMSL is very likely to rise by 0.3–0.6 feet (9–18 cm) by 2030, 0.5–1.2 feet (15–38 cm) by 2050, and 1 to 4 feet (30–130 cm) by 2100 (very high confidence in lower bounds; medium confidence in upper bounds for 2030 and 2050; low confidence in upper bounds for 2100). Future emissions pathways have little effect on projected GMSL rise in the first half of the century, but significantly affect projections for the second half of the century (high confidence). Emerging science regarding Antarctic ice sheet stability suggests that, for high emission scenarios, a GMSL rise exceeding 8 feet (2.4 m) by 2100 is physically possible, although the probability of such an extreme outcome cannot currently be assessed. Regardless of emissions pathway, it is extremely likely that GMSL rise will continue beyond 2100 (high confidence).

Page 493

“Global mean sea level (GMSL) has risen by about 7–8 inches (about 16–21 cm) since 1900, with about 3 of those inches (about 7 cm) occurring since 1993 (very high confidence)”… And?

Sea level has been rising at a secular rate of about 1.9 mm/yr since the end of neoglaciation (~1860 AD).


Figure 7. Sea Level Reconstruction (Jevrejeva et al., 2014), More Fun With Sea Level

About 3 inches of sea level rise has occurred since 1993.  About 4 inches of sea level rise occurred from 1930-1950.  There was very little sea level rise from 1951-1992.


Figure 8. Same as above with multi-decadal fluctuations highlighted.

“Human-caused climate change has made a substantial contribution to GMSL rise since 1900 (high confidence)”… Based on what???

“Contributing to a rate of rise that is greater than during any preceding century in at least 2,800 years (medium confidence)”… Another SWAG based on another hockey stick (Kopp et al., 2016).


Figure 9.  NCA4 Figure ES.8a,  page 27.  Sea level reconstruction (Kopp et al., 2016).  A 12-in (30 cm) ruler has been overlaid on the image for scale.  This hockey stick splice tide gauge data onto low frequency proxy data.  Even with the resolution discrepancy,  modern sea level is only 2.5 inches higher than pre-industrial natural variability – during neoglaciation.  When uncertainty is honored, the rate of instrumental era sea level rise is not significantly different than pre-industrial time.


“Relative to the year 2000, GMSL is very likely to rise by 0.3–0.6 feet (9–18 cm) by 2030, 0.5–1.2 feet (15–38 cm) by 2050, and 1 to 4 feet (30–130 cm) by 2100 (very high confidence in lower bounds; medium confidence in upper bounds for 2030 and 2050; low confidence in upper bounds for 2100).”

At least they get this one somewhat correct.

“Relative to the year 2000, GMSL is very likely to rise by… 1 to 4 feet (30–130 cm) by 2100 (very high confidence in lower bounds…  low confidence in upper bounds for 2100)”

Sea level is very likely to rise a bit less than 1 foot over the remainder of this century.


Figure 10. Sea level is very likely to rise by an additional 7-11 inches over the remainder of this century. 3 additional feet of sea level rise wold require an acceleration to a rate twice that of the Holocene Transgression. Oh say can you see modern sea level rise from a geological perspective?

Figure ES.8b is simply bat schist crazy.


Figure 11. NCA4 Figure ES.8b with Figure 11 overlaid.  Red is RCP 8.5.  The blue curve, an intermediate RCP, would necessitate >20 mm/yr of SLR in the late 21st century. 

And, of course, the lame-stream media turns this into…

The report, by more than 450 scientists from 60 nations, also found that greenhouse gases in the atmosphere and global sea levels are at their highest levels on record.

USA Today

It all depends on when you start the record.


Figure 12. Oh say can you see modern sea level rise from a geological perspective?

Life in the Adjustocene

Yesterday, I authored a post about NCA3’s model projections. I noted that the observations had falsified their models…


Figure 13. NCA3 models run hot… as always.

Well, it appears that NCA4 will address this issue by adjusting the observations to match the models…


Figure 14. NCA Figure ES.3, page 70 compared to NCA3.

They adjusted the observations to match the model in the current draft report.


Figure 15. Livin’ la vida Adjustocene!

NCA4 shows the observations tracking the model-mean prior to the 2016  El Niño and then spiking above the mean during it.  Nick Stokes provided the following image in one of his very astute comments:


Figure 16. Observations spiking to the CMIP5 mean during (Comment by Nick Stokes). Note: I am not implying that Nick Stokes agrees with anything I’ve ever posted or endorsing anything.  I am simply crediting him with providing this image.


NCA4 paints a picture of the climate basically behaving within the bounds of Late Holocene natural variability, accompanied with lots of rhetoric about being certain that humans have caused at least half of whatever happens since 1950…

“Same as it ever was”…

Null Hypothesis

August 15, 2017

From The Good, the Bad and the Null Hypothesis

Since it is impossible to run a controlled experiment on Earth’s climate (there is no control planet), the only way to “test” the CAGW hypothesis is through models.  If the CAGW hypothesis is valid, the models should demonstrate predictive skill.  The models have utterly failed:


Figure 14. “95% of Climate Models Agree: The Observations Must be Wrong.”


Figure 15. “Climate models versus climate reality.” Michaels & Knappenberger.

The models have failed because they result in a climate sensitivity that is 2-3 times that supported by observations:


Figure 15. Equilibrium climate sensitivity: Reality vs. Models.

From Hansen et al. 1988 through every IPCC assessment report, the observed temperatures have consistently tracked the strong mitigation scenarios in which the rise in atmospheric CO2 has been slowed and/or halted.

Apart from the strong El Niño events of 1998 and 2015-16, GISTEMP has tracked Scenario C, in which CO2 levels stopped rising in 2000, holding at 368 ppm.


Figure 16. Hansen’s 1988 model and GISTEMP.

The utter failure of this model is most apparent on the more climate-relevant 5-yr running mean:


Figure 17. Hansen’s 1988 model and GISTEMP, 5-yr running mean.

This is from IPCC’s First Assessment Report:


Figure 18.  IPCC First Assessment Report (FAR).  Model vs. HadCRUT4.

HadCRUT4 has tracked below Scenario D.


Figure 19. IPCC FAR scenarios.

This is from the IPCC’s Third Assessment Report (TAR):


Figure 20. IPCC TAR model vs. HadCRUT4.

HadCRUT4 has tracked the strong mitigation scenarios, despite a general lack of mitigation.

The climate models have never demonstrated any predictive skill.

And the models aren’t getting better. Even when they start the model run in 2006, the observed temperatures consistently track at or below the low end 5-95% range.  Observed temperatures only approach the model mean (P50) in 2006, 2015 and 2016.


Figure 21.  Climate Lab Book. Comparing CMIP5 & observations.

The ensemble consists of 138 model runs using a range of representative concentration pathways (RCP), from a worst case scenario RCP 8.5, often referred to as “business as usual,” to varying grades of mitigation scenarios (RCP 2.6, 4.5 and 6.0).


Figure 22. Figure 21 with individual model runs displayed.


When we drill wells, we run probability distributions to estimate the oil and gas reserves we will add if the well is successful.  The model inputs consist of a range of estimates of reservoir thickness, area and petrophysical characteristics.  The model output consists of a probability distribution from P10 to P90.

  • P10 = Maximum Case.  There is a 10% probability that the well will produce at least this much oil and/or gas.
  • P50 = Mean Case.  There is a 50% probability that the well will produce at least this much oil and/or gas.  Probable reserves are >P50.
  • P90 = Minimum Case.  There is a 90% probability that the well will produce at least this much oil and/or gas.  Proved reserves are P90.

Over time, a drilling program should track near P50.  If your drilling results track close to P10 or P90, your model input is seriously flawed.

If the CMIP5 model ensemble had predictive skill, the observations should track around P50, half the runs should predict more warming and half less than is actually observed. During the predictive run of the model, HadCRUT4.5 has not *tracked* anywhere near P50…


Figure 23. Figure 21 zoomed in on model run period with probability distributions annotated.

I “eyeballed” the instrumental observations to estimate a probability distribution of predictive run of the model.

Prediction Run Approximate Distribution

2006 P60 (60% of the models predicted a warmer temperature)
2007 P75
2008 P95
2009 P80
2010 P70
2011-2013 >P95
2014 P90
2015-2016 P55

Note that during the 1998-99 El Niño, the observations spiked above P05 (less than 5% of the models predicted this). During the 2015-16 El Niño, HadCRUT only spiked to P55.  El Niño events are not P50 conditions. Strong El Niño and La Niña events should spike toward the P05 and P95 boundaries.

The temperature observations are clearly tracking much closer to strong mitigation scenarios rather than RCP 8.5, the bogus “business as usual” scenario.

The red hachured trapezoid indicates that HadCRUT4.5 will continue to track between less than P100 and P50. This is indicative of a miserable failure of the models and a pretty good clue that the models need be adjusted downward.

In any other field of science CAGW would be a long-discarded falsified hypothesis.