Vox’s David Roberts: “Consilience” or just plain silliness?

December 14, 2015

True climate science denier, David Roberts (formerly with Grist) has authored another truly vapid article for Vox …

The 2 key points that climate skeptics miss

Updated by David Roberts on December 11, 2015

Arguments between climate skeptics (or whatever the hell we’re calling them now) and their opponents very frequently devolve into hypertechnical squabbles over particular scientific issues like sea surface temperatures or Milankovitch cycles (don’t ask).

Generally speaking, this is a Bad Thing. Technical scientific disputes are of limited interest the general public — especially technical disputes litigated with great partisan venom. A discussion dominated by such disputes just causes most people to tune out entirely. What’s more, it creates the illusion that the validity of climate science hinges on how these squabbles are resolved. It doesn’t.

[…]

Vox (whatever the hell that is)

 

“Define Irony”

Mr. Roberts actually denied the scientific method prior to making his first point. Science is the process of formulating systematic explanations for observations (hypotheses) and then testing and upholding or falsifying those hypotheses.  Science is not about formulating slogans that non-scientists can “tune in” to.

Read the rest of this entry »

A Novel Way to Test the Impact of Rising Sea Levels

December 8, 2015

Eric Worrall’s recent essay on the Prime Minister of Tuvalu and his reticence to providing some evidence that his island nation is being inundated by rising sea level inspired me to devise a simple test to see if an island is sinking, vanishing or being washed away:

Planimeter a recent map of the island and compare it to an older map of the island.

 

Since the USGS has a large historical inventory of topographic maps, this should be relatively easy for any islands in these United States.

For my test case, I chose Key Biscayne, Florida.  It’s just south of the perennially sinking Island of Miami, has a maximum elevation of about 5′, is relatively small and the USGS had several vintages of 7.5 minute quadrangles available.
Then…

KB_1962

Figure 1: Key Biscayne, 1962 (USGS)

And Now…

KB_2012

Figure 2: Key Biscayne, 2012 (USGS)

I planimetered the coast lines of each map and found no significant changes over the 51 years from 1962-2012…

KB_Comp

Figure 3: 1962 vs. 2012.  Roughly a 1% difference in area.  The apparent slight increase in well within the margin of error of the planimetering tool.

I guess I’m going to have to deny climate change or at least doubt the climate, because I can’t see any effect of sea level rise on this puny, flat, little island. (/sarc).

What did ExxonMobil Know and when did they know it? (Part 3, Exxon: The Fork Not Taken)

October 23, 2015

Fork

This just keeps getting more hilarious…

Exxon Confirmed Global Warming Consensus in 1982 with In-House Climate Models
The company chairman would later mock climate models as unreliable while he campaigned to stop global action to reduce fossil fuel emissions.
Lisa Song, Neela Banerjee, David Hasemyer
Sep 22, 2015

Steve Knisely was an intern at Exxon Research and Engineering in the summer of 1979 when a vice president asked him to analyze how global warming might affect fuel use.

“I think this guy was looking for validation that the greenhouse effect should spur some investment in alternative energy that’s not bad for the environment,” Knisely, now 58 and a partner in a management consulting company, recalled in a recent interview.

Knisely projected that unless fossil fuel use was constrained, there would be “noticeable temperature changes” and 400 parts per million of carbon dioxide (CO2) in the air by 2010, up from about 280 ppm before the Industrial Revolution.

[…]

Through much of the 1980s, Exxon researchers worked alongside university and government scientists to generate objective climate models that yielded papers published in peer-reviewed journals. Their work confirmed the emerging scientific consensus on global warming’s risks.

Yet starting in 1989, Exxon leaders went down a different road. They repeatedly argued that the uncertainty inherent in computer models makes them useless for important policy decisions. Even as the models grew more powerful and reliable, Exxon publicly derided the type of work its own scientists had done. The company continued its involvement with climate research, but its reputation for objectivity began to erode as it campaigned internationally to cast doubt on the science.

[…]

Climate ‘Catastrophe’ Foreseen

By 1981, Exxon scientists were no longer questioning whether the buildup of CO2 would cause the world to heat up. Through their own studies and their participation in government-sponsored conferences, company researchers had concluded that rising CO2 levels could create catastrophic impacts within the first half of the 21st century if the burning of oil, gas and coal wasn’t contained.

[…]

Unanimous Agreement

“Over the past several years a clear scientific consensus has emerged regarding the expected climatic effects of increased atmospheric CO2,” Cohen wrote to A.M. Natkin of Exxon Corporation’s Science and Technology Office in 1982. “The consensus is that a doubling of atmospheric CO2 from its pre-industrial revolution value would result in an average global temperature rise of 3.0 ± 1.5°C.” (Equal to 5.4 ± 2.7°F).

“There is unanimous agreement in the scientific community that a temperature increase of this magnitude would bring about significant changes in the earth’s climate, including rainfall distribution and alterations in the biosphere.”

Exxon’s own modeling research confirmed this and the company’s results were later published in at least three peer-reviewed science articles. Two of them were co-authored by Hoffert, and a third was written entirely by Flannery.

Exxon’s modeling experts also explained away the less-dire predictions of a 1979 study led by Reginald Newell, a prominent atmospheric scientist at the Massachusetts Institute of Technology. Newell’s model projected that the effects of climate change would not be as severe as most scientists were predicting.

Specifically, Newell and a co-author from the Air Force named Thomas Dopplick challenged the prevailing view that a doubling of the earth’s CO2 blanket would raise temperatures about 3°C (5°F)– a measure known as climate sensitivity. Instead, they said the earth’s true climate sensitivity was roughly less than 1°C (2°F).

[…]

http://insideclimatenews.org/news/18092015/exxon-confirmed-global-warming-consensus-in-1982-with-in-house-climate-models

I have yet to find any Exxon models… Much less any that confirmed a “Global Warming Consensus” or “Climate ‘Catastrophe'”.  What I have found are reports which cite other people’s models and quite a few “cartoons” derived from them.

Same as it ever was.

“When you come to a fork in the road, take it!” (Lawrence “Yogi” Berra)

Half wrong.

Exxon: The Fork Not Taken

It’s notable that Exxon was made aware of the so-called consensus…

“The consensus is that a doubling of atmospheric CO2 from its pre-industrial revolution value would result in an average global temperature rise of 3.0 ± 1.5°C.”

And they were also made aware of reality…

“Newell and a co-author from the Air Force named Thomas Dopplick … said the earth’s true climate sensitivity was roughly less than 1°C.”

Inside Climate likes to make a big deal out of this…

Exxon’s former chairman and CEO, Lee Raymond, took an even tougher line against climate science. Speaking before the World Petroleum Congress in Beijing in 1997, Raymond mocked climate models in an effort to stop the imminent adoption of the Kyoto Protocol, an international accord to reduce emissions.

“They are notoriously inaccurate,” Raymond said. “1990’s models were predicting temperature increases of two to five degrees Celsius by the year 2100,” he said, without explaining the source of those numbers. “Last year’s models say one to three degrees. Where to next year?”

Mr. Raymond was correct. The models have been “notoriously inaccurate.” However, they have been very precise in their inaccuracies…

What did ExxonMobil Know and when did they know it? (Part Deux) “Same as it ever was.”

October 23, 2015

If you thought Part 1 was a doozy, “you ain’t seen nothing yet”…

Exxon Believed Deep Dive Into Climate Research Would Protect Its Business
Outfitting its biggest supertanker to measure the ocean’s absorption of carbon dioxide was a crown jewel in Exxon’s research program.

Neela Banerjee, Lisa Song, David Hasemyer
Sep 21, 2015

In 1981, 12-year-old Laura Shaw won her seventh-grade science fair at the Solomon Schechter Day School in Cranford, N.J. with a project on the greenhouse effect.

For her experiment, Laura used two souvenir miniatures of the Washington Monument, each with a thermometer attached to one side. She placed them in glass bowls and covered one with plastic wrap – her model of how a blanket of carbon dioxide traps the reflected heat of the sun and warms the Earth. When she turned a lamp on them, the thermometer in the plastic-covered bowl showed a higher temperature than the one in the uncovered bowl.

If Laura and her two younger siblings were unusually well-versed in the emerging science of the greenhouse effect, as global warming was known, it was because their father, Henry Shaw, had been busily tracking it for Exxon Corporation.

[…]

Henry Shaw was part of an accomplished group at Exxon tasked with studying the greenhouse effect. In the mid-70s, documents show that Shaw was responsible for seeking out new projects that were “of national significance,” and that could win federal funding. Others included Edward E. David, Jr., a former science advisor to President Richard Nixon, and James F. Black, who worked on hydrogen bomb research at Oak Ridge National Laboratory in the 1950s.

Black, who died in 1988, was among the first Exxon scientists to become acquainted with the greenhouse effect. Esso, as Exxon was known when he started, allowed him to pursue personal scientific interests. Black was fascinated by the idea of intentionally modifying weather to improve agriculture in arid countries, said his daughter, Claudia Black-Kalinsky.

“He believed that big science could save the world,” she said. In the early 1960s, Black helped draft a National Academy of Sciences report on weather and climate modification. Published in 1966, it said the buildup of carbon dioxide in the atmosphere “agrees quite well with the rate of its production by man’s consumption of fossil fuels.”

In the same period, a report for President Lyndon Johnson from the President’s Science Advisory Council in 1965 said the burning of fossil fuels “may be sufficient to produce measurable and perhaps marked changes in climate” by the year 2000.

By 1977, Black had become a top technical expert at Exxon Research & Engineering, a research hub based in Linden, N.J., and a science advisor to Exxon’s top management.  That year he made a presentation to the company’s leading executives warning that carbon dioxide accumulating in the upper atmosphere would warm the planet and if the CO2 concentration continued to rise, it could harm the environment and humankind.

[…]

http://insideclimatenews.org/news/16092015/exxon-believed-deep-dive-into-climate-research-would-protect-its-business

Firstly, the Earth’s atmosphere is not air in a jar.

Secondly, the Black presentation was dated in 1978.

Thirdly, the Black presentation was just another survey of government and academic publications on the so-called greenhouse effect.

Here’s what Exxon knew in 1978…

Exxon knew that most government and academic scientists wanted more research money.

Exxon knew that most government and academic scientists wanted more research money.

“Same as it ever was…”

XOM4

“Same as it ever was…”

In 1978, Exxon knew that the effects on sea level and the polar ice caps would likely be negligible, models were useless and more effort should be directed at paleoclimatology.

In 1978, Exxon knew that the effects on sea level and the polar ice caps would likely be negligible, models were useless and more effort should be directed at paleoclimatology.

“Same as it ever was…”

In 1978, Exxon knew that the models were useless.

In 1978, Exxon knew that the models were useless.

“Same as it ever was…”

Inside Climate then bemoaned the fact that Exxon management scrubbed a science project…

Exxon’s enthusiasm for the project flagged in the early ’80s when federal funds fell through. Exxon Research cancelled the tanker project in 1982, but not before Garvey, Shaw and other company engineers published an initial paper in a highly specialized journal on the project’s methodology.

We were anxious to get the word out that we were doing this study,” Garvey said of the paper, which did not reach sweeping conclusions. “The paper was the first of what we hoped to be many papers from the work,” he said in a recent email. But the other publications never materialized.

I never worked for “big oil,” however, “little oil” tries to avoid spending money on science projects.

What did ExxonMobil Know and when did they know it? Part 1

October 22, 2015

Maybe ExxonMobil should file a RICO lawsuit against the “Shukla 20” and this gentleman…

Exxon Knew Everything There Was to Know About Climate Change by the Mid-1980s—and Denied It
And thanks to their willingness to sucker the world, the world is now a chaotic mess.

By Bill McKibben YESTERDAY 12:13 PM

A few weeks before the last great international climate conference—2009, in Copenhagen—the e-mail accounts of a few climate scientists were hacked and reviewed for incriminating evidence suggesting that global warming was a charade. Eight separate investigations later concluded that there was literally nothing to “Climategate,” save a few sentences taken completely out of context—but by that time, endless, breathless media accounts about the “scandal” had damaged the prospects for any progress at the conference.

Now, on the eve of the next global gathering in Paris this December, there’s a new scandal. But this one doesn’t come from an anonymous hacker taking a few sentences out of context. This one comes from months of careful reporting by two separate teams, one at the Pulitzer Prize–winning website Inside Climate News, and other at the Los Angeles Times (with an assist from the Columbia Journalism School). Following separate lines of evidence and document trails, they’ve reached the same bombshell conclusion: ExxonMobil, the world’s largest and most powerful oil company, knew everything there was to know about climate change by the mid-1980s, and then spent the next few decades systematically funding climate denial and lying about the state of the science.

[…]

http://www.thenation.com/article/exx…and-denied-it/

These folks are so desperate to create a tobacco company analogy that they will resort to bald-faced lies.

Read the rest of this entry »

Shell to quit US Arctic due to “unpredictable federal regulatory environment”

September 29, 2015

Guest post by David Middleton

Disappointing results from an initial rank wildcat can’t kill a play.

The cyclical ups and downs of product prices can’t kill a play.

High operating costs can’t kill a play.

Only massively incompetent government can kill a play.

“This is a clearly disappointing exploration outcome,” Marvin Odum, director of Shell’s Upstream Americas unit, said in a statement. While indications of oil and gas were present in the Burger J well in Alaska’s Chukchi Sea, they weren’t sufficient to warrant further exploration, the company said. Shell will now plug and abandon the well.

Shell had planned a two-year drilling program starting this July. The company was seeking to resume work halted in 2012 when its main drilling rig ran aground and was lost. It was also fined for air pollution breaches. The Anglo-Dutch company first discovered oil and gas in the region in the late 1980s.

The company continues to see potential in the region and the decision not to explore further in Alaskan waters “reflects both the Burger J well result, the high costs associated with the project, and the challenging and unpredictable federal regulatory environment in offshore Alaska,” according to the statement.

http://www.bloomberg.com/news/articl…ulations-costs

The potential of the Alaska OCS is nearly as large as the Central Gulf of Mexico…

Product prices and exploration results are by their nature, unpredictable. Operating costs are tied to product prices and regulatory requirements. Regulatory requirements must be predictable in order for any business to function.

Answer to Jon Chambers

September 8, 2015

Jonathan Chambers

Hey Mid. Received the below from a friend of a friend. Went on line to refute this but all the articles say a 1.4 degree change in temperature is a bad thing. Starts a Little Ice Age.

Anything you can offer to concretely snuff this guy?

Climate change is cyclical. I agree 100%. Ice ages come and go. If you follow this thought process though, the data since the industrial revolution has us coming out of an ice age faster amd more drastically than ever. Climate deniers are my favorite kinds of Republicans

He already snuffed himself better than I could.  His ignorance appears to be manifest.

We are still in an “ice age.” We are just fortunate enough to be living in an interglacial stage known as the Holocene.  Geologically speaking, this is an “ice age”… We’ve been in an ice age since the beginning of the Oligocene (~30 MYA)…


(Older is to the right.)

This particular ice age period (the Quaternary) has been particularly cold. Over the most recent ~2 million years, glacial maxima have had periods of ~100,000 years, punctuated by brief (15-40 KY) interglacial stages. The last glacial maximum occurred about 20,000 years ago.

We are currently in an interglacial stage known as the Holocene. It began about 10,000 years ago and peaked during the Holocene Climatic Optimum (4,000 to 7,000 years ago). Based on analyses of the Milankovitch cycles, some people think the Holocene will be a long interglacial stage. In which case, we have about 30,000 years of balmy weather remaining. If this is a typical interglacial, we only have about 5,000 years left. Even then, it will take thousands of years for the ice to build up and advance across Canada. Places like Chicago might still have 50,000 years remaining to plan on how to deal with a mile-high sheet of ice bulldozing its way south across the Midwest.

Like all other significant biomass features on Earth, humans do affect natural processes. We amplify some processes and attenuate others. Some studies have suggested that AGW has already staved off a decline into the next glacial stage (Ruddiman). Others have suggested that AGW will delay or even prevent the next glacial stage. I think it might delay it by a few centuries, maybe even more than 1,000 years; but it can’t prevent it. The greenhouse effect can’t trap heat that never arrived from the Sun. Atmospheric CO2 remained elevated long after Sangamonian (Eemian, MIS 5) interglacial stage began cooling toward the last glacial maximum..

When insolation declines, less solar energy reaches the Earth and there is less heat for CO2 and other GHG’s to trap. Holocene insolation peaked during the Holocene Climatic Optimum…


(Older is to the left.)

Insolation has been declining since the Holocene Climatic Optimum. While we still can benefit from episodic periods of millennial scale warming, the long-term trend is toward the inevitable next deep freeze.

We came out of the most recent glacial stage at the end of the Pleistocene (~10,000 yrs ago).

The Holocene has been a heck of a lot more stable than the preceding Pleistocene glacial episode (the x-axes of next three graphs are denominated in calendar years – Today is to the right)…

But it has been far from stable…

The end-Pleistocene deglaciation was far more drastic than any climatic change in the Holocene, including the industrial revolution (AKA Anthropocene).  The Earth’s climate has generally been warming since 1600 AD, the coldest part of a period known as the Little Ice Age (LIA).  The LIA is part of a naturally occurring millennial scale climate oscillation which has dominated Holocene climate change and was quite possibly the coldest phase of the Holocene since the 8.2 KYA Cooling Event. The LIA was characterized by maximum glacial advances and the most extensive sea ice coverage since the onset of the Neoglaciation (end of the Holocene Climatic Optimum).


The Little Ice Age.may have been the coldest climatic period of the past 8,200 years.

While volcanic forcing may have played a role in the coldness of the LIA, it was clearly a.cyclical cooling event. Much, if not all, of the warming since the late 16th century is clearly part of a millennial climate cycle.

All of the “global warming” from ~1600 AD through 2000 AD barely brought the climate back to “normal.”

There is nothing anomalous about recent warming unless you fraudulently splce high frequency instrumental data onto low frequency proxy reconstructions (AKA Hockey Sticks).

The following is a composite of several different posts I have written on the subject.  Hopefully it’s not too disjointed.

The rate of warming since ~1978 was no different than the rate of warming from ~1910 to 1945…

The recent warming was totally nonanomalous relative to changes of the prior 2,000 years…

Which were unrelated to atmospheric CO2 concentrations…

The Late Holocene climate has been characterized by millennial scale cycle with a period of ~1,000 years and amplitude of ~0.5 °C.

Figures 7 & 8. Both Moberg and Ljungqvist clearly demonstrate the millennial scale climate cycle.

These cycles even have names…


Figure 9. Ljungqvist with climatic period nomenclature.

These cycles have been long recognized by Quaternary geologists…


Figure 10. The millennial scale climate cycle can clearly be traced back to the end of the Holocene Climatic Optimum and the onset of the Neoglaciation.

Fourier analysis of the GISP2 ice core clearly demonstrates that the millennial scale climate cycle is the dominant signal in the Holocene (Davis & Bohling, 2001). It is pervasive throughout the Holocene (Bond et al., 1997).


Figure 11. The Holocene climate has been dominated by a millennial scale climate cycle.

The industrial era climate has not changed in any manner inconsistent with the well-established natural millennial scale cycle. Assuming that the ice core CO2 is reliable, the modern rise in CO2 has had little, if any effect on climate…


Figure 12. Why would CO2 suddenly start driving climate change in the 19th century?

While the climate may have warmed by 0.2 to 0.4 °C more than what might be expected to occur in a 100% natural warming phase of the millennial cycle, all of the apparent excess warming may very well be due to resolution differences between the instrumental and proxy data…


Figure 13. Ljungqvist demonstrates that the modern warming has not unambiguously exceeded the range of natural variability. The bold black dashed line is the instrumental record. I added The red lines to highlight the margin of error.

According to Ljungqvist…

The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. AD 1-300, reaching up to the 1961-1990 mean temperature level, followed by the Dark Age Cold Period c. AD 300-800. The Medieval Warm Period is seen c. AD 800–1300 and the Little Ice Age is clearly visible c. AD 1300-1900, followed by a rapid temperature increase in the twentieth century. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961-1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself.

[…]

The proxy reconstruction itself does not show such an unprecedented warming but we must consider that only a few records used in the reconstruction extend into the 1990s. Nevertheless, a very cautious interpretation of the level of warmth since AD 1990 compared to that of the peak warming during the Roman Warm Period and the Medieval Warm Period is strongly suggested.

[…]

The amplitude of the temperature variability on multi-decadal to centennial time-scales reconstructed here should presumably be considered to be the minimum of the true variability on those time-scales.

Ljungqvist is recommending caution in comparing the modern instrumental record to the older proxy reconstructions because the proxy data are of much lower resolution. The proxy data are showing the “minimum of the true variability on those time-scales.” The instrumental data are depicting something closer to actual variability. Even then, the instrumental record doesn’t exceed the margin of error for the proxy data during the peak of the Medieval Warm Period. With a great deal of confidence, perhaps even 67%, it can be concluded that at least half, perhaps all, of the modern warming is the result of quasi-periodic natural climate fluctuations (AKA cycles).

Ljungqvist said that “the temperature since AD 1990 is, however, possibly higher than during any previous time in the past two millennia if we look at the instrumental temperature data spliced to the proxy reconstruction.” The dashed red curve is the instrumental data set…

The instrumental data always show faster (higher amplitude) temperature variations than the proxy reconstructions…

Many multi-decadal periods have warmed at 7-13 times the background rate.  Many 100-yr periods of warming and cooling exceeded the 10,000 year average.

The pronounced spike at 62-65 years on this power spectrum means almost any 100-yr period has a far greater rate and magnitude of warming or cooling than the 10,000-yr average…

“Science Isn’t Broken” (Except when it is)

August 28, 2015

From Five Thirty Eight Science…

 

If you follow the headlines, your confidence in science may have taken a hit lately.

Peer review? More like self-review. An investigation in November uncovered a scam in which researchers were rubber-stamping their own work, circumventing peer review at five high-profile publishers.

Scientific journals? Not exactly a badge of legitimacy, given that the International Journal of Advanced Computer Technology recently accepted for publication a paper titled “Get Me Off Your (Fracking) Mailing List,” whose text was nothing more than those seven words, repeated over and over for 10 pages. Two other journals allowed an engineer posing as Maggie Simpson and Edna Krabappel to publish a paper, “Fuzzy, Homogeneous Configurations.”

Revolutionary findings? Possibly fabricated. In May, a couple of University of California, Berkeley, grad students discovered irregularities in Michael LaCour’s influential paper suggesting that an in-person conversation with a gay person could change how people felt about same-sex marriage. The journal Science retracted the paper shortly after, when LaCour’s co-author could find no record of the data.

Taken together, headlines like these might suggest that science is a shady enterprise that spits out a bunch of dressed-up nonsense…

[…]

http://fivethirtyeight.com/features/science-isnt-broken/

While there are a lot of problems with the peer-review process and a population explosion of journals which will readily publish abject bullschist, I think the bigger, more serious scientific breakdown is in journalism and press releases, as it pertains to the public. Most people never read the abstracts of the papers, much less the actual papers.

 

Here’s an example…

Green Business | Wed Aug 26, 2015 4:10pm EDT Related: ENVIRONMENT
Global sea levels climbed 3 inches since 1992, NASA research shows
CAPE CANAVERAL, FL. | BY IRENE KLOTZ

Sea levels worldwide rose an average of nearly 3 inches (8 cm) since 1992, the result of warming waters and melting ice, a panel of NASA scientists said on Wednesday.

In 2013, a United Nations panel predicted sea levels would rise from 1 to 3 feet (0.3 to 0.9 meters) by the end of the century. The new research shows that sea level rise most likely will be at the high end of that range, said University of Colorado geophysicist Steve Nerem.

Sea levels are rising faster than they did 50 years ago and “it’s very likely to get worse in the future,” Nerem said.

[…]

http://www.reuters.com/article/2015/08/26/us-environment-sealevel-nasa-idUSKCN0QV2B020150826

 

First, the unbroken science…

Nerem, R. S., D. Chambers, C. Choe, and G. T. Mitchum. “Estimating Mean Sea Level Change from the TOPEX and Jason Altimeter Missions.” Marine Geodesy 33, no. 1 supp 1 (2010): 435.

Dr. Nerem’s science does support 3 inches of sea level rise since 1992.

Now for the broken science…

In 2013, a United Nations panel predicted sea levels would rise from 1 to 3 feet (0.3 to 0.9 meters) by the end of the century. The new research shows that sea level rise most likely will be at the high end of that range, said University of Colorado geophysicist Steve Nerem.

Sea levels are rising faster than they did 50 years ago and “it’s very likely to get worse in the future,” Nerem said.

Sea level has been rising at a rate of about 3 mm per year since the Jason/Topex missions started flying.

The IPCC says that sea level will rise by 300 to 900 mm by the end of this century. Dr. Nerem says that his work indicates that the sea level rise will be at the high end of that range. Since we are 15 years into this century with about 45 mm of sea level rise “in the bank,” sea level would have to rise by 855 mm over the next 85 years to hit the high end. That is 10 mm per year. This caused sea level to rise by ~10 mm/yr for about 10,000 years…

The animation above is of the end-Pleistocene deglaciation (AKA Holocene Transgression)…

All of the sea level rise since 1700 AD is circled at the right hand side of the graph.

The only way sea level rise could approach the high end of the IPCC range is if it exponentially accelerates…

The rate from 2081-2100 would have to average 20 mm per year, twice that of the Holocene Transgression. This is only possible in bad science fiction movies.

Broken science, part deux…

Sea levels are rising faster than they did 50 years ago…

They are rising faster than they were 50 years ago. However, they are rising at the same rate that they were 80, 70 and 60 years ago…

 

 

There is nothing abnormal about sea level rising by 3 inches over a 23-yr period.  Nor is a 3 mm/yr sea level rise over a multi-decade period unusual.  There is simply no anomaly requiring an explanation.  The claim that the 3 inches if sea level rise from 1992-2015 is inline with 3 feet of sea level rise in the 21st century is patently false and demonstrably disprovable.  The accurate statement that sea level is rising faster now than it was 50 years ago is cherry-picking of the highest order.  Warning that “it’s very likely to get worse in the future,” is the scientific equivalent of shouting “Fire!” in a crowded movie theater because you constructed a model which predicts that the projection system will burst into flames if it malfunctions at some point in the future.

Miocene (~20 MYA) lizards, preserved in amber, “identical to their modern cousins.”

July 28, 2015

Truly amazing…

Ancient lizards in amber amaze scientists
Tuesday, 28 July 2015 Stuart Gary
ABC

A community of lizards from the Caribbean, preserved for 20 million years in amber, have been found to be identical to their modern cousins, say researchers.

This suggests the different niches inhabited by the lizards have – incredibly – changed little over the past 20 million-year, report the team, in this week’s Proceedings of the National Academy of Sciences.

[…]

http://www.abc.net.au/science/articl…28/4279562.htm

I guess the Gorebots are wrong about climate change endangering lizards

(Older is to the right.)

Gavin says the funniest things…

June 5, 2015

NOAA temperature record updates and the ‘hiatus’

Filed under:

— gavin @ 4 June 2015

In a new paper in Science Express, Karl et al. describe the impacts of two significant updates to the NOAA NCEI (née NCDC) global temperature series. The two updates are: 1) the adoption of ERSST v4 for the ocean temperatures (incorporating a number of corrections for biases for different methods), and 2) the use of the larger International Surface Temperature Initiative (ISTI) weather station database, instead of GHCN. This kind of update happens all the time as datasets expand through data-recovery efforts and increasing digitization, and as biases in the raw measurements are better understood. However, this update is going to be bigger news than normal because of the claim that the ‘hiatus’ is no more. To understand why this is perhaps less dramatic than it might seem, it’s worth stepping back to see a little context…

Global temperature anomaly estimates are a product, not a measurement

The first thing to remember is that an estimate of how much warmer one year is than another in the global mean is just that, an estimate. We do not have direct measurements of the global mean anomaly, rather we have a large database of raw measurements at individual locations over a long period of time, but with an uneven spatial distribution, many missing data points, and a large number of non-climatic biases varying in time and space. To convert that into a useful time-varying global mean needs a statistical model, good understanding of the data problems and enough redundancy to characterise the uncertainties. Fortunately, there have been multiple approaches to this in recent years (GISTEMP, HadCRUT4, Cowtan & Way, Berkeley Earth, and NOAA NCEI), all of which basically give the same picture.

[…]

The ‘hiatus’ is so fragile that even those small changes make it disappear

[…]

– See more at: http://www.realclimate.org/index.php/archives/2015/06/noaa-temperature-record-updates-and-the-hiatus/#sthash.B1t4pbWO.dpuf

If “the ‘hiatus’ is so fragile that even those small changes make it disappear,” the underlying trend must also be fragile, if small data fudges are the difference between hiatus and the road to AGW calamity.

In the meantime…