Blog Archive

Monday, November 23, 2015

WaPo: Why are so many Americans confused about climate change? Corporate money funding disinformation, duh!

Emissions spew from the smokestacks at Westar Energy’s Jeffrey Energy Center coal-fired power plant near St. Mary’s, Kansas (AP)

by Joby Warrick, The Washington Post, November 23, 2015

Climate change has long been a highly polarizing topic in the United States, with Americans lining up on opposite sides depending on their politics and worldview. Now a scientific study sheds new light on the role played by corporate money in creating that divide.
The report, a systematic review of 20 years’ worth of data, highlights the connection between corporate funding and messages that raise doubts about the science of climate change and whether humans are responsible for the warming of the planet. The analysis suggests that corporations have used their wealth to amplify contrarian views and create an impression of greater scientific uncertainty than actually exists.
“The contrarian efforts have been so effective for the fact that they have made it difficult for ordinary Americans to even know who to trust,” said Justin Farrell, a Yale University sociologist and author of the study, released on Monday in the peer-reviewed journal, Proceedings of the National Academy of Science.
Numerous previous studies have examined how corporate-funded campaigns have helped shape individual views about global warming. But the Yale study takes what Farrell calls the “bird’s-eye view,” using computer analytics to systematically examine vast amounts of printed matter published by 164 groups—including think-tanks and lobbying firms—and more than 4,500 individuals who have been skeptical of mainstream scientific views on climate change.
The study analyzed the articles, policy papers and transcripts produced by these groups over a 20-year period. Then it separated the groups that received corporate funding from those that did not.
The results, Farrell said, revealed an “ecosystem of influence” within the corporate-backed groups. Those that received donations consistently promoted the same contrarian themes—casting doubt, for example, on whether higher levels of man-made carbon-dioxide in the atmosphere were harmful to the planet. There was no evidence of such coordination among the non-funded groups.
The existence of corporate money “created a united network within which the contrarian messages could be strategically created” and spread, Farrell said.
“This counter-movement produced messages aimed, at the very least, at creating ideological polarization through politicized tactics, and at the very most, at overtly refuting current scientific consensus with scientific findings of their own,” he said.
The report did not examine the impact of outside money on the messages of groups that encourage activism on climate change. Farrell suggested that there were qualitative differences between such groups and those that sought to advance corporate interests by promoting skepticism about science.
“Funders looking to influence organizations who promote a consensus view are very different from funders looking to influence organizations who have the goal of creating polarization and controversy and delaying policy progress on a scientific issue that has nearly uniform consensus,” he said.
The publication of the report comes two weeks after New York prosecutors announced an investigation into whether Exxon Mobil misled the public and investors about the risks of climate change. The probe was prompted in part by reports first in the online publication InsideClimate News and then in the Los Angeles Times, alleging that Exxon researchers expressed concerned about climate change from fossil fuel emissions decades ago, even as the company publicly raised doubts about whether climate change was scientifically valid.
Exxon has declined to comment on the investigation while acknowledging that its position on climate-change has evolved in recent years. “Our company, beginning in the latter part of the 1970s and continuing to the present day, has been involved in serious scientific research, and we have been supporting since that time scientific understanding of the risk of climate change,” Exxon’s vice president of public and government affairs Ken Cohen told reporters after the New York probe was revealed.

[Latest news:  Exxon spent $10 million a year on promoting confusion.]

Siberia's thawing permafrost fuels climate change

Frost mounds in Reindalen, Svalbard, form as permafrost confines groundwater, which is pushed up to the surface under hydrostatic pressure and re-freezes [Alfred Wegener Institute/Jaroslav Obu].
by Lowana Veal, Al Jazeera, November 23, 2015

Reykjavik, Iceland - Over the past year, a number of giant, mysterious holes have emerged in Siberia, some as deep as 200 metres.
Scientists say the craters may be emerging because the frozen ground, or "permafrost," that covers much of Siberia has been thawing due to climate change, allowing methane gases trapped underground to build up and explode.
Permafrost is ground that is permanently frozen, where the ground temperature has remained below 0 °C (32 °F) for at least two years. It covers about a quarter of the northern hemisphere's land surface.
When permafrost thaws, microbes digest the plant and animal remains that were locked in the permafrost and release greenhouse gases, carbon dioxide and methane into the atmosphere.
The phenomenon is a self-feeding cycle, explained Sarah Chadburn, from the University of Exeter.
"Permafrost soils contain vast amounts of carbon, nearly twice as much as is currently in the atmosphere. As the permafrost thaws in a warming climate, the soil decomposes and releases carbon to the atmosphere as carbon dioxide and methane. These are greenhouse gases, and they warm the Earth even more. This leads to more permafrost thawing, more carbon release, and so the cycle continues," Chadburn said.
At the recent Arctic Circle Assembly in Reykjavik, Iceland, Max Holmes from the US-based Woods Hole Research Center (WHRC) said in a presentation that the Siberian sinkholes "are an additional indication that vast changes are under way in the Arctic."
"I don't worry about them too much in and of themselves," the researcher said. "But they do reinforce the notion that big changes are already happening, and that we are likely to have more unpleasant surprises in the future."
Recent research has found that a third greenhouse gas, nitrous oxide (N2O), is also emitted in some areas covered by thawing permafrost.
"We now know that a lot of nitrogen is released during permafrost thaw and that the microbes responsible for N2O production are present in virtually all Arctic and boreal systems," said Ben Abbott, a France-based scientist who studies permafrost in Alaska.
He added that it was unclear whether nitrogen gas emissions from thawing permafrost are significant compared with those of carbon dioxide and methane.
Despite scientists' concern that thawing permafrost could exacerbate global warming, Chadburn noted that "most climate models do not include the warming aspect of permafrost emissions," including the models used by the Intergovernmental Panel on Climate Change (IPCC).
Although the IPCC has acknowledged that permafrost contributes to global warming, a lack of data on the phenomenon has meant that they have not been able to include it in their reports.
Chadburn estimated that thawing permafrost would raise global temperatures by an average of 0.3 °C but could be as much as 0.7 °C.
Given predictions that permafrost thaw could cause warming, Hugues Lantuit from the Alfred Wegener Institute in Germany said that "the objective for the COP21 climate summit should really be a temperature increase of no more than 1.7 °C  to take account of emissions from permafrost," referring to the annual global conference on climate change to be held next month in Paris.
Walter Oechel from San Diego State University and the Open University and Donatella Zona from the University of Sheffield have been measuring methane fluxes in the Arctic for more than a decade. "We expect methane emissions from the Arctic to increase dramatically with warming of the Arctic," they said.
"And, the potential is there for this release to become catastrophic."
Meanwhile, the frequency of fires has been intensifying in Arctic areas, noted Scott Goetz from WHRC. More than two million hectares of land have burned in Alaska this year, he said in his presentation at the Arctic Circle Assembly.
"Climate warming and drying are intensifying the fire regime. These fires burn roots and the trees then fall over… Fire disturbance deepens thaw depth and mobilises permafrost carbon," Goetz said.
In addition to contributing to global warming, thawing permafrost also affects wildlife and indigenous populations in the Arctic.
Courtney Price, of the Arctic Council's Conservation of Arctic Flora and Fauna organisation, said continued thawing of permafrost is one factor endangering thermokarst lakes. These lakes are formed by the thawing of permafrost and accumulation of surface water in the depression.
But if permafrost continues to thaw, there is no structure to hold the water, and the lakes can drain completely, Price said.
"Thermokarst lakes act as 'hot spots' of biological activity in northern regions… Such biologically productive systems are important to Arctic peoples for supporting traditional lifestyles, and for providing water to rural/urban communities and development, especially where groundwater resources are unavailable," she explained.
The phenomenon also affects public safety: Around 70% of the world's permafrost is found in Russia and, in Siberia, entire cities, of which Yakutsk is the largest, are built on permafrost. When permafrost thaws, buildings can tilt and become uninhabitable.
The solution? WHRC scientist Sue Natali said that "to save permafrost, we have to reduce fossil fuel use and manage forests globally to enhance carbon dioxide uptake by the biosphere."

Lamar Smith completely rebutted on his accusation that NOAA rushed to publication

An image obtained on November 16, 2015, from the National Oceanic and Atmospheric Administration (NOAA) shows the satellite sea surface temperature departure for the month of October 2015, where orange-red colors are above normal temperatures and are indicative of El Nino. (NOAA via Agence France-Presse)

by Lisa Rein, The Washington Post, November 23, 2015

The escalating struggle between an influential House Republican and government scientists over their pivotal study of global warming now turns on accusations that they rushed to publish their findings to advance President Obama’s agenda on climate change.
But a spokeswoman for Science, the prestigious peer-reviewed journal that in June published the paper by climate scientists at the National Oceanic and Atmospheric Administration, said in an interview that their research was subject to a longer, more intensive review than is customary.
“This paper went through as rigorous a review as it could have received,” said Ginger Pinholster, chief of communications for the American Association for the Advancement of Science, which publishes Science. “Any suggestion that the review was ‘rushed’ is baseless and without merit.”
She said the paper, submitted to the journal in December, went through two rounds of peer review by other scientists in the field before it was accepted in May. The number of outside reviewers was larger than usual, and the time from submission to online publication was about 50% longer than the journal’s average of 109 days, Pinholster said.
During the review, the research was sent back to NOAA for revision and clarification, she said. And because it was based on such an “intensive” examination of global temperature data, the reviewed was handled by one of the journal’s senior editors, she said, “so it could be more carefully assessed.”
The study is widely considered to be a bombshell in the climate change world because it contradicted the notion of a “pause” in global warming and thus undercut the arguments of global warming skeptics. Among them is Rep. Lamar Smith (R-Tex.), who chairs the House Committee on Science, Space and Technology. NOAA’s data sets are used by climate scientists to take temperature measurements worldwide.
Smith has subpoenaed four top NOAA officials seeking internal e-mails and documents relating to the study, which he alleged last week in a letter to Commerce Secretary Penny Pritzker was “rushed to publication” and may have violated the agency’s scientific integrity standards. The chairman also has threatened to subpoena Pritzker, whose department includes NOAA, if she does not turn over the internal deliberations.
What makes this feud so difficult to referee is not just the complexity of climate science. Smith and his committee have yet to offer details of the allegations that the research was rushed.
Smith told Pritzker in his letter that his claim is based on information from whistleblowers who have told the committee that some employees at NOAA had concerns about the research. But committee staff members have declined to provide details about these concerns, saying that disclosing the specifics could jeopardize the panel’s sources and their anonymity.
Pressed for more details, committee aides last week pointed as an example to new temperature data that was made publicly available earlier this month and questioned how the scientists used it. The data came from a larger number of measuring stations around the world than previous data sets.
“NOAA should not be publishing headline-grabbing results based on data sets that have not been adequately vetted and were not available to the public,” an aide to the House committee said.
Smith has said the scientists were in a hurry to have their findings published because they wanted to influence Obama’s Clean Power plan, which aims to cut carbon emissions from power plants, and upcoming international negotiations in Paris on climate change.
“This isn’t an easy high school science experiment where you do it and you get results and write them up,” another aide said. “There are huge data sets from all over the world. They need to be studied. Every time the sets are changed they have to be worked on to make sure the data set is now valid.”
The scientists, however, say their research was based on an earlier version of the data that had been made public and examined by other climate experts. The study published in Science was not based on the updated data released earlier this month, although the two versions are very similar, according to NOAA officials and one of the study’s authors.
That author, Thomas Peterson, described in an interview some of the internal tensions at NOAA between the scientists and computer engineers who were writing software code for the data and wanted more time to make sure it was reliable. The scientists felt confident using the data that had already been made public and were ultimately vindicated by the latest version.
The conclusions of the Science paper were based on corrections and adjustments to even earlier land and sea temperature measurements. These were intended to address what scientists described as measurement biases in readings taken of ocean temperatures and land temperatures that did not fully account for the rapidly warming Arctic.
NOAA published the first updates to the land temperature data set in October 2013 in the Geoscience Data Journal. The revised sea surface data were published in the Journal of Climate in October 2014. These updates were the basis of the study in Science, NOAA officials said.
That combined data set was available publicly in July 2014, officials said.
As NOAA scientists examined the data, they discovered that warming trends over the past few decades would be substantially larger than what the earlier data set indicated, recalled Peterson, who retired from NOAA as principal scientist in July.
“Was there a rush to get [the research] out? No,” he said. “Did we want to get this out to advance the science? Of course.”
Peterson acknowledged that tensions over timing developed between the scientists and a team of computer engineers — some contractors, some civil servants — who were rewriting the software code to process the data once it was collected from stations worldwide. The engineers wanted more time to test and retest the software to ensure its reliability, he recalled. The scientists argued that it was taking too long.
“We’re talking about a time lag of years between the science and when they thought the software testing would be ready because of this question of whether one piece of software might develop a glitch, ” said Peterson, now president of the World Meteorological Organization’s Commission for Climatology.
To accommodate the engineers, he said, the submission to GeoScience was delayed by six months.
Still, the engineers were frustrated that the scientists were pressing ahead with research based on the data. “They viewed it as putting the cart before the horse,” Peterson said. “Those of us, like myself, who knew the importance of getting this work out literally years ago were very frustrated by the process … that slowed the the release down.”
Peterson stressed that the scientific analysis itself was not rushed. “Indeed just the opposite is true,” he said.
Peterson and NOAA officials said the scientists decided to submit their research to Science before the most recent data set was ready for release this month. They said the new version, which contains even more data than the set used for the Science study, confirms its findings.
Read more:

Sunday, November 22, 2015

Kevin Anderson, Nature: Duality in Climate Science

by Kevin Anderson, Nature Geoscience, (2015)

Delivery of palatable 2 °C mitigation scenarios depends on speculative negative emissions or changing the past. Scientists must make their assumptions transparent and defensible, however politically uncomfortable the conclusions.

In July, Paris hosted “Our Common Future Under Climate Change,” a key conference organized as a prelude to the political negotiations scheduled for December 2015, also in Paris. In the conference summary that immediately followed, the scientific committee noted that limiting “warming to less than 2 °C” is “economically feasible” and “cost effective” (1).

The statement chimed with the press release that accompanied the Synthesis Report published by the Intergovernmental Panel on Climate Change (IPCC) last November, in which IPCC representatives suggested that “to keep a good chance of staying below 2 °C, and at manageable costs, our emissions should drop by 40–70% globally between 2010 and 2050, falling to zero or below by 2100,” and that mitigation costs would be so low that “global economic growth would not be strongly affected” (2).

If these up-beat — and largely uncontested— headlines are to be believed, reducing emissions in line with a reasonable- to-good chance of meeting the 2 °C target requires an accelerated evolution away from fossil fuels; it does not, however, necessitate a revolutionary transition in how we use and produce energy. Such conclusions are forthcoming from many Integrated Assessment Models, which are key tools for informing policy makers of alternative climate change futures. In these models, prices, markets and human behaviour are brought together with the physics of climate change to generate “policy-relevant” and cost-optimized emission scenarios that typically offer highly optimistic views of the future. However, these positive outcomes are delivered through unrealistically early peaks in global emissions (3), or through the large-scale rollout of speculative technologies intended to remove CO2 from the atmosphere (3,4), yielding so-called negative emissions.

In stark contrast, I conclude that the carbon budgets associated with a 2 °C threshold demand profound and immediate changes to the consumption and production of energy. According to the IPCC’s Synthesis Report, no more than 1,000 billion tonnes (1,000 Gt) of CO2 can be emitted between 2011 and 2100 for a 66% chance (or better) of remaining below 2 °C of warming (over preindustrial times) (5).

Without resorting to “changing the past,” or making the leap of faith that substantial amounts of CO2 can be removed from the atmosphere in the coming decades, the IPCC’s 1,000-Gt budget requires an end to all carbon emissions from the energy system by 2050 — five decades earlier than the IPCC headline suggests.

Geo-engineering as systemic bias

In most Integrated Assessment Models, 2 °C carbon budgets are effectively increased through the adoption of negative-emission technologies. These technologies are currently at little more than a conceptual stage of development, yet are ubiquitous within 2 °C scenarios. Nowhere is this more evident than in the IPCC’s scenario database (6).

Of the 400 scenarios that have a 50% or better chance of no more than 2 °C warming (with three scenarios removed due to incomplete data), 344 assume the successful and large-scale uptake of negative-emission technologies. Even more worryingly, in all 56 scenarios without negative emissions, global emissions peak around 2010, which is contrary to available emissions data (7).

In plain language, the complete set of 400 IPCC scenarios for a 50% or better chance of meeting the 2 °C target work on the basis of either an ability to change the past, or the successful and large-scale uptake of negative-emission technologies. A significant proportion of the scenarios are dependent on both.

Reality check

Building on the concept of carbon budgets (8–10), I present an alternative line of reasoning that suggests a radically different challenge to that dominating the current discourse on climate change.

As the IPCC reiterates (in section 2.1 of ref. 5), it is cumulative emissions of CO2 that matter in determining the global mean surface warming out to 2100. Specifically, and as noted earlier, the IPCC’s Synthesis Report concludes that no more than 1,000 Gt of CO2 can be emitted between 2011 and 2100 for a 66% chance, or better, of remaining below a 2 °C rise (5).

However, between 2011 and 2014, CO2 emissions from energy production alone amounted to about 140 Gt of CO2 (ref. 7). To limit warming to no more than 2 °C, the remaining 860 Gt of CO2 (out to 2100) must be apportioned between the three principal emission sources: energy, deforestation and cement production (for cement, I count process CO 2 only; energy-related cement emissions are accounted for in total energy CO2).

Assuming concerted efforts to reduce emissions from all three sources, I base deforestation and land-use change emissions for the period 2011–2100 on RCP4.5 (, the IPCC’s most ambitious deforestation pathway to exclude net-negative land-use emissions. I therefore adopt a highly optimistic total deforestation budget of about 60Gt of CO2.

Process emissions from cement production must be considered separately. Industrialization throughout poorer nations and the construction of low-carbon infrastructures within industrialized nations will continue to drive rapid growth in process emissions, which currently run at about 7% per year (R. Andrewpersonal communication and ref11). Although lower-carbon alternatives sucas carbon capture and storage and the prudent use of cement may reduce some of this early growth (R. Andrew, personal communication and ref11), in the longer term, these emissions must be eliminated entirely. A provisional analysis, building othe latest process-emission trends (personal communications from both K. West and R. Andrew, and refs. 11,12), suggests process emissions from cement production could be constrained to around 150 Gt of COfrom 2011 to their eradication later in the century.

Consequently, the remaining budget for energy-only emissions over the period 2015–2100, for a likely chance of staying below 2 °C, is about 650 Gt of CO2.

Unpalatable repercussions

A carbon budget this tight suggests a profoundly more challenging time frame and rate of mitigation than that typically asserted by many within the scientific community. It demands a dramatic reversal of current trends in energy consumption and emissions growth: more than a fifth of the remaining budget has been emitted in just the past four years. To avoid exceeding 650 Gt, global mitigation rates must rapidly ratchet up to around 10% per year by 2025, continuing at such a rate towards the virtual elimination of CO2 from the energy system by 2050.

The severity of such cuts would probably exclude the use of fossil fuels, even with carbon capture and storage (CCS), as a dominant post-2050 energy source. Only if the life cycle carbon emissions of CCS could be reduced by an order of magnitude from those postulated for an efficiently operating gas-CCS power station (typically around 80 g CO2 per kilowatt-hour (13)), could fossil fuels play any significant role beyond 2050.

Delivering on such a 2 °C emission pathway cannot be reconciled with the repeated high-level claims that in transitioning to a low-carbon energy system “global economic growth would not be strongly affected” (2). Certainly it would be inappropriate to sacrifice improvements in the welfare of the global poor, including those within wealthier nations, for the sake of reducing carbon emissions.

But this only puts greater pressure on the lifestyles of the relatively small proportion of the globe’s population with higher emissions— pressure that cannot be massaged away through incremental escapism. With economic growth of 3% per year, the reduction in carbon intensity of global gross domestic product would need to be nearer 13% per year; higher still for wealthier industrialized nations, and higher yet again for those individuals with well above average carbon footprints (whether in industrial or industrializing nations).

A candid assessment

The IPCC’s Synthesis Report and the scientific framing of the mitigation challenge in terms of carbon budgets are important steps forward. As scientists, we must now leverage the clarity gained by the budget concept to combat the almost global-scale cognitive dissonance in acknowledging its quantitative implications. Yet, so far, we simply have not been prepared to accept the revolutionary implications of our own findings, and even when we do we are reluctant to voice such thoughts openly.

Instead, my long-standing engagement with many colleagues in science leaves me in no doubt that although they work diligently, often against a backdrop of organized skepticism, many are ultimately choosing to censor their own research. Explicit and quantitative carbon budgets provide a firm foundation on which policy makers and civil society can build a genuine low-carbon society. But the job of scientists remains pivotal. It is incumbent on our community to communicate our research clearly and candidly to those delivering on the climate goals established by civil society; to draw attention to inconsistencies, misunderstandings and deliberate abuse of the scientific research.

It is not our job to be politically expedient with our analysis or to curry favor with our funders. Whether our conclusions are liked or not is irrelevant. Yet, as we evoke a deus ex machina (such as speculative negative emissions or changing the past) to ensure our analyses conform with today’s political and economic hegemony, we do society a grave disservice — the repercussions of which will be irreversible. 

Kevin Anderson is at the Tyndall Centre for Climate Change Research, University of Manchester, Pariser Building, Sackville Street, Manchester M13 9PL, UK.


1. Our Common Future under Climate Change—Outcome Statement (CFCC15 Scientific Committee, 2015);
2. Concluding Instalment of the Fifth Assessment Report IPCC Press Release (2 November 2014);
3. The Emissions Gap Report 2014 (United Nations Environment Programme, 2014).
4. Fuss, S. et al. Nature 4, 850–853 (2014).
5. Climate Change 2014: Synthesis Report (eds Pachauri, R. K. et al.) (IPCC, 2014).
6. IPCC Climate Change 2014: Mitigation of Climate Change (eds Edenhofer, O. et al.) (Cambridge Univ. Press, 2014).
7. Global Carbon Atlas Emissions The Global Carbon Project;
8. Anderson, K. et al. Energy Policy 36, 3714–3722 (2008).
9. Anderson, K. & Bows, A. Phil. Trans. R.Soc. A 369, 20–44 (2011).
10. Frame, D. et al. Nature Geosci. 7, 692–693 (2014).
11. Cement Technology Road Map 2009 (International Energy Agency, 2009);
12. Energy Technology Perspectives 2014 (International Energy Agency, 2014);

13. Hammond, G. et al. Energy Policy 52, 103–116 (2013).