Blog Archive

Wednesday, June 30, 2010

Andrew Freedman, WaPo: precipitation extremes already increasing in frequency and severity due to global climate change

String of floods raise climate change questions

Call it the spring of flash floods. Rare and deadly flash flooding events have struck several parts of the south-central U.S. from Tennessee to Oklahoma this spring, with two remarkable events occurring in just the past five days: the astounding six-to-ten-inch gully washer that resulted in numerous swift-water rescues in Oklahoma City yesterday morning, and the tragic deluge in rural Arkansas late last week.

Cars are stranded and submerged by flood water in Oklahoma City after heavy rain hit the area. (Alonzo Adams-AP)
With precipitation extremes already increasing in frequency and severity due to global climate change, it is especially important that journalists and the public learn how to discuss the climate context of such extreme weather events, provided such discussions stick to the scientific evidence.

Take the Arkansas flash flood for example, which toppled trees, lifted cars, and swept at least 20 people to their deaths in the wee hours of the morning on June 11. That flood joins the ranks of a long history of deadly flash floods in the United States.

One might think the rainfall was extremely unusual, even unprecedented. However, the approximately 6.83 inches of rain that fell in 24 hours in the Arkansas event was actually a 1-in-10-year event with respect to the 24-hour rainfall total (it may have been rarer if one were to examine historical six-hour precipitation data). A confluence of factors made the the heavy rain so deadly -- most of it fell in just a few hours, in an area primed for rapid rises in small rivers, creeks and streams, while a large and vulnerable population of campers was sleeping in a remote area.

In some respects, the Arkansas flood bore a resemblance to the infamous 1976 Big Thomson Canyon flood in Colorado, which killed 145 people and also struck a popular campground and recreation area. The Arkansas rains caused an astonishingly rapid rise in the Little Missouri River and other small rivers and streams. A gauge maintained by the U.S. Geological Survey reveals that the Little Missouri River climbed almost 20 feet in just a few hours at the nearby town of Langley. At 2 a.m. central time the river was at 3.81 feet, and by 5:30 a.m. it reached a whopping 23.39 feet. This exceeded the previous highest flood in that location by about 10 feet!

A plot of the stream flow from a nearby gauge on the Caddo River also depicts the shockingly swift rise of the water, and shows why people in the campground had very little -- if any -- time to prepare for the surge of water.

Height of the Caddo River during the flash flood event on June 11, 2010, as shown on a USGS streamflow gauge. Credit: USGS.

The climate change context of the recent flood events is intriguing, and like most things climate change-related, not exactly cut and dry (so to speak). The Tennessee and Oklahoma rainstorms were far more unusual than the Arkansas flood. The Tennessee event, in which 13-19 inches of rain fell during a two-day period, flooding much of the state including downtown Nashville, was around a 1-in-1,000-year event. And the Oklahoma City floods yesterday were a 1-in-100-year event. Some areas in and around Oklahoma City eclipsed their monthly average rainfall totals for June in under 12 hours.

Doppler radar estimated precipitation totals for Oklahoma City on June 14. Image credit: Weather Underground.
On the one hand, these were all extreme events of the sort that scientists have found are already becoming more common, and are likely to become more frequent and severe in the next few decades as more water vapor is added to the atmosphere in response to a warming climate. (See an illuminating interview about this with climate researcher Kevin Trenberth, posted yesterday at Joe Romm's Climate Progress blog.)

On the other hand, it's impossible to say that a specific extreme event was caused by climate change, and flash floods have occurred throughout U.S. history, killing an average of about 100 people per year.
So, with this information in mind, how should journalists cover extreme events?

Responding to the Arkansas flood, Andy Revkin of the New York Times' DotEarth blog wrote that he is resistant to the idea -- put forward by climate blogger Joe Romm -- that journalists should refer to such precipitation extremes as "global warming-type" events.

"What is changing is the frequency -- and in many places the exposure to risk as people congregate and build in flood zones. But given the scope of this tragedy, more on quibbles over semantics can wait for another day," Revkin wrote on Saturday.

However, the question of whether to raise climate change in discussions of flash floods (and other extreme events) constitutes more than a quibble over semantics. The media has a responsibility to report what the science says, even in the context of a breaking news story, such as a flood event or heat wave. The science has become clearer, although by no means certain, that local precipitation extremes may be connected to climate change. Yet, to date, the mainstream media has shied away from raising climate change in extreme event coverage. This is unfortunate, because it constitutes a missed opportunity to make climate change relevant to people in the here and now, rather than an abstract concept in the distant future.

As I wrote last spring:
When an extreme event occurs, a reporter is often caught in a quandary. If we overplay the causal link between climate change and the event, then we can rightly be accused of being alarmist...
Yet, if journalists ignore the scientific studies that show that some types of extreme events are consistent with what is expected due to climate change, then we may be guilty of a sin of omission.
The media barely raised the topic of climate change in response to the record Tennessee floods last month, which were so unusual that they were crying out for a scientific discussion of the climate connection. In fact, a good argument can be made that -- by ignoring the subject of climate change -- the media coverage of those floods was actually inconsistent with the scientific evidence.

Reporters should take each event on a case-by-case basis, and speak with scientists who can shed light on whether an extreme event may be related to climate change or not. By failing to ask such questions, reporters risk continuing to provide incomplete coverage. More journalists should heed the advice given by meteorologist Stu Ostro of The Weather Channel, who wrote in response to record rains last year in his home state of Georgia and other weather extremes (although Ostro was addressing weather experts, his advice is relevant to the press):
It behooves us to understand not only theoretical expected increases in heavy precipitation (via relatively slow/linear changes in temperatures, evaporation, and atmospheric moisture) but also how changing circulation patterns are already squeezing out that moisture in extreme doses and affecting weather in other ways.
While it's important to consider what may happen in 50 or 100 or 200 years, and debate what should be done about that... we need to get a grip on what's happening *now*.

Petermann Glacier explained by Mauri Pelto of "From a Glacier's Perspective"

Petermann Glacier explained by "From a Glacier's Perspective"

from the blog by Mauri Pelto:  From a Glacier's Perspective, March 27, 2010

The Petermann Glacier in northwest Greenland is significantly different than the fast flowing large outlet glaciers, such as the Jakobshavn and Helheim Glacier we here so much about. Petermann Glacier is much thinner at the calving front and moves much slower. A detailed review of some of the differences is explored in an article I wrote for Realclimate in 2008. In 2008, Petermann Glacier lost a substantial area, 29 km² due to calving as noted by Jason Box at Ohio State, and a crack well back of the calving front indicates another 150 km² is in danger. Petermann Glacier has a floating section 16 km wide and 80 km long, that is, 1280 sq. km. This is the longest section of floating glacier in the Northern Hemisphere. The area of floating ice on the Jakobshavns in contrast has varied from year to year with retreat but has remained less than 40 sq. km. The size of the floating tongue provides the potential for a longer exposure and greater melting at the base of the glacier. This is also due to the slow velocities, 1 kilometer per year. The slow velocity results in a greater duration of surface and basal melting, which effectively thins the glacier to a mere 60-70 meters at the calving front. Petermann Glacier loses 90% of its thickness before it reaches the calving front thinning from 600-700 m at the grounding line primarilly due to basal melting. The calving front protrudes a mere 5-10 m above sea level, not your typical towering ice cliff you generally envision for a large Greenland outlet glacier. This reflects the fact that the ice at the front is only 60-70 m thick (Higgins, 1990). Thus, at 500-600 meters of thickness is lost to melting. The glacier velocity is close to 1 km per year, 3/meters day, about 10% of the velocity of Jakobshavn (Rignot, 2000). The volume of the ice lost is much less than that from the loss of a comparable area by Jakobshavn because the ice is an order of magnitude thinner. Radar assessment by Howard Zebkar, Stanford University, of glacier velocity indicates the acceleration near the grounding line as the glacier narrows and friction is reduced, then deceleration as the glacier thins after becoming fully afloat.

The key to this glacier’s second major ice loss this decade, after limited retreat in the last century, is thinning of the floating tongue. The thinning weakens the glacier and reduces the degree of anchoring to the fjord walls. The loss of this ice should then lead to acceleration, developing more crevassing and glacier retreat. The crack seen in the image of Petermann Glacier (ASTER image provided by Ian Howat of Ohio State) is more of a rift, like those on Larsen Ice Shelf, than a crevasse.

The transverse rift is further connected to longitudinal-marginal rifts. Illustrating the poor connection of the Petermann Glacier to its margin and lack of a stabilizing force this margin has, even 15 km behind the calving front. This is not the only rift of its kind on the glacier. Also note that like on Larsen Ice Shelf the rift crosscuts surface streams. These illustrate the significance of surface melting, which reached a record in 2008 (Box and others, 2009). The surface melting does not enhance flow on this section of the glacier at all, as it is already afloat.

The images below are a series of Landsat images provided by the USGS, from 2002, 2006 and 2007. These illustrate the shift in the terminus and in the position of key rifts A, B and C. The distance back from the terminus has diminished for A and B from 2002 to 2007. In 2006 to 2007, the shift in the position of C is also evident.The final image is a larger scale indicating the entire valley section of the Petermann Glacier. The darker blue hue indicates that this is bare ice and is in the ablation zone. This is true in each year examined.
The transition to the lighter hue, indicates the snowline, which is a short distance above the valley tongue. Petermann Glacier is poised to lose a greater area as it retreats than Jakobshavn Glacier, but a smaller total volume.

Higgins, A. 1990. Northern Greenland glacier velocities and calf ice production. Polar Forschung, 60, 1-23.


Sharon Begley: Newspapers retract 'Climategate' claims, but damage still done

Newspapers retract 'Climategate' claims, but damage still done

Vindicated too late? Penn State climatologist Michael Mann. Greg Rico / AP

by Sharon Begley, Newsweek, June 25, 2010

A lie can get halfway around the world while the truth is still putting its boots on, as Mark Twain said (or “before the truth gets a chance to put its pants on,” in Winston Churchill’s version), and nowhere has that been more true than in "climategate." In that highly orchestrated, manufactured scandal, e-mails hacked from computers at the University of East Anglia’s climate-research group were spread around the Web by activists who deny that human activity is altering the world’s climate in a dangerous way, and spun so as to suggest that the scientists had been lying, cheating, and generally cooking the books.
But not only did British investigators clear the East Anglia scientist at the center of it all, Phil Jones, of scientific impropriety and dishonesty in April, an investigation at Penn State cleared PSU climatologist Michael Mann of “falsifying or suppressing data, intending to delete or conceal e-mails and information, and misusing privileged or confidential information” in February. In perhaps the biggest backpedaling, The Sunday Times of London, which led the media pack in charging that IPCC reports were full of egregious (and probably intentional) errors, retracted its central claim—namely, that the IPCC statement that up to 40% of the Amazonian rainforest could be vulnerable to climate change was “unsubstantiated.” The Times also admitted that it had totally twisted the remarks of one forest expert to make it sound as if he agreed that the IPCC had screwed up, when he said no such thing.
It’s worth quoting the retraction at some length:
The article "UN climate panel shamed by bogus rainforest claim" (News, Jan 31, 2010) stated that the 2007 Intergovernmental Panel on Climate Change (IPCC) report had included an “unsubstantiated claim” that up to 40% of the Amazon rainforest could be sensitive to future changes in rainfall. The IPCC had referenced the claim to a report prepared for the World Wildlife Fund (WWF) by Andrew Rowell and Peter Moore, whom the article described as “green campaigners” with “little scientific expertise.” The article also stated that the authors’ research had been based on a scientific paper that dealt with the impact of human activity rather than climate change.

In fact, the IPCC’s Amazon statement is supported by peer-reviewed scientific evidence. In the case of the WWF report, the figure . . . was based on research by the respected Amazon Environmental Research Institute (IPAM) which did relate to the impact of climate change. We also understand and accept that . . . Dr Moore is an expert in forest management, and apologise for any suggestion to the contrary.

The article also quoted criticism of the IPCC’s use of the WWF report by Dr Simon Lewis, a Royal Society research fellow at the University of Leeds and leading specialist in tropical forest ecology. We accept that, in his quoted remarks, Dr Lewis was making the general point that both the IPCC and WWF should have cited the appropriate peer-reviewed scientific research literature. As he made clear to us at the time, including by sending us some of the research literature, Dr Lewis does not dispute the scientific basis for both the IPCC and the WWF reports’ statements on the potential vulnerability of the Amazon rainforest to droughts caused by climate change. . . .  A version of our article that had been checked with Dr Lewis underwent significant late editing and so did not give a fair or accurate account of his views on these points. We apologise for this.
In another retraction you never heard of, a paper in Frankfurt took back (apologies; the article is available only in German) its reporting that the IPCC had erred in its assessment of climate impacts in Africa.

The Times's criticism of the IPCC—look, its reports are full of mistakes and shoddy scholarship!—was widely picked up at the time it ran, and has been an important factor in turning British public opinion sharply against the established science of climate change. Don’t expect the recent retractions and exonerations to change that. One of the strongest, most-repeated findings in the psychology of belief is that once people have been told X, especially if X is shocking, if they are later told, “No, we were wrong about X,” most people still believe X. As Twain and Churchill knew, sometimes the truth never catches up with the lie, let alone overtakes it. As I wrote last summer in a story about why people believe lies even when they’re later told the truth, sometimes people’s mental processes simply go off the rails. 

Tuesday, June 29, 2010

Tea Party denialists postpone Las Vegas convention due to heat wave

How hot is it? So hot that 8 countries in Africa and Asia set all-time high temperature records

And the Tea Party postponed their Las Vegas convention

by Joseph Romm, Climate Progress, June 29, 2010

Before getting to the irony of the anti-science Tea Partiers canceling their big convention because the weather is too hot, let’s look at some of the staggering extreme weather events around the globe.

In China, “The Southern Daily said over 600 millimetres (24 inches) of rain fell in Guangdong’s Huilai county over a six-hour period on Friday, a 500-year record.”  That’s two feet of rain in 6 hours!

As Dr. Kevin Trenberth, head of the Climate Analysis Section at the National Center for Atmospheric Research, told me earlier this month:
There is a systematic influence on all of these weather events now-a-days because of the fact that there is this extra water vapor lurking around in the atmosphere than there used to be say 30 years ago. It’s about a 4% extra amount, it invigorates the storms, it provides plenty of moisture for these storms and it’s unfortunate that the public is not associating these with the fact that this is one manifestation of climate change. And the prospects are that these kinds of things will only get bigger and worse in the future.
The latest record-smashing U.S. superstorm was two weeks ago in Oklahoma.  Now we know it was even more record-setting than initially thought — see Capital Climate’s update “Oklahoma City Paralyzed By Flash Floods“:
The final daily rainfall for Oklahoma City is 7.62″. This breaks the all-time daily rainfall record for any day in any month.
The Weather Channel reported:
Oklahoma City Micronet (OKCNET) reports that a rainfall observation of 10.21″ in OKC has exceeded the 1-in-500 year rainfall total for a 12 hour period.
Moreover, the 9 inches that fell in 6 hours meets the requirements for a 1 in 500 year flood event.
That’s almost as impressive as Tennessee’s 1000-year deluge.  As with Tennessee, New England, and Georgia, what makes OK’s deluge doubly remarkable is that it was not the remnant of a tropical storm (see “Weather Channel expert on Georgia’s record-smashing global-warming-type deluge“).

As for the heat, meteorologist Jeff Masters reports:
Extreme heat wave in Africa and Asia continues to set all-time high temperature records
A withering heat wave of unprecedented intensity and areal covered continues to smash all-time high temperatures Asia and Africa. As I reported earlier this week, Kuwait, Saudi Arabia, Iraq, Chad, Niger, Pakistan, and Myanmar have all set new records for their hottest temperatures of all time over the past six weeks. The remarkable heat continued over Africa and Asia late this week. The Asian portion of Russia recorded its highest temperate in history yesterday, when the mercury hit 42.3 °C (108.1 °F) at Belogorsk, near the Amur River border with China. The previous record was 41.7 °C (107.1 °F) at nearby Aksha on July 21, 2004. (The record for European Russia is 43.8 °C (110.8 °F) set on August 6, 1940, at Alexandrov Gaj near the border with Kazakhstan.) Also, on Thursday, Sudan recorded its hottest temperature in its history when the mercury rose to 49.6 °C (121.3 °F) at Dongola. The previous record was 49.5 °C (121.1 °F) set in July 1987 in Aba Hamed.
We’ve now had eight countries in Asia and Africa, plus the Asian portion of Russia, that have beaten their all-time hottest temperature record during the past two months. This includes Asia’s hottest temperature of all-time, the astonishing 53.5 °C (128.3 °F) mark set on May 26, 2010, in Pakistan. All of these records are unofficial, and will need to be certified by the World Meteorological Organization (WMO). According to Chris Burt, author of Extreme Weather, the only year which can compare is 2003, when six countries (the UK, France, Portugal, Germany, Switzerland, and Liechtenstein) all broke their all-time heat records during that year’s notorious summer heat wave. Fortunately, the residents of the countries affected by this summer’s heat wave in Asia and Africa are more adapted to extreme high temperatures, and we are not seeing the kind of death tolls experienced during the 2003 European heat wave (30,000 killed). This week’s heat wave in Africa and the Middle East is partially a consequence of the fact that Earth has now seen three straight months with its warmest temperatures on record, according to NOAA’s National Climatic Data Center. It will be interesting to see if the demise of El Niño in May will keep June from becoming the globe’s fourth straight warmest month on record.
Note:  Masters updated his post, so I have updated this one.

I know that, for the anti-science crowd, this is all a coincidence, but when you smash so many records in huge countries or entire continents, at the same time that NASA reports that globally it was easily the hottest spring — and Jan-May — in the temperature record (and NOAA, too), well maybe some major media outlet somewhere will make the link.  No it probably won’t be the Washington Post.

I previously discussed the record-breaking temperature sweeping the nation (see “Record heat sweeps DC, nation, and world“).
Total number of daily high and low temperature records set in the U.S., data from NOAA National Climatic Data Center, background image © Kevin Ambrose.  Includes historical daily observations archived in NCDC’s Cooperative Summary of the Day data set and preliminary reports from Cooperative Observers and First Order National Weather Service stations.  All stations have a Period of Record of at least 30 years.
Finally, the Tea Party story is just too ironic not to report.

The Tea Party crowd famously doesn’t believe in global warming (see “Virginia AG mocks dangers of CO2, telling Tea Partiers to hold their breath and make the EPA happy“).  They even invite the most extremist disinformers and purveyors of hate speech to their events (see “Irony-gate 2: Modern day Tea Partiers outsource denial to Lord Monckton — a British peer!“).

As TPM Cafe reported yesterday in their story, “Tea Party Convention Postponed — Vegas In July Is Too Hot!”
A planned “unity” convention for tea partiers is being pushed to the fall, with organizers scrapping a major gathering with just 19 days before it was scheduled to go off. They cited heat….
Tea Party Nation announced in an email to members this weekend that their “unity” convention, planned for July 15-17 in Las Vegas, would be delayed….
Full email below:
“This week, there were several meetings of the Executive Planning Committee for the convention. We concluded it would more advantageous to hold the convention in the middle of October just prior to the November elections….
“This was not a spur of the moment decision contrary to anyone’s opinion or thoughts….
“We were so excited about the tremendous success of the first convention, we jumped into this second convention without considering the timing. The heat in Las Vegas in July is keeping many who would like to participate from attending…..”
Seriously.  Nobody told the Tea Partiers that it’s hot in July in Vegas?  True, there have been two June mini-heat waves in in Vegas.


But in fact, temperatures are only running “3 to 6 degrees above normal,” so this isn’t even the record smashing stuff that I normally write about.  No, this is merely the kind of hot weather any group capable of rational planning would have to anticipate.  Jumping into things without actually thinking them through at all is a perfect metaphor for the Tea Party crowd, though.

If this kind of warmth chases away people from Vegas now, I wonder how many people are going to show up in the summer down the line if we stay anywhere near our current emissions path if you like it hot, you ain’t seen nothing yet (see Our hellish future: Definitive NOAA-led report on U.S. climate impacts warns of scorching 9 to 11 °F warming over most of inland U.S. by 2090 with Kansas above 90 °F some 120 days a year — and that isn’t the worst case, it’s business as usual!”).

In a terrific March presentation, Climate scientist Katherine Hayhoe has a figure of what business as usual (A1F1 or 1000 ppm) would mean (derived from the NOAA-led report):


Who knows, maybe they’ll put a dome over the entire city and try to air condition that with, say, Concentrated solar thermal power Solar Baseload — a core climate solution.  Absent that, one would expect a mass migration up to the north and northeast.  By century’s end, what happens in Vegas, in the summertime, may not be very much at all.

Note:  In response to a query, extreme weather expert Chris Burt e-mails me (via Jeff Masters):
The Chinese (and world) record for a 6-hour rainfall is 33.07″ at Muduocaidang, Inner Mongolia, on Aug. 1-2, 1977. I have never been able to locate this place, but being transliterated from Chinese it is hard to tell what the ‘real’ name might be or if the name has changed since 1977. The “500-year record” is a hydrological statement (such a rainfall has a once in 500-year return period for that location) not that it broke a 500-year old record.

BBC: Cryosat-2 sees floes, leads, depth of Antarctic, Arctic ice w/a SAR-interferometric radar altimeter (SIRAL)

Cryosat-2 focuses on ice target

, June 29, 2010
Infographic (BBC) 
Cryosat tracks over the Ross Ice Shelf and Ross Sea on 11 April 2010

The Cryosat-2 mission is delivering on its promise to make high-precision radar measurements of polar ice.
The first data from the European spacecraft has been presented at an Earth observation meeting in Norway.
The information clearly shows Cryosat has the required sensitivity to assess the state of Antarctic and Arctic ice, according to its lead scientist.

"All of the measurement concepts have been confirmed," UCL Professor Duncan Wingham told BBC News.

The European Space Agency's Cryosat-2 satellite was launched in April on a quest to map the thickness and shape of the Earth's polar ice cover.

It carries a single instrument -- a SAR/Interferometric Radar Altimeter (Siral) -- which has a capability that far exceeds the previous space-borne radar technology used for this purpose.

Siral has an along-track (straight ahead) resolution of about 250 m, which will allow it to see the gaps of open water between the protruding sea-ice floes of the Arctic.

With centimetre-scale accuracy, the altimeter will measure the difference in height between the two surfaces so scientists can work out the overall volume of the marine cover.


Infographic (BBC)
Cryosat's radar has the resolution to see the Arctic's floes and leads
Some 7/8ths of the ice tends to sit below the waterline -- the draft
The aim is to measure the freeboard -- the ice part above the waterline
Knowing this 1/8th figure allows Cryosat to work out sea ice thickness

A second antenna on Siral offset from the first by about a metre will enable the instrument to sense the shape of the ice below, returning more reliable information on slopes and ridges.

This interferometric observing mode will be used to assess the edges of Greenland and Antarctica where some rapid thinning has been detected in recent years.

At Esa's Living Planet Symposium here in Bergen, Professor Wingham released radar data taken from a Cryosat pass over the Ross Ice Shelf in Antarctica.

It records the edge of the 400-metre-thick mass of ice, and the sudden drop to the seawater surface which is probably covered with a thin veneer of ice.

"It shows us coming off the shelf; it shows the scoop [or slumping] you often get at the edge as a result of melting underneath, and then our pass over the sea -- although there must be a lot of ice in the water. It's very still; there are no waves on it," explained Professor Wingham.

"There's a feature in there that's so sharp, it's probably a fracture."

Another radar echo track, acquired this time in the Arctic, illustrates Cryosat's ability to see the gaps, or leads, in the ice -- something it has to do to make an assessment of ice thickness. This only became possible last week after several weeks of calibration work on Siral.

Infographic (BBC) 
Cryosat has to be able to distinguish the floes from the leads
"It's all starting to come into focus," said Professor Wingham.

Esa's mission operations team has had to work the spacecraft to get it into the correct orbit to do its science.
The Dnepr rocket put the satellite initially into an elliptical orbit that took the platform to too high an altitude to make optimal ice measurements.
This meant Cryosat had to fire its thrusters to tighten the ellipse, bringing the highest altitude down from 770 km to under 760 km; and the instrument was then re-tuned for the changed circumstances.

"We budgeted 15 kg of fuel to acquire the initial orbit to allow for launch errors," said Dr Richard Francis, the Esa Cryosat project manager.

"What we actually used to achieve this [modified] orbit was 2.2 kg. So, it was a lot less than we budgeted; we've got a lot of fuel left. We're now using about 2 g a day in normal operations."

Esa expects to get at least five years of mission life out of the satellite. The spacecraft is mid-way through a six-month commissioning phase. Once this is complete, calibrated and validated data will be delivered to the scientific community.


BBC: GOCE satellite determines Earth's gravity geoid

GOCE satellite views Earth's gravity in high definition

The geoid defines the horizontal - put a ball on this "surface" and it would not roll anywhere even though it appears to have slopes
by Jonathan Amos, Science correspondent, BBC News, Bergen, June 28, 2010
It is one of the most exquisite views we have ever had of the Earth.

This colourful new map traces the subtle but all pervasive influence the pull of gravity has across the globe.

Known as a geoid, it essentially defines where the level surface is on our planet; it tells us which way is "up" and which way is "down".

It is drawn from delicate measurements made by Europe's GOCE satellite, which flies so low it comes perilously close to falling out of the sky.

Scientists say the data gathered by the spacecraft will have numerous applications.

One key beneficiary will be climate studies because the geoid can help researchers understand better how the great mass of ocean water is moving heat around the world.

The new map was presented here in Norway's second city at a special Earth observation (EO) symposium dedicated to the data being acquired by GOCE and other European Space Agency (ESA) missions.

Europe is currently in the midst of a huge programme of EO development which will see it launch some 20 missions worth nearly eight billion euros before the decade's end.

The Gravity Field and Steady-State Ocean Circulation Explorer (GOCE) is at the front of this armada of scientific and environmental monitoring spacecraft.

Imaginary ball
Launched in 2009, the sleek satellite flies pole to pole at an altitude of just 254.9 km -- the lowest orbit of any research satellite in operation today.

The spacecraft carries three pairs of precision-built platinum blocks inside its gradiometer instrument that sense accelerations which are as small as 1 part in 10,000,000,000,000 of the gravity experienced on Earth.
Apples falling (Esa)
The 'standard' acceleration due to gravity at the Earth's surface is 9.8m per second squared
In reality the figure varies from 9.78 (minimum) at the equator to 9.83 (maximum) at the poles

This has allowed it to map the almost imperceptible differences in the pull exerted by the mass of the planet from one place to the next -- from the great mountain ranges to the deepest ocean trenches.

Two months of observations have now been fashioned into what scientists call the geoid.

"I think everyone knows what a level is in relation to construction work, and a geoid is nothing but a level that extends over the entire Earth," explained Professor Reiner Rummel, the chairman of the GOCE scientific consortium.

"So with the geoid, I can take two arbitrary points on the globe and decide which one is 'up' and which one is 'down'," the Technische Universitaet Muenchen researcher told BBC News.

In other words, the map on this page defines the horizontal -- a surface on which, at any point, the pull of gravity is perpendicular to it.

Put a ball on this hypothetical surface and it will not roll -- even though it appears to have "slopes". These slopes can be seen in the colours which mark how the global level diverges from the generalised (an ellipsoid) shape of the Earth.

In the North Atlantic, around Iceland, the level sits about 80 m above the surface of the ellipsoid; in the Indian Ocean it sits about 100 m below.
1. Earth is a slightly flattened sphere - it is ellipsoidal in shape
2. GOCE senses tiny variations in the pull of gravity over Earth
3. The data is used to construct an idealised surface, or geoid
4. It traces gravity of equal 'potential'; balls won't roll on its 'slopes'
5. It is the shape the oceans would take without winds and currents
6. So, comparing sea level and geoid data reveals ocean behaviour
7. Gravity changes can betray magma movements under volcanoes
8. A precise geoid underpins a universal height system for the world
9. Gravity data can also reveal how much mass is lost by ice sheets

The geoid is of paramount interest to oceanographers because it is the shape the world's seas would adopt if there were no tides, no winds and no currents.

If researchers then subtract the geoid from the actual observed behaviour of the oceans, the scale of these other influences becomes apparent.

This is information critical to climate modellers who try to represent the way the oceans manage the transfer of energy around the planet.

But a geoid has many other uses, too. Having a global level underpins a universal system to compare heights anywhere on Earth.
In construction, for example, it tells engineers which way a fluid would naturally want to flow through a pipeline.

Animated map of gravity around the world. Courtesy of Esa.

Geophysicists will also want to use the Goce data to try to probe what's happening deep within the Earth, especially in those places that are prone to quakes and volcanic eruptions.

"The GOCE data is showing up new information in the Himalayas, central Africa, and the Andes, and in Antarctica," explained Dr Rune Floberghagen, ESA's GOCE mission manager.

"This is, in one sense, not so surprising. These are places that are fairly inaccessible. It is not easy to measure high frequency variations in the gravity field in Antarctica with an aeroplane because there are so few airfields from which to operate."
GOCE's extremely low operating altitude was expected to limit its mission to a couple of years at most. But ESA now thinks it may be able to continue flying the satellite until perhaps 2014.

Unusually quiet solar activity has produced very calm atmospheric conditions, meaning GOCE has used far less xenon "fuel" in its ion engine to maintain its orbit.

Ultimately, though, that fuel will run out and the residual air molecules at 255 km will slow the satellite, forcing it from the sky.
The 1,100kg Goce is built from rigid materials and carries fixed solar wings. 
The gravity data must be clear of spacecraft 'noise'
The 5 x 1 m frame incorporates fins to stabilise the spacecraft as it flies 
through the residual air in the thermosphere
GOCE's accelerometers measure accelerations that are as small as 1 part in 10,000,000,000,000 of the gravity experienced on Earth
The UK-built engine ejects xenon ions at velocities exceeding 40,000 m/s; the engine throttles up and down to keep GOCE at a steady altitude


PNAS, Economic aspects of global warming in a post-Copenhagen environment, W. D. Nordhaus

Proceedings of the National Academy of Sciences, Vol. 107, No. 26, pp. 11721-11726 (June 29, 2010); doi: 10.1073/pnas.1005985107

Economic aspects of global warming in a post-Copenhagen environment

William D. Nordhaus*


The science of global warming has reached a consensus on the high likelihood of substantial warming over the coming century. Nations have taken only limited steps to reduce greenhouse gas emissions since the first agreement in Kyoto in 1997, and little progress was made at the Copenhagen meeting in December 2009. The present study examines alternative outcomes for emissions, climate change, and damages under different policy scenarios. It uses an updated version of the regional integrated model of climate and the economy (RICE model). Recent projections suggest that substantial future warming will occur if no abatement policies are implemented. The model also calculates the path of carbon prices necessary to keep the increase in global mean temperature to 2 °C or less in an efficient manner. The carbon price for 2010 associated with that goal is estimated to be $59 per ton (at 2005 prices), compared with an effective global average price today of around $5 per ton. However, it is unlikely that the Copenhagen temperature goal will be attained even if countries meet their ambitious stated objectives under the Copenhagen Accord.

*Correspondence e-mail:


PNAS, Assessing the climatic benefits of black carbon mitigation, R. E. Kopp & D. L. Mauzerall

Proceedings of the National Academy of Sciences, Vol. 107, No. 26, pp. 11703-11708 (June 29, 2010); doi: 10.1073/pnas.0909605107

Assessing the climatic benefits of black carbon mitigation

Robert E. Kopp and Denise L. Mauzerall


To limit mean global warming to 2 °C, a goal supported by more than 100 countries, it will likely be necessary to reduce emissions not only of greenhouse gases but also of air pollutants with high radiative forcing (RF), particularly black carbon (BC). Although several recent research papers have attempted to quantify the effects of BC on climate, not all these analyses have incorporated all the mechanisms that contribute to its RF (including the effects of BC on cloud albedo, cloud coverage, and snow and ice albedo, and the optical consequences of aerosol mixing) and have reported their results in different units and with different ranges of uncertainty. Here we attempt to reconcile their results and present them in uniform units that include the same forcing factors. We use the best estimate of effective RF obtained from these results to analyze the benefits of mitigating BC emissions for achieving a specific equilibrium temperature target. For a 500 ppm CO2e (3.1 W m-2) effective RF target in 2100, which would offer about a 50% chance of limiting equilibrium warming to 2.5 °C above preindustrial temperatures, we estimate that failing to reduce carbonaceous aerosol emissions from contained combustion would require CO2 emission cuts about 8 years (range of 1–15 years) earlier than would be necessary with full mitigation of these emissions. 

*Correspondence e-mail: or


Best site (and I have seen many) for daily satellite photo updates of the GrIS and the Arctic:

Best site (and I have seen many) for daily satellite photo updates of the GrIS and the Arctic:

PNAS, Expert judgments about transient climate response to alternative future trajectories of radiative forcing, K. Zickfeld, M. G. Morgan, D. J. Frame & D. W. Keith

Proceedings of the National Academy of Sciences,

Expert judgments about transient climate response to alternative future trajectories of radiative forcing

Kirsten Zickfeld*, M. Granger Morgan, David J. Frame and David W. Keith


There is uncertainty about the response of the climate system to future trajectories of radiative forcing. To quantify this uncertainty we conducted face-to-face interviews with 14 leading climate scientists, using formal methods of expert elicitation. We structured the interviews around three scenarios of radiative forcing stabilizing at different levels. All experts ranked “cloud radiative feedbacks” as contributing most to their uncertainty about future global mean temperature change, irrespective of the specified level of radiative forcing. The experts disagreed about the relative contribution of other physical processes to their uncertainty about future temperature change. For a forcing trajectory that stabilized at 7 Wm-2 in 2200, 13 of the 14 experts judged the probability that the climate system would undergo, or be irrevocably committed to, a “basic state change” as ≥ 0.5. The width and median values of the probability distributions elicited from the different experts for future global mean temperature change under the specified forcing trajectories vary considerably. Even for a moderate increase in forcing by the year 2050, the medians of the elicited distributions of temperature change relative to 2000 range from 0.8–1.8 °C, and some of the interquartile ranges do not overlap. Ten of the 14 experts estimated that the probability that equilibrium climate sensitivity exceeds 4.5 °C is > 0.17, our interpretation of the upper limit of the “likely” range given by the Intergovernmental Panel on Climate Change. Finally, most experts anticipated that over the next 20 years research will be able to achieve only modest reductions in their degree of uncertainty.


Extreme negative Arctic Oscillation winds drove Gulf Stream off course to west Greenland current

Freak current takes Gulf Stream to Greenland

by FishOutofWater, DailyKos, January 6, 2010

An unprecedented extreme in the northern hemisphere atmospheric circulation has driven a strong direct connecting current between the Gulf Stream and the West Greenland current. The unprecedented negativity of the "Arctic Oscillation" and the strong connection of the Gulf Stream with the Greenland current are exceptional events. More exceptional weather events are predicted with anthropogenic climate change, but this could be a natural variation of weather and currents.

The West Greenland current transports water originating from the Nordic seas that wrap around the southern tip of Greenland.
The West Greenland Current (WGC) flows north along the shelf and shelf break of the west coast of Greenland. It transports about 3 Sv of fresh (salinity < 34.5), cold (-1.8 °C) water from the Nordic seas (Clarke 1984; Cuny et al. 2002).
The West Greenland current normally does not connect directly with the Gulf Stream, but the most negative "Arctic oscillation" in 50 years of measurement caused a strong wind field that temporarily drove the Gulf Stream toward the west Greenland current.
Negative phase of the Arctic Oscillation
These regional contrasts in temperature anomalies resulted from a strongly negative phase of the Arctic Oscillation (AO). The AO is a natural pattern of climate variability. It consists of opposing patterns of atmospheric pressure between the polar regions and middle latitudes. The positive phase of the AO exists when pressures are lower than normal over the Arctic, and higher than normal in middle latitude. In the negative phase, the opposite is true; pressures are higher than normal over the Arctic and lower than normal in middle latitudes. The negative and positive phases of the AO set up opposing temperature patterns. With the AO in its negative phase this season, the Arctic is warmer than average, while parts of the middle latitudes are colder than normal. The phase of the AO also affects patterns of precipitation, especially over Europe.
The phase of the AO is described in terms of an index value. In December 2009 the AO index value was -3.41, the most negative value since at least 1950, according to data from the NOAA Climate Prediction Center.

The unprecedented atmospheric circulation pattern brought exceptionally warm air to Greenland and the Arctic ocean while dumping cold Arctic air into the United States, Europe and central Siberia.
Warm air keeps ice extent low
December air temperatures over the Arctic Ocean region, eastern Siberia, and northwestern North America were warmer than normal. In contrast, temperatures in Eurasia, the United States, and southwestern Canada were below average. The strongest anomalies (more than 7 degrees Celsius/13 degrees Fahrenheit) were over the Atlantic side of the Arctic, including Baffin Bay and Davis Strait, where ice extent was below average.
Anomaly in degrees Celsius.

The warm air over the Arctic ocean and Greenland slowed the formation of sea ice. Sea ice in the Labrador Sea decreased in late December with the strong current that brought warm water up the west coast of Greenland.

The past gives clues to what might be happening in the North Atlantic.
Recent research on paleoclimates (10 pg PDF) is showing that very warm Gulf Stream water penetrated much further north in the seas between Greenland and Norway in the Paleocene before the Pleistocene glaciations began.
The USGS Pliocene Research, Interpretation and Synoptic Mapping (PRISM) Project is charged with reconstructing global conditions during the ~3.3 to 3.0 Ma time interval (hereafter "the mid-Piacenzian") in an effort to better understand past and possible future climate dynamics. PRISM reconstructions of sea-surface temperature (SST), based largely on planktic foraminifer assemblage data, indicate that temperature differences between the mid-Piacenzian and modern increase with latitude in the North Atlantic (Dowsett et al. 1992). That is, mid-Piacenzian temperatures near the equator were similar to modern temperatures, but temperatures in the higher latitudes were several degrees warmer than at present. This reconstructed equator-to-pole gradient has been indeterminate at and above ~66° N, however, because temperature estimates from polar regions such as the Nordic Seas and Arctic Ocean have remained elusive due to the lack of geologic proxies yielding quantitative results as well as weak age control.

Water off of Spitzbergen was 65 °F in August. Northern Europe had a much warmer climate.
Summer temperatures between 10 °C and 18 °C and very low BFARs imply at least seasonally ice-free conditions in the subpolar North Atlantic and nearby Arctic Ocean during the mid-Piacenzian. Incorporation of these data into the PRISM SST reconstruction provides a possible analog to future climate. (The benthic foraminifer accumulation rate (BFAR), is calculated as the number of dead benthic foraminifer specimens in 10 cc of sediment. In the modern Arctic, the BFAR is lowest under seasonally ice-free areas and highest under permanent ice cover at any given water depth (Wollenberg and Kuhnt 2000).)
Paleoclimates were more sensitive to CO2 than previously recognized according to new research studies of CO2 in soil. The warmest paleoclimates happened at CO2 levels of 1000 ppm -- where we are headed if there are no limits on fossil fuel emissions. The soil research showed that paleo -- CO2 levels were calculated based on average conditions when soil carbonate actually forms under the hottest and driest conditions. This miscalculation caused ancient CO2 levels to be overestimated for warm periods like the Mesozoic and the Paleocene.
The carbon isotope compositions of modern soil and speleothem carbonate were studied in the southwestern USA in order to investigate their relationship to local plant cover. Our results suggest that δ13C values of both soil and speleothem carbonate tend to be biased toward seasonal extremes rather than "average" plant distributions. Evidence from modern soils in central New Mexico indicates that soil carbonate forms during the warmest and driest times of year. Reduced respiration rates under these conditions lower soil CO2 concentrations and help drive the precipitation of calcite. Soil carbonates are biased toward C4 plants because they record the annual maximum δ13C values of soil CO2 that occur during dry episodes. In addition, if pedogenic carbonate records minimum rather than mean growing season soil pCO2 values, then paleoatmopsheric CO2 concentrations calculated using the paleosol barometer may be significantly overestimated.
The need for urgent action to rapidly cut emissions of CO2, other greenhouse gases, and black carbon soot is becoming more evident by the day.


Monday, June 28, 2010

EPA 10 years behind on setting guidelines for toxic air pollutants, budget gutted by Bush

E.P.A. Lags on Setting Some Air Standards, Report Finds

by Yeganeh June Torbati, The New York Times, June 26, 2010

WASHINGTON — The Environmental Protection Agency is 10 years behind schedule in setting guidelines for a host of toxic air pollutants, according to a report from the agency’s inspector general.

The report, which was released last week, found that the agency had failed to develop emissions standards, due in 2000, for some sources of hazardous air pollutants. These included smaller sites often located in urban areas, like dry cleaners and gas stations, but also some chemical manufacturers.

The inspector general also found that the agency had not met targets outlined in a 1999 planning document, the Integrated Urban Air Toxics Strategy, including tracking urban dwellers’ risk of developing health problems from exposure to pollutants.

Some experts said the failures were persisting largely because the E.P.A.’s Office of Air and Radiation, which is responsible for regulating air pollutants, lacked the money needed to meet its deadlines.

In a written response to the report, E.P.A. officials also said budget cuts had made it difficult to meet their deadlines, noting that “air toxics support has been cut over 70 percent” since 2001.

In the past, the Government Accountability Office has found that the low priority for the air toxics program and limited financing were in part to blame for the agency’s failure to stay on schedule.

Frank O’Donnell, the president of Clean Air Watch, an environmental watchdog group based in Washington, said the inspector general’s report made clear that “the issue of breathing cancer-causing chemicals in city air is something of an orphan issue.”

For example, the agency’s last assessment of the risk of toxic air pollutants is based on emissions data from 2002. That analysis found that 1 in 28,000 people, or 36 in 1 million, could develop cancer from lifetime exposure to air toxics from outdoor sources. That number is an average, however, and people living in densely populated cities may face a higher risk.

The people most exposed, Mr. O’Donnell said, “are probably not out in the wheat farms — they’re going to be people living near where the bus depots are.”

Jeffrey Holmstead, who was assistant administrator for air and radiation at the E.P.A. from 2001 to 2005, said that even though Congress increased the agency’s budget when it passed significant amendments to the Clean Air Act in 1990, the E.P.A. still did not have enough money to fulfill all its requirements.

“It’s fair to point out that the E.P.A. has not met its statutory deadlines,” Mr. Holmstead said. “But there are hundreds and hundreds of statutory deadlines that the E.P.A. hasn’t met. Even though E.P.A. has a fairly large budget, it’s not big enough to do everything the E.P.A folks are supposed to do.”

In the past, Mr. Holmstead has represented semiconductor, aerospace and chemical companies as an environmental lawyer. He is now a partner at the law firm Bracewell & Giuliani, where his clients include oil companies and others in the energy sector.

James S. Pew, a lawyer with the environmental law group Earthjustice, said that the E.P.A. had the financing it needed, and that it undercut itself by moving money away from the division that specifically deals with air toxics. “This is a situation where the lack of resources is just not a valid excuse,” Mr. Pew said.

Some evidence suggests that there is now more attention being paid to this category of air pollutants within the E.P.A. The agency noted in its response to the report that for the first time in a decade, funds are shifting to the air toxics program this year to meet regulatory deadlines.


Arctic Sea ice melt 2010 thru day 174: satellite photo animation

Arctic Sea ice melt 2010 thru day 174: satellite photo animation

Get current image at:


Past rates of climate change in the Arctic by J.W.C. White et al., Quart. Sci. Rev., 29 (2010)

Quarternary Science Reviews, 29(15-16) (July 2010) 1716-1727; doi: 10.1016/j.quascirev.2010.04.025

Past rates of climate change in the Arctic

James W.C. Whitea, Corresponding Author Contact Information, E-mail The Corresponding Author, Richard B. Alleyb, Julie Brigham-Grettec, Joan J. Fitzpatrickd, Anne E. Jenningsa, Sigfus J. Johnsene, Gifford H. Millera, R. Steven Neremf and Leonid Polyakg

Climate is continually changing on numerous time scales, driven by a range of factors. In general, longer-lived changes are somewhat larger, but much slower to occur, than shorter-lived changes. Processes linked with continental drift have affected atmospheric circulation, oceanic currents, and the composition of the atmosphere over tens of millions of years. A global cooling trend over the last 60 million years has altered conditions near sea level in the Arctic from ice-free year-round to completely ice covered. Variations in arctic insolation over tens of thousands of years in response to orbital forcing have caused regular cycles of warming and cooling that were roughly half the size of the continental-drift-linked changes. This “glacial-interglacial” cycling was amplified by the reduced greenhouse gases in colder times and by greater surface albedo from more-extensive ice cover. Glacial-interglacial cycling was punctuated by abrupt millennial oscillations, which near the North Atlantic were roughly half as large as the glacial-interglacial cycles, but which were much smaller Arctic-wide and beyond. The current interglaciation, the Holocene, has been influenced by brief cooling events from single volcanic eruptions, slower but longer lasting changes from random fluctuations in the frequency of volcanic eruptions, from weak solar variability, and perhaps by other classes of events. Human-forced climate changes appear similar in size and duration to the fastest natural changes of the past, but future changes may have no natural analog.


History of sea ice in the Arctic by L. Polyak et al., Quart. Sci. Rev., 29 (2010)

Quarternary Science Reviews, 29(15-16) (July 2010) 1757-1778; doi: 10.1016/j.quascirev.2010.02.010

History of sea ice in the Arctic

Leonid Polyaka, Corresponding Author Contact Information, E-mail The Corresponding Author, Richard B. Alleyb, John T. Andrewsc, Julie Brigham-Gretted, Thomas M. Cronine, Dennis A. Darbyf, Arthur S. Dykeg, Joan J. Fitzpatrickh, Svend Funderi, Marika Hollandj, Anne E. Jenningsc, Gifford H. Millerc, Matt O'Regank, James Savellel, Mark Serrezej, Kristen St. Johnm, James W.C. Whitec and Eric Wolffn


Arctic sea-ice extent and volume are declining rapidly. Several studies project that the Arctic Ocean may become seasonally ice-free by the year 2040 or even earlier. Putting this into perspective requires information on the history of Arctic sea-ice conditions through the geologic past. This information can be provided by proxy records from the Arctic Ocean floor and from the surrounding coasts. Although existing records are far from complete, they indicate that sea ice became a feature of the Arctic by 47 Ma, following a pronounced decline in atmospheric pCO2 after the Paleocene–Eocene Thermal Optimum, and consistently covered at least part of the Arctic Ocean for no less than the last 13–14 million years. Ice was apparently most widespread during the last 2–3 million years, in accordance with Earth's overall cooler climate. Nevertheless, episodes of considerably reduced sea ice or even seasonally ice-free conditions occurred during warmer periods linked to orbital variations. The last low-ice event related to orbital forcing (high insolation) was in the early Holocene, after which the northern high latitudes cooled overall, with some superimposed shorter-term (multidecadal to millennial-scale) and lower-magnitude variability. The current reduction in Arctic ice cover started in the late 19th century, consistent with the rapidly warming climate, and became very pronounced over the last three decades. This ice loss appears to be unmatched over at least the last few thousand years and unexplainable by any of the known natural variabilities.