European Efforts to Verify GHG Emissions Reporting
Worldwatch Institute. It was a remarkable coda to the failure of the Copenhagen climate negotiations at the close of 2009, which left the international community without mandatory emissions limits to follow those of the Kyoto Protocol. Yet this dangerous trend was invisible at the time. In fact, Worldwatch did not issue its 2010 global emissions figure until April 26, 2012, immediately following a similar report on domestic emissions from the U.S. Environmental Protection Agencyin .Global greenhouse gas emissions surged 5.8 percent in 2010 over the year before, according to the Washington, D.C.-based
Delayed reporting is the rule for greenhouse gases (GHGs), because they are nearly always inferred rather than measured. Methane leakage from a natural gas pipeline is, for example, estimated from the condition of the pipeline and the amount of methane flowing through. Carbon dioxide (CO2) spewing from vehicle tailpipes is calculated based on total kilometers traveled and fuel consumed, rather than metered at the source. GHG reporting is indirect because the gas flows that are changing Earth’s climate are nearly as invisible to science as they are to the naked eye.
The resulting ‘bottom-up’ GHG accounting employed today comes with several drawbacks, according to experts in emissions reporting. One is the reporting time-lag, which obscures the link between emissions and their sources. Systemic error is the other: In case studies, researchers have identified dramatic discrepancies between official GHG inventories and measured emissions. Engineering models such as those for pipeline leaks are not perfect, and such models do not account for every possible GHG source (including those as yet unknown to science). They are also susceptible to deliberate under-estimation.
Growing awareness of the deficiencies of today’s GHG inventories is inspiring innovation by scientists, policymakers and activists. Some are trying to render the bottom-up method more accurate and relevant. But there also is growing hope for “top-down” schemes based on rapid or even real-time measurement of atmospheric GHGs. Models are already available to convert GHG concentration data into predictions of flow. Now the push is on to install the instruments — both ground and satellite-based — to feed them with data.
“In principle, the modeling methods are well-known and mature enough,” says Laurence Rouïl, Director for Environmental Modeling and Decision-making at France’s Institut National de l’Environnement Industriel et des Risques (INERIS) and the author of a December 2011 report on GHG “nowcasting” prepared for the European Commission. What is missing, says Rouïl, are “observations with an appropriate spatio-temporal resolution. This is … why there is so much interest in the development of GHG observation networks and satellites.”
Quicker books on greenhouse gases
One of the best places to look for innovative bottom-up reporting is Finland, where energy and environmental consulting firm Benviroc Oy has put its country’s GHG accounting scheme on steroids. Benviroc Oy uses a mix of accounting, measurement and modeling to generate weekly GHG stats by province and, increasingly, by city. The results appear on Benviroc’s CO2-raportti news portal.
Benviroc’s weekly reports cover those emissions most associated with individual citizens’ everyday behavior and lifestyle choices. Its system uses data from utilities and district heating suppliers and weather reports to estimate emissions from power consumption and heating. Emissions from driving are derived from actual traffic data.
Benviroc senior scientist Suvi Monni, a contributor to the Intergovernmental Panel on Climate Change (IPCC) emissions methodologies and scenarios, says the goal of the weekly figures is to motivate change. She notes, for example, that spiking emissions figures during winter cold snaps make consumers more aware of the impact of spiking coal consumption.
Reports that are closer to real-time also motivate local officials and politicians, she says, informing their investments in public transport, issuance of building permits, and other actions that impact municipal emissions. The more real-time the reporting, the faster the payback in terms of public recognition for actions that cut emissions. “They get a lot of positive publicity,” says Monni.
Accelerated reports such as Benviroc’s can close the psychological gap associated with bottom-up accounting of GHG emissions, but cannot address sources of error. Concern over such errors is growing along with the degree of importance associated with GHG emissions, both within and between countries. There are, for one, growing incentives for polluters and countries to downplay their emissions, as Rouïl notes in his 2011 scoping paper: “The negative impact of GHG emissions on climate, and the financial value of emissions reductions in carbon-equivalent trading markets, both create incentives to under-report actual emissions, whether consciously or subconsciously.”
Last year, researchers at the Swiss Federal Laboratories for Materials Science and Technology in Duebendorf showed just how far under-reporting may go. The team measured levels of trifluoromethane (HFC-23), a gas whose 100-year warming impact is 15,000-times greater than that of CO2, from ground stations in Switzerland and Ireland. Models of the atmospheric circulation over Europe then enabled them to trace the chemical industry byproduct back to its source, generating estimates of the emissions rate for each country in Western Europe.
This process for tracking emissions sources from atmospheric concentration data is known as ‘inverse modeling’, and it is central to most top-down GHG reporting efforts. “Inverse modeling of GHGs is the best approach to develop an independent verification process of emission inventories,” says Rouïl.
In the case of HFC-23, the Swiss scientists’ independent assessment raised serious doubts about reported emissions. Overall, the team estimated that emissions for Western Europe in 2009 were 60–140 percent higher than reported. They raised particular concern over Italy’s emissions, which appear to be at least 10 times more than was reported in Italy’s national GHG registry. The researchers identified a refrigerants factory near Milan as the most likely source.
Cases such as this leave no doubt that top-down verification will be critical to the “reliable realization of any GHG treaty,” according to Christoph Keller, who led the study and is now a postdoctoral fellow in atmospheric chemistry modeling at Harvard University. The challenge, he says, is to add measurement capabilities and thus reduce the uncertainty of top-down estimates which, in his study, were 30-50 percent for country-by-country predictions. Even such crude estimates, Keller says, are currently limited to Western Europe and to some parts of North America and Asia. “The observation network needs to be extended,” says Keller.
Expanding the CO2 network
Demand for a more extensive measurement network is greater still for those greenhouse gases that occur naturally, such as CO2 and methane. These compounds are exchanged between Earth’s atmosphere, biosphere, soils and oceans, and generated anew by geological processes such as volcanoes. The anthropogenic contribution is, for such gases, hard to spot against this “huge” background noise, explains Michael Gunson, program manager for Global Change and Energy at NASA’s Jet Propulsion Laboratory in Pasadena, California. “Human activity is a small number relatively speaking, but it’s important because it’s a steady increase against the background,” says Gunson.
Experts such as Gunson say that inverse modeling of CO2 emissions is possible, but will require both better models and more and better observations. One major effort to boost CO2 observations is Europe’s ground-based Integrated Carbon Observatory System or ICOS, which began operating its first eight stations in February. ICOS’s goal is to increase the networks’ measurement stations ten-fold within just a few years.
Gunson is one of those who believes that ground stations will not be enough. “If there’s to be a meaningful application of inverse modeling to confirm reported emissions, we will need satellite measurements,” says Gunson.
The proof-of-principle platforms for space-based CO2 measurement are Japan’s Greenhouse Gases Observing Satellite (GOSAT), launched in 2009, and NASA’s Orbiting Carbon Observatory (OCO) satellite, which could launch as early as 2014. Gunson is project scientist for the NASA mission, which is actually a follow-up attempt to an OCO launch that failed in 2009. Both GOSAT and OCO-2 are designed to measure atmospheric CO2 by detecting sunlight absorbed by molecules in Earth’s atmosphere by looking at the sunlight reflected from the surface from space.
Gunson says GOSAT is delivering CO2 data of sufficient accuracy for inverse modeling thanks to a concerted three-year calibration effort, and he predicts that OCO-2 will be even more accurate. However, each satellite measures CO2 over a narrow track of Earth below. It is the missions now in planning, such as France’s MICROCARB, the European Space Agency’s CARBONSAT, and China’s TANSAT, that will begin to create the critical mass of measurements required for global estimation of CO2 emissions. “The European projects begin to fill out the scope to where there is enough spread of data to measure emissions country-by-country on a global basis,” says Gunson.
Could good enough be too much?
There is diminishing doubt regarding the technical feasibility of top-down GHG verification. That was the conclusion of a 2011 report from JASON, an independent group of scientists that advises the federal government on issues of science and technology (the research panel was organized by Mitre Corp., a McLean, Virginia-based nonprofit think tank).
The key to reducing uncertainty to a workable level of 20 percent, in the JASON panel’s view, is to combine satellite and ground-based data: “Coupled with sensor networks optimized to sample downwind of specific countries, the satellite data could provide a capability within 5 years for estimation of annually averaged net fluxes.”
JASON’s timeframe looks optimistic, however, because there is no five-year plan in place to make it happen. “That investment hasn’t been made,” says Gunson.
That lack of funding for top-down verification may represent an appropriate balancing of limited research dollars. The top-down system does not pass a cost-benefit test according to Kevin Gurney, an atmospheric scientist at Arizona State University in Tempe. As Gurney writes in a policy update published last year in the scholarly journal Carbon Management: “Conceptualizing a carbon monitoring system centered primarily on atmospheric concentration measurements, could lead to tremendous inefficiencies in the allocation of limited scientific resources.”
In addition, JASON’s report was written for a U.S. government anticipating a possible international agreement to follow up on the Kyoto Protocol. Gunson concedes that the failure of Copenhagen has diminished optimism for such a treaty, but he says that, ironically, this has only heightened the importance of top-down GHG verification.
Gunson believes that one of the most important contributions of a top-down observation capacity will be its ability to paint a more accurate picture of the carbon cycle, including the human contribution. That global picture, he asserts, could actually spur the international community to act, bringing them back to the treaty-bargaining tables.
“Reliable data and information like that is a good spur for everyone to come to grips with what’s really happening,” says Gunson. “Having data in real time gives you a stronger motivation to take action. There really is nothing like a picture.”
Peter Fairley is an independent journalist who writes about energy and the environment and has contributed to Earthzine since 2007. Fairley is a world traveler who divides his time between Vancouver Island and Paris.by