Skip to main content

The Environment’s Best Friend: GM or Organic?

The debate rages between the benefits of genetically modified organisms (GMOs) versus organic methods within in the realm of agriculture and nutrition. But what’s the science say?

Published May 1, 2006

By Lee M. Silver

Image courtesy of gabort via stock.adobe.com.

Pigs raised on farms are dirty, smelly animals. Shunned by Jews and Muslims for millennia, they get no respect from most other people as well.

It’s also not just our senses, though, that pig farms assault: it’s the environment. Pig manure is loaded with phosphorus. A large sow can secrete from 18 to 20 kg per year, which runs off with rainwater into streams and lakes, where it causes oxygen depletion, algal blooms, dead fish, and greenhouse gases. [11] Ecological degradation has reached crisis levels in heavily populated areas of northern Europe, China, Korea, and Japan.

The Cost of Dietary Protein

The problem is that—unlike cows, goats, and sheep—farmed pigs cannot extract sufficient phosphorus from the corn and other grains they are fed. Grains actually contain plenty of phosphorus, but it is mostly locked away in a large chemical called phytate, which is inaccessible to digestion by animal enzymes. Ruminants release the phosphorus during a lengthy digestive process in four stomachs, with the help of symbiotic bacteria.

To survive, the pig’s wild ancestors depended on a varied diet, including worms, insects, lizards, roots, and eggs. But pig farming is most efficient with a simpler all-grain diet, supplemented with essential minerals. Although this feeding strategy works well for the animals, the inaccessible phosphorus in the grain passes entirely into the manure, which farmers use as crop fertilizer or otherwise distribute onto the land.

Today, in most rich and poor countries alike, pigs provide more dietary protein more cheaply and to more people than any other animal. Worldwide, pork accounts for 40% of total meat consumption. [11] While northern Europe still maintains the highest pig-to-human ratio in the world (2 to 1 in Denmark), the rapidly developing countries of east Asia are catching up. During the decade of the 1990s alone, pork production doubled in Vietnam and grew by 70% in China.

Along the densely populated coastlines of both countries, pig density exceeds 100 animals per square kilometer, and the resulting pollution is “threatening fragile coastal, marine habitats including mangroves, coral reefs, and sea grasses.” [7] As the spending power of people in developing Asian countries continues to rise, pig populations will almost certainly increase further.

Pig-caused Ecological Degradation

Pig-caused ecological degradation is a complex problem, and no single solution is in the offing. But any approach that allows even a partial reduction in pollution should be subject to serious consideration by policy makers and the public.

A prototypical example of what directed genetic modification (GM) technology can deliver is the transgenic Enviropig, developed by Canadian biologists Cecil Forsberg and John Phillips. Forsberg and Phillips used an understanding of mammalian gene regulation to construct a novel DNA molecule programmed for specific expression of the E. coli phosphorus-extraction gene (phytase) in pig saliva. They then inserted this DNA construct into the pig genome. [8]

The results obtained with the first generation of animals were dramatic: the newly christened Enviropigs no longer required any costly dietary supplements and the phosphorus content of their manure was reduced by up to 75%. Subtle genetic adjustments could yield even less-polluting pigs, and analogous genetic strategies can also be imagined for eliminating other animal-made pollutants, including the methane released in cow belches, which is responsible for 40% of total greenhouse gas emissions from New Zealand. [5]

Enzymes in Natural Bacteria

An added advantage with the Enviropig is that the single extra enzyme in its saliva is also present naturally in billions of bacteria inhabiting the digestive tract of every normal human being. As bacteria continuously die and break apart, the naked enzyme and its gene both float free inside us without any apparent consequences, suggesting that the Enviropig will be as safe for human consumption as non-GM pigs. If the enzyme happened to escape into meat from modified pigs, it would be totally inactivated by cooking.

Of course, empirical analysis is required to show that the modification does not make the meat any more harmful to human health than it would be otherwise. With modern technologies for analyzing genomes, transcriptosomes, proteomes, and metabolomes, any newly constructed transgenic animal can be analyzed in great molecular detail. “Isn’t it ironic,” Phillips and Forsberg commented [13], “that new varieties of animals with extreme but natural mutations undergo no safety testing at all?”

Environmentally Friendly GM

Not all GM projects are aimed specifically at reducing the harmful effects of traditional agriculture on the environment. Other GM products approved to date, developed almost entirely in the private sphere, have aimed to reduce production costs on large-scale farms. But as molecular biology becomes ever more sophisticated, the universe of potential environmentally friendly GM applications will expand.

Scientists have begun research toward the goal of obtaining pigs modified to digest grasses and hay, much as cows and sheep do, reducing the land and energy-intensive use of corn and soy as pig feed. Elsewhere, trees grown on plantations for paper production could be made amenable to much more efficient processing. This would reduce both energy usage and millions of tons of the toxic chemical bleach in effluents from paper mills. [3] [14]

The most significant GM applications will be ones that address an undeniable fact: every plot of land dedicated to agriculture is denied to wild ecosystems and species habitats. And that already amounts to 38% of the world’s landmass. Genetic modifications that make crop production more efficient would give us opportunities to abandon farmland that, in many cases, should cede it back to forests and other forms of wilderness, as long as world population growth is ameliorated.

So why are many environmentally conscious people so opposed to all uses of GM technology? The answer comes from the philosophy of organic food promoters, whose fundamental principle is simply stated: natural is good; synthetic is bad. [16]

The Roots of Organic Farming

Before the 18th century, the material substance of living organisms was thought to be fundamentally different—in a vitalistic or spiritual sense—from that of non-living things. Organisms and their products were organic by definition, while non-living things were mineral or inorganic. But with the invention of chemistry, starting with Lavoisier’s work in 1780, it became clear that all material substances are constructed from the same set of chemical elements.

As all scientists know today, the special properties of living organic matter emerge from the interactions of a large variety of complex, carbon-based molecules. Chemists now use the word organic to describe all complex, carbon-based molecules—whether or not they are actually products of any organism.

Through the 19th and 20th centuries, increased scientific understanding, technological innovations, and social mobility changed the face of American agriculture. Large-scale farming became more industrialized and more efficient. In 1800, farmers made up 90% of the American labor force; by 1900, their proportion had decreased to 38%, and in 1990, it was only 2.6%.

A Return to Preindustrial Farming Methods

However, not everyone was happy with these societal changes, and there were calls in the United States and Europe for a return to the preindustrial farming methods of earlier times. This movement first acquired the moniker organic in 1942, when J. I. Rodale began publication in America of Organic Farming & Gardening, a magazine still in circulation today.

According to Rodale and his acolytes, products created by—and processes carried out by—living things are fundamentally different from lab-based processes and lab-created products. The resurrection of this prescientific, vitalistic notion of organic essentialism did not make sense to scientists who understood that every biological process is fundamentally a chemical process. In fact, all food, by definition, is composed of organic chemicals. As a result, the U.S. Department of Agriculture (USDA) refused to recognize organic food as distinguishable in any way from nonorganic food.

Legislating Meaning

In 1990, lobbyists for organic farmers and environmental groups convinced the U.S. Congress to pass the Organic Foods Production Act, which instructed the USDA to establish detailed regulations governing the certification of organic farms and foods. [2] After 12 years of work, the USDA gave organic farmers the certification standards that they wanted to prevent supposed imposters from using the word organic on their products. [15] Similar organic standards have been implemented by the European Commission and by the Codex Alimentarius Commission of the United Nations. [4] [6]

In all instances, organic food is defined not by any material substance in the food itself, but instead by the so-called natural methods employed by organic farmers. The USDA defines organic in essentially negative terms when it says, “the [organic] product must be produced and handled without the use of synthetic substances” and without the use of synthetic processes. The Codex Commission explains in a more positive light that “organic agriculture is a holistic production management system.”

The physical attributes of organic products—and any effects they might have on the environment or health—are explicitly excluded from the definition. Nonetheless, the definitions implicitly assume that organic agriculture is by its very nature better for the environment than conventional farming.

Precision Genetic Modification

The European Commission states as a fact that “organic farmers use a range of techniques that help sustain ecosystems and reduce pollution.” Yet, according to self-imposed organic rules, precision genetic modification of any kind for any purpose is strictly forbidden, because it is a synthetic process. If conventional farmers begin to grow Enviropigs—or more sophisticated GM animals that reduce nearly all manure-based pollution—organic pig farmers will then blindly continue to cause much more pollution per animal, unless they are prevented from doing so by future EPA regulations.

Many organic advocates view genetic engineering as an unwarranted attack not just on the holistic integrity of organic farms, but on nature as a whole. On the other hand, spontaneous mutations caused by deep-space cosmic rays are always deemed acceptable since they occur “naturally.” In reality, laboratory scientists can make subtle and precise changes to an organism’s DNA, while high-energy cosmic rays can break chromosomes into pieces that reattach randomly and sometimes create genes that didn’t previously exist.

Regardless, organic enthusiasts maintain their faith in the beneficence and superiority of nature over any form of modern biotechnology. Charles Margulis, a spokesman for Greenpeace USA, calls the Enviropig “a Frankenpig in disguise.” [12]

Chemical Pesticides and Organic Farming

Although the market share held by organic products has yet to rise above single digits in any country, it is growing steadily in Europe, the United States, and Japan. Nearly all consumers assume that organic crops are, by definition, grown without chemical pesticides. However, this assumption is false.

Pyrethrin (C21H28O3), for example, is one of several common toxic chemicals sprayed onto fruit trees by organic farmers—even on the day of harvesting. Another allowed chemical, rotenone (C23H22O6), is a potent neurotoxin, long used to kill fish and recently linked to Parkinson’s disease. [1] [10]

How can organic farmers justify the use of these and other chemical pesticides? The answer comes from the delusion that substances produced by living organisms are not really chemicals, but rather organic constituents of nature. Since pyrethrin is produced naturally by chrysanthemums and rotenone comes from a native Indian vine, they are deemed organic and acceptable for use on organic farms.

However, the most potent toxins known to humankind are all natural and organic. They include ricin, abrin, botulinum, and strychnine—highly evolved chemical weapons used by organisms for self-defense and territorial expansion. Indeed, every plant and microbe carries a variety of mostly uncharacterized, more or less toxic attack chemicals, and synthetic chemicals are no more likely to be toxic than natural ones.

Less-allergenic GM Food

All currently used pesticides—both natural and synthetic—dissipate quickly and pose a miniscule risk to consumers. Nevertheless, faith in nature’s beneficence can still be fatal to some children. About 5% express severe allergic reactions to certain types of natural food. Every year unintentional ingestion causes hundreds of thousands of cases of anaphylactic shock with hundreds of deaths.

The triggering agents are actually a tiny number of well-defined proteins that are resistant to digestive fluids. These proteins are found in such foods as peanuts, soybeans, tree nuts, eggs, milk, and shellfish. They linger in the intestines long enough to provoke an allergic immune response in susceptible people.

No society has been willing to ban the use of any allergenic ingredients in processed foods, even though this approach could save lives and reduce much human suffering. GM technology could offer a more palatable alternative: scientists could silence the specific genes that code for allergenic proteins. The subtly modified organisms would then be tested, in a direct comparison with unmodified organisms, for allergenicity as well as agronomic and nutritional attributes.

USDA-supported scientists have already created a less-allergenic soybean. Soy is an important crop used in the production of a variety of common foods, including baby formula, flour, cereals, and tofu. Eliot Herman and his colleagues embedded a transgene into the soy genome that takes advantage of the natural RNA interference system to turn off the soy gene responsible for 65% of allergic reactions. [9]

Promising Results

RNA interference can be made to work in a highly specific manner, targeting the regulation of just a single gene product. Not only was the modified soy less allergenic in direct tests, but the plants grew normally and retained a complex biochemical profile that was unaltered except for the absence of the major allergen. Further rounds of genetic surgery could eliminate additional allergenic soy proteins. Other scientists have reported promising results in their efforts to turn off allergy-causing genes in peanuts and shrimp.

Some day perhaps, conventional soy and peanut farmers will all switch production to low-allergenicity GM crop varieties. If that day arrives, organic food produced with GM-free organic soy or peanuts will be certifiably more dangerous to human health than comparable nonorganic products.

Unfortunately, conventional farmers have no incentive to plant reduced-allergy seeds when sales of their current crops are unrestricted, especially when the public has been led to believe that all genetic modifications create health risks. In the current social and economic climate, much of the critical research required to turn promising results into viable products is simply not pursued. Anti-GM advocates for organic food may be indirectly and unknowingly responsible for avoidable deaths in the future.

Vegetarian Meat

Only three decades have passed since genetic modification technology was first deployed in a rather primitive form on simple bacteria. The power of the technology continues to explode with no end in sight, leading to speculation about how agriculture could really be transformed in the more distant future.

Chicken meat is actually cooked muscle, and muscle is a type of tissue with a particular protein composition and a particular structure. At some future date, as the power of biotechnology continues to expand, our understanding of plant and animal genes could be combined with the tools of genetic modification to create a novel plant that grows appendages indistinguishable in molecular composition and structure from chicken muscles.

Vegetative chickens—or perhaps muscular vegetables—could be grown just like other crops. Eventually, there could be fields of chicken, beef, and pork plants. At harvest time, low-fat boneless meat would be picked like fruit from a tree.

The Advantages

The advantages of genetically engineered vegetative meat are numerous and diverse. Without farm animals, there could be no suffering from inhumane husbandry conditions and no pollution from manure. Since the sun’s rays would be used directly by the plant to make meat, without an inefficient animal intermediate, far less energy, land, and other resources would be required to feed people.

Up to 20% of the earth’s landmass currently used for grazing or growing animal feed might be ceded back to nature for the regrowth of dense forests. As a result, biodiversity would expand, the extinctions of many species might be halted, and a large sink for extracting greenhouse gases from the atmosphere might be created.

Of course, this scenario is wild biotech speculation. But current-day organic advocates would reject any technology of this kind out of hand, even if it was proven to be beneficial to people, animals, and the biosphere as a whole. This categorical rejection of all GM technologies is based on a religious faith in the beneficence of nature and her processes under all circumstances, even when science and rationality indicate otherwise.

Also read: Cultivating Better Health with Science

References

1. Betarbet, R., T. B. Sherer, G. MacKenzie et al. 2000. Chronic systemic pesticide exposure reproduces features of Parkinson’s disease. Nature Neuroscience 3: 1301-1306.

2. Burros, M. 2002. A definition at last, but what does it all mean? New York Times (Oct 16).

3. Chiang, V. L. 2002. From rags to riches. Nature Biotechnology 20: 557-558.

4. Codex Alimentarius Commission. 1999. Guidelines for the Production, Processing, Labeling and Marketing of Organically Produced Foods. U.N. Food and Agriculture Organization, Rome

5. Dennis, C. 2004. Vaccine targets gut reaction to calm livestock wind. Nature 429: 119.

6. EUROPA. 2000. What Is Organic Farming? European Commission, Belgium.

7. FAO. 2006. Livestock Policy Brief 02: Pollution from Industrialized Livestock Production. U.N. Food and Agriculture Organization, Rome

8. Golovan, S. P., R. G. Meidinger, A. Ajakaiye et al. 2001. Pigs expressing salivary phytase produce low-phosphorus manure. Nature Biotechnology 19: 741-745.

9. Herman, E. M., R. M. Helm, R. Jung & A. J. Kinney. 2003. Genetic modification removes an immunodominant allergen from soybean. Plant Physiology 132: 36-43.

10. Isman, M. B. 2006. Botanical insecticides, deterrents, and repellents in modern agriculture and an increasingly regulated world. Annual Review of Entomology 51: 45-66.

11. OECD. 2003. Agriculture, trade and the environment: the pig sector. OECD Observer (Sep).

12. Osgood, C. 2003. Enviropigs may be essential to the future of hog production. Osgood File (Aug 1). CBS News.

13. Personal communication.

14. Pilate, G., E. Guiney, K. Holt et al. 2002. Field and pulping performances of transgenic trees with altered lignification. Nature Biotechnology 20: 607-612.

15. USDA. 2000. National Organic Program. U.S. Department of Agriculture.

16. Verhoog, H., M. Matze, E. Lammerts van Bueren & T. Baars. 2003. The role of the concept of the natural (naturalness) in organic farming. Journal of Agricultural and Environmental Ethics 16: 29-49.


About the Author

Lee M. Silver is professor of molecular biology in the Woodrow Wilson School of Public and International Affairs at Princeton University, and the author of Challenging Nature: The Clash of Science and Spirituality at the New Frontiers of Life (Ecco).

The Environmental Impact of ‘Silent Spring’

Exploring Rachel Carson and her magnus opus which launched an environmental movement that remains strong today.

Published November 4, 2005

By Fred Moreno

Rachel Carson. Image via USFWS.

“Ecology” derives from the Greek word for “home.” It is defined, in a general sense, as the science that focuses on the interaction of all living things with their environment. Ecology actually owes a debt to Charles Darwin and his studies of the diversity and interdependence of species and their habitats, which laid the groundwork for a new understanding of the natural world. The term itself was coined by the 19th century German zoologist Ernst Haeckel, a staunch advocate of Darwin’s theories on the distribution and abundance of species. (Haeckel is well-known to artists for his meticulous drawings of life forms in his book Kunstformen der Natur.)

Ecology’s popularity as a movement has American roots, however, from its early days as the “nature study movement” and in literary traditions from such writers as Henry David Thoreau and Walt Whitman. As damage to the environment grew more apparent from logging, mining, and industrialization, interest in conservation grew and the establishment of such groups as the Audubon Society and the Sierra Club — as well as modest moves by the federal government to establish national parks and forests — heightened public consciousness.

Silent Spring and the Modern Environmental Movement

But the average American’s knowledge of ecology was minuscule, at best. It took a persistent young biologist and a book to raise the public’s awareness and change the way many people around the world looked at how we live on this planet. The biologist was Rachel Carson and the book was Silent Spring, often cited as one of the two most influential books in American history — the other being Uncle Tom’s Cabin. Historians are nearly unanimous in the belief that the modern environmental movement, which emphasizes pollution and other damage to the quality of life on earth, began with Silent Spring.

Alerting the public to the dangers of dichlorodiphenyltrichloroethanes (DDT) and promoting initiatives to correct the problems it created, the book combined a warning about pesticide pollution with a lyrical celebration of the natural world. Carson outlined how chlorinated hydrocarbons and organic phosphorus insecticides changed the cellular processes of animals, plants, and perhaps even humans. She questioned the wisdom of allowing toxic chemicals to be used in the environment before the impact of their long-term consequences was known. One of the most basic human rights, she said, was the “right of the citizen to be secure in his own home against the intrusion of poisons applied by other persons.”

Benefits Overshadowed by Damage

Of course, the effects of pesticides and other pollutants on the environment and human health was not a new concern. In the 1860s, toxic compounds of lead and arsenic were used in agriculture despite the hazard to health. And DDT itself was not only successful in pest control but even had humanitarian applications during WWII, being used to kill lice, which spread typhus.

(In fact, DDT’s inventor, Swiss chemist Paul Müller, won the Nobel Prize in 1948 “for his discovery of the high efficiency of DDT as a contact poison against several arthropods.”) But its benefits — and those of other of its synthetic relatives such as PCBs and HCB — were soon overshadowed by the damage done to the environment, and to people.

Rachel Carson was not the first to recognize the dangers of DDT, but her talent for conducting research and for synthesizing complex information in accessible terms made her warnings stand out. Born in 1907 in Pennsylvania, she was fascinated by the natural world. She graduated with honors in biology from the Pennsylvania College for Women in 1938 and earned a master’s in zoology from Johns Hopkins, where she also taught.

She worked with the U.S. Bureau of Fisheries (later the Fish and Wildlife Service) and continued her interests in both writing and natural science. Her first book was a description of nature in the ocean, Under the Sea-Wind, followed by The Sea Around Us, which won the National Book Award. It took her four years to research and write Silent Spring and it first saw print as a three-part series in the New Yorker in June, 1962, three months before its official publication as a book.

An Overwhelming Reception

Silent Spring’s reception was overwhelming. Over 100,000 copies sold by Christmas and within the next 10 years, it had been translated into 16 languages. The book has been continuously in print and has sold more than 2 million copies. Despite the book’s resonance with the public and most scientists, Carson came under withering attack by elements of the chemical industry and its supporters in government and the media.

She was accused of being a communist and a crank, and her credentials were questioned. Carson’s biographer Linda Lear notes that she was attacked because she was an “outsider who had not been part of the scientific establishment, first because she was a woman but also because her chosen field, biology, was held in low esteem in the nuclear age.” Critics called her a “bird and bunny lover” who had stepped beyond the bounds of her sex and her science.

Carson responded with dignity and deliberation. She did television and magazine interviews, gave lectures and testified before a U.S. Senate Committee. Finally, a report by President John Kennedy’s Science Advisory Committee vindicated the book and quieted the critics. In the years after Silent Spring, a flood of important environmental legislation passed, starting with the establishment of the Environmental Protection Agency in 1969 and the banning of DDT in 1972. This was followed by acts setting standards for clean air and clean water, as well as legislation to protect workers from toxins in the workplace and to safeguard processed foods from carcinogens.

A Valiant Fight Against Cancer

Tragically, all during the writing of Silent Spring and the turbulent aftermath of its publication, Carson’s health was deteriorating. She learned she had breast cancer in 1960 and endured several rounds of radiation therapy. Despite the treatments, the cancer spread. On April 14, 1964, not quite two years after her groundbreaking book was published, Rachel Carson died; she was 56 years old. In 1980, she was posthumously awarded the nation’s highest civilian honor, the Presidential Medal of Freedom. The citation reads:

Never silent herself in the face of destructive trends, Rachel Carson fed a spring of awareness across America and beyond. A biologist with a gentle, clear voice, she welcomed her audiences to her love of the sea, while with an equally clear voice she warned Americans of the dangers human beings themselves pose for their environment. Always concerned, always eloquent, she created a tide of environmental consciousness that has not ebbed.

Also read: The ISR and Traditional Environmental Stewardship

Landfill Diversion: Created from Consumerism

Brian Jungen reconstructs everyday materials into cultural and natural wonders in a modern art show that doubles as anthropology. Or paleontology.

Published October 21, 2005

By Adelle Caravanos

Image courtesy of Seventyfour via stock.adobe.com.

Walk into the New Museum of Contemporary Art, and you might think you’ve mistakenly stumbled into a natural history museum. After all, the first things you’ll see are three huge whale skeletons, suspended from the ceiling. Then there’s the collection of Aboriginal masks. Upon closer inspection, however …

You’ll see that the masks are made from Nike sneakers.

And those aren’t whale bones. They’re lawn chairs.

In fact, almost everything at the Brian Jungen exhibit is made from new, mass-produced items that the Canadian artist has reconfigured into something that looks, well, old and unique. The comprehensive exhibit features 35 sculptures, drawings and installations created by Jungen, who is best known for his Northwest Coast native masks made from sliced up Air Jordans. The complete collection of his masks, Prototypes for a New Understanding, is on display at this show for the first time.

With Prototypes, Jungen takes an everyday item from modern Western life – athletic sneakers – and reassembles it into a traditional indigenous item. The work is a comment on the commercialization of cultural heritage, as well as a comparison of the aesthetics of the two worlds. For instance, the trademark Air Jordans come in the same red, black and white color combination frequently used in Aboriginal masks.

Jungen’s reassembly of the sneakers — arguably the most sought after consumer products of the 90’s — literally gives them a human face, and their man-made material is refashioned into life-like ancient warriors: he renders the synthetic, organic.

The Natural Cycle of Materials

Jungen obtains a similar effect with the whale skeletons, comprised of chopped-up patio chairs: the stackable white plastic variety loved by suburbanites. Jungen worked with an assistant, bolting together the plastic pieces to form vertebrae, ribs, skulls and fins until each work became indistinguishable from the skeletal remains of a whale.

How many chairs make a whale? Jungen recalls midnight runs to the local home goods store to acquire some 300 for the three installations: Shapeshifter, named for a mythical creature with the ability to morph its form; Cetology, whose title refers to the zoological study of whales; and Vienna, titled in honor of the city where it was created. Although they loom large overhead in the gallery, ranging from 21 to 42 feet in length, Jungen says they’re on scale with baby whales.

By using plastic, which is derived from petroleum, which in turn comes from large animal fossils, Jungen draws attention to the natural cycle of materials on our planet – his fake whale skeletons are built using a by-product of the material that real whales leave behind.

Also in the collection: A series of “lava rocks” made from deconstructed soccer balls; wooden baseball bats carved with loaded words and phrases; and a set of neatly stacked cafeteria trays, inspired by a similar configuration of trays used by a Canadian prisoner to escape confinement.

Also read: Green is the New Black in Sustainable Fashion

Hollywood Hysteria or Scientific Reality?

Much hype is made about the impact of climate change from both sides of the ideological spectrum. But what does the actual science say? These NASA researchers break it down.

Published March 1, 2005

By Sheri Fink

Image courtesy of PaulShlykov via stock.adobe.com.

From the cover stories of popular science magazines to the content of popular Hollywood movies, the possibility of abrupt, catastrophic climate change has stirred the public imagination. But how real is the threat? At NASA’s Goddard Institute for Space Studies, Gavin A. Schmidt and Ronald L. Miller are attempting to answer that question by creating climate models, testing them against evidence from historical climate records, and then using the models in an effort to predict the climate of the future.

The Greenland ice core offers clues to the history of climate change. Calcium content and methane levels correlate with the sharp temperature changes during abrupt climate changes. “You can count the layers in these ice cores. It’s like tree rings; you can see one year after another,” says Schmidt.

The idea that abrupt climate change is even a possibility in our relatively climatically stable Holocene era comes by way of a single example. Recorded in the Greenland ice core, it dates to the very end of the last ice age, roughly 12,000 years ago.

“This is the poster child for abrupt climate change,” says Schmidt, “extremely cold going to extremely warm very, very quickly. When this was first discussed, people had no idea that the climate could change so rapidly.”

The period was named the Younger Dryas because of its reflection in European Dryas flower pollen records. Various other climate records also show the event – from caves in East China to sediments in Santa Barbara and Cariacao Basins to ice cores in the Andes to cave deposits around the Mediterranean.

Flow and Flux in the North Atlantic

“You can see a clear signature of this event almost everywhere in the globe,” says Schmidt. However, the effect is largest near the North Atlantic. “That kind of points you to something that’s happening in the North Atlantic as a possible cause or trigger for what’s going on,” he says.

The circulation of the ocean is driven not only by wind, but also by the water’s salt content and density. The two factors interact in a complex way.

Schmidt sums up the ocean’s overturning circulation – also known as thermohaline circulation – as “warm water that rises along the surface and cold salty water that remains underneath. That transport makes it much warmer in the North Atlantic than it is for instance in the North Pacific.”

The process is self-sustaining. “It’s warm in the North Atlantic because those currents also bring up salt. That salt is heavy, which causes water to sink, and this motion causes the water to release heat.”

He points out that the system also has the potential for different states, however. If for some reason the currents ceased, then the water would not be as salty. It would not sink, and the surroundings would stay cold.

Researchers recently developed a paleoclimate measure that correlates with the residence time of water in the North Atlantic. “In the Younger Dryas,” says Schmidt, “there was a big dip in how much water was being exported – or the residence time of water in the North Atlantic.” This indicates that the North Atlantic overturning circulation was much reduced at the time of that rapid climate change.

An Explanation for Abrupt Climate Change?

The last ice age was characterized by many examples of rapid climate change. Changes in ocean circulation provide a possible explanation.

“We have reasons to believe that ice sheets aren’t particularly stale,” says Schmidt. “Every so often, if they get too big, they start melting at the base.”

An iceberg calving that landed in the ocean and melted would produce a large freshwater pulse. “As you make the ocean fresher and fresher and fresher, then you get less and less formation of that deep water. As that reduces, then there’s less salt being brought up from the lower latitudes,” he explained.

“At one point it’s just too fresh, and then nothing’s being brought up anymore.” In that case, the only stable solution, Schmidt says, is the slowing of the thermohaline circulation.

Could a reduced overturning actually cause abrupt climate change? The answer isn’t clear yet, but there is a correlation. “When we have a weak circulation, it seems like the climate in a lot of cases is very cold,” says Ron Miller.

Why Worry Now?

On top of this instability, humans have dramatically changed the atmosphere’s composition over the past 150 years. And that’s cause for some concern. The energy absorbed by greenhouse gases is balanced by evaporation, which should lead to an increase in rainfall.

“It’s predicted by every climate model,” says Miller. That rainfall could be a source of just enough fresh water to tip the scales, stilling the ocean and, perhaps, making the atmosphere colder. Indeed, a study of ocean salinity shows that in the past decades, the ocean has gained extra fresh water. “The question is, how much cooling do we get?” asks Miller. “Where is this cooling happening? Is it global, and how important is it compared with the warming caused by greenhouse gasses?”

Miller and Schmidt are using a general circulation model to predict the answers to these questions. First, the model was tested to see how well it could predict climatic occurrences of the past century. A rough grid was superimposed on the planet, and within each grid cell, the changes in water vapor, liquid water, momentum, energy, and other factors were observed.

Next, positive and negative forcings – the atmospheric conditions expected to warm or cool the planet, such as solar irradiance and tropospheric and stratospheric aerosols – were calculated or estimated and added to the model. “It’s tracking the observed global average temperature surprisingly well, and we’re really quite proud of this,” says Miller.

Miller admits to a few kinks in regional predictions. Still, he says, “we have a lot of confidence that the model is good at reproducing 20th century climate trends. That gives us some confidence that we can actually make predictions in the future.”

No Cause for Alarm, Yet

After estimating the atmospheric conditions of the next century – no easy task in and of itself – the researchers took the model out for a spin to see what could be expected over the next 100 years. The results indeed predict a slowing of the thermohaline circulation corresponding to a cooling in some areas. “But it’s swamped globally by the warming expected from the greenhouse gases,” says Miller. “So clearly there’s no evidence for any sort of ice age.”

Other researchers have created their own models. All of these point to various drops in the freshwater ocean circulation, but all agree with Miller and Schmidt’s conclusions. “The models give no indication we’re going to see any climate surprises or ice ages in the next 100 years or so,” says Miller.

The terrible tsunami of Dec. 26, 2004 has left no one to doubt the power of oceanic change, in this case due to an undersea earthquake. Still, those kept awake at night by the imagined catastrophic aftermath of thermohaline circulation slowing, as depicted in the film The Day After Tomorrow, may rest easier. Meanwhile, the scientists are continuing to refine their models and studying other factors that may have led to rapid climactic change in the past.

Also read: Climate Change: A Slow-Motion Tsunami


About the Author

Sheri Fink, M.D., Ph.D, is a freelance journalist. Her award-winning book War Hospital: A True Story of Surgery and Survival (PublicAffairs, 2003) was published in paperback in December 2004.

Climate Change: A Slow-Motion Tsunami

From reducing greenhouse gas emissions to developing reliable sources of renewable energy, scientists are planning for how to deal with the threats brought on by climate change.

Published March 1, 2005

By Christine Van Lenten

Image courtesy of Itxu via stock.adobe.com.

Images of the devastation wrought by the December 2004 tsunami in the Indian Ocean are still indelible in all our minds. Thus, that imagery suggests itself within the context of climate change, which poses the threat of natural forces altering our planet – albeit much more slowly – in ways that could be destructive to many forms of life, not least our own.

The tsunami was not, of course, caused by climate change. That imagery is facile; that imagery is lurid; we may hope it’s grossly overstated. But it would be a mistake to reject its core significance.

The reasons why, and steps we can take to mitigate and adapt to climate change, were presented by two of the world’s top climate scientists at an event sponsored by the Environmental Sciences Section in December. Dr. Rajendra K. Pachauri, who chairs the authoritative Intergovernmental Panel on Climate Change (IPCC), was the featured speaker.

Dr. James E. Hansen, who directs the NASA Goddard Institute for Space Studies (GISS), served as respondent. Kenneth A. Colburn of the Northeast States for Coordinated Air Use Management (NESCAUM) and Karl S. Michael of the New York Energy Research and Development Authority (NYSERDA) lent regional and state perspectives on how governments and the marketplace are – and aren’t – responding to the challenge of reducing the greenhouse gas (GHG) emissions that are driving global warming.

Give Carbon a Market Value

This quartet of insiders spoke against a backdrop of major events. Pachauri and Colburn were en route to Buenos Aires for the tenth Conference of the Parties to the UN Framework Convention on Climate Change, which produced the Kyoto Protocol. On January 1, 2005, the EU market for trading carbon emissions would be launched. That, along with implementation of the Kyoto Protocol in February, would, Colburn predicted, give carbon a market value, and “that will change everything.”

But will change come soon enough to transform a destructively carbon-intensive world into one in which current generations can meet their basic needs without impairing future generations’ ability to meet theirs?

A Monster Problem

Created in 1988 by the UN Environment Programme and the World Meteorological Society, the IPCC has enlisted scientists and other experts around the globe in shaping and advancing a new body of knowledge about a monster problem of unparalleled complexity. As climate change has evolved from an obscure topic to the mother of all fields, the IPCC has earned and maintained wide respect.

Central to its success are its methods: it proceeds by way of assessments of peer-reviewed, published literature, largely by consensus, and transparently. Central to its methods is the design of scenarios that project a range of future GHG emission levels and resulting physical effects. The scenarios rest on assumptions about energy use, population, and economic activity. Much turns on them: the higher the forecasts, the more urgent the problem, the greater the pressures to act.

Inevitably the scenarios are challenged. “One welcomes debate,” said Pachauri. “But there should be a healthy and objective debate.” The IPCC scenarios have been subjected to “a systematic attack” charging that forecasts are too high because methods are flawed. That attack is unfounded, Pachauri contended, and the IPCC has been responding to it. A natural resource economist, he presented the IPCC’s case. In short, the IPCC relies on capable economists who draw from mainstream sources and employ mainstream methods. Critics who say the IPCC exaggerates future emission levels are “not at all correct.”

Dangerous Trends

What has the IPCC learned? The evidence Pachauri cited that human-induced climate change is already under way is by now familiar and widely accepted: temperatures are rising, glaciers are retreating, snow cover is diminishing, droughts are more frequent and severe. The November 2004 Arctic Impact Assessment reports that the Arctic is warming much faster than anticipated.

Of the GHGs driving climate change, CO2 is by far the worst offender. By 1999 its concentration in the atmosphere had increased 31% since the industrial revolution – to a level perhaps not exceeded during the past 420,000 years and likely not in the past 20 million. It’s expected that fossil fuel burning will continue to drive CO2 emissions during the 21st century. In IPCC model runs, the global averaged surface temperature increases from 1.4o to 5.8oC, from 1990 to the end of the 21st century. Wintertime temperatures could rise, particularly in northern latitudes. Globally, average water vapor, evaporation, and precipitation could increase, though effects could vary regionally. Sea levels could rise by 0.09 to 0.88 meters.

Complicating the picture is the complexity of the interacting climate, ecological, and socioeconomic systems that are in play, with their feedback loops and indirect as well as direct effects. With systems of such complexity and temperature changes of such magnitude, effects won’t necessarily be linear: there will be discontinuities; there could be abrupt changes; impacts could be severe. Because of the inertia of the climate system, some impacts may only slowly become apparent. Some, Pachauri observed, could continue for decades, centuries, millennia. Some could be irreversible if certain thresholds, not yet understood, are crossed.

The Potential Damage

Large uncertainties are associated with IPCC projections; for example, assumptions about economic growth, technology, and substitutions among different forms of energy. But if by the end of this century we end up in the upper end of the ranges forecast, Pachauri cautioned, “the world is in trouble.”

In sketching the likely impacts of climate change, Pachauri drew from the IPCC’s 2001 Third Assessment Report. (The fourth is due in 2007.) Damage could result from changes in precipitation patterns that impact fresh water and food supplies, he said. Growth in crop yields has already slowed due to factors not related to climate; yields could fall. Demand for irrigation water will rise, aggravating scarcities. Water quality will decline. Competition for water is already growing.

Developing countries and the poor in all countries will be hardest hit. A half-billion people in the Indian subcontinent depend for the bulk of their water supply on snow melt from the Himalayas, now threatened. Two-thirds of India’s population lives in rural areas where much farming depends entirely on rain-fed agriculture. Without adequate rainwater, soil conditions will deteriorate; poverty will worsen. Food prices will rise for everyone; huge demand for agricultural products will threaten food security for the entire world. Other likely effects are no less alarming:

Data Sources: Energy Information Administration (2003) and U.S. Environmental Protection Agency (2004)

– Sea levels will rise as polar ice caps continue to melt and as heat expands water; coastal regions and islands will be inundated.

– Heat waves will kill people.

– Rates of infectious and heat-related respiratory diseases will rise.

– Economies will be dislocated; sustainable development, thwarted.

– All forms of life (so exquisitely temperature-dependent) will be affected, with far-reaching ecological consequences.

Energy Efficiency is Key

Even after levels of CO2 are stabilized, temperatures will continue to rise before they eventually stabilize. Sea levels will continue to rise for much longer. The longer we delay acting, the worse the impacts, the longer they’ll last, the harder it will be to mitigate them.

Stabilizing climate will require a broad range of actions that address not only CO2 but also other climate forcing agents including methane, trace gases, and black carbon (soot) aerosols, according to Jim Hansen, whose research into climate change goes back decades. His congressional testimony in the 1980s helped raise broad awareness of climate change issues.

IPCC business-as-usual scenarios are unrealistically pessimistic, in his opinion; he says a realistic description of CO2 emissions growth is 1.4% per year. However, “I’m more pessimistic than IPCC analyses” with regard to how fast large ice sheets may disintegrate. In order to avoid climate problems, he argues, CO2 emissions need to level off during the next few decades and then decline with the help of new technologies.

But the U.S. Department of Energy projects continued growth in CO2 emissions and energy needs over the next several decades. Who are the biggest U.S. energy consumers? Industrial uses have been fairly constant; the greatest growth is in transportation. Indeed, CO2 emissions from transportation now exceed those from industry. “If you want to flatten out the use of energy, you need to address transportation,” Hansen said.

Within the transportation sector, the lion’s share of emissions comes from automobiles and “light trucks,” a category that includes SUVs, pick-ups, and vans. Growth in emissions comes primarily from light trucks. With gasoline at $2 per gallon, a conservative 4.8 mpg increase in efficiency for light trucks and a 2.8 mpg increase for automobiles would pay for themselves through decreased fuel usage.

Other Measures; Other Means

Implementing this by 2015, Hansen advised, would save, by 2030, more than 3 million barrels of oil each day; integrated over 35 years, four times the amount of oil in the Alaska National Wildlife Refuge. A moderate scenario – more modest than what’s possible with existing technology – would save, by 2030, seven times what’s in the Alaska National Wildlife Refuge. “If you go further it would be possible to get an absolute decrease [in emissions]…So the possibilities of avoiding large climate change are there technologically. We just have to get serious” – and act.

As Pachauri observed, climate change poses equity issues. To date, developed nations have added the largest share of human-generated GHGs to the atmosphere; they hold the most GHG “debt.” Should developing nations have a chance to “catch up?” Who should bear the burden of reducing GHG emissions?

Along with acting to mitigate climate change, we can adapt to it, Pachauri explained, by such measures as developing drought- and salt-tolerant crops; by developing better water-conservation practices and technologies for conserving and desalinating water; by reducing the enormous inefficiencies in the biomass cycle, the developing world’s major form of energy.

But technology isn’t just a mere fix. “You have to create the policy framework, the social conditions by which technology will be developed and used. We need to redefine technology-related priorities within a global framework.” Toward this end, social scientists’ rigorous analyses of the impacts of climate change are essential. Political implications, too, which tend to be viewed only in terms of negotiating positions, should be examined within an objective academic framework, Pachauri urged.

What Will It Cost?

And economic issues must be squarely addressed: beliefs that slowing climate change will be costly are “fallacious,” Pachauri stated. The IPCC found that reducing CO2 concentrations to a reasonable 450 ppm by 2050 would reduce global GDP by only about 4%. Essentially, in a period of healthy economic growth, you merely postpone by a year or so the date by which you reach a certain level of prosperity. And historically, technologies have proved far more efficient and less costly than anticipated, he noted.

Moreover, as Hansen remarked, the millions of barrels of oil you could easily save by 2030 are equivalent, at $40 a barrel, to $80 billion a year, “which would, independent of mitigating climate change, do a lot for our economic and national security.”

The view that it’s going to be terribly costly to mitigate climate change is shortsighted and irresponsible, Pachauri asserted. “You can’t see mitigation of climate change in isolation from other priorities.” He cited several benchmarks. Worldwide military expenditures for 2004 are estimated at $950 billion. In 2003, aid from donor nations totaled only $68.5 billion, 0.25% of their income; it’s generally believed aid should total around 0.7%. If the World Trade Organization reduces subsidies so that farmers in developing countries can compete on a level playing field, “altogether you’re providing a few hundred billion dollars a year” those countries could use to address climate change.

That is, the resources are there, but “there is unprecedented need for global vision and commitment. Groups like this – the scientific community, the thinkers of the world” – must bring it about.

Leaders and Laggards

The U.S. federal government hasn’t yet acted to regulate GHG emissions, although the United States is the largest GHG emitter in the world. Nor has it ratified the Kyoto Protocol. The United States is not leading technologically or intellectually, charged Ken Colburn, whose organization, NESCAUM, is an association of eight Northeastern states’ air quality regulators who are working to reduce GHG emissions.

China just adopted motor vehicle standards that may ultimately disadvantage the United States technologically, he reported. Australia’s government hasn’t ratified Kyoto, but all 10 of its states are committed to capping GHG emissions, and they’ll meet the first Kyoto target for emission reductions.

Tony Blair has been progressive on climate. “He’s getting flak from the right for not going far enough.” Germany requires that solar energy be integrated into new buildings; New York City’s proposed building standards don’t. And while New York is the financial capital of the world, the intellectual capital for carbon markets is being built in London. The financial ramifications of the EU’s carbon-trading market will be significant, Colburn predicted.

While the federal government balks, U.S. states are pursuing initiatives. The California Air Resources Board announced fuel-efficiency standards that require a 30% reduction in emissions by 2016. The cost, about $1,000 per vehicle, will be recouped by lower fuel consumption. A coalition of auto manufacturers has mounted a legal challenge. “The worst problem in America relative to fuel efficiency standards? High standards would disadvantage domestic manufacturers…We evidently need to drag them kicking and screaming into technological survival.”

Coast to Coast

Like the Northeastern states, California, Oregon, and Washington – whose fossil-fuel GHG emissions total 7% of the global total – are pursuing an initiative to develop a cap-and-trade program for CO2 emissions. Some “red states” are putting together climate action plans. With implementation of the Kyoto Protocol and the start of EU carbon trading, Colburn said, “pressure is especially building on the business community.”

Karl Michael, who coordinates New York State’s energy, environmental, and economic modeling and forecasting activities related to energy policy and planning, reported some bright spots on the state level. “Ten years ago…global warming wasn’t on the radar screen,” he recalled. But a state Climate Change Action Plan formulated five years ago rapidly evolved into a statewide GHG task force. “Suddenly, it was…the hottest issue in town.” Today, climate change issues are one of Governor George Pataki’s highest priorities.

New York adopted the goal of reducing its 1990 GHG emissions levels by 5% by 2010, and by 10% by 2020. It imposed a “system benefit charge” on electric bills to fund a variety of programs related to energy efficiency and renewable resources. It required that, by 2013, 25% of electricity used in the state come from renewable resources. “We expect that to mean a lot of windmills.” Roughly 16% now comes from renewable sources, largely hydro projects. “Moving to 25% over a decade is a big commitment,” Michael noted.

Just the Beginning

Another bold venture, initiated by Governor Pataki, is the Regional GHG Initiative (RGGI), a consortium of Northeastern states that’s developing a regional cap-and-trade program for carbon emissions from electricity generators. A model rule each state can use to fashion its own regulations is due out by April 2005.

The governors are saying, “We’re serious about this. Get this done. Come up with something that will work,” Michael said. And they want to help the economy by encouraging the development of new technologies. RGGI expects “to make some real changes” in how electricity is produced in the Northeast. The hope is that the federal government will follow suit, “because this is something the states are clearly way out ahead on.” After emissions from electricity generators are capped, RGGI will turn to other sectors. “We see this as just the beginning.”

Also read: The Dire Climate Change Wakeup Call


About the Author

Christine Van Lenten is a freelance writer who has written about environmental subjects for the Academy, government agencies, and private sector firms

Paul Ehrlich: Can We Avert a Global ‘Nineveh’?

Due to human impacts on the planet, our species and the broader ecosystem may be “racing toward a miserable future.” Paul Ehrlich says we shouldn’t over-rely on technology to correct this troubling trend.

Published August 1, 2004

By Christine Van Lenten

Our “triumphant” species may be partying on toward the first collapse of a global civilization. By accelerating depletion of our natural capital, the interrelated trends of population growth, rampaging consumption, and worsening political and economic inequality have put us on a collision course with nature and eroded our ability to create a sustainable future.

The sources of these trends and how they can be altered is the subject of Paul and Anne Ehrlich’s new book, One with Nineveh, which Paul Ehrlich discussed at The New York Academy of Sciences (the Academy) this spring, at the invitation of the Environmental Sciences Section and the Science Alliance.

That title refers to the seat of the ancient Assyrian empire, which, you may have noticed, is no longer flourishing. Its demise was hastened by self-inflicted environmental damage – a cautionary tale.

Today, Ehrlich’s name is more widely recognized than Nineveh’s. Author of the 1968 bestseller The Population Bomb, he is Bing Professor of Population Studies at Stanford and has published extensively, won many awards, and been a forceful scientist-citizen spokesman on vital issues for decades.

Grave and Worsening

The issues he’s grappling with now are grave and worsening, and Ehrlich did not disguise his frustration with the problem that dismays him most. The human race has radically reshaped the planet; scientists understand all too well that we’re racing toward a miserable future; what must be done is all too clear; for years, scientists have been urgently trying to make this understood. But the mass media carry little science news, and too many citizens and policymakers remain blithely unconcerned. Magical beliefs that technology will solve all problems, quickly, contribute to this syndrome. Leadership is essential, but, Ehrlich believes, the Bush administration is making matters worse.

Scientists must do a better job of getting their story out, he insisted. One with Nineveh is a heroic, plain-English attempt to do this.

The Ehrlichs’ agenda for achieving needed change is proportional to the problems: that’s to say, it’s staggering in scope. One initiative would squarely tackle the challenge of modifying nothing less than human behavior itself. “Remember,” Ehrlich said, “we’re a small-group species, both genetically and culturally. For most of our 5-million-year history…we lived in groups that averaged below 200 people, and almost everybody within those groups was related. Now, evolutionarily in an eye blink of time, we’re trying to live in a global civilization of 6.3 billion people.” We must figure out how to do this better. And individuals’ rights become part of environmental problems, because we can’t tackle problems “if we’re at each other’s throats.”

A millennium assessment of human behavior, he suggested, would examine issues on the “population-environment-resource-ethics-power” spectrum, including the fundamental question of “what people are for.” Ethical issues – including our obligations to the world’s poorest people, to future generations, and to nature – would be central.

Potential for Change

This initiative may seem fanciful, but a partial precedent is enjoying impressive success: the Intergovernmental Panel on Climate Change involves scientists from many countries and disciplines in tackling an unprecedented global problem. Its work is regarded as authoritative. The UN is a cosponsor, and while Ehrlich believes the UN must be radically restructured to reflect 21st century realities, he views it as “the only game in town.”

Another promising precedent is the Millennium Ecosystem Assessment, an international scientific collaboration that will support local, national, and international decision making about ecosystem management.

But can human behavior change, and change quickly enough? Ethical standards have been evolving, Ehrlich reflected. For example, it’s no longer OK to beat your horse to death in the street; becoming a despot is no longer considered a good career move. And societies can change dramatically and rapidly: after President Truman desegregated the military, race relations in the United States changed quickly, though not enough; the Soviet Union collapsed suddenly.

Ehrlich sees the potential for similar change in how we treat each other and the environment, and it is in this that he places his hope. “When the time is ripe, people will begin to realize that the only realistic solutions today are ones we thought were idealistic yesterday. What I hope all of you will do is everything you possibly can to ripen the time.”

Also read: Sustainable Development for a Better Tomorrow

Carbon Sequestration on the Great Plains

While the concept of carbon sequestration might seem like a magic trick, researchers continue to advance its environmental and financial feasibility.

Published June 1, 2004

By Christine Van Lenten

Image courtesy of Tom via stock.adobe.com.

Carbon dioxide emission dwarf in quantity all other greenhouse gases (GHGs) and exacerbate the impacts of climate change. But CO2 emissions are difficult to reduce. Chemically scrubbing them from smokestacks isn’t generally practicable, and many sources are mobile, small, and/or dispersed. Rather, achieving reductions requires adopting energy-efficient measures, converting to renewable energy sources or other sources that contain less or no carbon, or attempting to sequester the CO2 after it’s been emitted – that is, removing it from the atmosphere and storing it.

Various carbon sequestration schemes are being pursued. Exploiting soil’s capacity to store carbon is the one advocated by Dr. Patrick Zimmerman, who directs the Institute of Atmospheric Sciences at the South Dakota School of Mines and Technology. And Zimmerman has a specific carbon storehouse in mind.

While the world struggles to devise policies, practices, and technologies that can slow global warming, the Great Plains region of the United States, says Zimmerman, continues to serve as a vast “carbon sink,” silently sucking CO2 out of the atmosphere. At a March 23, 2004, event co-sponsored by The New York Academy of Sciences’ (the Academy’s) Environmental Sciences Section and the Third Annual Green Trading Summit, Zimmerman, the featured speaker, contended that croplands and rangelands can store much more carbon. And, he stated, the science needed to quantify sequestered carbon and a system for bringing credits for it to market are available right now.

Markets are Emerging

Setting the stage for Zimmerman’s talk was Peter C. Fusaro, an organizer of the Trading Summit and chairman of Global Change Associates. Fusaro said the Kyoto Protocol, the international community’s attempt to reduce GHG emissions by specifying national targets and a timetable for meeting them – yet to be endorsed by the United States – is flawed and won’t work.

But markets for trading GHG credits are emerging, he reported. They’re modeled on existing pollution-trading markets, like the successful market for sulfur dioxide run by the U.S. Environmental Protection Agency’s Acid Rain Program. SO2 emissions are capped by federal regulation; parties reducing their emissions below regulatory limits can claim credits for reductions and sell them to parties that haven’t met their targets. The overall goal of reducing emissions is served.

In the United States, CO2 emissions aren’t capped by the federal government, but various state and regional initiatives are under way, and leadership for forming GHG markets is coming from state policy makers, Fusaro observed, through bipartisan efforts that are creating the conditions necessary for markets to succeed: “simplicity, replication, and common standards.”

Why Buy Credits?

Why buy credits for reducing CO2 emissions? Anticipation of regulatory schemes is one reason; good corporate PR is another; simple good practice, yet another. And demand generates supply.

California’s Climate Action Registry for voluntary reporting of GHG emissions is the nation’s first. At the invitation of New York’s Governor Pataki, nine northeastern and mid-Atlantic states are collaborating to create a registry and formulate a model rule that states can adapt for capping and trading carbon emissions; two other states and several Canadian provinces are participating as observers. The Chicago Climate Exchange is a new, voluntary pilot market focused on North America. Some major corporations are already trading carbon credits.

Large, existing exchanges will enter the market soon, Fusaro predicted. New York City is the “environmental finance center,” and New York State will be “the template for world trade in carbon. We will have cap-and-trade markets in New York next year.”

CERCs, Cycles, and Soils

In the world of carbon trading, the unit of exchange is the carbon emission reduction credit (CERC), equivalent to one metric ton of CO2. Trading CERCs for CO2 that’s been snatched from the air and stored in the soil may sound like a magic trick. And indeed, scientists are only beginning to understand the intricate and complex feedback loops among climate, atmospheric composition, and terrestrial ecosystems that govern this form of sequestration.

Zimmerman framed the science by explaining that Earth’s carbon inventory cycles among reservoirs – the atmosphere, lithosphere, hydrosphere, and biosphere. The reservoirs vary dramatically in size; so do carbon fluxes between them. Because fluxes between terrestrial ecosystems and the atmosphere are large, over time small changes in them can produce large changes in how much carbon accumulates in the atmosphere.

Zooming in on the molecular level helps illuminate the huge potential of the carbon sequestration scheme that Zimmerman is advocating. During the growing season, green plants absorb solar energy and remove CO2 from the atmosphere, producing carbohydrates. Because these compounds contain less oxygen for each carbon atom than CO2 does, “surplus” oxygen is released; carbon is stored.

Plants also respire, taking in oxygen to metabolize carbon compounds and release energy needed for cellular processes, and producing water and CO2. When the growing season ends, photosynthesis and plant respiration cease, and organic matter, rich in carbon, decomposes, primarily because organisms in the soil feast on the carbon, metabolize it, and return CO2 to the atmosphere. When soil containing organic matter is broken up – for example, by tillage – more organic matter is exposed to this oxidizing process, accelerating release of CO2 and depletion of soil’s carbon bank.

The “Missing Sink”

If you want to prolong sequestration of carbon in soil, the crucial question becomes this: What conditions hinder decomposition of organic matter in soil, slowing down the release of CO2 back into the atmosphere?

Vegetation growing at high latitudes, Zimmerman said, fits the bill. At high latitudes, plants grow quickly in the summer, and about half the growth is underground. In winter, freezing slows the metabolic processes that oxidize carbon, trapping it within the soil in the form of organic material that’s not completely decomposed. By contrast, Zimmerman noted, “it’s really tough to store organic matter in tropical soils.”

High-latitude efficiencies in storing carbon also offer one answer to the mystery of the “missing sink.” Scientists calculate that quantities of CO2 produced by burning fossil fuels and resulting from deforestation and other land-use practices are greater than quantities taken up by the atmosphere and oceans. Where’s the balance? Seasonal variations in rising concentrations of atmospheric CO2 point to a slight net carbon sink that’s land-based, in the northern hemisphere, in North America, said Zimmerman. And the Great Plains, a high-latitude region, appears to be that “missing sink.” Analysis of carbon and oxygen in atmospheric CO2 samples collected from air masses as they traverse North America appears to confirm this.

Adapting Land-Use Practices

The Great Plains, he added, can store still more carbon, through alteration of land-use practices – for example, converting high-tillage cropland to low-till or no-till, or to pasture or grassland. Zimmerman, who grew up on a wheat farm, pointed out that organic matter also increases ecosystem productivity and resilience to stress.

Sequestering carbon in the Great Plains can significantly offset emissions elsewhere, he contended. For the purpose of slowing global warming, what matters is that CO2 is sequestered; not where. Thus, cropland and rangeland in the Great Plains can serve as generators of tradable CERCs and a brake on global warming.

But how can you tell how many metric tons of CO2 an acre of land is sequestering? As described by Zimmerman, the science is fiercely complex. His team of meteorologists, ecologists, plant physiologists, GIS experts, analytical chemists, computer scientists, and remote-sensing specialists virtually swarm all over the landscape – working from the molecular level, to individual leaf, to grass canopy, to atmosphere – gathering data on a host of processes and factors.

Equipment ranges from plastic bags placed over plants, to towers, tether balloons, an airplane equipped with a digital imaging system, and satellites. Measurements include variations in temperature, humidity, rainfall, snowfall, and solar radiation; quantities of water vapor, volatile organics, and CO2 emitted from vegetation, and the fluxes to vegetation; and leaf area indexes. Data on land use history, vegetation and crop dynamics, and feedback between carbon, phosphorous, nitrogen, and sulfur cycles are gathered, too.

Models Built on Data

Data are used to build numerical models of physical, chemical, and biological processes; these models are then linked, to model ecosystem carbon cycling and atmospheric chemistry, and extrapolated to landscape and regional scales. Regional modeling is essential, Zimmerman emphasized, “because that’s where the impacts are felt – where you live.” He termed the Black Hills (in South Dakota and Wyoming) “a great outdoor laboratory” that lends itself to the measurements needed to constrain regional models. His team is now establishing a network of field monitoring stations.

Determining how to link models constituted of algorithms based on physics and chemistry, across orders of magnitude that span spatial and temporal scales, is, Zimmerman observed, like trying to assemble an elephant from a box of molecules without the benefit of knowing what an elephant looks like. The work is iterative and time-consuming. And modeling rangeland to quantify incremental carbon storage poses special difficulties.

But while this science is still far from precise, it’s plenty good enough to get CERC markets going and “to make a difference,” he contends. Farmers can adapt their land use to sequester CO2 now, while we develop better technologies – and the socio-political will – to cut emissions.

And how can what Zimmerman termed “enhanced ecological carbon storage” be capitalized? For a market to be viable, he’s concluded, six conditions must be met:

(1) The business-as-usual baseline must be established.

(2) The additional CO2 each landowner sequesters must be quantified.

(3) How long CO2 will remain sequestered must be forecast.

(4) No unintended, offsetting releases can be generated; for example, converting cropland to pasture and introducing cows, which emit methane, every ton of which is equivalent to 20 tons of CO2 in its effects on global warming.

(5) Ownership of CERCs must be documented.

(6) CO2 sequestration must be verified.

Satisfying All Six Conditions

Zimmerman and his colleagues have designed a system, C-Lock (patent pending), that he said satisfies all six conditions. Internet- and GIS-based, it creates, certifies, standardizes, and verifies CERCs for specific land parcels, by integrating data on slope, climate, soil, historical land-use variables, and other factors. Farmers can access it directly online; no middlemen are required.

To create economies of scale, so benefits exceed transaction costs, C-Lock aggregates CERCs for many small landowners. Monte Carlo analysis is used to quantify uncertainty and normalize CERCs so they have universal currency. A reserve pool of CERCs with higher uncertainty values and correspondingly lower market values can be tapped to offset fluctuations in actual soil performance. The system’s transparency facilitates four levels of verification and “audits” that employ a variety of databases and scientific tools.

And because the carbon-storage capacity of tilled soil isn’t saturated, C-Lock quantifies only changes in amounts of soil carbon. Trying to quantify absolute amounts would pose daunting soil-sampling problems. Data on land-use history are key here. Quantification needn’t be precise for each individual land parcel; just reproducible and transparent. But it must be reasonably accurate in aggregate, and the uncertainty (financial risk) should be quantified to achieve maximum value.

Launching the System

C-Lock is now equipped with GIS for South Dakota, and a trade is in the works; trades in Idaho, Montana, Wyoming, and North Dakota will follow, Zimmerman predicted, adding that C-Lock can accommodate other GHG emissions and forms of sequestration, anywhere in the world.

What’s needed for it to succeed? A pilot phase, cap-and-trade policies, and policies that define soil sequestration’s role in the GHG reduction strategy, he said. But his biggest concern is that huge environmental advantages will be lost if USDA incorporates carbon sequestration into conventional farm subsidy programs. We have an obligation to make a difference, he insisted, and we can: markets can work, benefiting farmers, ranchers, and the environment.

As measures to slow global warming develop, what role is the Academy playing? Its Environmental Sciences Section is stepping up to the plate: Chair Michael Bobker says it’s creating “a dialogue around greenhouse gases and emission trading issues, as well as carbon reduction and sequestration projects” – an initiative squarely in keeping with the Academy’s historical role as a forum for exploring and debating the scientific issues that matter most, and advancing science for the public good.

Also read: The Promise and Limitations of Carbon Capture

Sprawling Cities Can Coexist with Thriving Ecosystems

Many major urban areas are constrained with the amount of green space they can provide to residents. Encouragingly, building rooftops have emerged as a solution to fill this shortfall of urban green space.

Published January 1, 2004

By Peter Coles

Jacob K. Javits Center – New York City. Image courtesy of demerzel21 via stock.adobe.com.

The common image of cities as hot spots of crime and grime may need updating. They also can be havens of natural and cultural diversity – and could hold the keys to sustainable development in the 21st century.

While some 3.2 billion people – half the world’s population – are now estimated to live in towns and cities, with a growing number of poor, “urban” is by no means incompatible with “nature,” even in a major city like New York. Once rare, peregrine falcons now nest on Manhattan bridges, while a survey carried out by the Brooklyn Botanic Gardens found over 3,000 species of plants in a 30-mile radius of the city – far more than in the vast cornfields of the Midwest.

The Need for Preservation

And, while the presence of man has driven some species of plant and animal close to extinction, cities may now be the only places they are still found. Paradoxically, they will no longer survive without human intervention to preserve them.

These topsy-turvy ideas emerged during a meeting at The New York Academy of Sciences (the Academy) in October 2003, entitled “Urban Biosphere and Society: Partnership of Cities,” co-organized with CUBES (Columbia University-UNESCO Joint Program on Biosphere and Society) and UN Habitat.

For many people, the built-up environment is the antithesis of nature, as Rutherford Platt, of the Ecological Cities Project at the University of Amherst, pointed out. “Nature” is somewhere else, outside the city, in a national park or some remote wilderness. But, recalling Lewis Mumford, champion of the green belt, he emphasized that not only can nature be part of a city, but cities themselves can be as much a part of nature as an anthill or a beaver colony.

Creating New Types of Habitat

Ecologists are now appreciating that cities, as well as preserving rare patches of ancient flora and fauna in parks and settler cemeteries, also present challenging new habitats, with their own adapted plants. “We are creating types of habitat that have never been seen before,” said Charles Peters, Kate E. Tode Curator of Economic Botany at the New York Botanical Garden, “like a vacant lot with 35 minutes of sunlight a day. It’s an interesting niche.”

Peters, who has been studying a 40-acre swathe of ancient oak and hickory forest in the Botanical Garden for several years, also defended the invasive species that are settling there, filling niches left by native species that have failed to adapt to an urbanized habitat. “The most important thing is that these plants continue to function, whether they’re from China or Siberia. We can’t put the forest back the way it was 200 years ago. To do that, we’d have to put the Bronx back the way it was 200 years ago. Forests are continually changing. What’s important is that the new species are controlling erosion, providing nutrients for the soil, recycling the air.”

Others argue that intact, native ecosystems, like the remnants of oak woodlands and prairies in Chicago, have a far richer biodiversity than those colonized by invasive species, and are more sustainable. Since 1996, Chicago Wilderness, a loosely-structured coalition that today comprises over 165 associations, institutions and organizations, has been working to restore biodiversity in the Windy City, which is visited by some 6 million neo-tropical birds every year on their way to and from Canada.

Retaining Residents

Chicago’s city fathers, explained John Rogner, Chicago Field Supervisor of the Fish and Wildlife Service of the Department of the Interior, bought patches of oak woodland and prairie to prevent them being developed. Their argument was that a beautiful environment, with access to nature, would stop residents – the city’s life force – from moving away.

After three years of research, including “bio blitzes” in which local residents and children help scientists count species, Chicago Wilderness established a “biodiversity recovery plan.” With a wide range of projects, such as ridding the oak woodlands of tenacious, but non-native buckthorn, the consortium is also helping to restore brown-field sites, like Calumet, south of the city, which ironically contains several endangered and critical species of bird, surviving amid the derelict steel plants and toxic waste dumps.

Mark Wigley, interim dean of the Graduate School of Architecture at Columbia University, suggested that for most people “old cities are the heroes, and new cities are the villains.” But this idealized image leaves out the crime, open sewerage, disease and overcrowding characteristic of city life in the Middle Ages.

For Robert Pirani, director of environmental programs for the New York Regional Plan Association, the “villain” today is not so much the post-industrial downtown as it is suburban sprawl. In the past 10 years, he said, land use in the New York area has expanded by 100%, while population has increased by less than 10%.

Sprawling Cities

This means “thousands of homes surrounded by lawns, and shopping malls surrounded by parking lots,” he said. According to Rutherford Platt, this trend can be seen across the U.S., where suburban population has increased fourfold since 1950, compared to an 85% increase in population. And, he added, car ownership in the U.S. has risen by 100% since 1970, while population increased by 40%. In Atlanta, which has been dubbed “sprawl city,” drivers spend an average of 72 hours a year in gridlock, he said.

If sprawl is a middle-class phenomenon in developed countries, however, it is associated with poverty in much of the south. While 82% of Brazil’s population live in cities, said Rodrigo Victor, of the São Paulo Biosphere Reserve, some 23% of the population of São Paulo live in shanty towns, mostly on the edge of the Green Belt Biosphere Reserve that surrounds the city, a part of the Atlantic Forest Biosphere Reserve.

With a global trend towards urban living – two-thirds of world population in 2030 will live in cities – the challenge is to find sustainable solutions to urban growth. One approach, according to freelance journalist Helen Epstein, is through architecture itself.

A new generation of high performance buildings attempt to behave more like natural systems, with water management on site, passive solar energy production, natural lighting and ventilation reducing their “footprint,” or impact on limited natural resources. An example is the Solaire housing development in Battery Park City, Manhattan. But, as architect Ernie Davis, mayor of Mount Vernon, New York, pointed out, these buildings are not usually for the poor, whereas project housing, which is designed to look as though it’s for the disadvantaged, does not have advanced design features.

Green Rooftops in South Korea

Green rooftops also offer a solution, as Kwi-Gon Kim, professor of landscape architecture at Seoul National University, South Korea, demonstrated. With 42% of Seoul covered by buildings, landscaping rooftops could add an estimated 200 square kilometers of green space to the city, about 30% of the Seoul area. In an experimental green roof project on top of UNESCO’s downtown Seoul office, just five months after its construction the 75 species of plant introduced at the outset had been joined by an additional 39 species, presumably from surrounding green belt areas, while 37 species of insect had colonized the site.

Seoul was one of 11 cities invited by CUBES to prepare case studies to see whether, and how, the UNESCO “biosphere” model could be applied to urban areas. This model, designed 30 years ago, has since been applied in 440 UNESCO Man and the Biosphere (MAB) sites in 97 countries. These are areas of terrestrial and coastal ecosystems that promote the conservation of biodiversity with its sustainable use.

They are internationally recognized, nominated by governments, and remain under the sovereign jurisdiction of the states in which they are located. Usually, they consist of a “core” area that has minimum human impact, surrounded by a “buffer zone” and a “transition” area, with increasing levels of social and economic activity, respectively. But, while some of the sites adjoin cities (like São Paulo), to date there is no urban biosphere site as such.

A Future Urban Biosphere Site

Cape Town, South Africa, which is already surrounded by three natural biosphere reserves, gives some clues as to what a future urban biosphere site might look like, although it is just a theoretical case study at this stage. As Ruida Stanvliet, of the Western Cape Nature Conservation Board illustrated, the nine provinces in the region house 3.5 million people, some of them affluent and white, living in suburbs, while much of the black population lives in extreme poverty, in temporary housing and with a high incidence of HIV/AIDS. Nonetheless, the area boasts a rich biodiversity, with some 9,000 plant species.

“Environment conservation is crucial for poverty alleviation,” said Stanvliet. “It connects people to their sustainable resource base.” And in Cape Flats, one area in the Cape Town urban biosphere case study, over 20% of the people live in sprawling, informal settlements. In some communities, 70% live with less than $1 a day, and only 36% of adults are in paid employment.

The windswept mosaic of dunes and wetlands of Cape Flats is where victims of apartheid were relocated. Now, in a pilot initiative, the City of Cape Town has joined with the Botanical Society of South Africa, the National Botanical Institute and the Table Mountain Fund, to form Cape Flats Nature. This project focuses on conservation and restoration of biodiversity in several sites, enlisting the participation of local people through educational programs.

The Cape Flats Nature project has a certain resonance with the Chicago Wilderness brown-field development in Calumet, half way across the globe from Cape Town. This linking of cities, at least informally, was one of the ambitions of the Academy/CUBES meeting.

As Many Questions as Answers

The meeting raised as many questions as it answered, but that was another of its ambitions. In a city like New York, where would the “core” of a biosphere site be? For William Solecki, of the Department of Geography at Hunter College, City University of New York, it could be the harbor and estuary area, which is historically the focus of human activity in the city, while pockets of intact wetland survive in adjacent Jamaica Bay.

And the “buffer zone” might be the watershed in the Catskills that feeds the “core.” Indeed, as Christopher Ward, commissioner of the New York City Department of the Environment explained, New York was able to avoid spending millions of dollars on a new water treatment plant by investing in protection of the watershed.

This inclusion of more distant areas in the biosphere of a city like New York is a way to acknowledge that its “footprint,” unlike that of an equivalent natural area, can even extend thousands of miles. The coffee consumed in New York has a direct impact on plantations as far away as Bolivia, which, incidentally, is where some of the migrant warblers come from that feed in Central Park every May and October. Food for thought.

Also read:  The Impact of Climate Change on Urban Environments


About the Author

Dr. Peter Coles is a freelance science writer and photographer living in Paris, France.

A New Blueprint for Effective Green Architecture

From local sourcing of materials to utilizing renewable energy, the sustainable building design revolution has transformed the way that architects and engineers approach construction.

Published November 1, 2003

By Jeffrey Penn

Image courtesy of ArLawKa via stock.adobe.com.

As environmental awareness spreads around the globe, the so-called “greening” of architecture has ignited a revolution in the design and construction of buildings, according to one of the nation’s leading experts in the field.

“The concept of sustainable building design has led to a new architectural vocabulary – known as ‘green buildings’ – that is transforming the way we act and think about the environment and the buildings we construct,” said Hillary Brown. Titled “Visioning Green: Advances in High-Performance Sustainable Building Design,” Brown spoke at a August 26 2003, meeting, cosponsored by The New York Academy of Sciences (the Academy) and the Bard Center for Environmental Policy.

Former director of Sustainable Design for the New York City Department of Design and Construction, Brown now heads her own firm, New Civic Works, which specializes in helping local government, universities and the nonprofit sector incorporate sustainable design practices into their policies, programs, and operations.

“These new practices are beginning to catalyze not only the construction industry, but also the wider society” as people learn about the issues at stake, Brown said. “All sectors are mobilizing around sustainable building design.”

Paying Attention to Nature

“The increased recognition that buildings can contribute directly toward a healthy environment in which to live and work,” Brown said, provides the context for the architectural revolution.

Brown presented a blueprint for “green principles” in new buildings, including climate-responsive designs and an understanding of the relationship between the building and its location. “In this view, water, vegetation and climate are taken into account in the design of the building, with special attention paid to how the building’s infrastructure affects its surroundings,” she said.

“Nature and natural processes should be made visible in green buildings,” Brown added, noting that the form and shape of the building should take into account the interactions between the occupants and the building itself.

“Technology often displaces our connection to the natural world,” Brown contended. Green buildings, she pointed out, “help to improve a sense of health and well being as occupants are put in touch with their natural surroundings.”

According to Brown, studies show that “people are more comfortable in green buildings than conventional buildings.” She asserted that four factors have a substantive impact on performance and mood inside buildings: air quality, thermal comfort, amount of natural light, and appropriate acoustics.

Minimizing Waste of Resources

In addition to aesthetics and comfort, green buildings respond to ecological concerns by “minimizing the impact of human activity in lowering the levels of pollution during both the construction and maintenance of the building,” Brown said.

“Conventional methods of building design and construction leads to depletion of natural resources,” she added, “especially because carbon-based fuels are used extensively during construction and in the operation of the buildings’ infrastructure after completion. Green buildings attempt to minimize the waste of water, energy, and building materials,” Brown said. Within the construction industry, architects and builders have set goals to substantially reduce emission of carbon dioxide during construction and operation of buildings.

Brown noted that green buildings employ the use of daylight in combination with high-efficiency lighting. Use of horizontal “light shelves” and other well-designed building apertures, for example, can reflect daylight deeper into buildings, displacing the need for artificial lighting. Other passive comfort-control techniques include the use of natural ventilation and an improved building envelope to reduce dependence on mechanical systems. Still other green buildings are cooled/heated by utilizing the constant ground temperatures of the earth as a heat source or heat sink.

Designers of green buildings also seek to reduce or eliminate construction materials that contain unstable chemical compounds that, as they cure over time, are released into the environment – such as adhesives, sealants and artificial surfaces. “We need to think about eliminating these noxious chemicals from the building palette,” Brown said.

In addition, Brown said that architects are paying more attention to recycled and local materials in construction. “The selection of local and regional materials means a lower consumption of transportation energy during construction,” she noted. Brown also encouraged the increased use of renewable materials, woods – such as bamboo – or other wood products that are “certified” grown in renewable forests.

Improving Public Spaces

Although architects and builders have been slow to integrate “green principles” into most residential blueprints, Brown cited their incorporation into public buildings such as courthouses, libraries, and performance spaces and schools.

She cited a study from California that revealed elementary students in classrooms with the most daylight showed a 21% improvement in learning rates when compared to students with the least amount of daylight in their classrooms.

For businesses, Brown said improved air quality would likely result in reduced absenteeism from asthma and other respiratory diseases, may lower other health-related costs, and generally help to improve productivity in the workplace. Although she acknowledged that the average well-designed green building might have a slightly higher initial construction cost, up to 3%, she stressed that the long-term savings in operating expenditures can be as much as 33% or higher.

Brown also said urban streetscapes should employ sustainable design practices, including efforts to reduce the “heat-island affect” with increased planting of trees and use of light- or heat-reflective materials in sidewalks, streets, and roofing membranes. In addition, she cited opportunities for improved water resource management by recycling once-used tap water from sinks for irrigation and cleaning, and by installing green roofs or other systems that harvest usable storm water from the roofs of buildings.

‘Civic Environmentalism’

Brown said that although there are still some barriers to incorporating green principles in construction – such as increased costs, the difficulties of apportioning savings to both tenant and developer, and various regulatory disincentives – she noted that the federal government, several states, and many municipalities are beginning to demand or incentivize green buildings. She predicted that building and zoning codes would eventually more adequately reflect the interest in green buildings as society embraces what she called, “civic environmentalism.”

Also read: Green Buildings and Water Infrastructure

A Scientific Roadmap to the Hydrogen Economy

With advances in hydrogen technology, including hydrogen-powered vehicles, we can potentially lessen our reliance on carbon-based fossil fuels.

Published November 1, 2003

By Dan Van Atta

Image courtesy of Pongsakorn via stock.adobe.com.

Picture a world economy built around the profitable production of non-polluting and endlessly renewable energy supplies – a global society freed from the shackles of dependence on oil, coal and other carbon-based fossil fuels.

Such a scenario has long been the vision, or dream to skeptics, of Dr. Amory B. Lovins, co-founder and CEO of the Rocky Mountain Institute (RMI), whose widely published views on environmental and energy-related topics have gained him global recognition for more than three decades. Lovins described his “Roadmap to the Hydrogen Economy” to a crowded meeting room of both skeptics and believers at the Environmental Science Forum held September 4, 2003 at The New York Academy of Sciences (the Academy.)

Hypercar® vehicles – ultralight, ultra-low-drag, and originally based on hybrid gasoline-electric designs – were invented at RMI in 1991 and are the most attention-getting route to energy efficiency on Lovins’s roadmap. At that time, hybrid-electric propulsion, invented by Dr. Ferdinand Porsche in 1900, was still thought to be decades away, but Honda introduced the hybrid Insight in the United States in 1999, and Toyota debuted its hybrid Prius in the U.S. in 2000. DaimlerChrysler, Ford Motor Company, and General Motors have all announced hybrid vehicles for release in the next year or two.

Eliminating the Need for Internal Combustion Engines

Dr. Amory B. Lovins

Today, Lovins told the gathering, hydrogen could be used in combination with advanced fuel-cell technology to eliminate the internal combustion engine altogether, powering a new generation of ultra-high-efficiency hypercar-class vehicles. And, he added, hydrogen-powered fuel cells that can provide economical on-site electricity to business and residential buildings can set the hydrogen economy in motion – greatly accelerating the hydrogen transition that has led Honda and Toyota already to market early (and correspondingly expensive) hydrogen-fuel-cell cars, with three more automakers set to follow suit by 2005 and another six by 2010.

“U.S. energy needs can be met from North American energy sources, including local ones,” he said, “providing greater security.” Hydrogen production just from available windy lands in the Dakotas, he said, could fuel all U.S. highway vehicles at hypercar-like levels of efficiency.

Along with a more secure domestic energy supply, moreover, Lovins said the transition from a fossil fuel-based to a hydrogen-based economy would offer a “cleaner, safer and cheaper fuel choice” that could be very profitable for both the oil and auto industries. “Hydrogen-ready vehicles can revitalize Detroit,” Lovins said.

Molecular hydrogen (H2) – a transparent, colorless, odorless and nontoxic gas – is the lightest-weight element and molecule. One kilogram of H2 packs the same energy content as a gallon of gasoline weighing almost three times as much. It’s far bulkier, too, but that may be acceptable in uses where weight matters more than bulk, such as efficient cars.

And hydrogen is in abundant supply as it may be readily derived from water, as well as from natural gas or other forms of energy.

An Energy Carrier, Not an Energy Source

Unlike crude oil or coal, however, hydrogen is not an energy source. Rather, it is an energy carrier, like electricity and gasoline, which is derived from an energy source – and then can be transported.

“Hydrogen is the most versatile energy carrier,” Lovins said. “It can be made from practically anything and used for any service. And it can be readily stored in large amounts.”

Hydrogen is almost never found in isolation, however, but must be liberated – from water by electrolysis, which requires electricity; from hydrocarbons or carbohydrates using thermos-catalytic reformers (which typically extract part of the hydrogen from water); or by other currently experimental methods.

About 8% of the natural gas produced in the U.S. is now used to make 95% of America’s industrial H2, Lovins said. Only 1% is made by electrolysis, because that’s uneconomic unless the electricity is extremely cheap. And less than 1% of hydrogen is delivered in super-cold liquid form, mainly for space rockets, because liquefaction too is very costly. But, Lovins noted, there’s already a major global H2 industry, making one-fourth as much annual volume of H2 gas as the natural-gas industry produces, and already demonstrating safe, economical production, distribution and use.

Proper Handling of a “Hazardous Material”

A highly concentrated energy carrier, hydrogen is by definition a hazardous material. But because H2 burns in “a turbulent diffusion flame – it won’t explode in free air,” Lovins said the gas consumes itself rapidly when it ignites, rising up away from people on the ground because it’s extremely buoyant and diffusive. Its clear flame, unlike hydrocarbon flames, can’t sear victims at a distance by radiated infrared.

As a result, he said, nobody aboard the Hindenburg (a hydrogen dirigible whose 1937 flammable-canopy and diesel-oil fire killed 35% of those aboard) was killed by the hydrogen fire. The modern view, he reported, is that hydrogen is either comparable to or less hazardous than common existing fuels, such as gasoline, bottled gas and natural gas.

News media interest in the potential of hydrogen-fueled electric vehicles run by emission-free fuel cells was piqued after President George W. Bush mentioned the technology in his State of the Union address this year. But Lovins noted that evaluating the technology requires an understanding of unfamiliar terms and concepts that cut across disciplines, often confusing both supporters and critics.

To explain the fuel cell, Lovins referred to the common electrolysis experiment that many students remember from their high school chemistry class. An electric current is passed through water in a test tube, splitting the water into bubbles of hydrogen and oxygen.

The proton-exchange membrane (PEM) fuel cell does the same thing in reverse: It uses a platinum-dusted plastic membrane to combine oxygen (typically supplied as air) with hydrogen to form electricity. The only by-product is pure hot water. The reaction is electrochemical, takes place at about 80 degrees Celsius, and there’s no combustion.

No Carbon Dioxide Emissions

Conventional electric generating plants make power by burning carbon-based fossil fuels (coal, oil or natural gas), or by means of costly nuclear fission, to heat water and turn large steam-turbine generators. (Hydroelectric plants use water to turn the turbines.) While fuel cells do not release carbon dioxide and other emissions, they are not yet economically competitive with fossil fuels for large, centralized electricity generation. However, Lovins said, at the point of actual use, such as the light or heat delivered in a building or the traction delivered to the wheels of an electrically propelled vehicle, mass-produced fuel cells can offer a highly competitive alternative to conventional technology.

“A fuel cell is two to three times as efficient as a gasoline engine in converting fuel energy into motion in a well-designed car” Lovins said. “Therefore, even if hydrogen costs twice as much per unit of energy, it will still cost the same or less per mile – which is typically what you care about.”

“If you buy gasoline for $1 a gallon, pre-tax, and use it in a 20-mile-a-gallon vehicle, that’s a nickel a mile,” Lovins continued. “If you reform natural gas at a rather high cost of $6 per million BTU in a miniature natural gas reformer, you get $2.50 per kilogram hydrogen, which has an energy content equivalent to $2.50 a gallon gasoline.”

That sounds expensive. But used in an ultralight and hence quintupled-efficiency hydrogen-fuel-cell powered hypercar vehicle, he added, that translates to a cost of 2.5 cents a mile. Or more conventionally, Lovins reported, in Toyota’s target for a fuel-cell car – 3.5 times more fuel efficient than a standard gasoline car – the same hydrogen would yield an operating cost of 3.3 cents per mile, still well under today’s gasoline cost.

Peak Aerodynamic Efficiency

Designed for peak aerodynamic efficiency, cutting air drag by 40% to 60% from that of today’s vehicles, hypercar vehicles would be constructed using molded carbon-fiber composites that can be stronger than steel, but more than halve the car’s weight – the key to its efficiency. Such vehicles could use any fuel and propulsion system, but would need only one-third the normal amount of drive-power, making them especially well-suited for direct-hydrogen fuel cells.

That’s because the three-times-smaller fuel cell can tolerate three-times-higher initial prices (so fuels can be adopted many years sooner), and the three-times-smaller compressed-hydrogen fuel tanks can fit conveniently, leaving lots of room for people and cargo. Replacing internal combustion engines – and related transmissions, drive-shafts, exhaust systems, etc. — with a much lighter, simpler, and more efficient fuel cell amplifies the savings in weight and cost.

Carbon-fiber composite crush structures can absorb up to five times as much crash energy per pound as steel, Lovins said, as has been validated by industry-standard simulations and crash tests. The carbon-fiber composite bodies also make possible a much stiffer (hence sportier) vehicle, Lovins said, adding: “It doesn’t fatigue, doesn’t rust, and doesn’t dent in 6-mph collisions. So I guess we’ll have to rename fender-benders ‘fender-bouncers.’”

The main obstacle to making ordinary cars out of carbon-fiber composites – now confined to racecars and million-dollar handmade street-licensed versions – has so far been their high cost. But Lovins said Hypercar, Inc.’s Fiberforge™ process is expected to offset the costlier materials with cheap manufacturing “that eliminates the body shop and optionally the paint shop – the two biggest costs in automaking. This could make possible cost-competitive mid-volume production of carbon-composite auto-bodies, unlocking the potential of hypercar designs.”

Making the Transition

Some 156 fuel-cell concept cars have been announced. In mass production, Lovins added, investment requirements, assembly effort and space, and parts counts would be “perhaps an order-of-magnitude less” than conventional manufacturing. With aggressive investment and licensing, initial production of the first hypercar vehicles could “start ramping up as soon as 2007 or 2008.”

Lovins acknowledged that transitioning to a hydrogen economy creates something of a “chicken and egg” conundrum. How can you ramp up mass production of hydrogen-fueled cars in the absence of ubiquitous fuel supplies? And who will invest in building that refueling system before the market for it exists?

Fuels cells used to provide electricity for offices and residential buildings, Lovins said, can hold the answer. “You start with either gas or electricity, whichever is cheaper (usually gas), and use it to make hydrogen initially for fuel cells in buildings, where you can reuse the ‘waste’ heat for heating and cooling and where digital loads need the ultra-reliable power. Buildings use two-thirds of the electricity in the country,” he added, “so you don’t need to capture very much of this market to sell a lot of fuel cells.” Tellingly, the fuel-cell-powered police station in Central Park kept going right through the recent New York blackout, he noted.

Leasing hydrogen-fueled cars to people who already work or live in buildings that house fuel cells would create a perfect fit, Lovins suggested. For a modest extra investment, the excess hydrogen not needed for the building’s fuel-cell generators could be channeled to parking areas and used to re-fuel the fuel-cell cars. This would permit a novel value proposition for car owners, whose second-biggest household asset sits 96% idle: Lovins said the hydrogen-powered fuel-cell cars could constitute a fleet of “power plants on wheels.”

A Need for More Durable Fuel Cells

During working hours, when demand for electricity peaks, he said the fuel cells in parked cars could be plugged in, “selling power back to the electric grid at the time and place that they’re most valuable, thus earning back most or all of the cost of owning the car: the garage owner could even pay you to park there.”

While today’s PEM fuel cells can be “better than 60 percent efficient,” Lovins acknowledged that more durable fuel cells are needed, and that mass-production is needed to bring down their cost. Eventually, he added, efficient decentralized reformers could be placed conveniently around cities and towns, mainly at filling stations.

No technological breakthroughs are needed, Lovins said, to reach the hydrogen economy at the end of his roadmap. “The hydrogen economy is within reach” – if we do the right things in the right order, so the transition becomes profitable at each step, starting now.

“[Sir Winston] Churchill once said you can always rely on the Americans to do the right thing,” Lovins concluded, “once they’ve exhausted all the alternatives.” We’re certainly, he wryly added, “working our way well down the list. But, as Churchill also said, ‘Sometimes one must do what is necessary.’”

Dr. Klaus S. Lackner

Adding fuel to the discussion, Dr. Klaus S. Lackner, the Ewing Worzel Professor of Physics in the Department of Earth and Environmental Engineering, The Earth Institute at Columbia University, briefly responded with some thoughts on Lovins’s proposals.

Other Points of View

After agreeing that “things will have to change, business as usual will not work,” due mainly to the need to curb carbon dioxide emissions, Lackner raised a number of issues he believes proponents of the hydrogen economy should consider.

For example, Lackner said off-peak power costs should not be used to calculate the cost of producing hydrogen fuel from electricity, as the hydrogen-generation industry will “destroy” the structure of off-peak pricing. “There may be a benefit to the electricity market in that power generation profiles become flatter, but this will be a benefit to people running air conditioners at 4 p.m., not to the hydrogen economy.”

“Hydrogen will be made from fossil fuels,” Lackner stated, “because it is much cheaper than by any other route.” He also noted that fuel cells and hydrogen are not synonymous. “Hydrogen can work without fuel cells, and fuel cells can work without hydrogen.” Although Lovins’s vision emphasizes PEM fuel cells, Lackner added that “some fuel cells run on methane. You can use any hydrocarbon you like; we can debate which is the best fuel.”

Many Competing Options

It’s also important to remember that hydrogen is an energy carrier, like electricity, not an energy source. “One needs to compare the advantages of hydrogen as an energy carrier with those of other energy carriers,” Lackner said.

Regarding Lovins’s designs for ultralight hypercar vehicles, Lackner said there  are many competing options for changing the internal combustion engine. “It’s not fair to compare old fashioned conventional cars, on the one side, with the new, fancy cars on the other side. We need to compare each of the potential energy carriers side by side, and not assume that the competition stands still.”

Lovins largely agreed with these comments, but felt that they didn’t affect the validity of his recommendations.

Also read: Better Batteries for Electric Cars