Skip to main content

A Neuroscientist’s Search for Memory

A hand completes the final piece in a puzzle that is the shape of the human brain.

In celebration of his new memoir, the Nobel Prize-winning neuroscientist recounted many formative episodes from his life in science.

Published May 5, 2006

By Carl Zimmer

Sponsored by:  The New York Academy of Sciences and W.W. Norton

Memory allows us to do more than just store telephone numbers and directions to the post office. It is a repository for lost worlds, which we can recreate years later. The Nobel Prize-winning neuroscientist Eric Kandel did just that during his lecture at The New York Academy of Sciences on March 2, 2006, as part of the Readers & Writers lecture series. Kandel, now 76, drew his audience back to his youth in Vienna in the 1930s. To convey the trauma of being a Jewish boy during the Nazi occupation of Austria, he recalled his ninth birthday.

“I’d gotten a number of toys, the most magical was a little shiny car I could control remotely,” Kandel said. But that joy turned to terror. It was 1938, the year the Nazis had invaded Austria. “Two days later, Nazi police officers came and told us we had to leave the house,” Kandel recalled. “They sent us to live with another family. When we came back, the apartment had been essentially emptied out. Everything was gone.”

Part Memoir, Part Intellectual History

Today Kandel understands a great deal about how he can manage to hold memories such as these. He escaped from Austria to the United States, where he trained as a neuroscientist. He went on to have a spectacular career probing the biological basis of the mind, winning the Nobel Prize in Physiology or Medicine in 2000. Kandel has woven together recollections of his life, his research, and the evolution of modern neuroscience into a memoir and intellectual history, In Search of Memory: The Emergence of a New Science of Mind.

Kandel attended Harvard, where he discovered psychoanalysis. He became convinced that it would allow him to understand both the rational and irrational sides of mankind. “This was the royal road to understanding the mind,” he said.

He entered medical school to be a psychoanalyst, but he had an unconventional idea about what his training should include. “I thought to be a psychoanalyst, it would be useful to know something about the brain,” he said. In the 1950s, most psychiatrists paid little heed to the actual structure and function of the brain. And neuroscience itself hardly existed as a unified discipline.

Eventually, Kandel ended up at the laboratory of Harry Grundfest at Columbia University. At first Kandel had the wild ambitions of someone who has not yet actually tried to study the brain. “I said, ‘I’d like to see if I can help localize where the ego, the superego, and the id are localized in the brain,'” Kandel recalled, laughing. “Grundfest looked at me like I was out of my mind. But rather than kicking me out, he said, ‘This is beyond the grasp of neuroscience today.'”

Cellular Psychoanalysis

Grundfest directed Kandel to more manageable experiments. In Grundfest’s lab he began recording the activity of neurons in crayfish. “In those days the output of the amplifier was hooked up to a loud speaker so you could hear each action potential. Boom, boom, boom!” Kandel said. “It was fantastic. Here I was listening to the signals coming from the brain of the crayfish. This was true psychoanalysis at the cellular level.”

In place of his dream of finding Freud in the brain, Kandel decided to chase a dream that was only a little less grand. He would find the biological basis of memory. In the mid-1950s, neuroscientists recognized two different kinds of memory: short-term and long-term. Damage to the brain could harm short-term memory without affecting long-term memory. One small region of the brain, known as the hippocampus, appeared to be one of the key regions for allowing us to remember.

Kandel set out to investigate the hippocampus, hoping to find something distinctive about its cells. But nothing earthshattering emerged. It was possible that what was important for memory was not individual neurons, but how they were arranged in a network and communicated with one another. The millions of neurons in the hippocampus would be too complex to analyze. So he needed to find a simpler system. “I thought, the way you solve a problem in biology is you solve its simplest representation.”

Aplysia‘s 20,000 Neurons

The ideal system turned out to be the marine snail Aplysia. It had only 20,000 neurons, as opposed to the 100 billion neurons in the human brain. And its neurons were big—the biggest of any animal, in fact. Kandel was able to study memories in Aplysia by training it. He would nudge the snail before applying a jolt, and it would learn to associate the two sensations. Kandel could then compare the biochemistry of the neurons before and after it had recorded memories of this uncomfortable experience.

Working with Aplysia, Kandel and his colleagues demonstrated that short-term memory formed through the strengthening of connections between neurons. They even identified some of the molecules that made that strengthening possible. For long-term memories, it was necessary to switch on genes in neurons in order to make new proteins, and to make new connections. Once Kandel began to feel confident that he had figured out Aplysia, he moved back to the hippocampus in mice, discovering that many of the same genes and proteins also played an important role in their memories—and, by extension, human memories.

Kandel was recognized for this pioneering work with a Nobel Prize in 2000, but the award hasn’t slowed him down. He is writing a flurry of books, including the newest edition of his doorstop-sized textbook on neuroscience. While studying mice in the 1990s, Kandel began to investigate the molecular changes that occur as the animals get old. Insights from these experiences led to the founding of Memory Pharmaceuticals. The company is now conducting clinical trials on drugs that may boost the cognitive skills of people suffering Alzheimer’s disease and age-related memory loss.

Rogue Proteins

Meanwhile, Kandel and his postdoctoral fellow, Kausik Si, have opened up an entirely new front in the search for memory: the possibility that it shares something in common with mad cow disease. Researchers have shown that mad cow disease is caused by rogue proteins that fold into abnormal shapes, known as prions. Once prions form, they acquire the ability to force other proteins to assume the same shape. Under some circumstances, this runaway shape-change can cause devastating diseases.

But prions may play a helpful role in organisms. Collaborating with Susan Lindquist at the Whitehead Institute for Biomedical Research, Kandel and Si have found that some of the molecules involved in forming long-term memories show signs of behaving like prions in yeast cells. Kandel and Si propose that memories may be stabilized by self-perpetuating proteins. Individual proteins in neurons may be short-lived, but prions might be able to pass on their functional state to other molecules for years.

It’s a hypothesis that demands more experiments, a prospect that delights Kandel. “This gives me unending pleasure,” said Kandel, “because I can’t think of anything else I’d rather do.”

About the Speaker

Eric Kandel, MD, is University Professor at Columbia University, Kavli Professor and director of the Kavli Institute for Brain Sciences, and senior investigator at the Howard Hughes Medical Institute. He received the Nobel Prize in Physiology or Medicine in 2000 and is a member of the President’s Council of the Academy. He is also the author of In Search of Memory: The Emergence of a New Science of Mind (W. W. Norton).

Race and Health Inequalities in Medicine

A woman comforts a man by holding his hand.

A recent conference brought together medical professionals and other researchers to explore the intersection of race, genomics, and health inequities.

Published April 14, 2006

By Alan Dove, PhD

Romulus. Cain. Mr. Hyde. Literature abounds with evil twins. Far less famous, but far more dangerous, are the malevolent siblings that haunt nearly every branch of science. Their fortunes wax and wane, but they never really die. Alchemy now tinkers quietly in chemistry’s attic, but astrology is a media darling that often eclipses her nerdy sister, astronomy. Meanwhile, physicists and mathematicians hear the constant pitter-patter of new generations of cranks, each busily perfecting perpetual motion or squaring circles in his own little corner. A whole gaggle of quackeries trails medical science.

For sheer Gothic horror, though, the doppelganger stalking genetics has no equal. Chiropractors may rip an occasional carotid artery, and a rogue mathematical hermit might send the odd mail bomb now and again, but only eugenics—the belief that the human race can be improved by selective breeding—has spawned continent-wide slaughters of millions of innocent people. Even while genome sequencers amass gigabytes of data supporting the unity of mankind, geneticists continue to look over their shoulders, worried that someone could pervert their findings to terrible new uses.

On December 9, 2005, biological and social scientists met at Hunter College for an interdisciplinary discussion of a particularly dangerous area: the intersection of minorities, genomics, and health inequities. The presentations ranged freely across this contentious triple border, exploring everything from drug development technology to racial profiling. The conference, the 19th Annual International Symposium of the Center for the Study of Gene Structure and Function, featured nearly a dozen excellent talks. A few highlights provide a representative sample of the breadth and depth of topics.

From Racial Profiling in Drug Development to the Genetics of Homosexuality

Conference organizers took a wide view of the term “minorities,” and the discussion spanned everything from the racially profiled drug BiDil to the genetics of homosexuality. Some common themes emerged from these diverse research projects, though, including an enduring division between biological and social scientists on the potential of genomics. While many biologists tend to view the progress of genomics as an unmitigated boon, social scientists remain wary of the new field’s potential for misuse.

Troy Duster of New York University brought this discussion to the biologists’ doorstep, arguing that the science of genomics is already on a collision course with entrenched social problems of race. The completion of the human genome sequence was originally hailed as a landmark demonstration that human races are more similar than they are different, but in recent years old racial concepts have started to creep back into the core of genomics.

As one example, Duster cites BiDil, the first drug with a race-specific prescription label. The original clinical trial of BiDil, a heart disease treatment that targets nitric oxide metabolism, was a failure. Nitromed, the drug’s manufacturer, reanalyzed the data and found that a subset of African American patients benefited from BiDil. The U.S. Food and Drug Administration found the reanalysis persuasive, but Duster systematically dismantled Nitromed’s underlying reasoning.

Should We Have Race-specific Drugs?

“The question for me is, is it nitric oxide deficiency? If this is the case it should be available to all those who have this deficiency; it should not be racialized. What I am opposed to…is the notion that we can [get] a molecular understanding of race with this kind of research,” says Duster.

More ominously, police forces have begun testing new genomic technologies that claim to provide racial profiles from DNA found at crime scenes. This DNA dragnet might catch more criminals, or it might merely amplify existing racial disparities in law enforcement.

David Williams of the University of Michigan underscored the depth of the racial divide with some well-known but still shocking statistics. African Americans lead white Americans in 12 of the top 15 causes of death, translating to about 97,000 excess deaths per year. How many of those 265 deaths a day are due to endemic racism?

On the evidence, quite a few. For example, one of the leading causes of death among blacks in the U.S. is heart disease. Some of that disparity could be genetic, but Williams points out that black Africans have lower rates of hypertension than black or white Americans, clearly pinpointing environmental factors as the main cause. Poor health correlates strongly with low socioeconomic status, but even controlling the data for status does not eliminate the black-white skew. Something other than biology or money is causing the problem.

Racism May Literally Make African Americans Sick

Citing data that show a strong link between perceived racism and cardiovascular disease, Williams argues that a daily grind of inferior treatment can literally make African Americans sick. Another, more pervasive factor is the stubbornly entrenched segregation of housing: even four decades after the Civil Rights Act, black Americans are overwhelmingly concentrated in neighborhoods with fewer jobs, less access to social services, and more crime. Black neighborhoods disproportionately feel the brunt of everything from economic downturns to natural disasters. “When a group is highly segregated, it’s easy to discriminate against that group,” says Williams.

Brian Mustanski of the University of Illinois at Chicago exemplified the breadth of the conference by presenting data on one of the most controversial questions in modern biology: what makes some people gay and others straight? The field of sex research got its first quantitative tools in the 1940s, through the pioneering work of Alfred Kinsey, but sophisticated methods for genome mapping now provide new ways of searching for the biological basis of sexual orientation.

Homosexuality appears to have an inherited component, and earlier pedigree studies suggested that the Xq28 region of the X chromosome might be involved. To probe the issue more deeply, Mustanski and his colleagues mapped genomes from 146 families with at least two gay brothers. The researchers used a technique called microsatellite mapping, which provides a broad but low-resolution view showing which parts of a given genome came from each parent.

Male Homosexuality Appears to be Partly Inherited

Specific regions of chromosomes 7, 8, and 10 co-segregated with male homosexuality in the families. Interestingly, the trait seems to have an imprinted component, meaning some of the genes that determine it behave differently depending on whether they are inherited from the mother or the father. The new work confirmed that the Xq28 region might also help determine sexual orientation, but only in some families.

To date, nearly all of the major studies on the genetics of homosexuality have focused on gay men. Sexual orientation may develop quite differently in women, and studies on female homosexuality will likely require more sophisticated techniques.

Charles Rotimi of Howard University, one of the architects of the Human Genome Project and its descendant, the International HaplotypeMap Genome Mapping (HapMap) Project, discussed his recent work on type 2 diabetes in African populations. Type 2 diabetes, which has reached epidemic proportions as obesity rates have risen worldwide, is especially prevalent among African Americans. The disease, a dysfunction in blood sugar regulation, can lead to blindness, limb loss, heart failure, and stroke. Type 2 diabetes apparently arises from a complex interaction of genetic and environmental factors, making it particularly challenging to study.

In sequencing the human genome and developing the HapMap, researchers used DNA from several populations around the world, including the Yoruba tribe of West Africa. The Africans who were sold into slavery in prior centuries came primarily from a few tribes, including the Yoruba, so a large proportion of modern African Americans are descended from this group. Because West Africans and African Americans share genes but live in radically different socioeconomic environments, they provide a natural experiment for studying type 2 diabetes.

West Africans and African Americans Share Genes, But Not Environments

In a coarse mapping analysis of West Africans, technically similar to the method Mustanski employed in gay men, Rotimi and his colleagues identified specific regions of chromosomes 5 and 19 that may contain genes predisposing some people to diabetes. The researchers now hope to map the genes more finely, using the closely spaced genetic markers that have become available through the HapMap Project.

Though the results are interesting, Rotimi stresses that “conditions like diabetes … are truly complex diseases that are multifactorial at the molecular and the genetic level and also at the environmental level.” Environmentally, the stark social disparities between white and black Americans undoubtedly explain much of the skew in U.S. diabetes rates.

Carlos Bustamante of Cornell University eventually touched on a racial difference in metabolism that has important social impacts, but the bulk of his presentation focused on the universal traits of humanity, and what distinguishes us from other animals. Like many human geneticists, Bustamante has a particular fondness for chimpanzees, since these apes and humans are each other’s closest relatives.

“With the completion of the chimp genome and patterns of variation within the human genome, we can begin to answer the philosophical question of what it means to be human, from a real biological perspective,” says Bustamante. Taking an initial stab at that question, Bustamante and his colleagues analyzed the genomes of the two species with new statistical algorithms. The results reveal protein-coding genes that probably underwent either positive or negative natural selection in the five million years since humans and chimpanzees diverged.

Genome Comparisons Reveal What Makes Us Human

Evolutionary biologists have long argued that mutations are more likely to be harmful than beneficial, and the new data agree: of more than a thousand genes the algorithm highlighted, deleterious mutations appear to have occurred about twice as commonly as beneficial ones during the journey from chimpanzee to human. Reassuringly, the search fingered many genes involved in known genetic diseases, suggesting that the scientists are on the right track to find new medically important genes as well.

The positively selected genes affect smell receptors, components of the immune response, and enzymes that metabolize alcohol. The last group is especially interesting; these genes apparently vary between different ethnic groups, possibly explaining some racial differences in alcoholism rates.

Cause for Optimism and Wariness

Also participating in the event via video confence was John Ruffin, director of the National Center on Minority Health and Health Disparities at the National Institues of Health. Ruffin talked about the major efforts that the NIH was undertaking to coordinate existing national centers in restoring the research infrastructure after the Katrina debacle. He also discussed the strategies that the NIH is deploying to focus national research on understanding the causes of health disparities and addressing these inequities. Additionally, he described techniques for developing more culturally sensitive clinical researchers, and procedures to improve their impact. He saw this conference as an important component of the national effort.

As genomics forges ahead, biological and social scientists clearly have cause for both optimism and wariness. Genetics now proves that all people are fundamentally similar, but the history of eugenics belies our fundamental desire to be different, and the horrific results that can produce. We have met the evil twin, and he is us.

Also read: Big Data’s Influence on the Future of Healthcare


About the Author

Alan Dove earned his PhD in microbiology from Columbia University and is now a science writer and reporter for Nature Medicine, Nature Biotechnology, and Journal of Cell Biology

A New Perspective on the Future of Human Evolution

A gorilla holds her baby in a jungle.

Frank Wilczek, the 2004 Nobel Prize winner and renowned theoretical physicist and mathematician, offers some provocative speculations on the future of human evolution.

Published September 1, 2006

By Frank Wilczek

Archaeopteryx could fly—but not very well. Human beings today can penetrate outside Earth’s airy envelope—but not very well. Our minds can penetrate into realms of thought far beyond the domain they were evolved to inhabit—but not very well. It seems clear that the present form of humanity is, like archaeopteryx, a transitional stage. What will come next? I don’t know, of course, but it’s an entertaining, inspiring—and maybe important—question to think about.

Qualitative Evolution Based on Biology

In the past, evolution has been based on natural selection. Its results are impressive. Yet from an engineering perspective, natural selection is both haphazard and crude—haphazard because no meaningful goal is explicit; crude because it gathers feedback slowly and with much noise.

What we might call its “goal” is simply to keep going. Its “performance criterion” is production of fertile off spring: what Darwin called the struggle for existence. That “goal” is, of course, not a mindful goal, nor is the “performance criterion” a performance criterion in the conventional sense, where we judge how well some concrete task has been accomplished. Yet natural selection, by allowing information to flow from the environment to the replicating unit—the genes—results ineffective adaptation and creative response to opportunities. Famously, it leads to what seems to be inspired designs to achieve what appear (to us) to be concrete goals.

Viewed analytically, evolution’s design methods look terribly inefficient. Feedback arrives once a generation, and its information content is just a few bits, to wit the number and genetic types of surviving offspring. Furthermore, that information content is dominated by unrelated noise, all the complex accidents that impact survival. By way of comparison, we routinely gather gigabytes of useful information every hour by using our eyes and brain to look out at the world. Evolution by natural selection produces impressive feats of creative engineering only because it plays out over very long spans of time (many generations) on a very large stage (many individuals).

The Failures of Classic Eugenics

In the past, eugenics—encouraging certain individuals to reproduce while discouraging others—has been proposed as a path to human improvement. Even leaving moral issues aside, classical eugenics was doomed to failure. Selecting human parents on the basis of a few superficial characteristics is inherently crude and inefficient, with the same drawbacks as natural selection.

Only recently, with increased understanding of genetics, development, and physiology at the molecular level, have truly powerful possibilities for directing evolution begun to arise.

Screening against catastrophic genetic diseases is widely practiced and accepted. But where are the boundaries between disease, substandard performance, and suboptimal performance? Is deafness a disease? Or is tone deafness? Is lack of perfect pitch? Any boundary is artificial, and arbitrary boundaries will be breached. What’s in store for the future? Some, if not all, parents will seek to produce the best children they can, according to their own view of “best.” Parental (or governmental?) selection will replace natural selection as an engine of human evolution. Selection by genetic screening will be much more efficient.

What goals will parents pursue? (Note: I say will, not should.)

The most obvious goal is to improve health, broadly defined to include both physical vitality and longevity. The popularity of performance-enhancing drugs for athletes, of diets and food supplements, and, of course, our vast investment in medical research, attest to our powerful drive toward that goal. In this area, the most fundamental issue is aging.

The Realm of Molecular Investigation

After a long exile at the fringes of biology, the question of why we age, and what can be done to combat that process, has now firmly entered the realm of molecular investigation. Decisive progress may or may not come within a few years, but in a few decades it is likely, and in a century almost certain. Future humans will be healthier and live much longer than we do. They may be effectively immortal—and they’ll all have perfect pitch.

A second goal is more powerful intelligence. It may not be obvious, especially if you pay attention to the American political scene, but the evidence of nature is that there is intense pressure toward the evolution of increased intelligence. In the six million years or so since protohumans separated from chimpanzees, even bumbling natural selection has systematically upgraded our brains and enlarged our skulls, despite the steep costs of difficult childbirth and prolonged infancy. I suspect an important part of the pressure for intelligence comes from sexual selection: finding a mate is a complicated business, and women in particular tend to be choosy.

The salient facts here are: first, that it was possible to come so far so fast (on an evolutionary timescale!), and second, that the limiting factor is plausibly the mechanics of childbirth. Together, these facts suggest that tuning up production of bigger and better brains may be simple, once we find the tuning mechanism. More generally, better understanding of the molecular mechanisms behind development and learning gives new hope for improving mental vitality, just as understanding molecular genetics and physiology does for physical vitality.

Qualitative Evolution Based on Technology

Biological evolution, whether based on natural or parental selection, is intrinsically limited. Early design decisions, that may not be optimal, were locked in or forced by the physical nature of Earth-based life. Some of those decisions can be revisited through the addition of nonbiological enhancements (man-machine hybrids); others may require starting over from scratch.

The concept of a man-machine hybrid may sound exotic or even perverse, but the reality is commonplace. For example, humans do not have an accurate time sense, or absolute place sense, or the ability to communicate over long distances or extremely rapidly, or the ability to record sensory input accurately. To relieve these deficiencies, they have already become man-machine hybrids: by wearing a watch, using a GPS system, and carrying a cell phone, a Blackberry, and a digital camera.

Of these devices, only the watch was common ten years ago (and today’s watches are more accurate and much cheaper). Many more capabilities, and more seamless integration of man and machine, are on the horizon. For better or worse, much of the cutting-edge research in this area is military.

In other cases, incremental addition of capability may not be feasible. To do justice to what is possible, radical breaks will be necessary. I’ll mention three such cases.

Hostility to Human Physiology

The vast bulk of the universe is extremely hostile to human physiology. We need air to breathe, water to drink, a narrow range of temperatures to support our biochemistry; our genetic material is vulnerable to cosmic radiation; we do not thrive in a weightless environment. As a practical matter, our major ventures into space will be by proxy. Our proxies will be either humans so modified as to clearly constitute a different species; or, more likely, new species we design from scratch, that will contain a large nonbiological component.

The fundamental design of human brains, based on ionic conduction and chemical signaling, is hopelessly slower and less compact than modern semiconductor microelectronics. Its competitive advantages, based on three-dimensionality, self-assembly, and fault tolerance, will fade as we learn how to incorporate those ideas into engineering practice. Within a century, the most capable information processors will not be human brains, but something quite different.

Recently, a new concept has emerged that could outstrip even these developments. Physicists have realized that quantum mechanics offers qualitatively new possibilities for information processing, and even for logic itself. At the moment, quantum computers are purely a theoretical concept lacking a technological realization, but research in this area is intense, and the situation could change soon. Quantum minds would be very powerful, but profoundly alien. We—and this “we” includes even highly trained, Nobel-Prize-winning physicists—have a hard time understanding the subtleties of quantum mechanical entanglement; but exactly that phenomenon would be the foundation of the thought processes of quantum minds!

Where Does it Lead?

A famous paradox led Enrico Fermi to ask, with genuine puzzlement, “Where are they?”

Simple considerations strongly suggest that technological civilizations whose works are readily visible throughout our Galaxy (that is, given current or imminent observation technology) ought to be common. If they were, I’d base my speculations about future directions of evolution on case studies! But they aren’t. Like Sherlock Holmes’s dog that did not bark in the nighttime, the absence of such advanced technological civilizations speaks through silence.

Main-sequence stars like our Sun provide energy at a stable rate for several billions of years. There are billions of such stars in our Galaxy. Although our census of planets around other stars is still in its infancy, what we know already makes it highly probable that many millions of these stars host, within their so-called habitable zones, Earth-like planets. These bodies meet the minimal requirements for life in something close to the form we know it, notably including the possibility of liquid water.

On Earth, the first emergence of a species capable of technological civilization took place about one hundred thousand years ago. We can argue about defining the precise time when technological civilization itself emerged. Was it with the beginning of agriculture, of written language, or of modern science? But whatever definition we choose, the number will be significantly less than one hundred thousand years.

In any case, for Fermi’s question the most relevant time is not ten thousand years, but closer to one hundred. This marks the period of technological “breakout,” when our civilization began to release energies and radiations on a scale that may be visible throughout our Galaxy. Exactly what that visibility requires is an interesting and complicated question, whose answer depends on the means available to hypothetical observers.

Sophisticated Extraterrestrial Intelligence

We might already be visible to a sophisticated extraterrestrial intelligence through our radio broadcasts or our effects on the atmosphere. The precise answer hardly matters, however, if anything like the current trend of technological growth continues. Whether we’re barely visible to sophisticated though distant observers today, or not quite, after another hundred years of technological expansion we’ll be easily visible.

A hundred years is less than a part in ten million of the billion-year span over which complex life has been evolving on Earth. The exact placement of breakout within the multibillion-year timescale of evolution depends on historical accidents. With a different sequence of the impact events that lead to mass extinctions, or earlier occurrence of lucky symbioses and chromosome doublings, Earth’s breakout might have occurred one billion years ago, instead of one hundred.

The same considerations apply to those other Earth-like planets. Indeed, many such planets, orbiting older stars, came out of the starting gate billions of years before we did. Among the millions of experiments in evolution in our Galaxy, we should expect that many achieved breakout much earlier, and thus became visible long ago. So, where are they?

Several answers to that paradoxical question have been proposed. Perhaps our simple estimate of the number of life-friendly planets is for some subtle reason wildly overoptimistic. Furthermore, perhaps, even if life of some kind is widespread, technologically capable species are extremely rare. Perhaps breakout technology quickly ends in catastrophic warfare or exhaustion of resources. There are uncertainties at every stage of the argument. Even so, like Fermi, I remain perplexed.

The preceding discussion suggests another sort of possibility: they’re out there, but they’re hiding.

Quantum Quiet

If the ultimate information processing technology is deeply quantum-mechanical, it may not be energy-intensive. Excessive energy use brings heat in its wake, and heat is a deadly enemy of quantum coherence. More generally, quantum information processing is extremely delicate, and easily spoiled by outside disturbances. It is best done in the cold and the dark. Quantum minds might well be silent and isolated by necessity.

Silence and inner contemplation can also be a choice. The ultimate root of human drives remains what our selfish genes, in the struggle for existence, have imprinted. That root is apparent in many of our behavior’s most obvious priorities, which include fending off threats from a hostile environment, finding and attracting desirable mates, and caring for the young.

Those priorities involve active engagement with the external world. The products of deliberate biological or technological evolution, as opposed to natural selection, could have quite different motivations. They might, for example, seek to optimize their state according to some mathematical criterion (their utility function). Having found an optimum state, or several excellent ones, they could choose ever to relive selected moments of perfect bliss, perfectly reconstructed. This was the temptation of Faust:

If I say to the moment:

“Stay now! You are so beautiful!”
Then round my soul the fetters throw,
Then to perdition let me go!

Humans were not built to treasure a Magic Moment, nor could they reproduce such a moment reliably and in detail. For our evolutionary successors, that Faustian temptation will be much more realistic.

Also read: Resolving Evolution’s Greatest Paradox


About the Author

Frank Wilczek is the Herman Feshbach professor of physics at MIT. He received the Nobel Prize in Physics in 2004 for the discovery of asymptotic freedom in the theory of the strong interaction. He is the author, with Betsy Devine, of Longing for the Harmonies: Themes and Variations from Modern Physics (W. W. Norton) and the recently published, Fantastic Realities (World Scientific).

How the Maillard Reaction is Linked to Disease

Various cooks prepare a dish consisting of eggs and vegetables.

Scientists who study the chemistry of how food is cooked are exploring promising therapies to treat an array of diseases, from diabetes to Alzheimer’s.

Published January 20, 2006

By Jill Pope

Image courtesy of bernardbodo via stock.adobe.com.

It’s a chemical reaction central to daily life: the Maillard reaction browns our toast and makes roasted coffee smell wonderful. Oh yes, and it’s going on in our bodies all the time.

What happens when sugars and proteins are heated was first described in 1912, and it has intrigued food scientists for 50 years. Over the last 20 years, biomedical scientists have become fascinated as well. We now know that Maillard chemistry plays a role not just in normal aging, but also in a staggering array of age-related chronic conditions, among them atherosclerosis, diabetes, cardiovascular disease, and neurodegenerative diseases such as Alzheimer’s.

How are cooking and body processes related? Susan Thorpe, a prominent biochemist in the Maillard field who is based at the University of South Carolina, explains, “Much as we don’t like to think of our bodies this way, we are protein, sugar, and fat, and we are cooking at a low temperature.”

A Visionary’s Paper is Ignored

Louis Camille Maillard was a French physician and chemist who in 1912 wrote a paper, impressive in hindsight, describing a nonenzymatic browning reaction (that is, one not jump-started by enzymes) that occurred when he heated amino acids with sugars. His work suggested that the reaction might take place in the human body, and he even imagined the critical role we now know it plays in diabetes. At the time, the paper caused no stir.

It wasn’t until the late 1940s that food scientists became interested. For the next 25 years, they learned how the reaction improves the aroma, flavor, and texture of cooked foods. They also put some effort into finding ways to prevent this chemistry from causing undesirable changes in colors and flavors in foods that had to be stored a long time, such as powdered eggs and instant potatoes.

Then, in 1969, the reaction was recognized in the human body. Samuel Rahbar, now at the City of Hope National Medical Center and Beckman Research Institute, found while searching for a genetic marker for diabetes that his diabetic patients had glucose attached to their hemoglobin (the protein that carries oxygen). It had previously been assumed that the Maillard reaction required higher temperatures than those found in vivo.

Rahbar’s discovery of glycated hemoglobin had a major impact on diabetes management, giving doctors a better screening tool and patients a more reliable way to monitor blood sugar. It also opened dozens of research avenues. Once it was shown, in the late 1970s, that the reaction happened in all plasma proteins, biological research in this area took off.

Case in Point: A Lens Protein

The Maillard reaction is really a series of reactions. As an example, consider what happens when an eye protein encounters sugar. A long-lived protein, such as a lens protein, condenses with a sugar in a process called glycation. In subsequent reactions, the damaged lens protein is further abused by sugar as well as by oxidants (free radicals). When the chemistry is done, our lens protein has permanent glucose structures attached to it and appears brown under UV light. And it has a new name: advanced glycation endproduct (AGE).

AGEs accumulate with age and in age-related diseases. Many scientists believe they cause inflammation, loss of flexibility in tissues and organs, and ultimately, impaired function. In the case of our lens protein, the result could be cataracts.

Even the healthiest among us are accumulating AGEs in our tissues as we get older. But because of their elevated blood sugar, diabetic people accumulate AGEs much earlier in life than nondiabetic people. This buildup is seen in kidney disease, eye damage, and nerve damage—suggesting that AGEs are major contributors to diabetes complications. Tissues that depend on flexibility, such as the heart and blood vessels, are also affected.

Not everyone agrees with the theory that damaged protein accumulation causes aging and disease. It may turn out that AGEs simply correlate highly with life-threatening diseases in some other way. But debating that question is less important to many than stopping the damaging cycle.

Stop the Chemistry, I Want to Get Off

In light of the havoc Maillard chemistry can wreak in the body, there is considerable interest in finding ways to stop it, or at least slow it down.

Several Maillard inhibitors have been developed. One is Biostratum’s Pyridorin (pyridoxamine), a member of the vitamin B6 family that blocks AGE formation. Pyridorin is being tested in clinical trials for the treatment of diabetic kidney disease. Three Phase II clinical trials have been completed, and Phase III trials are planned. In the studies, scientists measured patients’ levels of serum creatinine, a widely accepted indicator of impaired kidney function. Treatment with Pyridorin significantly decreased the rate at which creatinine levels rose.

Another inhibitor now in preclinical (animal) trials at Biostratum, BST-4997, works by intervening at a different point, but appears to be even more effective. These drugs offer the potential to slow the progress of kidney disease, giving people more dialysis-free years.

Crosslinks: Reversing the Irreversible?

AGEs are notorious for forming protein crosslinks—becoming closely networked and resistant to being broken. Pimagedine (aminoguanadine, developed by Alteon), is a third kind of Maillard inhibitor for diabetic kidney disease that works by blocking the formation of protein crosslinks. The drug has been shown effective in clinical trials thus far, significantly reducing the amount of protein patients excreted in their urine.

Another substance moving through clinical trials may cause scientists to rethink AGEs entirely. Alagebrium (also by Alteon), the first AGE breaker, appears to work by cutting these protein crosslinks, and is being tested in patients with heart disease. Studies presented at the American Heart Association Scientific Session in November 2005 reported that it caused significant reduction in the mass of the left heart ventricle, a decrease in stiffening of the arteries, and improved function of the lining of the blood vessels. Alagebrium, and other crosslink breakers that may follow it, hold out a previously unimagined possibility—restoring function and flexibility to tissues and organs that have already sustained damage.

Treating Alzheimer’s by Blocking a Receptor

Alzheimer’s sufferers have been found to have three times the amount of AGEs in their brains as do healthy counterparts of the same age. But there is hope: a number of animal studies are looking at ways to treat Alzheimer’s by blocking the receptor for AGE (RAGE). Research suggests that the receptors that bind AGEs may also bind the proteins that accumulate in Alzheimer’s.

If the AGE receptor can be blocked, the accumulation of “senile plaques” in animal brains can also be limited. In one clever ploy, Yasuhiko Yamamoto of Kanazawa University, Japan and coworkers created a decoy receptor, called sRAGE, which they found trapped AGEs and competed with destructive RAGE-AGE communication.

A Role for Diet

What impact do browned foods have on our health? Maillard reaction products are mainly absorbed in the small intestine, and about 10% of dietary AGEs are absorbed in the bloodstream. According to Jennifer Ames, professor of human nutrition and health at the School of Biological and Food Sciences, Queen’s University, Belfast, Northern Ireland, most of the work on AGEs in diet has looked at how they affect atherosclerosis. Results suggest that a low-AGE diet is better for health—”especially for people who have, or who are at risk of developing, diseases related to inflammatory processes,” she says.

In light of these and other findings, Helen Vlassara of the Mount Sinai School of Medicine suggests that people reconsider the AGE content of common foods. Foods higher in fat and protein, such as meat and cheese, will give higher AGE levels. And in general, cooking at a higher temperature creates higher levels of AGEs. Sautéing, steaming, and poaching create fewer Maillard products than frying, grilling, and broiling.

Because oxidants contribute to Maillard chemistry, a diet rich in antioxidants may protect against disease. Toshihiko Osawa of Nagoya University and Yoji Kato of the University of Hyogo have found that antioxidative foods, such as turmeric, can prevent diabetic complications in rats. They also examined the role of glutathione (GSH), an antioxidant found in broccoli and pork, and found that it prevented diabetic kidney and nerve disease.

Eat Less, Live More

Like aging, Maillard chemistry seems inevitable. Drugs may soon help counter the damage. And, to the extent that we can fight it, eating more antioxidant-rich foods, and fewer char-broiled steaks, may help. But, at least in animal studies, only one thing has been shown to extend life—eating less. Most of us in America are eating too much, and an epidemic of type II diabetes is part of the price we pay. The best advice may sound familiar: eat a balanced diet, with lots of fresh fruits and vegetables, and don’t overeat.

Learn more about the Academy’s Nutrition Science program.


About the Author

Jill Pope writes about science and policy issues. She served as Senior Editor for The Cutting Edge: An Encyclopedia of Advanced Technologies (Oxford University Press, 2000).

The Story of a 25 Year Collaboration

Xrays from a brain scan.

Scientific collaborators Torsten Wiesel and David Hubel made significant advances in our understanding of the brain and perception. Their achievements were a work in progress for roughly a quarter century.

Published August 23, 2005

By Dorian Devins

An air of camaraderie pervaded The New York Academy of Sciences (the Academy) on March 31, 2005 as scientific collaborators Torsten Wiesel and David Hubel were joined by fellow Nobelist Eric Kandel in celebration of Wiesel and Hubel’s recently-published book, Brain and Visual Perception: The Story of a 25-Year Collaboration. The full-to-capacity house included several scientific luminaries and at least one other Nobel Prize winner in the audience.

Kandel kicked off the evening with a vivid description of the pair’s groundbreaking work, characterizing it as “the most important advance in understanding the brain since Ramón y Cajal” at the turn of the 20th century. Santiago Ramón y Cajal won the Nobel Prize in Physiology or Medicine in 1906 in recognition of his work on the structure of the nervous system. While Cajal’s work centered on the morphological aspects of interconnections between different parts of the brain, Wiesel and Hubel’s work used modern cellular physiological techniques to show how these connections filter and transform sensory information both within and on the way to the primary visual cortex.

According to Kandel, “using imagination in addition to methodology is the key to the Hubel and Wiesel success.”

Hubel and Wiesel made several major contributions to our understanding of the brain and perception, including new insights into how the cerebral cortex functions in transforming sensory information. They also did work on binocularity, cellular organization in orientation and ocular dominance, and visual sensory deprivation.

Processing Visual Information

Our dominant sensory experiences are visual, and Wiesel and Hubel’s work showed how visual information is processed in the first few stages after it reaches the brain. They found that the part of the cortex devoted to the early stages of visual processing is arranged in columns, within which the nerve cells have common response properties. An analysis of the image is compiled from this information, and results in what we see.

In other experiments the team also investigated how visual deprivation affects development, which they tested by unilateral lid closure. Hubel and Wiesel found that when one of a newborn kitten or monkey’s eyelids is sutured shut for several weeks or months, the animal is found to be blind once the eye is reopened. When the eye closure is done in adult cats no such result is obtained. In both cats and monkeys there is thus a “critical period” of plasticity, following which sensitivity to deprivation declines and finally disappears.

Their work yielded profound findings, especially in the area of neural circuitry. Kandel described the cerebral cortex’s capability of carrying out novel kinds of transformation of a visual image. Hubel and Wiesel realized that the image is decomposed and then reconstructed later, and their findings influenced not just neuroscience but also areas like cognitive psychology, where they allowed practitioners to develop the idea that the brain creates an internal representation of the outside world. For this work, Hubel and Wiesel were awarded the Nobel Prize in Physiology or Medicine in 1981.

The People Behind the Science

As Kandel pointed out, however, Hubel and Wiesel’s science itself was just one aspect of the evening’s program. It was also an occasion to celebrate their book, a collection of their major papers along with biographical and historical information. But perhaps most importantly, Kandel and the audience were assembled to honor the long and productive collaboration and friendship of these two very different people.

Kandel characterized David Hubel as “whimsical and anti-authoritarian,” someone who “probably couldn’t run a grocery store,” a creative and musical person perhaps not most at home as an administrator. Torsten Wiesel, on the other hand, was a “quiet, humble person who has emerged as really one of the great scientific leaders in the academic community,” in Kandel’s words. “You name it, he runs it!”

As the evening progressed, Hubel and Wiesel reminisced about their partnership. Hubel spoke of the difficulty of working while they were at the Salk Institute in La Jolla. The lull of the surf and general air of relaxation there was not a great motivation to get to the lab. Of their time together overall, he said it was like a “half-century long trip on a roller coaster.”

A Brotherly Relationship

Working and travelling together in their younger days created a brotherly relationship between the two. “We didn’t want to tell people that this kind of work isn’t horribly tedious because we thought that would invite competition,” said Hubel. Much of their success was due to the luck of finding each other at just the right point in the science’s history and in their own careers, and in their hitting it off as they did.

Typically modest, Wiesel gave credit to Hubel for most of the work necessary to create their recent book. In terms of their work in science overall, he said, “You may have the technical skill and the imagination and so on, but you also need luck in life to really have success.”

Wiesel also attributed much of the success of their careers to Steve Kuffler, one of the leaders in the emerging field of neuroscience in the 1950s and ‘60s. Kuffler had been chairman of the department of neurobiology at Harvard while Wiesel and Hubel were there, and his respect was reserved for those who showed up in the lab, did the experiments, and wrote them up. Wiesel said that despite his many administrative accomplishments, “I feel Steve Kuffler would look at me [now] with some disdain and state, ‘Torsten, why did you leave the lab? You’re supposed to do experiments!’”

What Makes Scientists Tick

Wiesel reiterated that he and Hubel had worked very reclusively. From early morning until late at night, they performed every component of their own experiments, from preparation of the animals at the outset to washing the glassware afterwards. By maintaining this atmosphere of privacy, they were able to keep the “primacy of the thoughts and the ideas” from being diluted. Over the years they did not work with many graduate students and postdocs, but were fortunate in the quality of those they did have.

This strategy obviously paid off. Wiesel attributes their motivation and that of scientists in general to “random reinforcement,” like that of B.F. Skinner’s famous pigeons. Wiesel and Hubel’s early discoveries about the visual cortex a few months into their work made it seem to them that one thing naturally led to next. They began without a hypothesis; rather, they had set out to use the new technology of the microelectrode to record the cells in different parts of the brain and try to understand how the cells cooperate. According to Wiesel, “We were explorers of unknown territory.”

Wiesel cited some other important factors aside from luck that lead to success in the sciences, including choosing the right problem to tackle, being observant, and having the right attitude and mentor. As a young scientist who came from Sweden to the U.S to learn more about the brain, he was frustrated by the limited knowledge available in the area. If he and Hubel hadn’t met and formed such a productive relationship, he would have returned to Sweden. One thing Wiesel worries about is that the current academic system does nothing to redirect those who might be better suited to other careers.

The Evolution of Neuroscience

In a question-and-answer session, Wiesel explained that the initial phase of their work was explorative, followed by a period of asking questions. Hubel stated that there was a misconception that “to do proper science it should be done in the image of physics, or the way most people think of physics.” He and Wiesel got ideas and tested them, but “we never would’ve expressed it in such exalted terms of having a hypothesis. One shouldn’t make up rules as to how Science with the capital ‘S’ is done or should be done.”

What surprises might be ahead for neuroscience in the way that Hubel and Wiesel’s discovery of orientation-specific cells once were? According to Wiesel, insights come in stepped points and are quite unpredictable. The study of olfaction has yielded some profound insights, but, for instance, our understanding of hearing is comparatively primitive. It is now an area of great interest to neuroscientists. The next frontier is unknown.

The field of neuroscience has grown exponentially over the years. David Hubel and Torsten Wiesel’s groundbreaking work has been in no small way responsible for this. As it was best put by Eric Kandel, when we celebrate Wiesel and Hubel, “we’re not only celebrating science at its best, we’re not only celebrating two extraordinary people and a wonderful collaboration,” but we’re also celebrating “the reason science is exciting. We’re really celebrating the whole scientific enterprise in celebrating the two of them.”

Also read: Discovering Cancer Therapies through Neuroscience


About the Author

Dorian Devins is a New York-based radio producer whose programs have aired for over 10 years on WFMU, 91.1 FM in the greater metropolitan New York area. For three years she produced and hosted The Green Room, a weekly science radio program which was carried both on the radio and the Web. She currently hosts The Speakeasy, a weekly arts and cultural interview program. She has also conducted an ongoing series of interviews for the National Academy of Sciences’ Web site, does freelance writing, and works as an acquisitions editor of technical physics books.

Devins’ background has been mostly in the arts and publishing. She was founder and executive director of Science Matters, Inc., a nonprofit organization dedicated to the public understanding of science.

The Physiology of the New York Schvitz

A wooden bucket with water inside a sauna.

A scientific explanation for an ancient Lower East Side indulgence that dates back to at least the 18th century.

Published July 26, 2005

By Ken Howard Wilan

Image courtesy of nyul via stock.adobe.com.

Some people like heat. Extreme heat. Hotter even than a NYC subway station in August. The kind of heat that gets your eccrine sweat glands pumping liquid through the skin for major evaporative cooling. Heat that bakes the mind and forces the aptly named “insensible perspiration” to bubble up from tissues through cells and blood. The kind of screaming hot temperatures that make your capillaries dilate to bring more blood circulating to your body’s surface to radiate away the heat. Triple-digit degrees that cause your heart to pound quicker to counter the initial drop in blood pressure caused by the dilation of blood vessels.

In other words, a schvitz.

The Finns have their saunas, but in New York you’ve got the schvitz. A place to relax, kick back, and massively sweat.

19th Century Origins

Perhaps the granddad of New York schvitzes is the East Village’s Russian and Turkish Bath House, around since 1892. Its subterranean sauna looks like a boiler room inside a cave, with seating installed as an afterthought. The dark sweat room reaches temperatures of 194 degrees Fahrenheit, enough to kick anyone’s body into high sweat.

“I’m not a scientist, but it gets impurities out of the system,” claims Dmitry Shapiro, a manager at the bathhouse.

He isn’t too far off. A 15-minute sweating session can lift one liter of water from your body, along with excess salt, lactic acid (which may cause stiff muscles and fatigue), urea (which is a waste normally excreted through the urine), and minute amounts of metals the body has taken in from the environment such as copper, lead, zinc and mercury. So for a workout for your skin, which, as your largest organ comprises about 30% of your body, hang out on the subway platform or try a schvitz.

Also read: Scientists Can Help Prepare for Record Heatwaves

A New Look at an Ancient Pain Remedy

A closeup shot of the bud of an indoor cannabis plant.

Despite legal restrictions in some states, cannabis has reemerged for its medical benefits in recent years, though its history dates back centuries.

Published April 1, 2005

By Alan Dove, PhD

Image courtesy of aon168 via stock.adobe.com.

While some researchers are pursuing genomic strategies to understand the causes of chronic pain, others are reversing the problem, starting with an ancient painkiller and trying to understand how it works.

Cannabis sativa and its close cousin Cannabis indica, better known as marijuana, have been used as medicinal herbs for centuries, and many patients suffering from chronic pain still use this herbal remedy today, despite its obvious drawbacks. To provide the painkilling benefits of marijuana without the side effects and legal troubles, pharmaceutical companies are now searching for more selective drugs that will use the same molecular targets.

On Oct. 26, 2004, Roger Pertwee, professor of neuropharmacology at the University of Aberdeen and an expert on pharmaceutically useful cannabinoids, gave the Academy’s Biochemical Pharmacology Discussion Group a briefing on the state of the science in this field. Marijuana contains more than 60 different cannabinoid compounds, and most are still poorly understood. These cannabinoids tap into a natural signaling system involving the body’s own endocannabinoids, which appear to control a wide range of physiological and pathological processes.

Early studies focused on a single cannabinoid, delta-9 tetrahydrocannabinol (THC), the main psychotropic ingredient of marijuana. Simple THC preparations are now prescribed to suppress nausea and stimulate appetite in cancer and HIV patients, but they are only moderately effective. A major breakthrough came in the early 1990s, with the discovery of CB1 and CB2, the receptors that bind cannabinoids in humans.

Popular in New Drug Development

CB1 and CB2 proteins are woven into the cell membrane, leaving loops of receptor protein hanging into the cell and the extracellular space. The structure is typical of receptors that act through multipurpose signaling molecules, called G proteins. G protein coupled receptors, including CB1 and CB2, are involved in a huge range of cellular responses. They also are among the most popular targets for new drug development.

CB1 is found on neurons, and stimulating it inhibits the release of neurotransmitters that communicate nerve impulses. In contrast, CB2 is seen primarily on cells of the immune system, and appears to modulate the release of cytokines that direct the immune response. Chemists have developed selective agonists that can stimulate either or both receptors.

Besides the agonists that stimulate CB1 and CB2, researchers have developed compounds that have the opposite effect. The most famous of these is Rimonabant, also known as Acomplia, a CB1-targeting drug currently being developed by Sanofi-Aventis for a variety of indications.

While the opposite of an agonist is usually called an antagonist, the story is more complicated in the cannabinoid system. A receptor antagonist blocks activation of the receptor. Rimonabant and related compounds go a step further.

“Their pharmacology is somewhat complicated,” says Pertwee. “They don’t just block. They produce effects themselves, and those effects are opposite to what you get with an agonist.”

For example, while CB1 receptor agonists inhibit neurotransmitter release, inverse agonists specifically stimulate neurotransmitter release from neurons. In animals, cannabinoid agonists act as painkillers, while Rimonabant actually amplifies pain responses. Rimonabant also exacerbates tremors and spasticity in a mouse model of multiple sclerosis (MS), whereas cannabinoid agonists reduce those symptoms.

Numerous Applications

Targeting the cannabinoid system could have numerous applications, as the investors buzzing about Rimonabant have already realized. Pertwee focuses on cannabinoid analogs’ potential uses as painkillers and as treatments for MS.

In animal models, CB1 agonists reduce acute and inflammatory pain, as well as the difficult-to-treat neuropathic pain that is untouched by traditional opioids. This aligns nicely with the patterns of CB1 expression in the nervous system, where it appears in areas of the brain and peripheral nerves involved in pain perception.

CB1 also is in the brain regions responsible for controlling movement. Satisfyingly, CB1 agonists reduce tremors and spasticity, and may even reverse the demyelination process in animal models of MS. CB2 agonists also reduce pain, including neuropathic pain. This is surprising, because CB2 is not known to be expressed on neurons.

Drug developers are now pursuing many strategies to improve the benefit-to-risk ratio for cannabinoid receptor activation in the clinic. These include targeting CB1 receptors outside the central nervous system, selectively activating CB2 receptors, and elevating endocannabinoid levels by delaying their removal from their sites of action.

Still another approach is to enhance the response of CB1 receptors to endogenously released endocannabinoids, by activating an allosteric site that Pertwee and his colleagues recently discovered on the CB1 receptor.

Meanwhile, patients suffering from chronic pain or MS continue to use marijuana and THC-containing extracts. Though this is less than ideal, Pertwee points out that when subjective reports from patients are taken into account, “My own view is that the benefits outweigh the risks.”

Also read: New Age Therapeutics: Cannabis and CBD

A Public Good: Accelerating AIDS Vaccine Development

A medical professional wearing rubber gloves and a facemask, draws a liquid from a vile using a syringe.

Researchers are making strides in the research and drug development necessary to combat the deadly HIV/AIDS epidemic, but more needs to be done to achieve this goal.

Published January 1, 2005

By Marilynn Larkin

More than 20 years into the HIV/AIDS epidemic, there is still no end in sight to this dreaded disease. Worse, the number of new cases of HIV/AIDS continues to climb, particularly in the less developed world.

In a presentation this July sponsored jointly by The New York Academy of Sciences’ (the Academy’s) Science Alliance and Rockefeller University’s Postdoctoral Association, Seth Berkley, founder, president, and CEO of the International AIDS Vaccine Initiative (IAVI), painted a disturbing picture of the magnitude of the epidemic, underscoring the scientific and advocacy work that needs to be done to quell it.

IAVI is a public–private partnership dedicated to putting an end to the AIDS epidemic. The organization offers financial and technical support to preventive vaccine research and development, serving as an advocate for sound public policy and as a community educator about AIDS and the clinical studies necessary to halt the disease. Scientists working with IAVI are playing vital roles in these endeavors, from basic science to regulatory issues, product management, and communicating with the media and public health officials.

“You need skill sets,” Berkley said. “But, what we really want at IAVI are people who care about the preventive vaccine issue and who are willing to dedicate themselves to trying to drive it forward. Then we match those desires with the career opportunities that are out there.”

A Two-Pronged Approach

A two-pronged approach is needed to deal with the devastation. One is to focus on the short-term emergency – preventing further spread of the virus, treating individuals who are infected, and mitigating the societal consequences. But he stressed that there also needs to be a long-term view, including creating the tools needed to end the epidemic entirely: female-controlled barrier methods and microbicides; diagnostics to improve treatment and control sexually transmitted diseases; and HIV vaccines. “A preventive vaccine is the only way we’re going to end the epidemic,” he said, “and we should settle for nothing less than ending the epidemic.”

AIDS vaccines are special in that their use would result in an international public good, Berkley emphasized. Simply put, that means the vaccine goes beyond individual protection. The message for policymakers, therefore, is that investing in HIV vaccines makes sense because it affects public health.

Several hurdles must be overcome before this potential global good becomes a global reality, however. For one thing, as drug candidates move from preclinical to phase-1 trials, success rates are low. Moreover, vaccines must be made available at low cost, which make them less attractive as investments. The result: Today’s market for vaccines is only about 1-2% of the market for pharmaceuticals. For HIV, that market is mostly in the less developed world, where companies are least likely to realize profits.

Lack of Funding

Vaccine development also is hampered by a lack of research funding. Public sector organizations – such as the National Institutes of Health in the United States, the Medical Research Council in the UK, and the ANRS [Agence Nationale de Reserche sur le Sida] in France – usually are national in their outlook and are not necessarily able to take a global view, said Berkley. Hence, the mission of IAVI: Ensure the development of a safe, effective, accessible, preventive HIV vaccine for use throughout the world.

While more than 30 HIV products are moving into trials around the world, the pipeline is duplicative. “Candidates are focused primarily on cell-mediated immunity, with little emphasis on neutralizing antibodies or mucosal vaccines. Also, the time from preclinical studies to market is far, far too slow.”

“So, here’s the take-home message,” said Berkley. “Twenty-three years into the worst viral infectious disease epidemic since the 14th century, only one vaccine candidate has been fully tested to see if it works. That is unbelievable. And with 14,000 new infections daily, speed is of the essence. We have to compress every aspect of vaccine development and access, without compromising safety.”

The challenge is to take the standard timeline, which is 35-plus years, and squeeze it down with parallel track approvals and deployment so that a safe, effective vaccine is licensed in most countries within 10 years after the start of preclinical research, with widespread access in less developed countries in less than 20 years.

Other Challenges

When a product is ultimately confirmed as efficacious, however, other challenges arise. One is pricing. What you want is to have the wealthiest nations pay the most, and the very poor pay as close to manufacturing cost as possible. “The problem here is getting wealthy country policymakers such as the United States Congress and European Union, which are focused on lowering their own domestic health care budgets, to accept that type of differential pricing.” Production poses yet another challenge. “We’ll want massive doses of the candidate produced in a timeline that allows us to immunize people at high risk, especially adolescents.”

“Advocacy, policy change, and scientific progress have to go hand in hand,” Berkley said. “Good science alone won’t get you products. Product development alone won’t get you there. You need to have it all together.”

This statement led to a discussion of how the IAVI has affected the global effort to develop an HIV vaccine. “We started out in 1994 to do something that seemed audacious in its magnitude, and yet there’s no question that we have, in fact, affected the worldwide effort. There are a lot more resources and a lot more attention being paid to a preventive vaccine. However,” he conceded, “the effort is still grossly inadequate.

“If you have a fatal disease, you’ll do anything to get treatment. But when it comes to prevention technology, there isn’t the same cry that there is for treatment.”

The Role of Advocacy

By contrast, Berkley continued, “there’s no movement for AIDS vaccines because when a mother has a child, she says, ‘my child’s not going to be bisexual. My child is never going to experiment with IV drugs.’ Nobody wants to sit there and say, ‘gee my kid may do this,’ and so there’s no advocacy for the creation of a vaccine for the next generation.”

To accomplish this, IAVI forms partnerships with organizations around the world. These organizations help IAVI staff working in their countries to build relationships, work with the media, and with the science community. Thus, for example, when you’re in Germany, you have a German group that has links, speaks the language, understands the system.” The result has been successes in countries such as India, where leaders of two opposing parties both stood up during a conference and stressed the importance of AIDS vaccines.

Does this mean the advocacy effort is a success? Yes and no. On the one hand, global spending on vaccine product development has increased from $125 million in 1994 to about $650 million in 2002. But the five-fold increase in spending is still only a “very small sliver” of HIV/AIDS spending overall, and less than 1% of total health-related R&D spending.

A Global Laboratory Network

Nonetheless, IAVI continues to have a hand in many vaccines currently in development, with more than 25 principal R&D partners working on six major vaccine projects, the Neutralizing Antibody Consortium, and human and non-human core laboratories. IAVI has also created a global laboratory network to ensure standardization of results from lab to lab. Moreover, to further compress the development timeline, IAVI is conducting trials in parallel in different countries.

The result, said Berkley, “is that we were able to bring five vaccines into the clinic in five years. The only other group that was able to do that was the Merck Corporation, which is a huge pharmaceutical company. We’ve also done clinical trials in eight countries, and shown that developing countries can be full partners in this effort. We’ve built their capacity and their leaders. We have good laboratory practices across all of our sites. We’ve done it with relatively small amounts of money, and my hope is that we’re seeing the beginning of a political movement to try to move AIDS vaccines up on the agenda.”

Also read: Antibodies, Vaccines, and Public Health


About the Author

Marilynn Larkin is a contributing editor to The Lancet.

Are Lawyers the Problem with Flu Vaccines?

A medical professional gives a patient a shot/vaccine.

From early discoveries in vaccine development to the anti-vax movement, the industry has changed immensely; and attorneys are playing a more prominent role than ever.

Published January 1, 2005

By William Tucker

“British authorities certainly thought there was a problem with the Chiron Corporation manufacturing – although the company didn’t,” commented Paul A. Offit, MD, who until last year sat on the Center for Disease Control’s Advisory Committee on Immunization Practices. “And the FDA was certainly caught off guard by the British decision. That’s what brought us to the current crisis.

“But when you look at the whole 40-year history of the vaccine industry and how we got to where we are today, you realize that lawsuits and changes in civil law have been a big factor. Profit margins are very thin and if liability costs run too high it just doesn’t make sense to stay in the business.”

Dr. Offit isn’t just making this argument off the cuff. He’s researched a book, The Cutter Incident, which will be published by Yale University Press next year. The Cutter Incident tells the story of an error by a California company in producing the first round of Salk vaccinations in 1955, which led to the accidental exposure of thousands of children to live polio viruses. The parents of a child who contracted polio sued Cutter Laboratories in one of the nation’s earliest medical product liability cases.

“It was 1957 and the courts had just adopted the principle of liability without fault,” said Dr. Offit. “You no longer had to prove negligence. The jury found that Cutter was not negligent – the crisis atmosphere had created a rush and there were no clear standards. But they found Cutter liable anyway, on the principle of liability without fault, and awarded $150,000.” Melvin Belli, who represented the family, always said that this victory made Ralph Nader possible.

An On-Going Problem

Whether liability law would have proceeded without the Cutter incident is an open question. But Dr. Offit makes a strong case that lawsuits have reshaped the vaccine industry, leaving the U.S. in the vulnerable position of having only two manufacturers of flu vaccines, both with roots in other countries. “In 1980 there were 18 American companies making eight different vaccines for childhood diseases,” Dr. Offit said. “Today four companies – Aventis, GlaxoSmithKline, Merck, and Wyeth – make 12 vaccines. Of the 12, seven are made by only one company and only one is made by more than two. There’s no redundancy in the system. Whenever there’s a bump in the road – and it happens fairly frequently – there’s a new shortage.”

Dr. Offit sees the current flu vaccine shortage as only the latest in a long string of incidents. In 2003 there was another flu vaccine shortage and a mini-outbreak when the two vaccine makers and the Centers for Disease Control and Prevention made a wrong guess and decided to cultivate the A/Panamanian strain of the flu virus. Instead, a rogue A/Fujian strain emerged and vaccine makers were caught short.

Before that, there was a serious shortage of the 2000 DTP (diphtheria, tetanus, and pertussis) vaccine after the FDA decided to ban thimerosal, a mercury preservative in vaccines, because of rumors it caused autism. The vaccine industry is now facing over 300 lawsuits claiming five times its net income from vaccines over thimerosal, even though there does not seem to be any scientific evidence to back up the claim.

“Fear of Litigation”

“There are some new vaccines that are not being made because of fear of litigation,” said Dr. Offit. “In 1998 the FDA approved a vaccine for Lyme Disease, which strikes 23,000 people a year. GlaxoSmithKline (GSK) manufactured it for three years, but withdrew it immediately after class actions were filed on rumors that it caused arthritis. Companies just don’t want to deal with the threat of lawsuits anymore.”

There is general consensus that the nation’s vaccine base has become too narrow, but diverse opinion about what has led to it. “Vaccines are really a very small business,” said Dr. Gregory A. Poland, director of vaccine research at the Mayo Clinic. “This global market is $6 billion, while the world drug business is $340 billion. Frankly, given the small profit margins, risks, and huge manufacturing costs, I marvel there are still companies in the business.”

One major problem is that flu vaccines are only good for a year. The flu virus circles the globe every 12 months, visiting Asia and the Southern Hemisphere before returning to the U.S. for the winter. On the way a few surface proteins change and the inoculations must be adjusted accordingly. Every February, the CDC sits down with the vaccine makers – now only Chiron and Aventis – and makes an educated guess at what strain will emerge the following year.

Then everybody makes the same vaccine. This lack of competitive diversity might seem like a weakness in itself, although Dr. Offit believes the system works. “By and large, the CDC has done a pretty good job of picking the right strain,” he said.

Painful Production

The difficulty comes if there’s a mild flu season or for some other reason people don’t want the vaccine. Then the manufacturers have to discard the vaccine and swallow their losses. “I’ve suggested that the government incentivize private industry by negotiating a fair price for a major portion of each year’s production and then promising to buy the last 10% as well,” said Poland. “That would reduce some of the risk.”

Antiquated cultivation techniques are also regarded as a handicap. Influenza vaccine manufacture hasn’t changed much since the 19th century, when companies such as Aventis set up production in Pennsylvania because that state has the nation’s largest production of eggs. Viruses are grown by injecting them one-by-one into raw eggs, with each egg producing three to five doses. From start to finish, manufacturing takes six months and any number of things can go wrong. A bacterial contamination prevented distribution of Chiron’s vaccine this year. There’s plenty of talk about modernizing the system – growing viruses in mammalian cell cultures, for example – but so far nothing has happened.

One problem is the exacting standards set by the FDA. It’s not that safety shouldn’t be observed, but it takes a long time to figure out whether a new method will work. In 2002, Congress heard testimony from Wayne Pisano, executive vice president of Aventis Pasteur North America, the company that will be the nation’s sole supplier of flu vaccine this year.

At the time, Pisano was explaining the previous shortage of DTP vaccines. “The schedule for the removal of thimerosal from the vaccines was decided on without input from industry,” said Pisano. “If changes are required before we can make them and the FDA can approve, shortages will occur. Science, not manufacturing, is the limiting factor in developing new vaccines.”

“More Trouble Than It’s Worth”

Finally, as with any public good, there is a certain amount of free riding. If the flu season is mild, people tend to skip their shots. Just having a lot of other people vaccinated will slow the progress of a virus and provide protection to people who aren’t vaccinated. “People are prepared to spend thousands of dollars a year on a treatment once they contract a disease, but will balk at paying modest sums to prevent it from happening,” complained Pisano.

Yet at the same time, companies are reluctant to have Congress mandate flu vaccines for everyone, since that would probably lead to a low mandated price. “The Childhood Immunization Act of 1992 has led to government purchase of nearly 20% of every year’s output,” said Poland. “But the government’s price barely covers the costs. The only place vaccine companies make money is in the private market.”

All this led David Brown of the Washington Post to write a story claiming that drug companies have abandoned the flu vaccine because it’s “more trouble than it’s worth.”

Fears of Lawsuits

Historically, lawsuits also have played perhaps a key role in winnowing down the competitors. When “swine flu” broke out at Fort Dix, N.J., in 1976, Congress decided to inoculate the entire country. It was astonished to find the insurance companies would not participate. They claimed lawsuits from people who would inevitably experience bad reactions would wipe out any profit margin.

The Congressional Budget Office went to work and came up with a prediction that of 45 million Americans inoculated, 4,500 would file injury claims, resulting in 90 damage awards totaling $2 million. The insurance companies still refused to bite, so Congress provided the insurance instead.

As Peter Huber recounted in Liability: The Legal Revolution and Its Consequences, the first part of the CBO’s estimate proved to be uncannily accurate. A total of 4,169 damage claims were filed. However, 700 – not 90 – of these suits were successful and the total bill to Congress came to over $100 million, 50 times what the CBO had predicted. The insurance companies knew what they were doing.

Another episode Dr. Offit noted is the pertussis vaccine scare of the 1980s. In 1974, a British researcher published a paper claiming that the whooping cough vaccine had caused seizures in 36 children, leading to 22 cases of epilepsy or mental retardation. Subsequent studies proved the claim to be false, but in the meantime Japan canceled inoculations, resulting in 113 preventable whooping cough deaths. In the United States, 800 pertussis vaccine lawsuits asking a total of $21 million in damages were filed over the next decade. The cost of a vaccination rose from 21 cents to $11.

A Flawed Process

Every American drug company dropped pertussis vaccine except Lederle Laboratories. “The company was punished for its persistence,” Dr. Offit writes in his book. In 1980, Lederle lost a single liability suit for the paralysis of a three-month-old infant – even though there was little evidence implicating the vaccine. The damages were $1.1 million, more than half the company’s gross revenues for sale of the vaccine that year.

“These scares may have no scientific basis, but they tend to take on a life of their own in the courtroom,” said Dr. Offit. Peter Huber’s second book, Galileo’s Revenge, which coined the term “junk science,” had a tremendous impact in cleaning up evidentiary procedures. Plaintiffs no longer win damages on the “impact theory of cancer” or for “chemical AIDS.” But the general problem persists.

To protect the vaccine manufacturers, Congress set up the National Vaccine Injury Compensation Program (NVICP) in 1986. Like Worker’s Compensation, the bill created a national fund to compensate legitimate injuries. In exchange, the injured party gave up the right to sue the manufacturer. Unlike Worker’s Comp, however, the program was not mandatory. Injured parties and their lawyers retained the right to sue if they aren’t satisfied with the verdict of the National Vaccine Injury Board. “The result has been that the most obvious cases are compensated, while the most unlikely claims go back to court,” said Dr. Offit. “The manufacturers still have to defend themselves.”

The Anti-vax Movement

The thimerosal episode – currently the biggest sword hanging over the vaccine manufacturers – has completely bypassed the National Vaccine Injury process. Thimerosal is a preservative containing slight traces of mercury that has been added to vaccines since the 1930s. In the late 1980s, a few speculations began suggesting that the mercury exposure to infants might be causing brain damage. This was soon related to what was described as an “epidemic of autism.”

Lawyers quickly circumvented the NVICP by arguing that thimerosal was an additive and not the vaccine itself. At present there are 300 pending lawsuits asking several billion dollars in damages – more than the net worth of the entire vaccine industry, for example. There are now numerous “Vaccine Liberation” organizations and several national directories of law firms looking for clients.

“There have been four large epidemiological studies that have looked for a connection between vaccines and autism and found nothing,” Dr. Offit said. “There’s absolutely no scientific evidence.”

“Back Where We Started”

Try telling that to angry parents saddled with the costs of raising autistic children. In 2001, U.S. Senator Dan Burton, chairman of the Government Reform Committee, held hearings that widely publicized the claims.

With such passions afoot, there is a serious question of whether childhood vaccination programs can continue to be successful. “People forget that 100 years ago, bacterial and viral infections were the number one cause of death,” said Dr. Offit. “The reason the average lifespan was 45 in 1900 and nearly 80 today is because we’ve been successful in conquering infectious diseases. If people start refusing to take shots – or if the manufacturers will no longer supply them – we’re going to be right back where we started.”

* The views and opinions expressed in this article are those of the author and do not necessarily reflect the views or opinions of The New York Academy of Sciences.*

Also read: Law Experts Give Advice for Scientific Research

About the Author

William Tucker is a writer for The American Enterprise.

Merging Modern and Ancient Medicines

A wooden mortar and pestle with various herbs.

An Interview with Albert Y. Leung, a pharmacologist who uses modern medical science to study the mechanisms—or active components—of herbs.

Published September 30, 2004

By Dan Van Atta

Image courtesy of iMarzi via stock.adobe.com.

To Albert Y. Leung, the benefits of Western medicine and those of medicinal herbs and other “natural” remedies are by no means mutually exclusive. Born and raised in Hong Kong, Leung grew up experiencing the power of traditional approaches to medicine used for centuries in China.

“My great grandfather on my mother’s side was a local doctor in his little village,” recalls Leung, a member of The New York Academy of Sciences since 1976. “While, I never knew him, my grandmother knew a lot about herbs. I grew up taking herbs.”

“I grew up taking herbs, but no one really understood why they worked.”

For three decades, Leung has used the tools and knowledge of modern medical science to study the mechanisms—or active components—of herbs. He is helping to understand what makes them effective in reducing certain aches and pains, as well as alleviating other symptoms of illness.

“I knew that for certain problems herbs were effective,” Leung said, “but then no one really understood why they worked. Now we know that many herbs contain active ingredients that are antioxidant or anti-inflammatory agents.”

Leung obtained a BS degree in pharmacy at the National Taiwan University before coming to the United States in 1962. He earned his MS and PhD in pharmacognosy at the University of Michigan, in Ann Arbor.

Part Scientist, Part Entrepreneur

Moving to Glen Rock, New Jersey, in the late 1970s, Leung created AYSL Corp., an information company. AYSL “probably holds the most extensive collection of Chinese journals in a single location outside of China” covering traditional Chinese medicine. He also edited the Encyclopedia of Common Natural Ingredients Used in Food, Drugs, and Cosmetics. Hailed as the most authoritative reference for natural ingredients in commercial use, it is now entering its third edition.

In the past 30 years dietary supplements and “health foods” based on “natural” ingredients have become a major industry. Leung said he is concerned about the safety and efficacy of many products sold as herbal extracts.

“The major problem is that everyone claims their product is the best,” he said, “but there is no real science behind it, no real controls. To say that a product is standardized doesn’t mean much when, for many of these products, the active ingredient is not known.”

In 1996, Leung founded a second company, Phyto-Technologies, Inc., to specialize in herb research. Phyto-Technologies manufactures and custom formulates Chinese herbal products for private-label distribution. With facilities in Glen Rock, New Jersey, and Woodbine, Iowa, the company now has 20 employees. Leung serves as president and chief executive officer.

“My approach is to provide the quality control needed to make the extracts the way they are supposed to be made,” Leung explained. “Certain herbs have to be extracted by traditional methods, such as boiling in water or soaking in alcohol. In the past four or five years we’ve developed some more technical aspects, but our approach is to combine appropriate science with the traditional methods necessary to retain the total benefits of traditional Chinese herbs.”

A Major Headache

Leung is currently engaged in the third year of a research study of the herb feverfew (Tanacetum parthenium Schultz Bip.) for use in migraine prevention. His company has been awarded a Small Business Innovation Research grant by the National Center for Complementary and Alternative Medicine to conduct the study, for which he is the principal investigator.

This second year of the phase II grant, “Reproducible Feverfew Preparations for Migraine Trials,” is fully funded, with $690,337. Dennis V. C. Awang, of MediPlant, Inc., an expert in the chemistry of feverfew, is the co-principal investigator. Funding for both phases of the three-year project comes to about $1.4 million.

Leung’s main objective is to characterize and to standardize feverfew preparations that have the greatest potential for use in human clinical trials for relief of migraine. During the past 20 years four clinical trials have yielded positive results in migraine prevention. Three of the trials used dried feverfew leaf powder, and one used a CO2 supercritical fluid extract (SFE). However, another trial—using a 90% ethanolic extract (by prolonged extraction), containing high levels of parthenolide (0.35%)—produced negative results.

“These results indicated that parthenolide is not the active principle of feverfew in migraine prevention, as previously assumed,” Leung said. The researchers then used chromatographic and spectrophotometric profiling and bioassay and gene expression assay techniques to define and isolate the potentially active components present in the dried leaf and the SFE, but absent in the prolonged extract.

Further studies are now in progress to characterize potential active components, Leung said. “Pilot batches of materials standardized to contents and physicochemical profiles of these components will be prepared and further subjected to activity verification by bioassay and gene expression assay,” he added.

The Researcher as Communicator

These materials “will then be subjected to clinical trials.” If all goes well, Leung said, the work would result in a safe, effective over-the-counter drug for migraine.

In the meantime, Leung continues to see his role as one of communicator as well as researcher. In 1995 he published another book, Better Health with (Mostly) Chinese Herbs & Foods. He also serves as an advisor to the Modernizing Chinese Medicine International Association, headquartered in Hong Kong. In addition to conducting research and writing books about herbal medicine, Leung produces a newsletter on the subject as well.

“There are a lot of aspects of modern medicine that are superior,” commented Leung, “but there are many common ailments that modern medicine still does not understand and is unable to treat. And there are herbs that work to reduce aches and pains—even though we may not know the active ingredients that make them work. I think the two forms of medicine should be used side by side.”

Also read: A New Look at an Ancient Pain Remedy