Skip to main content

A New Approach to Modern Nutrition using Dietetics

An array of food items including oranges, potatoes, broccoli, onions, tomatoes, garlic, salmon, and more.

Learn about the long history of knowing about our food, our bodies, and ourselves and how that applies to our contemporary diets and lifestyles.

Published March 14, 2011

By Stephanie B. H. Kelly

Image courtesy of kerdkanno via stock.adobe.com.

The authority to make claims about food and bodily knowledge was not always so squarely situated in the hands of professional physicians and nutrition experts. In fact, as Harvard historian of science Steven Shapin explained in his January 26, 2011 talk at The New York Academy of Sciences (the Academy) titled You Are What You Eat: The Long History of Knowing about Our Food, Our Bodies, and Ourselves, Europeans from antiquity to the early modern period, while seeking advice from experts in many aspects of their daily lives, remained very much their own physicians.

These Europeans lived in the era of a branch of Western medicine known as dietetics. Unlike its sibling disciplines of diagnosis, prognosis, and therapeutics, dietetics was “not just called into play when [one] was manifestly unwell.” Instead, this particular and, it should be noted, remarkably stable body of knowledge focused on maintaining as well as restoring health through balance and moderation of the elements at play. Intricately connected to daily routines, behaviors and practices, dietetics was associated with the concepts of “regimen” and “hygiene,” and it prescribed an ordered, balanced life for its adherents.

These values might resemble, for example, aspects of certain modern cultural or religious traditions, but they are a far cry from the realm of modern medicine and present-day nutritional science, according to Shapin. Whereas a modern nutrition expert might advise you to “reduce your fat intake,” and a modern physician might prescribe medications to help you reduce your cholesterol, a 17th century dietetics expert or a reader of dietetic texts (or “dietaries” as they were known) might suggest you eat foods that match your “temperament,” including high-fat foods as long as they taste good.

The History of Dietetics

Delving into the history of dietetics is a fascinating journey in its own right, but there are other motivations for studying a set of historical practices now so divorced from our own lives, as well. If the history of dietetics provides us no useful framework for considering our present reality, why study it at all? The answer is relatively simple, as Shapin made clear.

Understanding the history of dietetics tells us a great deal about how people have thought of themselves in relation to their food and their surroundings. Furthermore, because of its unique blending of ethical and instrumental authority, dietetics and the course of history from that body of knowledge to modern medicine can help us comprehend how European societies came to segregate knowledge into the prescriptive and the descriptive, separating “what ought to be” from “what is.”

As scientific concepts of nutrition took over in the early 19th century, substantial authority was invested in these new scientific disciplines to speak about aspects of daily life that were formerly the domain of dietetics. How and why this change occurred are fundamental questions to the study of our own contemporary relationship to our food, to our bodies, and to the experts who study them. And so, as it turns out, the premise of our earlier question is flawed: the history of dietetics has much to say about our current condition.

What is (or was) Dietetics?

During his talk, Steven Shapin had the unenviable task of rendering familiar the vocabulary and the culture of dietetics, which was so pervasive as to be a “coordinating mechanism [for many aspects of life] in the late Middle Ages, even for ‘non-experts,'” but which today is a worldview entirely foreign to our own. He accomplished this feat by relating for his audience the fundamental principles of dietetics and by explaining the intuitive and perceptual ways individuals could access dietetics knowledge.

An 18th century illustration showing the different “complexions” (or temperaments: sanguine, phlegmatic, melancholic, and choleric) that result when each of the four humors dominates.

At its most basic, dietetics relied on the notion of four principal “elements” (air, water, earth, and fire) composing the physical world—everything from rocks to flies to humans to food. These four elements mapped onto four bodily “fluids,” or humors, respectively: the sanguine humor (blood), the phlegmatic (phlegm), the melancholic (black bile), and the choleric (yellow bile). To these fluids belonged certain qualities in different combinations.

For instance, yellow bile was believed to be both dry and hot, the qualities of the element fire. Inscribed in the combinations of four qualities, dry, hot, moist, and cold, one could find four personality types, one for each of the dominant humors. To this day, we can describe someone prone to anger as both “hot headed” and “choleric,” reflecting the cultural legacy of these concepts.

Adjusted for Each Person’s Specific Temperament

These qualities and personality types (also called “temperaments” or “complexions”) were maintained in health through balance and moderation. One should, according to dietetic advice, consume “hot” and “dry” foods to restore health when sick with “cold” and “moist” illnesses. To maintain health, one should match the food one consumes with one’s temperament. For instance, those of choleric tendencies should eat mostly foods with “hot” and “dry” qualities.

And because dietetics considered everything in terms of these same four qualities, it provided an integrative and coherent vocabulary that allowed for the description of relationships between diseases and their treatments, or people and their food, for example. This vocabulary and its corresponding culture of practice reached far into the fabric of everyday life; according to Shapin, “it belonged to you in ways that other science [does] not.” Dietetics advice could involve proscriptions against certain types of foods, but it could also venture into areas quite beyond the domain of modern nutritional science.

Shapin spelled out the areas on which a dietetics expert could advise, the realm of which included all the areas of someone’s life that could be altered in some way. These “six things non-natural” consisted of one’s airs (one’s surroundings), food and drink, motion and rest, sleep or waking, evacuations (including excretion, retention, and sexual release), and the passions of one’s soul (the emotions).

This list broadly encompassed many of the practices of everyday life and could include where someone should go on vacation or which direction his house should face, to name a few examples. Dietetic prescriptions were adjusted for each person’s specific temperament, so that a melancholic person (with a cold and dry temperament) was not constantly trying to counterbalance his or her temperament with sanguine treatments or vice versa.

The Good and the Good for You: The Instrumental and Ethical Authority of Dietetics

At this point it might seem that dietetics, insofar as it involved the treatment of disease through prescriptions for behavior changes, is not that different from aspects of modern primary care. However, dietetics can be distinguished on a number of fronts from the body of knowledge and the culture of practice belonging to modern medicine. Chief among these is “the way dietetics stood with respect to what was medical or scientific on the one hand and [to] what was moral on the other,” noted Shapin. For, as he explained, dietetics sat at the boundary of description and moral prescription.

Dietetics advice was therefore both instrumental and ethical in nature. The dietetic values of temperance and moderation, expressed in prescriptions such as “live to the golden mean” or “nothing too much” did not just resemble the four cardinal virtues of prudence, justice, fortitude, and temperance, they affirmed them. Dietetics told people not just what was good for them (nutritionally, physically, mentally, etc.), but also what was good morally. Those who followed such direction would benefit materially, through the maintenance of their health, but also morally, through the affirmation of ethical truths. 

“You are What You Eat”

“You are what you eat,” the phrase that began Shapin’s talk, takes on a level of meaning in this dietetics context beyond what is available in modern nutrition science’s vocabulary of “constituents.” People were characterized in the same way as their food, and just as knowing the correct description of someone’s qualities would lead you to the right description of their nature, knowing the right descriptors (hot, cold, moist, dry) for their food would tell you its nature as well. Knowing the nature of your food would allow you to predict how it would interact with your digestive system and whether it would make you sick, among other aspects of its behavior.

Furthermore, this analysis could be used to understand not simply the nature of people in the present, but also how they retained that temperament over time and what practices might drastically disrupt that nature in the future. In this way, custom was thought of as a second nature because people believed that a lifetime of custom, of eating behaviors and the like, could remake someone’s nature. In this framework, one had good reason to fear radical changes to lifestyle or drastic imbalance of the humors, and maintenance of that balance through dietetics knowledge became all the more important.

Dietetics and Expert Knowledge

Given the implications of a humoral imbalance, early modern Europeans clearly needed a reliable source of dietetic knowledge, but that source could be found surprisingly close to home. Although the “dietaries” written by dietetic experts were one avenue of access to dietetics knowledge, self-knowledge and experiential knowledge were both extremely important parts of the culture of dietetics.

In an era where independent testing of blood pressure, heart rate, or any of the metrics so fundamental to modern primary care was unheard of, even the most expert physicians were very much dependent on the patients’ accounts of their own behaviors and symptoms. Even the treatments for illness were largely in the patients’ hands, as medications took a back seat to lifestyle adjustments.

But early modern Europeans could get by without seeking advice from experts at all, explained Shapin. In most instances, non-experts could reliably deploy dietetic knowledge simply by engaging in what Shapin called “analogical forms of reasoning” to relate the superficial (or obvious) characteristics of food to the qualities of its nature and consequently, to the food’s fate as it was digested, to whether it would make a particular person unwell. Most of these superficial qualities could be reliably ascertained through intuitive descriptions of the food’s texture, general appearance, and so on. A cantaloupe, for example, would be considered “cold” and “moist,” and a chili “hot.”

Quod Sapit Nutrit

Of the ways to determine whether and how a particular food would match one’s own qualities, taste was crucial. The Latin saying “quod sapit nutrit,” meaning “if it tastes good, it’s good for you,” was not a playfully insincere justification for indulgence, as it might be today, but was instead a rule of thumb for assessing the goodness of food. In general an aliment agreed with someone—that is, made him well, not ill, if it sat well on the stomach—if its qualities matched his temperament, and the same principle applied in the mouth.

What tasted good on the tongue clearly matched the tongue’s, and therefore the person’s, nature. In this framework, because the perceived qualities of the food and the nature of that food were identical; the categories of being and of experience were one and the same. And more importantly, as Shapin put it, “the tongue [was] a reliable philosophical probe into the nature of things.” Not since has the role of the tongue been quite as authoritative as it was in dietetic culture.

The Rise of “Constituent” Descriptions

Smell, taste, and other everyday experiential ways of knowing have become “philosophically devalued” with the disappearance of dietetics from formal medical training in the 1810s and 1830s and eventually from lay consciousness, according to Shapin. Although they are the vehicles for the study of art and beauty through connoisseurship, these senses are no longer thought to be reliable probes into the nature of things. Dietetics and its ontology of “qualities” gave way to the beginnings of modern nutritional science and, with it, to a world of “constituents,” such as proteins, carbohydrates, and fats.

Such a drastic change first in the content of professional medical education and then in the public understanding of concepts as fundamental as which foods are beneficial did not happen over night. To begin to think of foods in terms of what they were made of instead of in terms of their qualities took a great deal of adjustment. Initially medical doctors, trained in the burgeoning discipline of chemistry, began to describe foods in terms of their sweetness, alkalinity, acidity, and so forth. These descriptions were constituent-based but were nonetheless available in some respects to the lay public through personal observations and sensory experience.

Fuel versus Non-fuel

William Prout in the 1810s and Justus von Liebig in the 1830s, both chemists by training, helped move the authority to make claims about foods’ components further out of the reach of the untrained public. Liebig in particular thought of the body as a kind of engine in which digestion was one mechanical process among many. And, like an engine, the body required materials to construct it and “fuel” to run it.

The distinguishing feature of “non-fuel” for him was the presence of nitrogen—an assessment that could clearly only be made by chemical examination. Of course, today this category of “nutrients” includes much more than nitrogen-containing substances, but it is no less in need of chemical interpretation than it was in Liebig’s era. Ideas of fuel have not left public discourse, though from that century to this, calories have come to mean “the power of food” rather than something contained within it.

Evidence for the present-day centrality of scientific interpretation can be found in what Shapin called “one of the great artifacts of the role of the state and the relationship of expertise—the Nutrition facts label.” This label, required on all processed, purchasable foods, indexes “the achievements of nutritional science,” and most of its details are only known to the consumer on the condition of expert knowledge.

Without the testing and re-testing of foods by scientific experts, we cannot know, for example, the iron content or carbohydrate content of the foods we consume. And this reliance on one set of experts cements our dependence on yet another set—we cannot be our own physicians in the same way as were early modern Europeans, so we must turn to nutrition experts and other medical professionals for advice.

The New Vernacular of Scientific Constituents Shows the Power of Modern Science

An example of a nutrition label showing the constituents of a package of pork rinds.

Despite our reliance on the authoritative accounts of modern medicine, we still use this scientific language, as in “I have to watch my cholesterol,” as though there were no expert intermediary to our knowledge of our foods and our bodies. In fact, as Shapin understands it, this scientific language is part of who we are and of how we understand ourselves, though that was not always the case. He explained that the rise of scientific expertise to dominance played a role in reconstituting ourselves. As he put it, “We are what scientists say. We didn’t used to be.” Without criticizing this reality, he went on to articulate that this new vernacular of scientific constituents shows the power of modern science.

He clarified, however, that there are actually two ways to think about this power. One way is to consider the rise of constituent descriptions as a reflection of the medicalization of modern life—our lives as we understand them are more medical than they once were. But, Shapin qualified, it is also possible to see this “power” as an indication that the “reach of modern science is actually less than it once was.”

Under dietetics food and drink composed just one of six aspects of daily life over which medical advice (expert or not) reigned. And, whereas modern medicine is adamantly instrumental, advancing physical but not moral well-being, dietetics advice had moral dimensions. Today, we might look to religion or philosophy to tell us what is morally right; dietetics shows how science (of a sort) once dominated these spheres, too.

Then and Now: What Has Changed?

Shapin concluded his talk by asking how we might begin to think about the big picture, the fate of dietetics and the rise of nutritional science to replace it. How, he asked, can we describe what has changed and where that leaves us in relation to our bodies, our food, and our physicians? Two historical trends can guide an answer to this question.

The first of these, the separation of expertise and lay knowledge, occurred concurrently with the increasing professionalization of medicine. Dietetics asked people to be their own physicians in many respects, requiring them to report on, to sense, and to manage daily routines through illness and health. This system, by contrast to modern medicine, elevated the level to which each person monitored his medical well-being on a daily basis.

Explored during Shapin’s talk as a key distinction between dietetics and nutritional science, the separation of ethical prescription from instrumental description was increasing in other areas of western culture at the same time. As he explained, under dietetics “instrumental advice occupied much the same cultural terrain as moral advice,” and medical and moral assessments used the same sort of vocabulary and reached much the same conclusions as one another.

The division of the ethical from the instrumental in medicine belongs to the same historical move as the emergence of “the naturalistic fallacy” in philosophy around the turn of the 20th century. This fallacy says that, logically, one can’t move from an “is” statement to an “ought” statement—just because something is or has been a certain way, one should not assume it should (morally) be that way.

Balance and Moderation

Examples of dietetics-like advice found on fast food companies’ websites to advertise high calorie foods.

Whereas the principles of balance and moderation once offered the means to an end (health) and an end in and of themselves (moral health), modern science provides means, but not ends. For us, equaling our consumption of healthy foods with that of unhealthy ones just for the sake of upholding balance would be nonsensical, and we would not dream of prescribing cantaloupe to someone simply because he was sluggish. Today we would not find, nor presumably welcome, a conversation with our nutritionist in which the concepts of sin, gluttony, and excess were deployed as medical advice.

But balance and moderation have not disappeared completely from public discourse about food. In fact, in a humorous turn, Shapin revealed that “what was once traditional dietetic counsel has become the property of agents in society with the least credibility” when it comes to food: fast food companies.

These companies use the idea of balancing healthy foods “with a little fun” to advertise the very foods nutrition experts would warn us against. Shapin’s parting remarks left the audience considering the role of dietetics principles in a world so very different from the 17th century world of their prominence: “Prudence, now an aid to profit, once the most authoritative dietary advice, now a cynical attempt to link the bad and the bad for you with the good and the good for you.”

Also read: Challenges in Food and Nutrition Science

Building a Bridge from Epidemiology to Nutrition

How does what I eat affect my long-term health and wellbeing? The bridge between epidemiology and nutrition provides a way to these answers.

Published December 1, 2010

By Walter Willett, as told to Adrienne J. Burke

Image courtesy of yanadjan via stock.adobe.com.

I grew up in Wisconsin and Michigan in a family that has been dairy farming for generations. While studying at Michigan State University, I grew vegetables–sweet corn, tomatoes, squash—that I sold to local grocery stores to support my studies.

I started off there in physics and food science. Then, I went on to medical school but took several electives on nutrition-related topics. I spent one summer on a reservation in the Upper Peninsula of Michigan doing a health and nutrition survey. I was shocked that 50 percent of adults in our survey had type-2 diabetes, and the study demonstrated to me how it was possible to collect very interesting and useful data about people’s diets with a simple structured questionnaire.

My papers on how diet relates to long-term health and disease have led to being the second-most-cited author in clinical medicine. Much of this work was conducted within the Nurses’ Health Study, which has provided a tremendous platform that continues to yield an expanding output of data as the subjects grow older.

The original focus of the study was breast cancer, but that allowed us to get funding to collect dietary data starting in 1980. The Nurses’ Health Study was the first large study to gather dietary data and follow a large number of people for many different outcomes. It’s also unique for having repeatedly updated dietary data every four years over time.

Diet and Cardiovascular Disease

Many of our findings flew in the face of conventional wisdom. I was interested in the 1970’s in the relationship between diet and cardiovascular disease and people were being told very strongly, as though it was absolutely established truth, that we should avoid eggs to prevent heart disease and give up saturated fat.

When I dug into the literature supporting this, it was remarkably weak. In fact, there were no studies that showed that people who ate more eggs had higher risk of heart attacks, and the few small studies showed no relationship. It became apparent to me that a strong body of empirical evidence was needed if we were going to be giving guidance to individuals or the public.

During that time, several epidemiologists were documenting that rates of many cancers around the world varied tremendously. For example, the rates of breast cancer in post-menopausal women in Japan were only about one-eighth of those in the U.S.  That obviously provokes the question, why? When I was in medical school, no one really asked why these things were happening, why people get cancer. When I went to the Harvard School of Public Health, people in the Department of Epidemiology were asking those questions. 

The Department Chair at the time, Brian MacMahon, said there were some suggestions that diet might be important in the cause or prevention of cancer. That sounded like a pretty radical statement. The evidence was very scattered and not very strong, but the topic seemed worth investigating. What has unfolded has been surprising. Many highly controversial at first, but the finding has been replicated repeatedly and it’s accepted now.

Research Leads to Regulation of Trans Fats

There had been a belief that the percentage of calories from fat in the diet was the main reason why breast cancer rates were higher in the U.S. than in Japan and in developing countries. That idea turned out to be not supported by the data. Trans fats appeared early on as a problem. Experts in the cardiovascular field had been telling people to replace butter with margarine and Crisco to reduce cholesterol and saturated fat. But it turned out that those foods were very high in trans fats and were even worse than the foods they were meant to be replacing. 

I was attacked, but most of these findings have become accepted with time. It took about 10 years to get FDA to require that trans fats be included on food labels. Just a few weeks ago, The New England Journal of Medicine published a letter by one of our junior colleagues showing that in prepared foods, restaurant foods, and main national chains the amounts of trans fats have been reduced by about 90 percent. There’s been a huge change in the last three or four years in the national food supply and probably in everybody’s body. If you actually stuck in a needle and analyzed your tissues, you’d find a big difference.

In the field of nutrition, the tools did not exist to answer the most important questions: How does what I eat affect my long-term health and wellbeing? The bridge between epidemiology and nutrition provides a way to these answers.

Learn more about Nutrition Science at the Academy.

A Non-invasive Way to Prevent the Return of Fear

A hand completes the final piece in a puzzle that is the shape of the human brain.

Four members of The New York Academy of Sciences (the Academy) in the psychology and neural science departments reported their new-found technique in the journal Nature.

Published January 15, 2010

By Adrienne J. Burke

Researchers in the New York University Department of Psychology and Center for Neural Science, Daniela Schiller, Elizabeth Phelps, Marie Monfils, and Joseph LeDoux, have developed a noninvasive technique to block the return of fear memories in humans. The technique, reported in the December 9 edition of Nature, may change how memory storage processes are viewed and could lead to new ways to treat anxiety disorders.

The four Academy members and colleagues showed that reactivating fear memories in humans allows them to be updated with non-fearful information, a finding that was previously demonstrated in rodents. As a result, fear responses no longer return.

Finding the Re-consolidation Window

The experiment was conducted over three days: the memory was formed in the first day, rewritten on the second day, and tested for fear on the third day. However, to examine how enduring this effect is, a portion of the participants was tested again about a year later. Even after this period of time, the fear memory did not return in those subjects who had extinction during the re-consolidation window. These results suggest that the old fear memory was changed from its original form and that this change persists over time.

“Our research suggests that during the lifetime of a memory there are windows of opportunity where it becomes susceptible to be permanently changed,” says Schiller. “By understanding the dynamics of memory we might, in the long run, open new avenues of treatment for disorders that involve abnormal emotional memories.”

Phelps added, “Previous attempts to disrupt fear memories have relied on pharmacological interventions. Our results suggest such invasive techniques may not be necessary. Using a more natural intervention that captures the adaptive purpose of re-consolidation allows a safe way to prevent the return of fear.”

Also read: A Neuroscientist’s Search for Memory

Vaccines Strategies for Developing Countries

A mosquito bites a person.

Exploring vaccine treatments for Dengue, Meningococcal, and Pneumococcal, all of which pose a major public challenge in areas like Southeast Asia, Africa, and Latin America.

Published August 11, 2009

By Theresa M. Wizemann

Despite the availability of life-saving vaccines, communicable diseases remain a major public health problem around the world. There is a gap between when vaccines and technologies become available in industrialized countries, and when they are implemented in countries that need them most, where the disease burden is the greatest. This problem persists because of challenges in research and development, prioritization, regulation, funding, infrastructure development, implementation, and surveillance and monitoring.

A May 20, 2009, symposium hosted by The New York Academy of Sciences (the Academy) looked closely at these issues, focusing on meningococcal and pneumococcal diseases, dengue, and dengue hemorrhagic fever. In addition to explaining the state of the art in vaccine development for these diseases, speakers offered several strategies for protecting populations from vaccine-preventable disease: achieve high immunization coverage rapidly and across age groups with the highest disease burden; assess waning immunity; and do not underestimate carriage reservoirs that can reintroduce the organism into the population.

They also discussed the importance of public–private partnerships to ensuring timely, equitable, and sustainable delivery of lifesaving vaccines to developing countries. Some aspects of successful strategies include: regional demand estimates and bulk purchasing, allowing for lowest price negotiation; purchase commitments and annual contracts, allowing manufacturers to plan for production, delivery, and financial investment; and lines of credit to countries, ensuring uninterrupted delivery of vaccines.

Closing the Gap

Despite advances in treatment and prevention, communicable diseases remain a major public health problem in much of the world. Thirty-two percent of all deaths worldwide are due to infectious diseases, and there is a stark inequality in the global disease burden of vaccine-preventable diseases. The developing world, including regions of Southeast Asia, Africa, and Latin America, bear the greatest burden, in part because life-saving vaccines that are part of routine care in industrialized countries still have not been broadly implemented in these regions.

On May 20, 2009, leading scientists from academia, industry, and nonprofit organizations gathered at the Academy to discuss the unique challenges of developing, evaluating, funding, and delivering vaccines for infectious diseases to those that need them most. The symposium, moderated by Albert Ko of Weill-Cornell Medical College, focused on global efforts to prevent meningococcal and pneumococcal diseases, dengue, and dengue hemorrhagic fever.

The challenges of providing these vaccines to the poorest of the world span research and development, prioritization, regulation, funding, infrastructure development, implementation, and surveillance and monitoring. To close the gap, all of these areas must be dealt with concomitantly.

Dengue, Once Nearly Eliminated, Re-emerges with a Vengeance

Dengue hemorrhagic fever is often classified as an emerging infectious disease, Scott Halstead of the Pediatric Dengue Vaccine Initiative (PDVI) said. The often fatal disease has been amplified by factors of the contemporary world. Global population growth, rural-to-urban migration, and deterioration of cities provide breeding space for the dengue vector, the Aedes aegypti mosquito. Jet travel by people unaware they are infected allows the virus to relocate rapidly. In addition, following eradication of A. aegypti from the American tropics in the 1960s in response to yellow fever, the number of vector control experts dwindled and Aedes has reestablished itself in an even wider range than before.

There are four dengue viruses, all endemic in tropical regions of the world. Certain populations, including Africans, appear to have a resistance gene and experience severe disease less frequently. Most observable disease in Africa is in expatriate residents. About half the world’s population, about 3 billion people, live in the tropical regions where the dengue vector thrives. Conservatively, 50–100 million people are infected with dengue each year and about 10% have overt symptoms ranging from mild disease to fatal hemorrhagic fever. The economic impact of dengue illness is considerable.

Countries in yellow are home to the dengue virus, and approximately 3 billion people are at risk.

The majority of severe disease results from secondary infections in those with circulating antibody from a previous infection. Primary infection in infants can also be severe, progressing as if it were a secondary infection due to the presence of maternal antibody. Antibodies interact with dengue virus to form immune complexes that are internalized by monocytic cells, resulting in a complicated phenomenon called intrinsic Antibody Dependent Enhancement (ADE).

In the presence of enhancing antibody (i.e. circulating antibody below protective levels) the number of cells that are infected is greater, and the viral output of each cell is increased. A recent study suggests that while the interferon system is generally effective in controlling infection, in the presence of dengue antibody, immune complexes suppress the interferon system, resulting in more severe disease.

Given the lack of effective vector control and the vast population at risk for contracting dengue, a vaccine is the best hope for controlling the disease. Halstead highlighted five products in development. ChimeriVax-Dengue is a tetravalent dengue vaccine engineered by Sanofi Pasteur by inserting structural genes from the four dengue viruses into an attenuated yellow fever virus. A second tetravalent vaccine being codeveloped by GSK and the Walter Reed Army Institute for Research (WRAIR) is comprised of native whole dengue viruses, attenuated by serial passage.

Two other chimeric tetravalent vaccine candidates are being developed by NIH in collaboration with Johns Hopkins University, and by the CDC in collaboration with Inviragen, based on attenuated dengue 4 and dengue 2 viruses, respectively. Finally, Hawaii Biotech has expressed the four dengue envelope genes in yeast cells. One of the challenges of vaccine development, Halstead said, is targeting the age when maternal antibodies decline sufficiently to allow a live attenuated vaccine to be effective in infants. Achieving herd immunity through mass vaccination is another approach to better protect infants, and to avoid ADE.

The PDVI supports the development of dengue vaccines, preparing field sites for vaccine trials, working to improve and standardize dengue diagnostic tests, and preparing regulatory and manufacturing standards. PDVI is also working to better understand the burden of illness, strengthen surveillance, and upon approval of a vaccine, will advocate for introduction and use, addressing issues of access.

Meningococcus: Preventing Outbreaks and Controlling Endemic Disease

While dengue virus infects only those who come in contact with its mosquito vector, nearly all humans are colonized with Neisseria meningitides at some point. Emil Gotschlich of the Rockefeller University said that about 10% of the population harbors meningococcus asymptomatically in the nasopharynx in winter months, but in schools, prisons, and the military, the carrier rate can be as high as 60%. Since WWII, there has been a relatively low-level endemic pattern of meningococcal meningitis in the U.S., about 2 cases per 100,000 people, with occasional regional outbreaks.

With the essential elimination of Haemophilus influenzae meningitis as a result of vaccination, meningococcus is now the leading cause of bacterial meningitis in U.S. Overseas, epidemic disease is a serious concern. Most prominent is the “meningitis belt” spanning sub-Saharan Africa. There, endemic disease is very common and major epidemics occur at frequent intervals. There is a rapid onset of cases in the winter months, as high as 500/100,000, which halts rapidly when the rainy season starts. This dependence on meteorological conditions is not fully understood.

The meningococcus is covered with a polysaccharide capsule which is required for virulence, and natural immunity is dependent on circulating antibody to the capsule. A relatively limited number of capsular serogroups cause invasive disease, including types A, B, C, Y, W-135, and X. The principle of vaccination with capsular polysaccharide was established by early studies of the pneumococcal vaccine, and trials of first-generation meningococcal polysaccharide vaccines in school-age children and adults established that the efficacy of vaccines to group A and C meningococci was 90%.

However, infants and young children under the age 2, who are at the highest risk of meningococcal disease, do not respond immunologically to polysaccharide antigens. To address this, researchers focused on developing protein conjugate vaccines modeled after the current H. influenzae vaccines. The current meningococcal conjugate vaccine is tetravalent, covering groups A, C, Y, and W-135, and has proven to be highly effective in preventing outbreaks of meningitis, and controlling endemic disease.

Conjugate vaccines are a marked improvement over polysaccharide alone, allowing for vaccination in the first years of life, achieving high levels of antibody at all ages, and effectively suppressing the carrier state. However they do have shortcomings, including the need for boosters to maintain protective levels of antibody. Conjugate vaccines, if properly manufactured, are excellent. What is required, Gotschlich said, is a leadership that devises and promotes programs to match the quality of the vaccine.

The African Meningitis Belt

Despite vaccination in response to epidemics, outbreaks of serogroup A meningococcal disease occur year after year across sub-Saharan Africa, explained Thomas Clark of the Centers for Disease Control and Prevention (CDC). Over 250 million people are at risk across the African meningitis belt, from childhood, well into the fourth decade of life.

The Meningitis Vaccine Project (MVP) was created in 2001 to develop a meningitis vaccine that is effective and affordable for Africa, and ultimately, to eliminate epidemic meningitis as a public health problem in sub-Saharan Africa. A unique consortium of partners is contributing serogroup A polysaccharide, carrier protein, conjugation technology, and manufacturing capabilities. Clinical trials in Africans aged 2–29 years of age have shown that the novel meningococcal A conjugate vaccine, MenAfriVac, is immunologically superior to the existing tetravalent conjugate vaccine, Menactra. Comparable results were observed in children aged 12–23 months.

In the United States, infants under age 1 and adolescents aged 16–19 have the greatest disease burden. Since 2005, CDC has recommended meningitis vaccination at 11–12 years of age, but coverage in adolescents has taken effect more slowly than infant programs, with teen coverage reaching only 30% by 2007.

The U.K. experience is somewhat different. In 1999, in response to the increasing burden of serogroup C disease, the U.K. rapidly implemented immunization of infants, young children, and adolescents, achieving over 80% coverage across all ages by 2001. In 2008, for the first time, there were no deaths from serogroup C meningococcal disease in the U.K. A study of over 10,000 U.K. adolescents showed that vaccination reduced the carriage rate of serogroup C meningococci by two thirds within one year, resulting in herd immunity and a reduction in disease across all age groups, including in those who were not vaccinated.

The Meningitis Vaccine Project’s schedule for implementing mass vaccination across the African meningitis belt.

The MVP is modeled on the U.K. experience. Phase 1 is mass vaccination of the population under 30 years of age. Phase 2 includes a catch-up campaign every 5 years in children aged 1 to 5 years, and two infant doses as part of the regular childhood schedule. MenAfriVac has been submitted to the Indian regulatory agency for licensure and a decision is expected in late 2009. Upon approval, Burkina Faso will be the first country to implement vaccination, due in part to the burden of disease, and the ability to conduct surveillance and evaluation. Vaccination will then roll across the meningitis belt.

Even at a cost of 40¢ per dose and about $1 per person to administer, the 12 million doses needed in Burkina Faso are still prohibitively expensive for the country’s ministry of health. Fundraising is ongoing to be able to implement the mass vaccination phase before funding from GAVI is approved. Surveillance will be conducted for coverage levels, vaccine effectiveness, safety, carriage and transmission, outbreaks, and molecular epidemiology, including the emergence of other serogroups.

Pneumococcus: The Number One Cause of Vaccine-preventable Death

Like meningococcus, the primary virulence factor of Streptococcus pneumoniae is the polysaccharide capsule. Pneumococcus is carried asymptomatically in the nasopharynx, and causes clinical disease when it moves to other parts of the body. The most common manifestation is otitis media, infection of the middle ear. More invasive and severe outcomes include pneumonia, bacteremia, and meningitis. Conjugation of pneumococcal polysaccharide to a carrier protein elicits protective levels of functional opsonophagocytic antibody in infants and children.

There are over 90 pneumococcal serotypes, but not all are associated with disease, explained Emilio Emini of Wyeth Pharmaceuticals. The current conjugate vaccine, Prevnar, covers the seven serotypes that are responsible for 80% of invasive disease in children in the U.S.: 4, 9V, 6B, 14,19F, 18C, and 23F. Manufacturing the 7-valent vaccine is extremely complex as it is, in essence, seven separate conjugate vaccines. In clinical trials, the vaccine was shown to be greater than 95% efficacious for invasive disease, and by 5 years after introduction, there was a 98% reduction in invasive pneumococcal disease due to covered serotypes in children. There was also a 76% reduction in disease in unvaccinated adults as a result of herd immunity.

Children in the U.S. have been routinely immunized since 2001; however S. pneumoniae remains the number one cause of vaccine-preventable death in children worldwide. One million infants, and one half million adults, die every year, in large part because the vaccine is just now beginning to be introduced in developing world countries.

The 7-valent vaccine was developed based on serotype prevalence in North America, and the two serotypes that cause epidemic outbreaks of disease in developing world countries, types 1 and 5, are not covered as they are not typically endemic in developed countries. Development of a 9-valent vaccine was undertaken to help address this, and two studies, in South Africa and Gambia, showed the vaccine to be highly effective. The 9-valent vaccine was not fully developed, however, as it became apparent that the epidemiology of the infection was shifting and coverage of more than nine serotypes was needed.

Efforts are now focused on developing a single second-generation vaccine that will be effective worldwide. A new 13-valent vaccine will include serotypes 1 and 5, as well as 3, 6A, 7F, and 19A. All clinical trials have been completed and regulatory approval is now being sought in the U.S. and the EU.

For a vaccine of this complexity, the biggest obstacle to effective delivery to the developing world is manufacturing and cost. The estimated number of pneumococcal vaccine doses required for worldwide use could be as high as 250 million per year. The burden is on biopharmaceutical scientists, Emini said, to improve processes and decrease costs. In addition, WHO requirements must be met before the vaccine is “pre-qualified” (i.e., eligible for purchase by agencies such as the UN that supply vaccines for developing countries). Cold chain requirements also add complexity to delivery and distribution. Emini also noted that political will in the countries in which the vaccine will be made available is required.

In 2009, Wyeth donated enough 7-valent vaccine to immunize every child in Rwanda, which has one of the highest incidences of pneumococcal disease and death. Given the significant unmet medical need, it was decided to begin an immunization program with the current 7-valent vaccine, and transition to the 13-valent as soon as it is licensed.

Breaking the Cycle of High Price, Uncertain Demand, and Limited Supply

Arthur Reingold of the University of Calfornia, Berkeley stressed the dramatic public health impact of the 7-valent pneumococcal vaccine, noting the decrease in disease due to antibiotic-resistant strains, the reduction in noninvasive syndromes such as otitis media, and the decrease in hospitalizations of children due to pneumonia. Children are the primary source of infection for adults, and childhood immunization has led to a significant decline in invasive disease in adults aged 65 and older due to the 7 vaccine serotypes. Remarkably, the indirect herd protection of older children and adults has prevented more disease, and saved more lives, than has direct protection of vaccinated children.

The 7-valent vaccine is administered as three infant doses and a fourth booster dose after age 1 year. But a three-dose schedule could result in a cost savings of 25%, free up global supply, and mean fewer injections for children. Immunogenicity studies have shown that three- and four-dose schedules produce similar antibody levels for most serotypes, and persistence of antibody is similar. The key issue, shown in a case-control study, seems to be that one of the doses must be a booster after the first year of life. Reingold noted that some countries in the EU are already using a three-dose schedule.

The WHO Strategic Advisory Group of Experts (SAGE) recommended introduction of the 7-valent vaccine in the developing world now, switching to the 13-valent when licensed. The challenge now is implementation. There is, in essence, a vicious cycle of high price, uncertain demand, and limited supply, Reingold said.

A cycle of structural pressures makes it difficult to get vaccines to where they are needed most.

PneumoADIP is working to break this cycle through the development of public–private partnerships. Eligible countries, those with annual per capita income levels of less than $1000, first commit to using the vaccine. Manufacturers, then assured of a market, commit to making and selling the vaccine at a lower price. GAVI and other funders commit to purchasing the vaccine at the guaranteed price for the countries, with a minimal per dose copayment from the country.

This approach, while assisting high-burden countries, does not assist mid-income level countries in the Americas that are not GAVI-eligible, yet cannot afford to pay full price for the vaccine doses they need to give. Some are considering cutting costs by administering fewer doses per child, the health impact of which remains to be seen.

Funding and Delivery: The High Cost of Inaction

Describing his work at the Pan American Health Organization (PAHO), Jon Andrus began by noting that the introduction of vaccines in the Americas (Latin America and the Caribbean) has eliminated or eradicated measles, neonatal tetanus, and polio, and significantly reduced the incidence of diphtheria and pertussis. There is still work to be done, however. In the Americas, two children die every hour of pneumococcal invasive disease. The 7-valent vaccine could prevent at least one of those deaths every hour, Andrus said.

The PAHO experience shows that new vaccines can be successfully introduced and sustained over time. The PAHO Revolving Fund was established 30 years ago to ensure a safe, effective, uninterrupted, sustained supply of vaccines. The Fund uses a bulk purchasing strategy, negotiating best price based on regional demand, rather than per country estimates. A credit system allows for delivery of vaccines in the absence of immediate funds. Countries reimburse PAHO within 60 days after vaccine is delivered, and pay 3% of the price of the vaccine as a service charge to be put into PAHO’s working capital.

This allows for stable and prompt supplies of safe, high-quality, affordable vaccines. For manufacturers, annual contracts facilitate the ability to plan for production, delivery and financial investment. There are reliable demand forecasts, and transparency of relationships. PAHO facilitates “one-stop shopping” for manufacturers, eliminating the need to negotiate individually with 36 member states.

In 2006, the Directing Council of PAHO passed a landmark resolution urging all member states to identify new revenue sources to sustainably finance the introduction of new vaccines against rotavirus, pneumococcus, influenza, and human papilloma virus (HPV). The Revolving Fund policy of single best market price, linked with the power of bulk purchasing, will be critical to accelerated and sustained uptake of these vaccines in participating Latin American countries, Andrus concluded.

If the introduction of pneumococcal and rotavirus vaccines is delayed, almost one million children could die from vaccine-preventable diseases over the next 30 years, and the world would not achieve MDG4, the Millennium Development Goal to reduce child mortality. Ensuring equitable and sustainable introduction of new vaccines into developing countries requires a strategic vision grounded in long-term goals, Andrus emphasized, not short-term fixes.

Moving forward

In summary, to successfully protect populations from vaccine-preventable disease, initiatives should strive to achieve high immunization coverage rapidly, across age groups with the highest disease burden, and should monitor waning immunity. The impact of herd immunity in controlling infectious diseases can be considerable. (i.e. the indirect protection of unvaccinated individuals in a population where a sufficiently high percentage has been vaccinated). However, carriage reservoirs that can reintroduce the organism into the population should not be underestimated. Working as partners, the public and private sectors must develop creative funding approaches to ensure timely, equitable, and sustainable delivery of lifesaving vaccines to developing countries.

Also read: Maternal, Neonatal, and Early Infancy Vaccine Developments

Helping Physicians Better Understand Genomics

A DNA helix.

A new initiative will include an array of efforts, such as a series of ongoing symposia and online community-building tools.

Published June 18, 2009

By Adrienne J. Burke

Image courtesy of Anusorn via stock.adobe.com.

The Life Technologies Foundation has awarded The New York Academy of Sciences (the Academy) a two-year, $400,000 grant to help educate thousands of physicians and medical students in how to utilize new technologies and share cutting-edge research to improve healthcare and understand disease.

“There is a critical need to educate our physicians, both the current generation and the next, to use genomics to think about disease and treatments, and to apply the latest discoveries in this field to the understanding of human health,” said the Academy’s Scientific Director Stacie Grossman Bloom.

To do this, the Academy will build a live and virtual, global community of physicians, medical students and scientists, focused on the pressing issues and challenges of using new technologies in medicine. Organizers hope this effort can serve as a model of improving physician education and health outcomes in urban centers throughout the nation.

The grant was among three that the Foundation made to organizations working to advance science. “We are proud to support these innovative organizations that advance scientific understanding,” said Greg Lucier, Chairman and CEO of Life Technologies. “The grantees we have chosen are working to demystify science to the public by providing training and access for using biology to make life even better.”

The Life Technologies Foundation is a non-profit arm of Life Technologies Corporation, a global biotechnology tools company dedicated to improving the human condition.

Also read: Collaboration Is Key to Breaking New Ground in Genomics

A Global Giver Lends Support from Japan

A shot of beautiful architecture and cherry blossoms in Japan.

With a successful medical career in obstetrics and gynecology, Kenichi Furuya also spends his time advancing science as a member of the Academy’s Darwin Society.

Published March 1, 2009

By Adelle C. Pelekanos

Image courtesy of ake1150 via stock.adobe.com.

At the core of The New York Academy of Sciences’ (the Academy’s) mission is a commitment to “creating a global community of science for the benefit of humanity.” It is a statement that deeply resonates with the Academy members from 140 countries – including Darwin Society member Kenichi Furuya. For this Japanese researcher, the Academy membership is one important way to bridge the distance between Tokyo, New York City, and other international hubs of science.

Furuya, a specialist in obstetrics and gynecology, holds both an MD and PhD. He is a professor and Chairman of the Department of Obstetrics and Gynecology at Japan’s National Defense Medical College. In addition to his association with the Academy, Furuya is a fellow of the International College of Surgeons (headquartered in Chicago) as well as a number of Japanese medical societies. He was born in Tokyo in 1953, and still lives in a central area of the city, Bunkyoku.

A Proud Scientific Tradition

Furuya graduated from the School of Medicine at Japan’s Juntendo University in 1979. He recounts his alma mater’s history with pride: “Our medical school was founded as one of the oldest western-style private hospital/schools in Edo City (Tokyo), in 1838,” during a period of national isolation. Thirty years later, Japan’s Meiji Revolution opened the country’s doors to the West, Furuya explains. Juntendo’s third president, Susumu Sato, was the first Japanese student to study abroad officially, and since the late 19th century the school has encouraged international education and collaboration between researchers. Furuya is a product of this tradition, as evidenced by his active membership and generous support of the Academy.

In the almost 30 years since graduating from Juntendo, Furuya has worked in various areas within obstetrics and gynecology, including basic molecular research, reproductive immunology, clinical reproductive medicine (such as IVF- ET and laparoscopic surgery), and clinical pelvic surgery (such as ovarian and uterine cancers).

In his current work, Furuya focuses on two areas of gynecological research. First, he is studying the mechanisms by which the fetal period of pregnancy (week 10 through birth) affects the development of metabolic disorders in children. In particular, Furuya is interested in diabetes, obesity, and hypertension as epigenetic influences of this period, in pregnancies complicated by placental malfunctions such as gestational diabetes mellitus, nutritional deficiency, and pregnancy-induced hypertension.

Secondly, Furuya is working to clarify the basic mechanism of the relationship between ovarian endometriosis (EM) and ovarian cancer. Epidemiologic findings indicate a strong positive correlation between ovarian EM and ovarian clear cell carcinoma characterized as “refractory,” or resistant to chemotherapy, he explains.

It Runs in the Family

Kenichi Furuya

Furuya’s family, past and present, shares the doctor’s dedication to medicine. Furuya’s wife is an anesthesiologist, his son is an obstetrics-gynecology resident, and his daughter is in dental school. His late father, Hiroshi Furuya, was a gynecologist and emeritus president of the Society of Tokyo Maternal Health. In the 1970s, the elder Furuya was a visiting professor at Germany’s Hamburg University, as well as Columbia University. Furuya not only inherited his father’s vocation, but also his passion for participation in the global science community. It was his father’s status as an Academy member during his time at Columbia that inspired Furuya to become a member 20 years later.

Support in an Important Time

Furuya’s proud support of the Academy conveys his passionate support for scientific collaborations across the globe, and in particular, between the US and Japan. With the new presidential administration, Furuya believes that the American society may be undergoing its “fourth revolution”—identifying the first as the American Revolution, the second as the Civil War, and the third as the end of World War II. “I have been impressed indeed that [the US is changing its] basic social, political, and historical foundations,” Furuya explains. He likens this period in American history to his own country’s Meiji Revolution, the time that ushered in new world views and sparked international dialogue between Japan and the world. Furuya’s long-distance membership is his vote of confidence in the current and future relationship between the US and Japan.

Although Furuya has traveled to New York a number of times, he has not been to the new Academy headquarters at 7 World Trade Center. He plans to visit in the near future, and to continue his support of the Academy. “It is my great honor to support the activities of The New York Academy of Sciences given its long history and many pure science traditions,” Furuya says.

Also read: Changing the Face of Molecular Medicine


About the Author

Adelle C. Pelekanos is a freelance science writer in New York City.

A New Approach to the Hippocratic Oath

A variety of different pills and other medicine in over-the-counter packaging.

For more than 40 years, public- and private-sector biochemical pharmacology experts have been sharing knowledge at Academy meetings.

Published September 1, 2007

By Jill Pope

Image courtesy of Artinun via stock.adobe.com.

It’s a rare occasion when scientists from competing pharmaceutical companies and academic laboratories come together to share their latest findings on human diseases and treatments. But since 1964, The New York Academy of Sciences (the Academy) has played host to a regular meeting of biochemists, molecular biologists, and biomedical researchers who do just that.

The members of the Biochemical Pharmacology Discussion Group (BPDG) hail from more than a dozen pharma and biotech companies, as well as top research universities and medical centers. Pfizer and Bristol-Myers Squibb provide major funding. The American Chemical Society, AstraZeneca, Boehringer Ingelheim, and Novartis also sponsor the group.

As the oldest of the Academy’s 14 discussion groups, the BPDG convenes eight times a year for half- and full-day symposia where experts address topics designated by the group’s steering committee. More than 70 scientists attend each meeting.

Good Career Move

Academy Fellow Martha Matteo began attending the group’s meetings nearly 25 years ago when she was a scientist for Boehringer Ingelheim. New to the pharma industry in 1983, she recalls she was pursuing a theory about how anti-inflammatory steroids affect protease levels. It ran counter to the conventional wisdom that “leukotrienes and prostaglandins modulate everything.”

She organized a BPDG meeting where other scientists presented evidence that steroids induce protease inhibitors. “The story was just unfolding and I got right in the thick of it,” she says. “I had the opportunity to test ideas with a broad range of industry and academic scientists, separate from long-held beliefs and prejudices.” Matteo, who eventually became director of knowledge management and R&D planning at Boehringer Ingelheim before retiring last year, adds, “The Academy has long provided neutral territory and instant feedback in the exploration of new ideas.”

For industry and academic researchers who work mostly in isolation from one another, BPDG events are opportunities to connect, says Charles Lunn, a research fellow at Schering-Plough Research Institute and the group’s current program coordinator. While attendees don’t disclose proprietary information, people do share their work. That’s essential because in industry, Lunn says, “Much high-quality science is accomplished that is never communicated to the academic community.”

From Theory to Therapy

An Alzheimer’s seminar drew more than 100 participants last December. This is another example of a BPDG forum where researchers discuss cutting-edge research. Alzheimer’s researchers have focused on two main culprits in their search for the cause of this devastating disease: amyloid-β peptide (A-beta), which forms plaques in the brain, and tau, a rogue protein that forms tangles. A-beta is produced when a large protein is cut by two enzymes. Several leading experts on one of those enzymes, γ-secretase, shared their insights into how it might be targeted by Alzheimer’s therapies. Others discussed the role of tau: some showed how amyloid pathology may trigger changes in tau, and others examined how tau abnormalities lead to cell death.

Speakers included Mark Shearman, senior director of neuroscience drug discovery research at Merck in Boston; Thomas Lanz, a scientist in central nervous system biology at Pfizer Global Research & Development; Michael Wolfe, who in 2006 established the Laboratory for Experimental Alzheimer Drugs at Harvard Medical School; and David Holtzman, head of the Department of Neurology and associate director of the Alzheimer’s Disease Research Center at Washington University in St. Louis.

Matteo, who chaired the group from 1989 to 1994, says it’s not unusual to see theories presented at BPDG meetings turn into therapies years later. Around 1990, she remembers, the group held a meeting to discuss a potential approach to cardiovascular disease called angiotensin II receptor blockers. Today, ARBs such as losartan and valsartan are standard therapy for hypertension.

Setting The Agenda, Seeking Diversity

It’s easy to imagine how the BPDG will continue to benefit young scientists’ careers the way it did Martha Matteo’s.

Recent seminars have included another on Alzheimer’s research trends—“Immunotherapy for Neurodegenerative Diseases,” in which experts discussed ways to train the body’s immune response to attack the wayward proteins that plague patients with Alzheimer’s and other diseases of the brain and spinal cord.

In May 2007, the group hosted “The Future of Monoclonal Antibody Biotherapeutics.” Monoclonal antibodies are cloned proteins that modulate the activity of specific disease targets. In cancer treatment, they zero in on tumor promoters, leaving healthy tissue alone. The therapies are already benefiting patients, but they have limitations, including high production costs. Speakers discussed new approaches, such as optimizing cell culture processes, that promise to spur the therapies forward and make them more widely available. Also this past year, speakers at BPDG’s “Novel Strategies for Compound Identification from Compound Libraries: High-Throughput Screening” presented diverse approaches to drug screening such as Biotrove’s RapidFire mass spectrometry, and virtual screening with the University of New Mexico’s high-throughput flow cytometry platform.

Diabetes to Stem Cells

The 2007-2008 meeting schedule will cover progress in treating diabetes and eating disorders, psychiatric illness, and atherosclerosis, as well as tools for drug discovery including adult stem cells and molecular imaging. Setting the year’s agenda is a labor-intensive process, requiring committee chairs to tally the votes of hundreds of discussion group members. But the result is worthwhile, says Ross Tracey, an associate research fellow at Pfizer who led the group from 2002 to 2006: “The programs that emerge have clearly passed the popularity and interest test.”

To ensure the continued relevance of BPDG meetings, Jose Perez, a senior principal scientist at Pfizer and a committee co-chair, is on a mission to recruit new members to the group. In the coming year, he’ll reach out to scientists at underrepresented pharma and biotech firms, as well as at New York City’s universities. “That’s the only way the organization is going to have a broad perspective,” he says. “We really strive for diversity of thought.”

Also read: Equivalence of Complex Drug Products: Scientific and Regulatory Challenges


About the Author

Jill Pope writes about science and policy issues. She served as Senior Editor for The Cutting Edge: An Encyclopedia of Advanced Technologies (Oxford University Press, 2000).

How Can Science Help in the Fight Against Poverty?

A straw hut.

A global scientific publishing initiative follows the philosophy of the Millennium Development Goals by tackling poverty from all angles

Published September 1, 2007

By Leslie Taylor

For the last decade, a technological marvel, has been saving lives in sub-Saharan Africa. It has no bells and whistles, no microprocessors or moving parts. It is a simple piece of insecticide-treated netting.

Bed nets made from this material remain effective deterrents against mosquitoes for three to five years. Donors, governments, and community leaders have embraced the low-tech tool as a valuable public health intervention and frequently hand out nets during immunization campaigns and antenatal clinics. About $5 buys a net that will shield two children from mosquitoes as they sleep—an incredibly effective means of preventing malaria, a disease that kills more than 1 million people a year.

The nets are a great example of what can be achieved when the scientific and development communities work together to identify needs and implement new ideas, says John McArthur, who was deputy director of the United Nations Millennium Project and is now associate director of the Center for Globalization and Sustainable Development at Columbia University’s Earth Institute. To put life-saving technology in the hands of the people it is designed to benefit requires the cooperative efforts of scientists, policy makers, and the communities they hope to serve, he says.

A Different Publish-Perish Paradigm

That philosophy of partnership underpins the Millennium Development Goals, which aim to achieve target levels of world-wide nutrition, health, literacy, and environmental sustainability that were set at the Millennium Summit in September 2000. It is also at the heart of a new program called Scientists Without Borders SM that was co-conceived by The New York Academy of Sciences (the Academy) and the U.N. Millennium Project. And now a massive cooperative effort in the interest of global development is taking place among scientific publishers.

This year, halfway to the 2015 deadline that world leaders set for achieving the Millennium Development Goals, 230 science journals worldwide will simultaneously publish papers or special editions on the topic of poverty and human development. Publications participating in the Council of Science Editors initiative include wide-circulation journals such as Science and Nature and more specialized volumes such as the African Journal of Drug and Alcohol Studies, the Chinese Journal of Evidence-Based Medicine, and the Wisconsin Medical Journal.

The Annals of the New York Academy of Sciences will publish a volume titled Reducing the Impact of Poverty on Health and Human Development: Scientific Approaches.

A Multidisciplinary Approach

The Annals volume, edited by Stephen Kaler and Owen Rennert of the National Institute of Child Health and Human Development, takes a multidisciplinary look at the issues facing the world’s poor. Chapters address public health issues in the developing world as well as specific diseases associated with poverty, such as tuberculosis, malaria, HIV/AIDS, lymphatic filiariasis, and hookworm. Other chapters discuss the poor’s access to health care services, education, proper nutrition, and housing.

The volume will highlight diverse areas of research. It will include a paper on measles by Samuel L. Katz, chairman emeritus of pediatrics at Duke University, who was awarded the 2007 Pollin Prize in recognition of his contributions to pediatric infectious disease research and vaccine development; a paper titled “Sustainable Transfer of Biotechnology to Developing Countries,by Eva Harris, who used the money from her 1997 MacArthur “Genius” Fellowship to establish the Sustainable Sciences Institute, an organization that helps scientists around the world gain access to state-of-the-art training and equipment; and a paper by Nobel Laureate James J. Heckman, professor of economics at The University of Chicago, about the consequences of poverty for human skill formation.

Poverty Is a Many-Stranded Problem

Bashir Jama, author of “Agriculture in Developing Nations,” a paper in the upcoming Annals volume, spent 19 years with the International Centre for Research in Agroforestry before becoming a policy advisor to a U.N. Development Program group working on poverty and the Millennium Goals. He says it’s very difficult to tease apart the problems of poverty and address any single factor in isolation. Agriculture is inextricably linked to health, he says.

For instance, malaria and other tropical diseases can impede worker productivity in farming communities, resulting in reduced crop yields, followed by hunger, and increased vulnerability to disease.

And illiteracy can be an obstacle to heartier harvests. Training in new farm techniques or agriculture technologies can’t be distributed in writing to farmers who can’t read, he notes. Instead, non-governmental organizations and governments must offer in-person training or demonstration farms.

“As scientists we have fairly good knowledge of the ecology and the technical issues that are slowing down progress or that can enhance production,” says Jama. “But giving people the skills they need when they live in remote areas—in areas with limited energy supplies, no electricity or clean water—is challenging.”

Within select communities known as Millennium Villages, networks of scientists with diverse areas of expertise work with residents to address the intertwining issues of agricultural productivity, health, education, and access to markets. Projects to increase food yields and improve access to education and health services coincide with initiatives to improve village infrastructure—roads, sanitation, communication technology, and energy. Villagers are also given advice on enterprise diversification and environmental management.

Leverage Existing Technologies

Residents of the 12 Millennium Villages in 10 African countries have seen tremendous improvements in quality of life since the project started, Jama says. “In one or two growing seasons we’ve seen incredible increases in agricultural productivity, phenomenal decreases in hunger, improved health with a reduction in malaria and waterborne diseases, and safe drinking water becoming available,” he says.

Successes at the Millennium Village sites were not the result of exclusive breakthrough technologies, but came about because experts in a variety of fields took action to supply villagers with a range of basic technologies, such as fertilizer, medication, and water purification systems. “We have the basic know-how,” says John McArthur. “The question in the immediate term is how to mobilize existing technologies.”

Frequently, technologies created for another purpose or discovered in the course of pure research can be greatly beneficial. “It’s a matter of adapting good technologies that may exist in other countries,” says Bashir Jama.

Seemingly uncomplicated technology can have a dramatic impact. For example, the treadle pump—an inexpensive, simple- to-operate, foot-powered pump that can draw water from a well or spring—has revolutionized farmers’ ability to grow food during the dry season. “It’s a good example of a situation where, if the investment is there, it could really increase irrigation, and improve income and nutrition,” says Jama.

Energy and Resource Use

Improved cook stove technologies have also done much to improve the lives of the poor, according to Daniel Kammen, a professor in the Energy and Resources Group at University of California, Berkeley, who contributed a paper titled “Energy & Resource Use in Developing Countries” to the new Annals volume. Respiratory illnesses are one of the biggest health problems in the developing world, where most people typically cook using very simple fires—burning wood or dung on just a few stones. “Making stoves more efficient has actually cut down on one of the leading causes of illnesses worldwide,” he says.

Kammen, who is also founding director of the Renewable and Appropriate Energy Laboratory, an organization that focuses on designing, testing, and disseminating renewable and appropriate energy systems, has seen how the timely application of technology can transform communities. His group works on projects such as promoting sustainable biomass energy management in Zimbabwe, evaluating the performance of single junction amorphous silicon modules used in photovoltaic systems in Kenya, and creating new technologies such as the UV-Tube—an inexpensive and easy-to-use household water disinfection device that uses ultraviolet light to inactivate pathogens.

While each country has slightly different needs, Kammen explains, in most parts of the developing world the basic issues are the same. “There’s a lack of access to clean water, a lack of electricity to do things like read at night or run a business, and a lack of access to education,” he says. “There are some constants, and those mean you can work pretty hard on a project in one country and it’s likely to be useful to people in many other parts of the world. It’s not like a solution you develop in Mozambique is only useful there.”

Create New Technologies

For problems of the poor that do not yet have technological solutions, scientists have found new ways to obtain funding to do the research they hope will ultimately alleviate suffering.

Peter Hotez, editor-in-chief of a soon-to-launch Public Library of Science journal called Neglected Tropical Diseases, wrote a paper about hookworm for the Annals volume. He is president of the Sabin Vaccine Institute, a nonprofit organization that works to provide the world’s poorest people with access to low-cost, safe vaccines and drug treatments for neglected tropical diseases—13 parasitic and bacterial infections that produce chronic and disabling conditions. Many people have not heard of the diseases—including scariasis, hookworm infection, trichuriasis, lymphatic filariasis, onchocerciasis, schistosomiasis, and trachoma—but they are devastating.

“Neglected tropical diseases are one of the primary reasons why poor people remain poor. In some ways [what they do to a person] is worse than death,” says Hotez. “They destroy quality of life and are one of the major reasons we have poor economic development in Africa and elsewhere. These are the diseases that are keeping people mired in this horrible cycle of destitution and despair.”

Yet, until recently, little attention was paid to these scourges. While the private sector has been willing to invest money in research that might lead to an AIDS vaccine, for which there is still a substantial market in the U.S. and Europe, “There’s no way you could ever make a profit on a hookworm vaccine,” says Hotez.

Vaccines and Medication

Thankfully, the Human Hookworm Vaccine Initiative, a public development partnership sponsored by the Sabin Vaccine Institute with major funding from the Bill & Melinda Gates Foundation, is working to develop and disseminate an effective, safe, and low-cost vaccine. “It’s a unique model for making a product for people who can’t afford to pay for it,” Hotez says.

While the vaccine is not yet ready to be distributed, the Global Network for Tropical Disease Control, a program of the Sabin Institute, distributes a “rapid impact” package of medication that includes four anti-parasitic drugs to treat seven neglected diseases. The health kit, which costs only 50 cents per person per year, greatly reduces rates of morbidity, blindness, and skin disease. Yet it is only a short-term solution because diseases such as hookworm have high rates of transmission and re-infection, Hotez explains.

“Millennium development goal number six is ‘to control and fight HIV/AIDS, malaria, and other diseases.’ We feel we can make an impact right now in the ‘other diseases’ category,” he says.

Questions of Investment: Time and Money

While sufficient will and technologies are available to raise the standard of living in the developing world, funding is a primary barrier to success. Too little money is devoted to the cause, and there is no consensus about how the money that is devoted should be spent, experts say.

“A rule of thumb, which varies a little by country and by need, is that it takes a basic investment of about $110 per person per year to achieve the goals outlined in the Millennium Development Project,” says John McArthur. “Right now there is, on average, $25 per person in foreign aid going into these places. That needs to be scaled up two- or three-fold by 2015. There’s not enough money getting to where it needs to go, and a greater share needs to go to practical technologies, like long lasting insecticide-treated bed nets, fertilizer, or drilling bore wells.”

The Need for Collaboration

Bashir Jama worries that, too frequently, what scientists have discovered about issues of development is not being incorporated into national, regional, and global programs. “Decisions are made in a vacuum as though science doesn’t exist,” he says. “Donors, international governments, the policy makers need to take advantage of this knowledge and to link up better with scientists in designing systems that work.”

At the same time, it is important for scientists to make the effort to collaborate with policy makers and with one another in the fight against poverty, suggests Hotez, sharing this quote from Dr. Albert Sabin, the inventor of the polio vaccine, after whom the Sabin Institute is named: “A scientist who is also a human being cannot rest while knowledge which might reduce suffering rests on the shelf.”

Also read: Scientists Step into New Roles to End Poverty


About the Author

Leslie Taylor is associate editor of Update and of the Academy’s online public gateway, Science & the City.

Organic Morality: Our Intuitive Inheritance

That graphic in the shape of a human brain.

In a new book, the Harvard evolutionary psychologist argues that all humans share an innate sense of right and wrong.

Published March 1, 2007

By Laura Buchholz

You are in control of a switch at a railroad station. An empty out-of-control train is racing toward five people walking on the tracks. It will hit and kill them unless you pull a lever to switch the train to another track—but there it will kill one person standing on the track. Do you pull the lever? Why? Or why not?

You are an emergency room doctor. Five of your patients urgently need organ transplants in order to live. In the waiting room is a healthy young man with all of the organs necessary to save these five people. Would you sacrifice the life of the man to save your five patients? Why? Or why not?

If you answered “yes” in one case and “no” in the other, what is the difference between the two cases?

Marc D. Hauser, professor of psychology, organismic and evolutionary biology, and biological anthropology at Harvard University, explores how we answer questions like these in his new book, Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong. On January 11, 2007, as part of The New York Academy of Sciences’ (the Academy’s) Readers & Writers series, Hauser explained that moral decision-making may not flow entirely from experience and education, but instead may have a significant biological aspect that has been shaped, like all human traits, by the forces of evolution.

Instinctive Morality

“We are endowed with a moral faculty evolved to generate intuitive judgments of right and wrong,” says Hauser, adding that the principles underlying those intuitive judgments are unconscious, and therefore, may be immune to cultural influence. In other words, Hauser suggests that the influence of Sunday school may pale in comparison to the effect of thousands of years of genetic programming.

Hauser, who directs the Cognitive Evolution Laboratory at Harvard, collects some of his evidence from the Moral Sense Test—a Web site his lab developed, presenting visitors with “artificial dilemmas” designed to test their moral instincts. Working with a data set of responses from 250,000 subjects from 120 different countries, with ages ranging between 13 and 70, and inclusive of all varieties of religion, Hauser’s lab finds some patterns emerging.

Hauser identifies three principles of automatic moral reasoning that transcend religion, geography, age, and culture. The first is the Intention Principle: That is, most people judge it morally worse when harm is intended as a means to an end as compared with when an equivalent harm is foreseen but as a side effect. When Joe intentionally hits John we tend to hold him more responsible than when Joe strikes an object with the foreseen consequence that this object will fall and hit John. According to Hauser, the Intention Principle operates at an unconscious level: When people judge based on this principle, they are not able to say why they made the judgment.

The second principle Hauser calls the Action Principle, and it states that harm caused by action is worse than exactly the same harm caused by omission.

Consider Two Scenarios:

#1: A man intends to kill his young nephew, who stands to inherit all the family wealth. The uncle goes up to the bathroom where the boy is taking a bath, and drowns the boy in the tub.

#2: A man intends to kill his young nephew, who stands to inherit all the family wealth. The uncle goes up to the bathroom where the boy is taking a bath, and finds the nephew drowning face-down in the tub. The man does not intervene, and lets the boy drown.

The effect is the same, but would a jury find the uncle guilty of murder in the second scenario? Probably not. This principle is available at a conscious level, says Hauser, and may explain why societies generally find active euthanasia more morally troubling than passive euthanasia.

Third is the Contact Principle, which states that harm caused by contact is morally worse than equivalent harm caused by non-contact (e.g., when we hit someone vs. seeing an object fly across a room and hit somebody—or the difference between the two introductory scenarios). This third principle is partially available to human consciousness—about half and half, says Hauser.

Remarkably, Hauser notes that subjects who described themselves as highly religious delivered the same judgments as those who said they were not at all religious. These observations suggest that the system that unconsciously generates moral judgments is immune to religious doctrine. But what does this have to do with biology? Hauser draws a parallel between what he calls our “universal moral grammar” and Noam Chomsky’s linguistic theory of universal grammar.

Judgment and Emotions

In Chomsky’s concept, a child knows, in an unconscious sense, the set of principles for all the world’s languages, and the environment feeds her the sound patterns of the native language. Hauser contends that morality is similarly innate. But what are the neural underpinnings of moral judgment? Is there a dissociation between how we judge and how we act? And how did this system evolve?

Hauser points out that people with brain damage in the ventromedial prefrontal cortex (vmPFC) have some problems with moral judgments, suggesting that this area may play a part in our evolved moral machinery. People with damage in this area, says Hauser, tended to judge in a more utilitarian manner when faced with personal moral dilemmas involving conflict between aversive actions (hitting someone) and positive gains (saving the lives of many). When faced with less personal or nonmoral dilemmas, their judgments are similar to those of people in control groups.

This suggests that people with damage to the vmPFC have largely preserved capacities to judge in both non-moral and moral situations, but for a selective class of moral dilemmas, they are strict utilitarians. As this region of prefrontal cortex is known to be involved in mediating the relationship between emotional processing and decision making, it seems possible that morality may have evolved in tandem with the emotions, perhaps a fortuitous advance for those who would reap the protective benefits of life in a group.

We Can’t Help It

“Understanding the biology of moral judgment will not dictate what we ought to do,” concedes Hauser, pointing to a split between a description of our judgments and a prescription of how we should act or how we actually act. (Go ahead, have another cookie, says a small invisible voice. And your hand reaches out …) But what it can do is to help societies craft policies that do not violate this universal, intuitive code. “If a law is not sensitive to our intuitive psychology,” says Hauser, “it will never go anywhere.”

How different societies deal with euthanasia illustrates how our intuitive principles interact—and sometimes conflict—with policy. In the case of euthanasia, most medical boards agree that it is better simply to withhold treatment than to be an active participant in the death of a patient. However, says Hauser, Belgium and the Netherlands no longer support a distinction between active and passive euthanasia. Nevertheless, there still exists in those countries a bias towards passive rather than active euthanasia.

In this case, says Hauser, “the law does not penetrate intuitive psychology, even though permission is explicit in the culture.” Hauser is hopeful that his findings will do more than help us craft better laws. “Appreciating the fact that we share a universal moral grammar, and that at birth we could have acquired any of the world’s moral systems, should provide us with a sense of comfort, a sense that perhaps we can understand each other. Deep in our past we might find some hints to our moral state and perhaps to our future.”

About Marc D. Hauser

Marc D. Hauser is professor of psychology, organismic and evolutionary biology, and biological anthropology at Harvard University, and is co-director of Harvard’s Mind, Brain, and Behavior program. His previous books include The Evolution of Communication (MIT); Wild Minds: What Animals Really Think (Henry Holt); and The Design of Animal Communication (with Mark Konishi) (MIT). His new book, Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong, is published by HarperCollins.

Also read: National Security, Neuroscience and Bioethics

The Role of Nucleic Acids in Plasma, Serum

A colorful illustration of a DNA strand.

Nucleic acids circulating in plasma and serum can be screened for a variety of conditions. Testing fetal DNA found in maternal plasma may become a noninvasive diagnostic approach.

Published February 21, 2007

By Jill Pope

Most ninth-grade biology students can tell us that DNA and RNA are found within cells. But in both healthy and sick people, these nucleic acids can also be found circulating freely in plasma (the fluid in which blood cells are suspended) and serum. Scientists don’t yet understand exactly how and why nucleic acids are released into circulation, but these nucleic acids are proving to be useful as diagnostic tools in prenatal and cancer care.

Today, researchers are working toward noninvasive prenatal diagnosis of several disorders by analyzing fetal DNA in maternal blood. DNA markers can also aid in the diagnosis of cancer or tell doctors whether a person is responding to chemotherapy. Analysis of circulating RNA may also yield tumor markers and ways to detect fetal abnormalities and pregnancy complications.

No Conclusive Proof

At the same time, basic questions remain. Are DNA and RNA deliberately released into body fluids, or are they a byproduct of some other process? How do they enter the circulation? Peter Gahan of the University of London, along with Maurice Stroun and Philippe Anker, two of the field’s pioneers, have shown that there is a spontaneous release of both DNA and RNA from living cells, including tumor cells. This does not preclude other sources for nucleic acids in plasma and serum, however, such as apoptosis (programmed cell death).

“There are theories, and some evidence, but still no conclusive proof” as to the role circulating nucleic acids play in the body, says Ramasamyiyer Swaminathan, who served as editor, along with Peter Gahan and Asif Butt, of Annals of the New York Academy of Sciences Volume 1075, Circulating Nucleic Acids in Plasma and Serum IV. He organized and hosted the most recent conference on the subject, held at King’s College, University of London, in September 2005. More than 200 experts in the field attended, and this volume provides a record of the meeting.

Much of the research is geared toward developing better diagnostic tools. “I think that for things like lung cancer, where early detection is important, and conventional methods are unable to detect it, this will be very useful,” Swaminathan says. He also believes research will soon translate into maternal blood tests to diagnose prenatal disorders.

What Can Fetal DNA Tell Us?

Since the 1990s, scientists have been able to detect fetal DNA in the bloodstream of pregnant women. Circulating fetal DNA can be used diagnostically in two ways. Its quantity can be measured to aid in the detection of preeclampsia (pregnancy-related high blood pressure), risk of early delivery, and Down syndrome. Scientists can also examine the DNA qualitatively to look for the presence of certain genetic factors, such as those that indicate blood disorders such as β-thalassemia (severe anemia) or Rh disease.

Before doctors can tell if a pregnant woman has levels of fetal DNA in her bloodstream that are cause for concern, scientists need to establish a baseline for normal levels of fetal DNA in maternal blood. To do that, Diana Bianchi and her colleagues at Tufts-New England Medical Center investigated whether factors such as maternal age, weight, smoking, ethnic background, and type of conception affected circulating fetal DNA levels in normal pregnancies. They found that maternal weight in the second trimester was the only relevant factor—and that fetal DNA levels were lower in mothers who were heavier, which may have to do with the larger volume of body fluids in the heavier women.

Increased levels of fetal DNA in the mother’s bloodstream can be used to monitor pregnancy complications and may, in the not too distant future, help predict them. Bianchi’s group has found that among women at risk for delivering early, those with high concentrations of fetal DNA in their blood were significantly more likely to deliver before 30 weeks than those with lower levels.

The Role of Preeclampsia

Dennis Lo and colleagues at Prince of Wales Hospital in Hong Kong have found that preeclampsia is associated with a five-fold increase in fetal DNA levels. Both Lo’s group and Bianchi’s group have found that it is possible to detect trisomy 21, the chromosomal triplication that causes Down syndrome, by measuring levels of fetal nucleic acids in maternal plasma.

Adding to the progress in Down syndrome diagnostics, Vincenzo Cirigliano and colleagues at the General Lab in Barcelona, Spain reported that an alternative to karyotyping called quantitative fluorescent PCR could decrease the time needed to confirm the presence of an extra chromosome 21 in fetal DNA from two to three weeks to one or two days. Cirigliano’s group analyzed some 30,000 amniotic fluid samples, and found that the rapid technique was highly accurate in detecting major fetal abnormalities.

The ability to analyze fetal DNA within a maternal blood sample has already led to changes in clinical practice. Dennis Lo and his colleagues demonstrated in the late 1990s that a test of fetal DNA in maternal serum could reliably indicate whether the fetus has Rh-negative or Rh-positive blood. Mothers who are Rh negative need to find out their baby’s Rh status, because the baby may be at risk of developing Rh disease, in which the mother’s immune system attacks the baby’s blood cells. In parts of Europe, noninvasive maternal blood tests for fetal Rh status are now part of standard prenatal care.

Separation Anxiety

About 10 years ago, the discovery of fetal DNA in maternal plasma had many researchers excited about the potential to screen the DNA for genetic diseases and disorders without invasive procedures such as amniocentesis. Since that time, the problem has been how to distinguish fetal DNA from the maternal DNA around it. Until recently, the only reliable way to know the DNA belonged to the fetus was to detect a Y chromosome. Because females have two X chromosomes, if a Y chromosome were present, it would have to be from a male baby. (At-home baby gender tests that look for the Y chromosome are now on the market, but the tests are controversial.)

The picture is changing now, as researchers have reported two different ways to distinguish the baby’s DNA from the mother’s. One promising marker of circulating fetal DNA is its size. Sinuhe Hahn and colleagues at the University Women’s Hospital in Basel, Switzerland, have found that circulating fetal DNA molecules are measurably smaller than circulating maternal DNA molecules. Using gel electrophoresis, they observed that about 70% of cell-free fetal DNA was less than 300 base pairs in length, while about 75% of cell-free maternal DNA was more than 300 base pairs. They were able to separate out the fetal DNA by selecting for and enriching the smaller DNA molecules.

The Role of Methylation

Another technique to identify circulating fetal DNA takes advantage of the difference in the methylation state of maternal and fetal DNA. Methylation is an epigenetic factor, meaning that it influences the expression of genes without changing the actual DNA sequence. The process, which plays a major role in gene silencing, occurs when a cytosine base is modified by the addition of a methyl group. Sites called gene promoter regions can be undermethylated (hypomethylated), which may increase transcription levels, or overmethylated (hypermethylated), which may prevent gene transcription.

Lo’s group looked at the methylation state of placental cell DNA and compared it with the methylation state of DNA in maternal blood cells. They discovered that the maspin gene, a well-known tumor suppressor gene, is hypomethylated in the placenta and hypermethylated in the maternal blood cells. They then detected hypomethylated maspin sequences circulating in the plasma of pregnant women and observed that these sequences were rapidly cleared from the plasma after delivery, indicating that they were fetal DNA. Though the source of fetal DNA in maternal plasma has not been established, many researchers believe it comes from the placenta. Researchers expect that Maspin could be the first of many fetal epigenetic markers.

Improving Cancer Diagnosis

Analysis of circulating nucleic acids is also proving fruitful in cancer care. Investigators are analyzing nucleic acids to help detect cancers early, reduce the need for invasive biopsies, and identify people who are likely to respond to treatment.

Many researchers have focused on lung cancer, the leading cause of cancer death worldwide. Most lung cancers are not found until they are in advanced stages, in part because current measures—chest X rays and cytological sputum tests that look for abnormal cells under a microscope—are not useful for early detection. Research shows that analyzing circulating DNA for methylation of tumor suppressor genes and for genetic instability of microsatellites can improve the diagnosis of lung cancer.

Yi-Ching Wang of National Taiwan Normal University in Taipei and coworkers recently tested a panel of biomarkers for this purpose. They analyzed DNA markers in sputum samples from cancer patients and healthy individuals and compared them with those markers in tumor or normal lung tissue samples from the same people to see whether DNA from sputum pointed to the presence of cancer. Their work yielded seven useful diagnostic markers, including methylation of the tumor suppressor genes p16INK4a and RARβ. The authors suggest that testing for these markers could improve current diagnostic methods, and that markers of DNA methylation could become powerful diagnostic tools.

Predicting Response to Chemotherapy

Doctors who treat lung cancer have more chemotherapy options today than they did 10 years ago. They can try another option if they can determine early on that a drug or drug combination is ineffective, saving the patients precious time and sparing them from unnecessary side effects. The imaging techniques used to assess tumor mass are often not sensitive enough to detect changes until after several rounds of chemotherapy. Stefan Holdenrieder and colleagues at the University of Munich set out to discover whether blood markers could detect the tumor’s response much earlier.

To date, CYFRA 21-1, a serum protein marker, has been the strongest indicator of prognosis in non-small cell lung cancer. Holdenrieder and his group have shown that measuring levels of circulating nucleosomal DNA (the basic unit of packaged DNA, usually found in the nucleus of cells but also found in cell-free form) along with CYFRA 21-1 can identify patients who will respond to the first round of chemotherapy. In their most recent work, they asked whether the same two markers could be used to predict response even earlier—during the first round of treatment.

In a study of more than 300 people with advanced lung cancer, the researchers measured the levels of a number of biomarkers and of nucleosomal DNA to distinguish those patients whose tumors were in remission from those whose tumors were progressing. Higher concentrations of nucleosomal DNA and CYFRA 21-1 identified a subgroup of patients who were unlikely to respond to chemotherapy, and it identified them early—nucleosomal DNA was measured on the eighth day of therapy and CYFRA 21-1 was measured before the start of a second round of therapy.

Detecting Lung Cancer with Circulating Nucleic Acids

Out of a subgroup of 270 patients with good clinical status, 84 had cancers that progressed. The combination of markers correctly identified 30% of these patients as non-responders. If the markers had been used to manage treatment, they could have allowed a change of regimen for the non-responders before the start of the second round. Importantly, the markers did not point to any of the remaining 70% of the patients in this group whose tumors responded well to the initial treatment.

Indeed, research on using circulating nucleic acids to detect lung cancer may be ready to move to the clinic. A literature review in Clinical Chemistry (October 2006) found that based on what is now known, it would be possible to develop “a simple blood test” for screening, staging, prognosis, and evaluating response to treatment. The authors called for large studies “to integrate blood marker-based assays into the clinical setting.”

The next meeting devoted to circulating nucleic acids research will be held in May 2007, in Moscow. But before too long, Swaminathan predicts, this research will simply become part of the disciplines in which it is applied. Its techniques are already being adopted by specialists in fetal medicine, oncology, and other diseases. “I see that in a few years’ time, there will be a subsection of oncology conferences,” he says. “It is more important for oncologists to show other oncologists what is happening.” The research has already become a part of fetal medicine conferences. Wherever they share their findings, researchers in this field will continue to work toward earlier, faster, and more accurate diagnosis and management of disease.

Also read: The Primordial Lab for the Origin of Life


About the Author

Jill Pope writes about science and policy issues. She served as Senior Editor for The Cutting Edge: An Encyclopedia of Advanced Technologies (Oxford University Press, 2000).