Reflecting on ethical considerations posed by the famous Irish-born satirist nearly a century after his play critiqued aspects of the medical profession.
You’ve invented a “miracle cure” for tuberculosis. Unfortunately, you have limited supplies of the drug and have room for only one more patient. You must choose between saving the life of a penniless doctor dedicated to helping the poor or a talented but dissipated artist whose neglected wife attracts your eye. Who would you save?
That’s the “dilemma” facing the protagonist of The Doctor’s Dilemma, George Bernard Shaw’s 1906 satire on the medical community and the conflict between the arts and sciences. To examine the play’s treatment of medical ethics and its relevance to today’s physicians, The New York Academy of Sciences (the Academy) co-sponsored a panel discussion and play at the Graduate Center of the City University of New York, on March 30, 2004. The event, The Doctor’s Dilemma: Quirks and Quacks, was co-sponsored by the Martin E. Segal Theater Center, the Bernard Shaw Society, and the City University’s Science and the Arts Program.
After actors from the Juilliard School of Drama read three scenes from Shaw’s play, a panel discussion was held with Mark Horn, MD, MPH, director of medical alliances at Pfizer’s Alliance Development; Howard Kissel, senior theatre critic for the New York Daily News, and John T. Truman, MD, MPH, Professor and Deputy Chairman of the department of pediatrics, Columbia University/Children’s Hospital of New York-Presbyterian. Rhonda Nathan served as moderator.
A Lifelong Skeptic
Kissel opened the discussion by stating that Shaw’s portrayal of doctors was too harsh. “Shaw was cantankerous and his plays were often designed to provoke controversy,” he said. The first scene, in which a group of doctors congratulate a colleague on his knighthood, rapidly turns into a debate over which medical procedure is superior. While the lead character believes that germs must be coated with a chemical in order for the body’s immune system to fight them, a surgeon believes that nearly all diseases are caused by blood poisoning, and yet a third says diseases can be avoided by cutting out everyone’s nuciform sac.
Shaw’s implication is that doctors promote their procedures to gratify their ego or their wallet rather than the needs of the patient. “Shaw counted doctors among his friends, but also remained a lifelong skeptic toward the medical profession,” Kissel said.
While Horn agreed that Shaw had “a contemptuous attitude toward doctors,” he thought the play was a parody that remains timely and contains some uncomfortable truths about medicine. For example, the play’s premise—how a doctor decides whom to treat when there is a limited supply of medicine—echoes the “health care rationing” of medical services offered by today’s HMOs.
The Poor Doctor versus the Brilliant Artist/Scoundrel
Truman noted that Shaw presented the conflict in terms of class and profession—the “poor doctor” versus the “brilliant artist/scoundrel.” But Shaw’s play harks back to an era when health care decision-making was influenced by the idea of ‘social utility.’
“In those days, if you wanted to get a kidney transplant, there was a ‘scorecard’ determining whether or not you would get it. You were rated according to what you had to offer to society, and that determined whether or not you got a kidney transplant,” he said. Such reasoning (based on Social Darwinism) is obsolete today, he felt, although the wealthy continue to have more options to receive better treatment than the poor.
In addition, the play’s satire on “medical fads” (the doctors each promoting their new procedures like salesmen), still holds up well today, according to Horn. He pointed out how “current styles of intervention” are a fact of health care and how medicine is constantly changing. For example, there has been a radical reassessment of coronary disease in recent years, and such once-heralded procedures as hormonal replacement therapy have been scrutinized.
Truman observed that the doctors in the play represent different schools of thought. “In the field of ethics, these doctors favor their own procedures because they may have an ‘unconscious bias’ toward their own specialty,” he said. He told an amusing anecdote about Rudy Giuliani, who reportedly visited several doctors during his treatment for prostate cancer. When he went to a radiation therapist, the doctor suggested radiation therapy; when he went to a surgeon, the doctor suggested surgery. “That does not mean doctors are bad; they do believe they have the correct solution,” he said.
What Does it Mean to be a Human Being?
Kissel cautioned, however, that we should remember when the play was written. “In the last 50 years, medicine has been miraculous,” he said. Shaw’s play was written at a time when many medical procedures were still unsafe, and it was not uncommon for people to die from them. Hence, the debate over vaccines in the play did not involve big companies like Pfizer, but vaccines that had been manufactured by farmers.
The panelists also questioned whether Shaw’s delineation of the line separating the arts and sciences remains true today. Are there irreconcilable differences between these two branches of human thought? Will they remain forever at odds?
Horn commented that artists and scientists are different in terms of temperament and this difference creates a barrier in communication. When human beings speak another language, he said, they might resort to “contempt, which would serve as a camouflage to hide feelings of fear over what they don’t understand,” he said.
“Nowadays, the arts have gone off in so many weird directions that the gap between art and science is much less than it was in Shaw’s time,” Kissel added. To him, the issue appeared to be more about humanity. “I think the more important question posed by the play is, what does it mean to be a human being?”
On February 26, The U.S. Food and Drug Administration (FDA) approved Avastin (bevacizumab) for the treatment of metastatic colorectal cancer. It is the first drug to receive FDA approval that works expressly by blocking tumor angiogenesis, defined as the ability to generate new blood vessels. (An important process in embryonic development, angiogenesis is rare in healthy adults, occurring only during menstruation and wound healing.) While Avastin is not a cure, it is being hailed as a significant advance in the long fight against cancer.
Researchers committed to the antiangiogenic approach see the drug’s approval as an early vindication of their central hypothesis: that tumors need a reliable blood supply to grow and spread, and that cutting off that supply could lead to tumor death.
Curiously, most tumors never become angiogenic in the first place. The ability to attract new blood vessels is a special talent, and most human carcinomas are underachievers. The majority remain tiny, stable, and in situ. Their cells grow and die in balance. Ineffectual and unable to feed themselves, they’re as common as they are non-threatening to human health and longevity.
A determined minority of tumors do succeed in tipping the scales in their own favor. To outsmart the body’s immune defenses, they undergo hundreds of mutations at the genomic level, acquiring critical survival skills along the way. At a certain point, the successful tumor switches from non-angiogenic bystander to angiogenic troublemaker. Able to commandeer the blood vessels it needs to grow and proliferate, a tumor may become unstoppable and, with exceptions (e.g., the benign yet highly angiogenic adrenal adenoma), malignant.
The Angiogenic “Switch”
The angiogenic “switch” was the focus of a recent presentation at The New York Academy of Science (the Academy) by Judah Folkman, widely considered the founding father of angiogenesis research. Folkman led a panel of experts, including Robert S. Kerbel, Napoleone Ferrara, Stuart Peltz, and Gavin Thurston, in a discussion of anti-angiogenic therapeutic approaches to treating cancer – ideally, in Folkman’s view, before the switch even occurs.
With an accretion of data and insights gleaned from more than 30 years of research, the panelists clarified the scientific foundation for what has been termed the “fourth arm” of cancer treatment (chemotherapy, radiation therapy, and surgery are numbers one, two, and three). The speakers also brought an enthusiastic audience up to date on the latest discoveries, clinical trials, and an entirely new class of drugs – angiogenesis inhibitors – that could help win some major battles in the war on cancer, not through direct attack, but by depriving the enemy of its lifeblood.
New Drugs, New Targets
Instead of aiming their firepower at tumors themselves, anti-angiogenic drugs target the genetically stable endothelial cells in and around the tumor bed. These normal host cells not only provide a nourishing environment for the tumor, but they may also secrete a number of proteins that stimulate tumor cell growth. In other words, they aid and abet the whole undertaking.
Only one meaningful target for the new angiogenesis inhibitors has emerged thus far: vascular endothelial growth factor (VEGF). In his presentation, Napoleone Ferrara, the pioneering Genentech scientist who co-discovered VEGF in 1989 and also led the research effort behind the development of Avastin, explained that VEGF expression occurs in the vast majority of tumors. “To be effective,” he said, “angiogenesis inhibitors must suppress VEGF in both tumor-based and stromal (circulating) cells.”
Today, there are about 30 angiogenesis inhibitors in clinical trials in the United States and 50 worldwide. For example, SU 11248, a promising Pfizer compound, blocks the VEGF receptor, while the newly approved Avastin neutralizes the protein altogether.
Gavin Thurston, director of Cancer Angiogenesis at Regeneron Pharmaceuticals, shared data published in the Proceedings of the National Academy of Sciences on his company’s new drug, VEGF-Trap. In addition to completely blocking new vessel creation in many established tumors, the drug has gone beyond halting tumor growth to actually shrinking them, according to some preclinical studies. PTC Therapeutics, too, has announced a promising anti-angiogenic compound. The company has screened tens of thousands of molecules, said Stuart Peltz, the company’s founder and CEO, and continues to make strides in understanding RNA biology – the cornerstone of its research agenda.
The Unintended Anti-Angiogenic Effects
Interestingly, several speakers pointed to the unintended anti-angiogenic effects of a number of existing drugs. Herceptin, a monoclonal antibody sometimes used in late-stage breast cancer, increases the expression of thrombospondin-1, one of the body’s internally-produced angiogenesis inhibitors, by 500% and decreases VEGF as well. Iressa (AstraZeneca) takes aim at VEGF by blocking its production in tumor cells.
It turns out that thalidomide, recently approved in Australia for the treatment of multiple myeloma, suppresses circulating levels of endothelial cells. Several commercially available drugs, including interferon, Celebrex (Pfizer), and Velcade (Millennium Pharmaceuticals), inadvertently block angiogenic activity. Even tamoxifen, in use for more than 20 years to fight breast cancer, has hidden antiangiogenic potential.
The human body actually produces 12 known endogenous angiogenesis inhibitors, including angiostatin, endostatin – a protein that is 600 million years old – and thrombospondin, among others. In the near future, Folkman said, small-molecule-based therapies may be able to prod the body’s anti-angiogenic proteins into action to fight tumor growth.
In the meantime, anti-angiogenic therapies are inexorably entering the mainstream of cancer treatment. Clinicians are using them alone or in combination with chemotherapy drugs, radiation therapy, and surgery, and they’re rethinking one of oncology’s most cherished orthodoxies: the idea that the maximum tolerated dose (MTD) of a drug is always the best dose.
Metronomic Chemotherapy
Robert Kerbel, professor of medical biophysics at the University of Toronto, has taken on the MTD oncology model in earnest. “Many oncologists believe that if a drug isn’t highly toxic, it can’t be working,” said Kerbel. High levels of toxicity are often accepted as the norm, a necessary by-product of a drug’s cancer-fighting ability. But in 2000, Kerbel and others started investigating a kinder, gentler form of treatment based on combining lower, more frequent doses of conventional chemotherapy drugs, such as paclitaxel, vinblastine, and cyclophosphamide, with newer, targeted anti-angiogenic therapies. So-called metronomic chemotherapy stipulates dosing schedules that mimic the regular tick of a metronome.
The approach is minimally toxic, said Kerbel, and, because it appears to be endothelial cell-centric, it comes with an even more welcome advantage: “Unlike cancer cells, endothelial cells shouldn’t acquire drug resistance as easily, precisely because genetically, they’re stable,” he said. In contrast to MTD, low, steady dosing also goes far toward preventing or delaying drug resistance. Extremely toxic to both cancerous tumors and healthy tissues alike, conventional higher-dose chemotherapy is administered in an all-or-nothing fashion, with several weeks between successive doses. Its anticancer effects may be dramatic in the short run, but tumors may quickly learn to bounce back, especially during breaks in treatment.
“Even patients who have relapsed will sometimes respond to the same drug administered on a metronomic schedule,” Kerbel added. They may do even better, he speculated, with the addition of an anti-angiogenic agent. This approach, he believes, makes good clinical as well as economic sense: “Combining older chemotherapy drugs with newer angiogenesis inhibitors provides an incentive for developing new drugs and using older drugs to keep costs down.”
The Quest for Surrogate Markers
All this comes as welcome news for people who have already been diagnosed with cancer – even for those with late-stage disease. But with all its present-day benefits, angiogenesis research may prove even more relevant for the eventual prevention of tumors than for their destruction after the fact.
Researchers envision a time when angiogenesis inhibitors will be used to treat cancer before it becomes visible. To do that, they need to identify surrogate biomarkers that appear in the blood or urine in tandem with the appearance of tumors, even before angiogenesis truly sets in.
The search for surrogate markers is still at any early stage, but it is no quixotic exercise. Calcitonin, for example, is a known marker for medullary cancer of the thyroid – a deadly disease that, even after surgery, often recurs in the chest and results in mortality rates of 80% and up. Instead of administering late-stage treatment and consigning patients to near-certain death, oncologists may in the future “treat” the calcitonin early on, an approach that Folkman compared to treating an infection.
Major Advances Since 1930
“Before 1930, there used to be no way to treat an infection except surgery,” said Folkman. “Doctors were always asking, ‘Where’s the pus’? The infection’s location was all they had to go on. Today, we don’t have to identify its exact location. Guided by blood tests, doctors treat infections with antibiotics, and they go down.”
All of which means it is time to start moving away from the location paradigm. Instead of focusing exclusively on the tumor – where it is, what it’s doing, how fast it’s growing – oncologists may soon be able to move toward an infectious diseases model of treatment. An angiogenesis inhibitor could be used to treat rising markers in the blood and urine, markers that signal the presence of cancer that it still in situ. Once we’ve identified such markers, hypothesized Folkman, we may be able to thwart cancer by preventing the angiogenic switch from occurring in the first place and making sure tumors remain underachieving nonentities in perpetuity.
A regular diet of quarter-pound hamburgers may not do much for your health. Yet could as little as a microgram of orally ingested proteins prevent or treat devastating disease?
For two-and-a-half days, scientists and clinicians from around the world gathered in Mount Sinai Medical Center’s Stern Auditorium to explore this promising approach to human autoimmune disease: Feeding antigens might inhibit subsequent immune responses. The principle, called oral tolerance, holds out hope for such diseases as multiple sclerosis, diabetes and rheumatoid arthritis.
Oral Tolerance: Mechanisms and Applications, held Oct. 23-26, 2003, was the second conference on this potential therapy sponsored by The New York Academy of Sciences (the Academy). The years in between have brought solid evidence for oral tolerance in animal models, but the path to harnessing it in humans remains unclear. By the conference end, discussion had turned from the future of antigen-specific therapy to a debate on the value of advances since the first meeting, back in 1995.
Putting Principles into Practice
Oral tolerance sounds like an ideal mechanism for manipulating human immune responses, where drug safety is a paramount concern. All it takes is ingesting tiny amounts of proteins. As Howard L. Weiner, the Robert L. Kroc professor of neurology at Harvard Medical School, explained, “It’s nontoxic and can be given on a chronic basis.”
Animal models firmly support the potential therapeutic value of this simple approach. Numerous – and highly reproducible – studies have shown that feeding disease-related proteins can suppress disease. However, it has proven difficult to translate this success to the clinic.
Attempts have still to demonstrate that this approach can be used successfully in humans. Among the failures is a well-designed diabetes trial presented in the meeting’s final session that several speakers cited as a particularly “beautiful” study. Even in the early stages of type 1 diabetes, researchers found that oral insulin did not prevent or delay the development of disease.
“It was a very convincing negative study,” said Warren Strober, acting chief of the Laboratory of Clinical Investigation at the National Institute of Allergy and Infectious Diseases and one of the conference organizers. Results such as these are causing researchers to focus their attention on new findings that support and elucidate the underlying mechanisms.
Control of Immunity
Interest was first triggered when scientists found the same proteins that induce disease when injected can act to suppress disease when administered orally. Because autoimmune diseases such as multiple sclerosis and arthritis are characterized by a lack of suppression, it is believed that oral tolerance might prove useful as a therapeutic for these conditions.
There are two ways to induce a tolerant response in animals. First, small doses of antigen – milligram or even microgram amounts – are believed to act locally in the gut through regulatory T cells. Second, larger doses of antigen might get into the systemic circulation. When the proteins in, for example, a hamburger interact with peripheral organs such as the spleen and lymph nodes, they can result in the deletion of T cells that react to these antigens.
A clinical trial involving multiple sclerosis patients found evidence for the first response in humans. Although the trial failed to demonstrate any clinical difference between patients who were fed an antigen and those who were not, regulatory T cells were identified in the fed group.
“It’s proof that at least one of these mechanisms exists in humans,” said Caroline Whitacre, professor and chair of the Department of Molecular Virology, Immunology, and Medical Genetics at The Ohio State University in Columbus. This finding suggests that much of the research presented is important not only in immunology, but also for the elucidation of oral tolerance pathways in humans.
The first two days of the conference focused on different pieces of these developments. It started with new insights into the control of immunity in the gut. Although the field originated with the oral administration of antigens, it has broadened to include nasal administration as well – what researchers call mucosal tolerance.
The Role of the Mucosal Immune System
The keynote address, by Allan Mowat of the University of Glasgow’s Division of Immunology, Infection and Inflammation, discussed how the mucosal immune system interacts with the body’s broader, systemic immune system. Throughout the conference, speakers continued to compare these alternative routes for antigen delivery.
Dendritic cells, which take up antigens and present them to T cells, were discussed in the second session. It’s thought that enhancing their development could induce more regulatory T cells and thus result in an enhanced tolerant response. “There may be factors that are produced by the epithelium that then subsequently drive dendritic cell maturation,” hypothesized Brian Kelsall, an investigator at the National Institute of Allergy and Infectious Diseases, based on his work with reovirus.
New findings on the role of regulatory T cells also generated significant attention. Andrew Caton, a professor at the University of Pennsylvania’s Wistar Institute, explored “how the diversity of self-antigens affects tolerance induction.” He has found that self-antigens stimulate regulatory T cells through high-affinity interactions with antigen presenting cells in the thymus.
Fiona Powrie, who heads an immunology research group at the Sir William Dunn School of Pathology, University of Oxford, has found that purified populations of CD25 positive regulatory T cells with specificity for self-antigens can cure intestinal inflammation in animals. “That has a lot of traction,” said Strober. “If you could generate these cells as part of oral tolerance, you could turn off inflammation.”
Translating to Humans
A later session turned again to animal studies. It “focused on mechanisms: what cell is presenting the orally or nasally fed antigen, what happens to the cell it is presented to, what are the important molecules for those presentations to take place, what are the homing molecules to take cell populations where they need to go, what are some of the enzymes that are important for promoting tolerance versus immunity,” summarized Whitacre.
The effort to prove oral tolerance in humans has generated considerable interest beyond scientific circles. One story, published in May 2002, drew particular attention. Human Trials: Scientists, Investors, and Patients in the Quest for a Cure features Weiner himself as the field’s unwavering champion. In this book, Susan Quinn chronicles the efforts of the company Weiner founded, AutoImmune, Inc., to develop the concept of oral tolerance into a drug.
AutoImmune’s efforts, however, have not succeeded thus far. Despite the depth of the new findings into the mechanisms of oral tolerance, therefore, the conference’s closing session returned to the challenge of demonstrating success in the clinic. “It’s a testament to people’s tenacity that we’re all here today believing that we might be able to make something of it,” remarked Norman A. Staines, a professor in the Infection and Immunity Research Group at King’s College London.
Promise, Tenacity, and Uncertainty
Often during the conference, the sheer drama of promise, tenacity and uncertainty could be felt. One such moment came during the concluding presentation, by Charles O. Elson, a professor of medicine at the University of Alabama at Birmingham. “As far as getting insight into the fundamental processes that are involved” in oral tolerance, Elson said, “I don’t believe we’ve advanced an awful lot.”
“I can’t disagree more,” Strober hurried to say after the meeting. “We have made enormous progress.” As more is learned about regulatory T cells, it is inevitable that scientists will learn how to induce oral tolerance in a way that allows therapeutic effects, he continued. With each new study, researchers are learning more about how these T cells are generated, expanded and maintained. “The ‘big news’ from this conference is that we are well on our way to this goal.”
Or was uncertainty itself the big news? In the final session, Lloyd Mayer, also one of the conference organizers, addressed those who had presented data from clinical trials in humans. Mayer, professor and chairman of the Immunobiology Center at Mount Sinai, asked “how each one of you decided on the dosing regimen and the scheduling of your oral feeding.” His question pointed to the difficulty of translating doses from animals to man. As he argued, “if we move forward doing human trials with doses that are generated by hand waving, we’re going to wind up with conflicting and variable results.”
Understanding the Mechanisms
How did the field get to this point? Oral tolerance, strictly defined as the active non-response to a foreign protein administered through the mucosal route, was initially described in the early 1900s. The field experienced resurgence in the late 1970s, and studies over the next decade suggested that there are “cells acting to suppress immune responses both locally and systemically,” explains Mayer. At the time of the Academy’s first conference in 1995, the field was exploding. “When we began working on oral tolerance, we thought that it would be as simple as feeding an antigen and getting positive results in people,” said Weiner, one of the conference organizers. “As we all know, that hasn’t happened.”
What has been learned since the promise in animals was first reported? Essentially, everything reported at the conference about the mechanisms underlying oral tolerance in animals was unknown nine years ago. This includes basic information about the mucosal environment in the gut, the processing of antigens by dendritic cells, and the generation and role of regulatory T cells, as well as advances in animal models of oral tolerance.
It is now known, for example, that regulatory cells can work in what’s called a bystander fashion. This means that “they don’t have to be responsive to precisely the right protein. If it’s a protein in the same organ it’s probably good enough,” explains Whitacre.
Some of the cytokines that are involved in shutting down inflammatory responses have been defined, elucidating the mechanism by which regulatory cells might work. “What you have in oral tolerance is the induction of T cells that produce transforming growth factor beta (TGFß), which then induces other regulatory cells to produce interleukin 10 (IL-10),” said Strober, summarizing the current thinking on regulatory T cell function. “These cells could downregulate inflammatory processes.”
More Work Needed
The final conference session focused on the results of several clinical trials. Apart from positive results in smaller phase I and phase II studies, “everybody had data that showed it did not work, which basically we knew, but it was sobering to see it in black and white,” said Mayer, referring to the large-scale phase III studies that were presented. “It was depressing. Oral tolerance is not ready for prime time in humans.”
“At first I was really disappointed,” Whitacre admits. “I wanted one of these trials to be overwhelmingly positive and it just wasn’t.” But, she adds, “it’s really important to know what doesn’t work.”
One common theme was that antigen feeding might be a good adjunct therapy, in combination with other drugs or treatments. In addition, some form of mucosal adjuvant might be required. “I’m convinced that, with careful work, we’ll ultimately get mucosal tolerance to be effective in human diseases,” said Weiner.
So what needs to be done for oral tolerance to succeed? Mayer would like to see all the mechanistic studies that have been done in animals duplicated in humans. Ultimately, the goal is to identify the cells that are involved in tolerance induction and determine the antigen form, regimen and dose required for suppression to occur. This work still lies ahead.
To patients suffering from devastating and untreatable conditions such as spinal cord injuries, the near-limitless potential of stem cells offers a beacon of hope. These cells, if coaxed properly, can grow and divide into almost any cell type in the body. When kept in an incubator and served a specialized cocktail of growth factors, stem cells can give rise to neural cells, heart muscle cells, liver cells, and other cell types.
Finding the right recipe to control stem cells and make them differentiate into the right type of cell when and where one chooses, however, is a major challenge, according to researchers addressing a symposium on October 28, 2003. Titled “Stem Cell Technology: Emerging Science, Therapeutic Potential and Challenges Ahead,” the session was sponsored by The New York Academy of Sciences’ (the Academy’s) Biochemical Pharmacology Discussion Group and the Biochemical Group of the American Chemical Society.
Embryonic and Adult Stem Cells
There are essentially two types of stem cells: embryonic and adult. Embryonic stem cells may have the most promise for treating diseases because they are pluripotent, that is, they can grow and divide, or differentiate, into any cell type found in the fetus and adult. Scientists collect these cells at a very early stage of development, when the embryo is a fluid-filled sack with just a few cells in it, called a blastocyst.
When removed from the blastocyst and placed in an incubator, some of the cells can be grown for at least two years without any noticeable loss of their pluripotency. Due to ethical concerns about the use of human embryos for research, a presidential decision mandates that federal research funds can be used only for a handful of cell lines, created before August 9, 2001. Many researchers conduct studies on mouse embryonic stem cells as a steppingstone to understanding the human variety.
Adult stem cells may provide many of the same advantages as embryonic cells, without the ethical concerns. Researchers have known for decades that bone marrow contains stem cells that can repopulate a damaged blood supply. Liver, skin, and gut also contain stem cells that can regenerate damaged sections of these organs.
More recently, to the surprise of many developmental biologists, some researchers have found that adult stem cells can differentiate into entirely different tissue types. For example, if given the right stimuli, bone marrow stem cells appear to be able to differentiate into epithelial cells of the liver, kidney, lung, skin and GI tract as well as heart and skeletal muscle cells. In the past year, however, studies have presented alternative explanations to this differentiation and it remains to be proven that pluripotent stem cells exist in the adult.
Cells with “Free Will”
Austin Smith of the Institute for Stem Cell Research at the University of Edinburgh said that embryonic stem cells are attractive to researchers because they can be considered to have “free will.” The cells can remain as stem cells in a seemingly endless cycle of self renewal, or at a time of their choosing can exit self-renewal and differentiate into entirely new types of cells. But finding the signals that can keep the cells in self-renewal or shunt them towards the development of a specific cell type is a challenge.
If left to their own devices in culture, the cells tend to differentiate into what researchers call an “embroid body,” a tangled mass of overlapping cells where bone, liver, and even beating heart cells coexist. Smith found that by changing the ingredients of the cell culture medium in which the cells grow, he could make a sort of on/off switch for differentiation into neural cells. The results were published in Cell the same week as the presentation (Cell, 115, 281-292, October 31, 2003).
By trial and error, researchers are finding the key factors that control stem cell growth and differentiation. But might there be a more systematic way to discover why and how embryonic stem cells act as they do? Yes, said Ihor Lemischka, professor of molecular biology at Princeton University. He urged biologists to start thinking of stem cell biology as a system of circuits rather than as individual parts like genes, receptors, and signaling molecules.
The Stem Cell Research Database
To catalogue the complex array of genes involved in stem cell regulation, Lemischka and colleagues started the Stem Cell Research Database. By mining the database for common stem cell genes in mice and humans, Lemischka said, researchers can home-in on the most important genes to study (Science, 298, 601-4, Oct. 18, 2002). He is now examining the function of these genes by systematically deleting them and seeing how they affect stem cell behavior.
While basic research can be done in mice embryonic stem cells, some researchers will need to verify their findings in human cells. Melissa Carpenter, who until summer 2003 was director of stem cell biology at Geron Corp. and now is at the Robarts Research Institute in Canada, has worked closely in characterizing four of the human embryonic cell lines that are available to researchers who want to use federal funds in their work.
Taking note of the growth patterns, lifespan, and potential of these cell lines is essential if they are to be used for basic research, drug discovery and, eventually, therapies in patients. Curiously, although these cells are all “stem cells,” they do not behave identically in culture. If allowed to differentiate, they each form a different array of cell types. However, using a variety of tests, Carpenter’s team at Geron was unable to find major differences between the undifferentiated hES cell lines cells, and a microarray analysis of three of the lines found only a 5-10% difference in gene expression.
Stem Cells and the Heart
In related studies, the researchers were able to differentiate the stem cells into neurons (Exp Neurol 172(2), 383-97, 2001, and liver cells (hepatocytes) (Cell Transplant, 12(1), 1-11, 2003) and beating heart muscle cells. These heart cells beat faster when given common cardiac-stimulating drugs and stopped beating altogether when treated with calcium-channel blockers in a dose-dependent and reversible manner (Circ Res., 91, 501-508, Sept. 20, 2002). Researchers hope that these human cells could someday be injected into a heart attack victim to help repair damaged cardiac muscle.
Meanwhile, George Daley’s group at the Children’s Hospital, Harvard Medical School, is working on embryonic stem cells that can make blood. Just as a bone marrow transplant can restore a leukemia patient’s immune and blood systems, a single embryonic stem cell should in theory be able to do the same thing. But so far, attempts to practice the technique in mice have failed because these transplanted embryonic stem cells, similar to the mouse yolk sac, fail to produce blood cells in the adult mice.
Daley and his group have genetically altered the cells so they make extra quantities of a protein called HoxB4, which drives the cells to engraft mice and differentiate into both branches of the blood cell family – lymphoid and myeloid. The researchers found that they could ramp up blood cell production even more, turning up expression of another gene, Cdx4, which may work by boosting HoxB4 gene expression (Nature, 425, 300-6, Sept. 18, 2003). By enhancing the expression of both genes, Daley and colleagues achieved the result that 70-80% of the host mouse’s blood cells were from the donated stem cells.
Great Hope
Embryonic stem cells raise great hopes and great ethical concerns, but adult stem cells may provide therapeutic advantages without the ethical concerns. However, warned Diane Krause of Yale University, the jury is still very much out on how much potential these cells have – and even on the existence of highly plastic adult stem cells. Recent findings indicate that fusion between the host’s cells and the donated cells may be what is causing the results.
Nonetheless, Krause’s lab has done some pioneering work showing that bone marrow stem cells transplanted into mice have the potential to differentiate into lung cells and liver cells (Cell, 105, 369-77, May 4, 2001) and she showed recent unpublished work showing that this can occur without fusion. However, other researchers recently showed that such results can be due to fusion between the bone marrow cells and the recipient mouse’s lung or liver cells. “Even if it is fusion,” said Krause, “we need to ask, is this something that we can use for therapeutic benefit?”
While great strides have been made in finding individual factors that control the growth and differentiation of stem cells, much remains to be done to understand and control them. Lemischka summed up the challenge by quoting science philosopher and mathematician Jules Henri Poincaré, “Science is built up of facts as a house is of stones, but a collection of facts is no more science than a heap of stones is a house.”
Scientists and doctors now have a better sense of how the adolescent brain develops and offer guidance on how we can support young people during this often-vulnerable period.
The two young teens meet at a dance, and in an instant are mutually enraptured. They declare their love for each other, and over the next several days think of nothing or no one else. The idea of being apart is worse than death – an untimely fate that befalls them both.
The tale of Romeo and Juliet actually predates Shakespeare’s 16th century play by several centuries, illustrating that the emotionally charged nature of adolescence is not new. Along with the traditional throes of puberty, today’s adolescents juggle more homework, tightly scheduled after-school hours, a barrage of media messages, and the temptations of smoking and alcohol.
It’s no wonder that the adolescent years – the transition between childhood and adulthood – can be the most sensitive time in a person’s life. Indeed, although physical strength and reaction times peak in these years, morbidity and mortality increase by 300% in this group.
In September, 31experts gathered in New York City to address the changes and challenges of adolescence at a meeting called Adolescent Brain Development: Vulnerabilities and Opportunities. The conference was hosted by The New York Academy of Sciences (the Academy) in collaboration with the Robert Wood Johnson Foundation and the Tobacco Etiology Research Network. The conference was co-sponsored by the National Institutes of Health.
Rooted in Biology
Investigators at the conference agreed that much of the behavior characterizing adolescence is rooted in biology, intermingling with environmental influences to cause teens to conflict with their parents, take more risks, and experience wide swings in emotion. A lack of synchrony between a physically mature body and a still-maturing nervous system appears to be a primary reason.
“It’s like turbo-charging an engine without a skilled driver,” explained conference organizer Ronald E. Dahl, MD, of the University of Pittsburgh Medical Center. Although adolescents reach adult levels of decision-making by age 15, they make poor decisions in real life. “Adolescents make a lot of decisions that the average 9-year-old would say was a dumb thing to do,” he added. Dahl delivered the keynote address in place of Alan I. Leshner, PhD, of the American Association for the Advancement of Science, whose travel plans were cancelled by Hurricane Isabel.
The interaction of biological changes and environmental challenges that make adolescence a time of increased vulnerability can also make it one of great opportunity. The adolescent brain is built to learn, amassing more knowledge in high school and college than at any other time. With the right balance of guidance and understanding, adolescence can be relatively smooth. And for most kids, that’s exactly the case, with the majority of teens getting through those difficult years just fine.
Taking Risks, Despite the Odds
Driving too fast. Sneaking out. Smoking and drinking. Adolescents know very well that these activities are unlawful, dangerous and unhealthy. So why do they do it?
Part of the answer may be found in brain chemistry, explained Rudolf N. Cardinal, PhD, of the University of Cambridge in the UK. His studies in rats demonstrate that low levels of the neurotransmitters dopamine and serotonin 5HT make them more impulsive, causing them to choose smaller, immediate rewards over larger, delayed rewards. In people, such impulsivity has been linked to drug addiction, personality disorders and attention deficit hyperactivity disorder.
Laurence Steinberg, PhD, of Temple University, emphasized that researchers who explore adolescent behavior must consider the context in which risk-taking behaviors are occurring. Adolescents are more likely to engage in these behaviors when they’re in groups than when they’re alone. Moreover, real-life situations are usually highly charged with emotion, unlike the hypothetical events postulated in investigational settings.
Steinberg added that adolescents’ biological sensitivity to rewards is different than in adults, prompting them to seek levels of stimulation to achieve the same feeling of pleasure. “Increased risk-taking in adolescence is normative, biologically driven and inevitable,” he concluded. Rather than trying to change this behavior, Steinberg suggested such measures as increasing the driving age, raising the price of cigarettes, and enforcing laws restricting alcohol sales.
The Images Tell All
Magnetic resonance imaging (MRI) data can illustrate the changes occurring in the adolescent brain. Jay N. Giedd, MD, of the National Institute of Mental Health, presented MR images showing that the brain’s gray matter – which governs thought, decision-making, movement and sensation – thickens during adolescence, peaking around age 11 in girls and 12 1/2 in boys before thinning down to a stable level by age 25.
Such thickening is not due to more nerve cells, but to an increase in connections between neurons. Although the brain then has more choices of pathways through which to send signals, those pathways are not necessarily faster, making some processing inefficient. White matter – which controls motor functions – increases linearly during adolescence, while the cerebellum (which governs balance) also grows in volume. Giedd contends that adolescence is therefore the most efficient time for teens to take on such motor activities as sports, drawing and instrumental music.
One of the last areas of the brain to mature is the prefrontal cortex. Beatriz Luna, PhD, of the University of Pittsburgh Medical Center, performed behavioral and functional MRI tests to explain why the different parts of the brain learn to collaborate better, and processing becomes more efficient, as adolescents age and the prefrontal cortex matures.
Hormones and Behavior
Are the behaviors we see in adolescents simply a result of “raging hormones?” Certainly that’s not the complete answer, but changes in reproductive and stress hormones can influence behavior. Hormonal changes during adolescence can also trigger depression during this time.
Elizabeth A. Young, MD, of the University of Michigan, asserted that reproductive hormonal changes that affect stress systems during puberty may sensitize girls to stressful life events. Studies show that depressed premenopausal women have a higher baseline level of the stress hormone cortisol than post-menopausal depressed women and than depressed men. The hormone estradiol has not been shown to make women more vulnerable to stress, but progesterone may exaggerate the response to stress. During puberty, changes in reproductive hormones can therefore make girls more sensitive to the effects of stress and can trigger adolescent depression, explained Young.
A Heightened Sensitivity
An elevated sensitivity to rewards is most apparent in teens’ use of cigarettes and alcohol. Some 17% of high school seniors say they smoke daily, and 57% report they’ve tried it. More than 80% have tried alcohol, with about 66% of them reporting they’ve had at least one episode of consuming more than five drinks.
Unlike adults, adolescents seem to be more sensitive to the pleasurable effects of nicotine and alcohol, and less likely to experience the adverse effects. Frances M. Leslie, PhD, of the University of California, Irvine, showed that adolescent rats are more likely to choose nicotine over a saline placebo than adult rats, a condition she attributed to an immature forebrain. George F. Koob, PhD, of The Scripps Research Institute, reiterated those findings in his investigations, adding that adolescent rats are less affected by nicotine withdrawal than adults. “This research provides insight into how and why adolescent humans are so sensitive to nicotine, and why this is such a critical period for intervention,” he concluded.
Why do some adolescents become addicted to smoking, while others quit and move on?
Robin J. Mermelstein, PhD, of the University of Illinois at Chicago, evaluated a group of teens to find out. They completed a questionnaire on a handheld computer seven times a day during a one-week period to assess their moods and activities, and data were correlated with behaviors such as smoking and drinking. She found that smokers are more likely to have friends who smoke and to feel a mood-boosting effect from smoking, while people who have never smoked are less likely to hang out with smoking friends. Moreover, those who try cigarettes but decide not to smoke are less likely to feel a mood boost from smoking.
The Impacts of Alcohol Consumption
Many teens consuming alcohol experience its benefits without its negative effects. H. Scott Swartzwelder, PhD, of Duke University Medical Center, demonstrated that ethanol causes memory deficits in adolescent rats by affecting the hippocampus, which regulates conscious memory. He also showed that alcohol impairs memory in human adolescents. Yet adolescents appear to be less susceptible to the sedative effects of alcohol, such as sleepiness, than adults. “During adolescence, it may be easier to drink the brain to an impaired state without realizing it,” Swartzwelder contended.
Alcohol-related memory impairment may be long-term and cumulative, suggested Sandra A. Brown, PhD, of the University of California, San Diego. Alcohol-dependent adolescents – those who reported starting early and drinking more than 700 times in their lives – had a 10% lower ability to retain verbal and non-verbal information than nonalcoholic controls, even after a three-week abstinence period. Functional MRI studies confirmed the memory impairment, though Brown noted that it is not yet possible to say whether the neurological differences predated the alcohol abuse or were caused by it.
Studies in macaque monkeys illustrate that alcohol also may increase impulsivity and risk-taking by lowering cerebrospinal fluid levels of serotonin 5-HIAA. J. Dee Higley, PhD, of the National Institute on Alcohol Abuse and Alcoholism, reported that monkeys with low CSF levels of this metabolite are more likely to take long, risky leaps between trees than those with normal levels. They’re also more aggressive, socially isolated, intolerant to alcohol’s toxic effects, and tend to over-consume alcohol. In addition, the environment plays a role: Monkeys reared by their peers are twice as likely to consume too much alcohol as are those raised by their mothers.
Just Let Them Sleep?
“Of all the potent insults to the adolescent brain, sleep deprivation is the most widespread,” said Ruth Benca, MD, PhD, of the University of Wisconsin. Without adequate sleep, people can experience deficits in learning, memory, attention, concentration, and psychomotor function, and get into more auto accidents.
Many a parent can recall problems in waking a slumbering adolescent for school. But research shows adolescents’ need to sleep late is biologically ingrained. Their bodies naturally want to go to sleep later at night. And because they need just over nine hours of sleep, they want to sleep in. But the demands of early school starting times and other activities prohibit that, explaining why so many of them make up for lost sleep on weekends.
Theresa M. Lee, PhD, of the University of Michigan, and Mary A. Carskadon, PhD, of Brown Medical School, both reported differences in sleep phase between adolescents and adults. “Many adolescents sleep too little, and in the wrong phase,” concluded Carskadon.
The answer may be simple: start high school later. But school bussing schedules in many towns prohibit that, where busses that take kids to high school in the early morning are later needed to transport elementary school students. A later school start also may cut into the time allotted for after-school activities. It’s not an easy problem to solve. Added David F. Dinges, PhD, of the University of Pennsylvania School of Medicine, “Until we get serious about time and sleep and school start time, it’s going to be difficult for kids.”
Getting Through Adolescence
What distinguishes which kids travel through adolescence smoothly from those who navigate a rockier path? Ann S. Masten, PhD, of the University of Minnesota, reported that those with the supervision of caring adults, good intellectual skills, a positive self-perception, and a positive social group are most likely to fare well. Strong bonds to school, spirituality, and community also help. Several participants agreed that we should not label adolescence solely as a trouble-prone period. “We need to avoid characterizing some adolescents as having ‘bad brains’ or ‘problem brains,’” said Daniel P. Keating, PhD, of the University of Toronto. “That’s too simplistic.”
“Adolescents are more vulnerable in risky situations because of the immaturity of their brains,” concluded Elizabeth Cauffman, PhD, of the University of Pittsburgh. “By explaining these changes, we have a better chance of understanding their behavior.”
Medical advances in recent years have enabled doctors and other health professionals to better understand the scientific mechanisms behind diabetes, which in turn is enabling them to better treat patients.
A typical supper in Sunflower County Mississippi, might start with a basket of hot fried cornmeal hush puppies, followed by a heaping plate of spicy barbecued ribs or crispy fried catfish, topped off with a hefty slice of sticky pecan pie, and washed down with a frosty glass of generously sweetened iced tea. To many, this mouthwatering meal may sound like heaven, but for the tens of thousands of residents of this Mississippi Delta community, it could also be a recipe for diabetes.
Although the Delta is famed for its blues and gospel music, lush fields of cotton, and delectable culinary contributions, it also has the unfortunate distinction of having the highest per capita incidence of diabetes in the United States. Due to an ill-fated combination of genetics, ethnic factors, poverty, cultural obstacles, and a downright unhealthy diet, 10.3% of Mississippi’s population has diabetes, with 7.7% having the type 2 variety. In Sunflower County alone – home to some 40,000 people – one in five residents has diabetes.
But big changes are afoot in Sunflower County, noted Scott Nelson, MD, a family physician and Mississippi native. Nelson was one of five presenters who spoke at a meeting in June called Addressing the NEW Diabetes Epidemic: Uncontrolled Diabetes. The gathering – a conference for science writers – was supported by an educational grant from Aventis Pharmaceuticals Inc. and was hosted by The New York Academy of Sciences (the Academy).
Overcoming Cultural and Financial Obstacles
Public health programs have been started in an effort to overcome the cultural and financial obstacles that prevent many Sunflower residents from adequately controlling their diabetes. Moreover, these programs may serve as models for nationwide efforts to control the rapidly escalating epidemic of type 2 diabetes. The conference presenters addressed the physiological basis of type 2 diabetes, its potential complications, the importance of self-monitoring, the growing role of insulin in its treatment, and new approaches with a greater chance of helping people manage their disease.
Stephen N. Davis, MD, chief of the Division of Diabetes, Endocrinology and Metabolism at Vanderbilt University School of Medicine, described the differences between type 1 and 2 diabetes. Type 1, the type most commonly seen in children, is characterized by destruction of the insulin-secreting beta cells of the pancreas, and results in a lack of insulin. Type 2 – which is commonly called “adult-onset” diabetes, but is now also being detected in children – may feature resistance to insulin and result in insulin deficiency, with beta cells becoming progressively dysfunctional.
Of the 17 million Americans who are estimated to have diabetes, 90-95% have the type 2 variety, but some 5 to 6 million of them don’t know it. Millions more have impaired glucose tolerance, a form of “prediabetes” that can sometimes lead to diabetes if left unchecked. And the problem is only getting worse, with a five-fold increase in the incidence of type 2 diabetes noted during the latter half of the 20th century in the U.S. “You can appreciate what a large public health problem that is,” asserted Davis.
A Genetic Component
Doctors agree that treating diabetes requires a team approach. At a panel discussion, from left: Stephen N. Davis, Richard S. Beaser, Scott Nelson, Alan M. Jacobson, and Stephen Brunton. Photo by Michael Gaffney.
So what can we do about it? The disease has a strong genetic component, a risk factor that can be compounded by an unhealthy lifestyle. Exercise helps by moving glucose from the bloodstream into the muscles. Since fatty acids decrease glucose uptake by the muscles and increase glucose production by the liver, following a diet low in fat can reduce diabetes risk. And different medications work by helping the body to regulate blood glucose levels.
“Despite great advances over the last 10 years, and despite knowledge that if we can control blood glucose to normal levels we can reduce the complications and burden of diabetes, most people [with type 2 diabetes] do not have good glucose control,” said Davis. “We still have great challenges. We’ve got to understand what’s going on in the body so we can intervene appropriately.”
Although monitoring daily blood glucose is an integral part of diabetes management, it’s not the whole story. A more important number today is glycated hemoglobin, or hemoglobin A1C, which is commonly abbreviated as “A1C.” Blood A1C levels represent average glucose levels during the past two to three months. Combined with vigilant daily glucose monitoring, periodic A1C testing offers “a window into the metabolism,” said Richard S. Beaser, MD, a senior physician at the renowned Joslin Diabetes Center in Boston.
The American Diabetes Association recommends that people with diabetes aim for an A1C of less than 7%, while the American College of Endocrinology suggests an even tighter goal of 6.5%. (People without diabetes usually have an A1C level between 4% and 6%.) But getting people to that point isn’t easy, as demonstrated by the statistic that some 57% of people diagnosed with type 2 diabetes still have an A1C level of more than 7%.
A Host of Complications
That could be exposing them to a host of complications. People with type 2 diabetes may have increased blood clotting, high cholesterol and hypertension. If not adequately controlled, diabetes can cause retinopathy (degeneration of the blood vessels in the eye, leading to blindness), abnormal electrocardiogram readings, kidney disease (leading to the need for dialysis and sometimes kidney transplantation), nerve damage, coronary artery disease (which can result in a heart attack), peripheral vascular disease (resulting in leg and foot ulcers and even amputation in some patients), and stroke.
Even modest improvements in A1C can dramatically reduce the risk of diabetes complications. The United Kingdom Prospective Diabetes Study reported that every 1% decrease in A1C lowered the incidence of microvascular complications by 35%, diabetes-related mortality by 25%, myocardial infarction incidence and mortality by 18%, and total mortality by 7%.
Patients can achieve optimal A1C levels by monitoring blood glucose levels several times a day, as directed by their doctors. This can be done using traditional finger-prick techniques, or newer digital blood glucose testers that enable the patient to draw blood from a less sensitive area, such as the arm, and store the information in the testing unit. Patients should share the results with their healthcare providers as well.
The payoff of such self-monitoring has been clinically proven: Beaser noted a study showing that 70% of people who tracked their blood glucose regularly achieved an A1C level below 8%, compared to only 18% of those who tested irregularly. “So clearly there’s a relationship between frequency of monitoring and results,” he contended.
A significant problem, noted Beaser, is that diagnosis happens too late. He explained that 18% of people with type 2 diabetes already have retinopathy at the time of diagnosis, a disorder that may have begun up to five years before.
“Missing the Boat”
“We’re really missing the boat in terms of diagnosis,” he emphasized. “We need to diagnose diabetes earlier, before it does its damage, and perhaps even diagnose insulin resistance before it causes diabetes.”
He encouraged doctors to screen all adults over age 45 for diabetes every 3 years, and to screen those at increased risk earlier or more frequently. Risk is greater among people with a family history of diabetes, the obese (those who are more than 20% above ideal body weight), those from certain ethnic groups (including Native Americans, Hispanics, and African-Americans), those with high blood pressure or cholesterol, and women who have had gestational diabetes or delivered a baby greater than 9 pounds.
Once type 2 diabetes is diagnosed, Beaser encouraged combination therapy, when necessary, to lower A1C levels. Different oral diabetes medications work through different mechanisms: Some increase insulin secretion by beta cells, others increase the body’s sensitivity to insulin, and a third group slows the breakdown and absorption of starches and sugars. As a result, many patients may need more than one drug to control their blood glucose. “These medications, used alone or in combination, can lead to important improvements in glucose control,” he asserted. Medication in combination with lifestyle changes would be optimal, but Beaser noted that it can take years for many patients to adopt healthier practices – years that may lead to potentially lethal complications.
“Our challenge is to allow people to have a lifestyle that is as normal as possible,” he concluded. “With the tools we have today, we can do that better than ever before.”
“This is Not Your Grandmother’s Insulin”
Despite oral diabetes drugs and lifestyle changes, blood glucose remains uncontrolled in many patients with type 2 diabetes. For these patients, insulin injections may be the answer. But insulin isn’t what it used to be: Today some patients can get by with a single dose of long-acting insulin each night, using a fine-gauge needle that causes little discomfort. “This is not your grandmother’s insulin,” emphasized Scott Nelson.
Some 25% of the patients in Nelson’s Mississippi practice have diabetes, and many of them have been helped by insulin therapy. Until recently, insulin for type 2 diabetes has had a bad rap among doctors, many of whom saw it as a last resort and an indication of treatment failure. But today’s long-acting insulins not only control blood glucose and match normal insulin secretion patterns, but also are easier for patients to take regularly.
Typically, patients with type 2 diabetes begin receiving insulin therapy some 10 to 15 years after their diagnosis, when diabetes complications may have already started. Nelson recommended insulin therapy earlier in the course of the disease, “before the proverbial train has run down the mountain and crashed into the village.” Recent studies have shown that early intervention with insulin therapy may not only control blood glucose in type 2 diabetes, but also may prevent or delay the progressive loss of beta cell function caused by the disease.
A Team Approach
Nelson also supported a team approach to controlling diabetes. The patient must monitor his or her blood glucose several times a day, take any medications as prescribed, and see a healthcare provider regularly. But doctors also need to step up to the plate, ensuring that their patients get the education they need and that those without diabetes are screened periodically to find the disease in its earliest stages. “If we put the team structure in place, there’s a lot that can be done,” stressed Nelson. In Mississippi, such an approach has resulted in programs that help impoverished patients obtain access to care they may not have otherwise been able to receive.
Alan M. Jacobson, MD, senior vice president of the Joslin Diabetes Center, underscored the importance of positive messages to encourage people to take charge of their health. “Changes in care over the last 25 years have changed the course of diabetes in some important ways,” he stated. “The challenge is to get this message out to the broadest audience.”
Patients need to know that better blood glucose control can pay off for them, and that such control needs to start early in the course of the disease. Many patients are fearful of starting the journey to such goals because they fear failure. Jacobson encouraged doctors to help patients separate their goals into “achievable bits,” rather than emphasizing the end result all at once. It’s easier to think of reducing A1C by 1% at a time, for example, rather than immediately going for a 3-4% decrease.
Overcoming Patient Fears
Stephen Brunton, MD, of Stamford Hospital/Columbia University Family Practice Residency Program in Stamford, Connecticut, agreed that there’s a need to overcome patients’ fears. “This disease is so fraught with misconceptions,” he said. “People may not only not want to discuss it, but they may not see their physicians when they need to.” He encouraged the development of programs that teach patients both how to control their glucose and how to maintain their quality of life.
Vital to those programs are resources that healthcare providers need to educate their patients effectively. Continuing medical education courses for doctors and simple tools for patients (such as flip charts, booklets, and videos) could facilitate the process. “Our goal as clinicians is to access patients who have less access to care, and to provide tools they may not have,” Brunton concluded. “This disease, like no other, needs to be managed by a team. As a team, we can get a handle on this epidemic.”
Physicians, epidemiologists, public health practitioners, and other experts came together to discuss the emerging threat of SARS and how it can be dealt with.
It seems to appear out of nowhere: A virulent foe that bears a striking resemblance to other pathogens in its class – and yet deals a quick and lethal blow to many it infects – reared its menacing head in February.
It was only a matter of weeks before the mysterious new illness, called SARS (a pithy nickname for Severe Acute Respiratory Syndrome, a constellation of symptoms that cannot be attributed to any known infection), was linked to a coronavirus, which is the same family of viruses that cause the common cold. In a very short period of time – by biomedical research standards – the scientific and medical community had identified the enemy, deciphered its genetic code, and made swift and effective strides in controlling its spread.
Yet for all this success, SARS still presents more questions than answers, as demonstrated at a May 17, 2003, conference called SARS in the Context of Emerging Infectious Threats. The conference was presented by The New York Academy of Sciences (the Academy). As of that date, SARS – characterized by a high fever, dry cough, difficulty breathing, and sometimes diarrhea – had infected 7,761 people worldwide and claimed 623 of those lives.
Answering Questions
Where did it come from? Did it start in an infected animal and mutate to infect humans? Why do some people succumb to its grip while others survive? Why does it claim more victims in China than in other countries? What is the natural course of the disease? Are patients who recover still able to spread the infection to others? Can we create targeted therapies that throw a wrench in the viral replication process? Can we develop a vaccine to prevent SARS infection and, if so, what is the best approach?
These were among the questions tackled by 15 scientists, physicians, public-health officers and pharmaceutical representatives who made presentations at the Academy’s meeting, assembled in just three weeks and one of the first to transpire so early in the course of this medical story.
The meeting was sponsored by the Academy in partnership with Columbia University’s Mailman School of Public Health and the National Institute of Allergy and Infectious Diseases (NIAID). It was organized by two Columbia faculty, Scott M. Hammer, MD., chief of the Division of Infectious Diseases and the Harold C. Neu Professor of Medicine, and W. Ian Lipkin, MD., director of the Center for Immunopathogenesis and Infectious Disease, professor of Epidemiology and Neurology, and Special Advisor to China for Scientific Research and International Cooperation in the Fight Against SARS.
Due to respiratory symptoms that began following a trip to Beijing, Lipkin participated by phone from his home, where he was in quarantine through May 25. He is now asymptomatic and is not believed to have had SARS. Generous support for the meeting was provided by Pfizer Inc, Bristol-Myers Squibb Company, Merck Research Laboratories, and Novartis.
Where Did It Come From?
Coronaviruses owe their crown-like appearance to a multitude of spike (S) proteins studding their surfaces, explained Paul S. Masters, PhD, an investigator and professor of molecular genetics at the Wadsworth Center of the New York State Department of Health. These S proteins take on the task of fusing the virus to a victim’s cells, enabling the pathogen to set up shop in the cell. The coronavirus’ other three proteins – the membrane (M) protein, envelope (E) protein, and nucleocapsid (N) protein – then go to work, essentially turning the host cell into a factory that manufactures and exports newly formed coronaviruses that can attack other cells.
Coronaviruses are highly species-specific, noted Kathryn V. Holmes, PhD, a molecular biologist at the University of Colorado Health Sciences Center in Denver. They cause a variety of respiratory, gastrointestinal and neurologic infections in animals and humans. But because host cell receptors differ between species, a coronavirus that causes a respiratory infection in a pig, for example, has no effect on humans or chickens…unless the virus mutates. Such mutation might explain the origin of the SARS virus, which researchers speculate may have come from an animal in south China, where the first SARS cases materialized.
“Many of these viruses have probably been with their hosts for a long time,” Holmes said. “But how much change does there have to be for a virus to jump to a different host?” Holmes studies the mouse hepatitis virus, a coronavirus that may shed light on the behavior of the SARS virus. She outlined several potential targets for treating the SARS coronavirus, including those that interfere with its replication machinery as well as vaccines. “If we can develop these therapies, they will be applicable not only to SARS, but also to a large number of diseases in animals,” she concluded.
Pigs and Cows
Two species that could be especially helped by such treatments are pigs and cows. Linda J. Saif, PhD, a professor and researcher with The Ohio State University’s Agricultural Research and Development Center, described coronaviruses that cause severe and often fatal respiratory and gastrointestinal infections in these animals. Studies have shown that these infections may be exacerbated when the virus is administered via aerosol, at high doses, with immunosuppressive drugs, or in the presence of other viral or bacterial infections – data that may yield clues about who is most vulnerable to SARS infection.
Moreover, cows that co-mingle with other cattle from different farms and/or have experienced stress during shipping (causing “shipping fever”) are more susceptible to coronaviral infections. “We see something similar to this in SARS patients who recently experienced the stress of travel,” noted Saif. She described various vaccines that have been developed for these infections in animals. Some are effective, but most offer limited protection.
Thanks to a mix of classic and modern techniques, scientists are refining methods of detecting the SARS virus, explained Thomas G. Ksiazek, DVM, PhD, acting chief of the Special Pathogens Branch in the Division of Viral and Rickettsial Diseases at the Centers for Disease Control and Prevention (CDC). He chronicled the efforts of medical detectives to isolate and characterize the virus – initially using immunohistochemical staining, and later confirming its identity and genome with RT-PCR sequencing and array technology.
A Disease of Tribes
Indirect fluorescent antibody testing and ELISA have been employed to garner more information. “The sequencing of the virus’ genome so rapidly is a good use of modern technology, and will make diagnosis of the infection and therapy with vaccines possible in the future,” concluded Ksiazek.
Catherine Laughlin of the National Institute of Allergy and Infectious Diseases discusses the status of drug screening; Larry Anderson of the U.S. Centers for Disease Control and Prevention; Donald E. Low discusses the outbreak in Toronto. Photo by Michael Gaffney.
The first appearance of SARS in people began last fall, when sporadic cases began to emerge in the southern Chinese province of Guangdong. But the seminal event triggering the current epidemic took place on February 21, when a doctor from Guangdong stayed on the ninth floor of Hong Kong’s Metropole Hotel. Ten other people contracted his infection, taking it with them as they continued to travel. Within weeks, the illness popped up in other nations, including Vietnam, where it took the lives of healthcare workers such as Carlo Urbani, the World Health Organization doctor who first identified the outbreak.
Spreads in Hospitals
The virus most frequently has been spread in hospital settings, indicating that community transmission is less likely, said Larry Anderson, MD., chief of the CDC’s Respiratory and Enteric Viruses Branch. “There is a likelihood that with good infection control practices, we can control the spread of SARS,” he contended. Indeed, the infection already has been addressed in Canada, Singapore, Thailand, Vietnam, and the United States. “The good news is that the SARS outbreak has been controlled in some settings,” he said, adding: “We still have a great deal to learn.”
Who, for example, is most likely to develop severe, if not fatal, SARS? Evidence to date indicates that elderly patients and those with diabetes or certain other co-existing chronic medical conditions are more likely to succumb, but more data are needed to confirm and explain these associations. Investigators also want to know more about the optimal time during the illness to collect specimens such as urine, respiratory secretions, and stool samples so they can correlate their findings with disease progression.
Cases in Canada
In Toronto, investigators are analyzing the blood of 100 healthcare workers who were exposed to SARS and 100 others who were not, to see if they can pinpoint any indicators of early SARS infection. Donald E. Low, MD., chief microbiologist at Toronto’s Mount Sinai Hospital, discussed the SARS outbreak that, as of May 17, had taken 23 lives and caused economic hardship for the city.
Their saga began when a woman returned to Toronto after visiting Hong Kong. She developed SARS symptoms and died on March 5, but not before infecting her husband and her son. They, in turn, infected two other men in nearby hospital beds, setting off a chain of infection that included relatives, members of a church group that had been visiting the hospital emergency room and patients at other Toronto hospitals.
As of mid-May, however, Low asserted, “It is safe to come to Toronto.” He noted the valuable lessons learned from Toronto’s SARS experience. One is that strict infection control is a must. Second, the disease is more often spread in hospitals – via droplets and contact – than via casual contact in the community. And finally, Low called SARS “a disease of tribes,” be they family members, hospital workers, or close communities such as religious groups.
Recent reports from China indicate that SARS may be abating there as well. Scott Hammer delivered a presentation prepared by Chen Zhu, Sc.D., vice president of the Chinese Academy of Sciences, who could not be at the meeting. According to his presentation, China has established a central command and 10 task forces, and is evaluating potential treatments (including the serum of convalescent patients), building international collaborations and establishing research centers to study the virus.
Turning Challenge into Opportunity
“The situation in China illustrates the awakening and the multidimensional approach that China is taking to control SARS,” said Hammer. “It’s not just a public health event, but a major political and economic event for China.”
Despite more than two decades of research, the clever HIV pathogen has continued to elude us. But some new good may come out of all those years of study: The coronavirus that causes SARS appears to fuse to host cells in much the same way as HIV. Harnessing this knowledge, David Ho, MD., scientific director of the Aaron Diamond AIDS Research Center at The Rockefeller University – who has scrutinized HIV for 22 years – and his team have designed a peptide that may inhibit this fusion.
Preliminary studies in Hong Kong are producing promising results in tissue culture. Ho speculates that this peptide would have little toxicity in clinical applications. “There are still many obstacles in the way, but this is an example of what one can do in a very short time,” he concluded.
Since so little is known about the virus’ behavior, some doctors have been treating SARS patients with ribavirin and steroids such as dexamethasone, a treatment approach that has not been effective. In fact, corticosteroids may actually delay viral clearance in patients with viral respiratory infections, explained Frederick G. Hayden, MD., professor of internal medicine and pathology at the University of Virginia School of Medicine.
Proceeding Cautiously
“One has to be very cautious about the effects of corticosteroids on viral replication, particularly in the absence of antiviral drugs,” Hayden asserted. Antiviral agents that appear intriguing for use in SARS patients include oseltamivir, zanamivir, and interferon. “We need a better understanding of the natural history of the infection, including mechanisms of injury and host immunopathologic responses,” he added. “Controlled clinical trials are going to be essential to understand what really works in this illness.”
Drugs with the potential for treating SARS will go through an intensive screening process jointly coordinated by the CDC, the U.S. Army Medical Research Institute of Infectious Diseases, and the NIAID. “There are many steps in the viral life cycle where fusion inhibitors might play a role,” said Catherine Laughlin, PhD, chief of the Virology Branch of the Division of Microbiology and Infectious Diseases at the NIAID. Other potential drug targets include cysteine protease, RNA-dependent RNA polymerase, helicase, genome replication and transcription, and the N protein. Laughlin hypothesized that the most effective treatment will probably be a combination of an antiviral agent and another drug that interferes with the viral replication process.
The Race is On
From left: David Ho, Frederick G. Hayden, Catherine Laughlin, C. Richter King, Thomas Monath, Richard Colonno (Bristol-Myers Squibb), Michael Dunne (Pfizer), and Emilio Emini (Merck). Photo by Michael Gaffney.
The race is on between pharmaceutical companies setting out to make a name for themselves in the SARS arena. GenVec, Inc., a Maryland-based biopharmaceutical company, is developing a vaccine against SARS using its adenovector technology, in collaboration with the Vaccine Research Center of the NIAID and the U.S. Navy Medical Research Center. C. Richter King, PhD, vice president of research at GenVec, explained that the “AdVaccine” is based on an adenovirus that is modified to contain a therapeutic protein. The resulting adenovector bears a therapeutic gene capable of triggering an immune response. Moreover, King noted that the highly targeted vaccine is safe and well tolerated, and easily manufactured.
Acambis, a pharmaceutical company specializing in vaccine development, has begun its own investigations into a vaccine for SARS, hoping to build on the success it has had creating vaccines against smallpox and travel-related diseases. Thomas Monath, MD., chief scientific officer at Acambis, noted the scarcity of effective vaccines to treat coronavirus infections in animals, and highlighted the need for a suitable animal model of SARS. (So far, macaque monkeys have been the only animals offering promise in this regard.)
He cautioned that it could take at least five to six years to develop an effective vaccine, at a cost of some $60-100 million and requiring the collaboration of academic and industrial scientists. “We need to understand the natural history of this disease and develop appropriate animal models, and that will allow us to develop rational approaches,” he advised.
The session concluded with a panel discussion that also included representatives from Bristol-Myers Squibb Company, Pfizer Inc, and Merck Research Laboratories.
An Emerging Threat
Ebola. West Nile. And now SARS. “Every year or two we see a new virus, an old virus that wasn’t supposed to be here, or an old virus doing something new,” noted C.J. Peters, MD., professor of microbiology, immunology, and pathology at the University of Texas Medical Branch in Galveston. He described how viruses travel with their hosts, bringing them to areas they may not have been able to get to on their own. “Viruses can’t just pick up and go – they are ecologically constrained. The age of exploration started mixing viruses. But today we don’t have to wait for Columbus – there’s the airplane.”
Peters explained how the genetic variability of viruses, multiple ecologic niches, urbanization and global travel have combined to create evolutionary opportunities for viruses. But we have to learn to understand social, cultural and economic differences among populations in order to control viruses effectively. “We have to find a way to get ahead of this. If SARS gets into certain areas of the world, we will not eradicate it,” he contended.
An Effective Public Health Response
When SARS does strike, especially in a major urban area, an effective public health response is critical for controlling its spread, said Marcelle Layton, MD., assistant commissioner for communicable diseases for the New York City Department of Health. Such a response includes prompt detection of the outbreak, notification of key partners (including the medical community and law enforcement agencies), epidemiologic surveillance, medical and public health interventions (such as mass treatment or mass prophylaxis), and – most importantly – accurate and ongoing public communication.
“Effective communication underlies every aspect of a successful response,” emphasized Layton. “If you don’t communicate well, and if you lose trust with misinformation, it’s extremely hard to regain.”
“We really are facing an important problem,” added John La Montagne, PhD, deputy director of the NIAID. “SARS is an unpredictable and serious disease with dramatic impacts. It could have happened here (in this country) – we’re very lucky.” La Montagne supported continued collaboration both nationally and internationally, and credited the toils of SARS investigators. “It is an unbelievable testimony to the effectiveness of our public health institutions – not just nationally, but globally – that so much work and so much progress have been achieved in such a short period of time.”
Feeling stressed out? Anxious? Frustrated and angry? Looking for a way out?
Some significant advances in the neurosciences are revealing that stress is actually a complex relationship of internal and external factors, and that some relatively simple lifestyle changes can contribute to a sense of well being and improve health.
“A healthy lifestyle is the best way to reduce stress,” according to Bruce S. McEwen, head of the Harold and Margaret Milliken Hatch Laboratory of Neuroendocrinology at the Rockefeller University in New York and co-author of the recently published The End of Stress As We Know It (Joseph Henry Press).
The notion that stress is the result of external pressures is incomplete, said McEwen, who summarized his book during a March 18 lecture at The New York Academy of Sciences (the Academy). Research now reveals how the body’s defense mechanisms are involved in keeping stress at bay, as well as how the body’s defense system breaks down from time to time.
When the body is working properly, a process known as “allostasis” helps individuals adapt to and survive the real or imagined threats that confront them in the course of everyday life. McEwen explained that the allostasis process is maintained by a complex network – including hormones, the autonomic nervous system, neurotransmitters in the brain, and chemicals in the immune system – in the body.
“When this network is working efficiently, its activity helps to mobilize energy reserves, promote efficient cardiovascular function, enhance memory of important events and enhance the immune defense towards pathogens,” McEwen said. Normally, the body is able to self-regulate the proper responses to external pressures, but occasionally it reaches a limit known as “allostatic overload.”
Bruce S. McEwen
External Stress Factors
Many external pressures can contribute to allostatic overload, according to McEwen, such as conflicts at work or home, fears about war and terrorism, overworking, lack of sleep, economic difficulties, lack of exercise, excessive drinking and bad eating habits. Genetic risk factors, such as a predisposition for cardiovascular disease or diabetes, can also contribute to allostatic overload.
“If the imbalances in the body’s regulatory network persist over long periods of time, the result can lead to disease,” McEwen said. “Hardening of the arteries, arthritis, diabetes, obesity, depressive illness and certain types of memory loss are among the disorders that are accelerated by allostatic overload,” he added. He cited research indicating that long-term stress also affects the amygdala and the hippocampus, the regions of the brain that regulate fear, emotions and memory.
According to McEwen, “genes, early development, and life experiences all contribute to determining how the brain responds to environmental stresses.” Research has revealed that external factors in society also can influence health and disease commonly related to stress.
“In industrialized societies, allostatic overload occurs with increasing frequency at lower levels of education and income,” McEwen noted. He pointed out that mortality rates and levels of diseases associated with allostatic load are much higher among people in lower socioeconomic status. “A combination of lifestyle, perceptions of inequality and stressful life experiences appear to play a role,” he said.
Best Antidote: Healthy Lifestyle
What can be done to reduce allostatic load and the stress associated with it? Changes in lifestyle are the best remedy, according to McEwen. “Maintaining social ties with friends and family is one of the most important factors in reducing stress,” he said. “In addition, restorative sleep, and regular, moderate exercise are all important,” he added. “Regular, moderate exercise not only increases muscle utilization of energy, but also enhances formation of new nerve cells in areas of the brain that support memory.”
McEwen said that, in addition to individual responses to counteract allostatic overload and reduce stress, the private sector and policy makers also can contribute to well-being. “Government policies that recognize the impact of inequality, promote comprehensive health care and reduce smoking, and provide housing and community services are also very important,” he said.
Stress reduction is not only critical for individuals, he added, but for the health and welfare of the wider society as well. “By 2020, depression will be the second-leading cause of disease in this country,” he concluded.
One of the dichotomies between basic and clinical research into childhood mental illness has been the nomenclature of classification. Psychiatrists have historically used “categories” to classify neurological disorders; psychologists have turned to “dimensions.”
Thus, the Roots of Mental Illness in Children and Adolescents conference organizers set out to find a keynote speaker who could bridge this sometimes “cavernous gap,” said Doreen S. Koretz, chief of the Developmental Psychopathology and Prevention Research Branch, Division of Mental Disorders, Behavioral Research and AIDS, at the National Institute of Mental Health, Bethesda. The conference was supported by The New York Academy of Sciences (the Academy).
They turned to Sir Michael Rutter, MD, F.R.S., professor of Developmental Psychopathology at the Social, Genetic and Developmental Psychiatry Research Centre, Institute of Psychiatry, in London, and a leading expert in child psychiatric research.
By his own admission, Sir Michael took a rather “British approach” and, one by one, challenged “meta-theoretical claims” behind the two approaches. “The battle, as it has sometimes been, between dimensional and categorical approaches is rather futile,” he admitted. Ultimately, both are necessary.
Among the claims he challenged to reach that conclusion were:
Sir Michael Rutter, MD, F.R.S.
Dimensional analyses have greater statistical power. But, says Sir Michael, odds ratios can sometimes be preferable. A recent study using Canadian data, for example, found no difference between maternal care and group day care on physical aggression in children ages two and three — except when the children came from families at high psychosocial risk. “Where there was high family risk, the rates of aggression were substantially higher among those receiving family home care,” he said.
Another assumes that the most important environmental influences are outside the family and only extreme environments have any effects of functional importance. “Both are demonstrably false,” said Sir Michael. A French study has shown that children removed from their parents because of abuse or neglect and then adopted between the ages of four and six-and-a-half have a rise in IQ at adolescence, the degree of which is systematically related to the socio-educational level of the adoptive homes. “These are differences within the relatively narrow range of adoptive homes,” he explained.
A further wrong assumption is that causal inferences can be partitioned into those that are genetically or environmentally mediated, with their summation amounting to 100 percent of effects. One example of the shortcoming of this claim is the role that people themselves play in selecting and shaping their environment. A longitudinal study of girls at age 10 with anti-social behavior found that, in the absence of marital support, there is a high degree of persistence 18 years later. “But, given marital support, there is a huge improvement in social functioning,” said Sir Michael.
“There’s an American saying, which says something like, ‘It ain’t ignorance that does the harm, it’s knowing too many things that ain’t true,’” Sir Michael said. “I’m a great believer in that.”
Scientists and clinicians are pursuing the root causes of mental health struggles specific to young people to develop effective behavioral interventions.
Early childhood experiences appear to shape brain function in ways that confer either vulnerability or resilience to mental illness in children and adolescents. As scientists unravel the genetics and physiological mechanisms behind these changes in animals, clinicians are eager to translate their findings into new behavioral interventions. A recent Academy conference, to be the subject of a future Annals volume, explored how they can come together to do this.
Neonatal rats are programmed to form bonds with their mothers. During the first nine days of life, they develop a preference for their mother’s odor regardless of the quality of care that they receive, according to research done by Regina M. Sullivan, PhD, professor of Zoology at the University of Oklahoma.
She has shown that this learning, which enables nipple attachment and other activities necessary for survival, occurs in an experimental model whether the pups receive a “reward” of milk, a stroke or a shock after being exposed to an odor. And, given that about 12 of every 1,000 children are abused or neglected, she expects the same attachment pattern to be true in humans. “My working hypothesis is that the human child’s brain is wired to form an attachment whether the parent is being kind or not,” she said.
Sullivan’s data, though based on a rat model, could have other implications for research into the mental health of human children. There are physiological parallels between learning in human and rat infants. Thus the neural circuitry responsible for this attachment process, which involves the locus coeruleus in neonatal rats, might be shared.
Surprisingly Little Translational Research
While much of today’s medical knowledge is based on animal studies, there has been surprisingly little translational research in this area.
The March 2003 conference – Roots of Mental Illness in Children and Adolescents – at which Sullivan presented her findings was a first step toward filling this gap. The conference was organized by The New York Academy of Sciences (the Academy). It brought students of developmental neurobiology, developmental psychobiology and developmental psychopathology to New York City for two-and-a-half days of focused presentations and discussions.
According to Israel I. Lederhendler, PhD, director of the Behavioral and Systems Neuroscience Research Program at the National Institute of Mental Health and one of the conference organizers, the goal was “a little bit experimental and lots of fun.” Judging from the dialogue that resulted, it appears to have succeeded. The conference “strengthened the sense that there are important linkages that need to be explored and that the science is at the point where translational research is likely to lead to important breakthroughs,” said Megan R. Gunnar, PhD, Distinguished McKnight University Professor at the Institute of Child Development, University of Minnesota.
Attachment Disorders
Secure attachment between infants and their caregivers is known to be a protective factor and insecure attachment a risk factor for subsequent psychopathology. Charles H. Zeanah, Jr., MD, professor of Psychiatry and Pediatrics at Tulane University Health Sciences Center, has been studying one form of insecure attachment, called disorganized attachment, at an orphanage in Bucharest. “There’s a point at which attachment itself is reflective of a psychopathology, reactive attachment disorder, that’s defined in early childhood,” he said.
Zeanah has found the two patterns of this disorder, characterized either by emotional withdrawal or indiscriminate social extroversion, to be “readily identifiable” in these institutionalized children. Interestingly, however, a pilot project that aimed to reduce the number of caregivers for children in one unit of the institution dramatically reduced their signs of emotionally withdrawn reactive attachment disorder. “Secure attachment relationships appear to be protective in the context of high-risk environments,” he concluded.
Maternal separation in rats, where mothers are taken away from their offspring for three hours a day early in the first neonatal week, has been found to be associated with a transient decrease in hippocampal neurogenesis, reductions in hippocampal dendritic branching and a reduction in synaptic density. “This one fairly modest manipulation during the early part of these rats’ lives has profound long-term effects,” said Paul M. Plotsky, PhD, GlaxoSmithKline Professor of Psychiatry and Behavioral Sciences at Emory University School of Medicine. “There are a whole host of changes in the neurochemistry, behavioral profiles and morphology of these animals.”
Genetic Vulnerability
Nonetheless, only 33 to 45 percent of the animals in Plotsky’s study exhibit these effects in response to early maternal separation experiences. This suggests that some rats are genetically vulnerable to maternal separation and others are not. Studies done by Thomas R. Insel, director of the NIMH, in prairie voles underscore the importance of social experience in forming attachments.
Adult prairie voles form life-long partnerships, but only after mating triggers a release of dopamine in the nucleus accumbens. A D2 receptor agonist, which has no affect on mating behavior, eliminates partner preference in these animals. “The key then is that social attachment requires that social stimuli become linked to this major information stream in the forebrain,” said Insel.
Social behavior and physiology also appear to be linked in humans. Eye contact, for example, tenses middle ear muscles that enable the human voice to be distinguished from background noise. “If we extract from some of the animal and human work, we start realizing that some social behaviors are not learned behaviors but appear to be emergent properties of specific physiological states,” explained Stephen W. Porges, PhD, professor of Psychiatry at the University of Illinois at Chicago.
False Perceptions
Adult rhesus monkeys whose entire amygdala has been lesioned lose innate fear, such as that of snakes, but are able to function in social environments. However, when these monkeys are lesioned early in development, they show increased fear in social settings. There are very few people with amygdala lesions, but these findings are consistent with the behavior of one such female patient, “S.M.,” in her 30s. Though this woman appears to engage in normal social interactions, she is unable to detect fear in faces.
This interplay between fear and social motivation was further investigated in a recent study at the University of California, Davis. David G. Amaral, PhD, a professor of Psychiatry, found that amygdala-lesioned animals failed to demonstrate preference for their mothers over another female in a novel environment. Though this might appear to suggest failed maternal attachment, these monkeys didn’t seek out the comfort of their mothers, as they were unable to detect the novel environment as dangerous. “The amygdala may play some role in issues of impairment, such as in society anxiety,” Amaral concluded.
Bradley S. Peterson, MD, Suzanne Crosby Murphy Associate Professor in Pediatric Neuropsychiatry at Columbia University, is studying premature infants as a means to understand how disturbances in normal brain development might contribute to mental illnesses in children. A functional magnetic resonance imaging study of language processing found that they, unlike their full-term counterparts, tend to use phonological circuitry to process semantic material in their environment. “Preterm children may tend to hear semantic material, meaningful sounds and speech utterances, as meaningless junk,” he said.
Responding to Social Cues
Physically abused children studied by Seth D. Pollak, PhD, assistant professor in the Departments of Psychology and Psychiatry at the University of Wisconsin at Madison, also failed to respond appropriately to social cues in their environment. In this instance, however, the children were extremely sensitive to facial expressions of anger. They identified this emotion early in its formation and also detected it with little perceptual information from scrambled facial images.
“Their categories for anger are more inclusive,” explained Pollak. “A face that has maybe 30 percent anger in it but 70 percent sadness or fear is being interpreted as being an angry face. This really changes how children are interpreting social information that they’re receiving from the world.”
Though this heightened perception of anger might be beneficial in their home environments, it will likely cause them to misread cues in other social settings. “If we want to understand human development, especially with the goal of understanding the development of psychopathology, we need to bring together not only an understanding of the neuroscience of what is in children’s heads, but an understanding of what (environment) children’s heads are in,” Pollak concluded.