The Need to Accelerate Therapeutic Development: Must Randomized Controlled Trials Give Way?
The enterprise of drug development is a crucial lifeline for patients and their families. Those who need new and better treatment options depend on researchers to deliver safe and effective therapies as quickly as possible, meaning experimental drugs must first be tested on human volunteers before they can be approved for widespread use. Since the mid-twentieth century, the randomized controlled trial (RCT) has been considered the gold-standard in research design because of its ability to overcome bias and yield high-quality evidence. But it comes at a steep cost: The average new drug requires six to eight years of human testing and $100 million to fund the clinical trial phase alone. Moreover, conducting an RCT is not always feasible or moral, such as during a pandemic or in the case of a very rare disease. In such cases, alternative trial designs may produce faster and cheaper results, but in doing so, they must not compromise appropriate levels of standards of safety and efficacy, say regulators, patients and insurers. While more rapid development is critical to save lives, difficult questions remain about how to tread this delicate balance.
On June 21–22, 2017, the Academy convened a colloquium at which academic and pharmaceutical researchers, federal regulators, bioethicists, executives, patient advocates and lawyers met to discuss the relevance of the randomized controlled trial as the default model for human subject research. With the success of emerging interventions like genomic therapy and immunotherapy, a cultural conversation has opened up around issues such as determining how clinical trials should be designed in this new era, who may participate in research and when promising therapies should reach the market. Formulating answers to these urgent questions could benefit millions of patients and reshape the future of medicine.
The opening panel set the stage for the role RCTs have played in the history of medical research. Susan Lederer described how clinical trials first came to be. In the 1760s, James Lind was a ship’s surgeon in the British navy faced with a rash of scurvy cases. In a bid to stop the outbreak, he divided twelve sailors into groups of two, rotating each through different sets of treatments. The groups tried sea water, sulfuric acid, vinegar, cider, a tamarind paste and oranges and lemons. When that last treatment proved effective, Lind realized he had hit upon a cure.
But officially randomizing treatment into a control arm and a trial arm didn’t gain traction until the mid-twentieth century, when World War II prompted a massive influx of federal dollars for research, and the pharmaceutical industry began to transform American medicine. In the early 1960s, after many pregnant women took the drug thalidomide, which caused fetal deaths and birth defects, Congress established laws calling for “adequate and well-controlled” studies that demonstrated efficacy as well as safety before drugs could be approved.
Speakers also examined the need for alternatives to RCTs, the risks associated with both RCTs as well as alternatives, ethics and patient advocacy in clinical trial design, modern trends in clinical drug development and clinical trial innovation, among many other topics.
Full eBriefing by Kira Peikoff: www.nyas.org/RCT2017-eB
Neuroplasticity, Neuroregeneration, and Brain Repair
Many promising strategies for promoting neuroregeneration have emerged in the past few years, but a further research push is needed for these ideas to be translated into therapies for neurodegenerative diseases. On June 13–14, a symposium presented by Eli Lilly and Company and the New York Academy of Sciences brought together academic and industry researchers working on multiple neurodegenerative diseases as well as clinicians and government stakeholders to discuss cutting edge basic and clinical research on neuroregeneration and neurorestoration. Topics included neuronal plasticity, inflammation, glial cell function, autophagy and mitochondrial function, as well as analysis of recent drug development failures and how to move forward from them.
Full eBriefing by Alla Katsnelson: www.nyas.org/Neuroregen17-eB
Long-Acting HIV Prevention Methods
The development of antiretroviral drugs for HIV treatment, and in more recent years, for prevention has dramatically reduced disease burden for millions of people with access to adequate treatment and prevention programs. However, UNAIDS reports that 25 million people in Sub-Saharan Africa are still living with HIV, accounting for more than two-thirds of the infected population worldwide, and that young women (aged 15–24) in this region are disproportionately impacted. Importantly, the number of young women nearing or in this age range is currently very high, placing us at a critical juncture. Without adequate intervention and a reduction in the rate of new infections, the most likely outcome is a resurgence of the HIV epidemic.
The scientific community must develop new and effective prevention strategies to reduce HIV transmission in order to prevent this devastating outcome. Taking into consideration economic, social and cultural barriers to current HIV prevention strategies, approaches such as long-acting therapies are a promising path forward; however key questions remain before this approach can be put to use in community settings. On September 22, 2017, the Bill & Melinda Gates Foundation and the Academy’s Microbiology and Infectious Diseases Discussion Group gathered scientists, policy makers and community leaders to address current scientific barriers to development of long-acting prevention methods for HIV.
Full eBriefing by Ann Griswold: www.nyas.org/HIV2017-eB