Previous conferences and workshops covering artificial intelligence (AI) for Materials Science have mainly focused on introducing AI into materials simulations, which is only the first step in new materials discovery. These efforts have largely ignored AI’s promise for materials synthesis and translating research into high-volume industrial production.
On October 6-7, 2020, the New York Academy of Sciences hosted the AI for Materials symposium to provide a broader perspective on leveraging the benefits of AI in material simulations, experiments, and development efforts for high volume production. The symposium brought together materials scientists, industry experts, and AI researchers to cover the application of AI throughout the entire life cycle of new materials, from lab discovery to industrial production. These leaders also shape future research directions, identify urgent issues in this rising field, and foster interdisciplinary collaboration opportunities.
In This eBriefing, You’ll Learn
How machine learning is being applied to understand the physical processes behind materials science
Approaches to improve the data infrastructures used in materials science research to facilitate easier integration and promote a better data sharing environment
How AI is being applied to address industry-related issues in materials science, including the scalability of materials production from the lab to the factory and the synthetic and catalytic routes of new materials
Speakers
Muratahan Aykol, PhD Toyota Research Institute
Léon Bottou, PhD Facebook AI Research
Carla Gomes, PhD Cornell University
Philipp Harbach, PhD Merck KGaA
Michael Helander, PhD OTI Lumionics
Phillip M. Maffettone, DPhil Brookhaven National Laboratory
Nobuyuki N. Matsuzawa, PhD Panasonic Corporation
Greg Mulholland Citrine Informatics
Elsa Olivetti MIT
Rampi Ramprasad, PhD Georgia Institute of Technology
Tim Robertson, PhD Schrödinger, Inc.
Sam Samdani, PhD McKinsey & Company
Matthias Scheffler, PhD The Fritz Haber Institute
Rama Vasudevan, PhD Oak Ridge National Laboratory
James Warren, PhD National Institute of Standards and Technology
Léon received a Ph.D. in Computer Science from Université de Paris-Sud. His research career has taken him to AT&T Bell Laboratories, AT&T Labs Research, NEC Labs America, Microsoft, and now Facebook AI Research. The long-term goal of Léon’s research is to understand and replicate human-level intelligence. Because this goal requires conceptual advances that cannot be anticipated, Léon’s research has followed many practical and theoretical turns, including neural networks applications, stochastic gradient learning algorithms, statistical properties of learning systems, computer vision applications with structured outputs, and theory of large-scale learning. Léon’s research aims to clarify the relation between learning and reasoning, with focus on the many aspects of causation.
Carla Gomes, PhD
Cornell University
Carla is the Ronald C. and Antonia V. Nielsen Professor of Computing and Information Science and the Director of the Institute for Computational Sustainability at Cornell University. She received a Ph.D. from the University of Edinburgh. Her research area is artificial intelligence with a focus on Computational Sustainability. Computational Sustainability aims to develop computational methods to help solve some of the key challenges concerning environmental, economic, and societal issues to help put us on a path towards a sustainable future. Carla is a Fellow of the Association for the Advancement of Artificial Intelligence (AAAI), a Fellow of the Association for Computing Machinery (ACM), and a Fellow of the American Association for the Advancement of Science (AAAS).
Rama Vasudevan, PhD
Oak Ridge National Laboratory
Rama is the Research and Development Associate at the Center for Nanophase Materials Sciences, Oak Ridge National Laboratory. His research focuses on utilizing scanning probe microscopy (SPM) at the mesoscopic and atomic level to unearth structure-property relations in various systems, including ferroics, manganites, and others. In parallel, as vast amounts of imaging and spectroscopic data are gathered, he develops and implements tools from existing computational science literature towards tackling materials science problems and unearthing physics from deep data analysis of SPM-acquired datasets. Rama received his PhD in Materials Science from the University of New South Wales.
Rampi Ramprasad, PhD Georgia Institute of Technology
Matthias Scheffler, PhD The Fritz Haber Institute
Elsa Olivetti MIT
Muratahan Aykol, PhD Toyota Research Institute
Rampi Ramprasad, PhD
Georgia Institute of Technology
Rampi is the Michael E. Tennenbaum Family Chair and Georgia Research Alliance Eminent Scholar in Energy Sustainability at Georgia Tech. His area of expertise is developing and utilizing computational and data-driven (machine learning) methods to design and discover new materials. Materials classes under study include polymers, metals, and ceramics (mainly dielectrics and catalysts), and application areas include energy production and energy storage. Rampi received his B Tech in Metallurgical Engineering at the Indian Institute of Technology, Madras, India, and a PhD in Materials Science & Engineering at the University of Illinois, Urbana-Champaign.
Matthias Scheffler, PhD
The Fritz Haber Institute
Matthias is Director of the NOMAD Laboratory at the Fritz Haber Institute of the Max Planck Society. His research focuses on understanding fundamental aspects of physical and chemical properties of surfaces, interfaces, clusters, nanostructures, and bulk based on electronic-structure theory. In recent years, Matthias developed neural-network and compressed-sensing methods to detect structure and patterns in “big data of materials,” to create “maps of materials properties,” and identify “materials genes” that affect or even actuate materials properties. His “big-data” activities also include creating a FAIR data infrastructure (data are findable and AI-ready) and the largest data store for computational materials science data.
Elsa Olivetti, PhD
Massachusetts Institute of Technology
Elsa is the Esther and Harold E. Edgerton Associate Professor in Materials Science and Engineering at MIT. She received her PhD from the same department in 2007. Elsa’s research focuses on improving the environmental and economic sustainability of materials in the context of rapid-expanding global demand. Her research addresses two major problems where solutions could yield significant environmental benefit: first, improving the sustainability of materials through increased use of recycled and renewable materials, recycling-friendly material design, and intelligent waste disposition; and second, understanding the implications of substitution, dematerialization, and waste mining on materials markets. Her research spans three levels of materials production: operational-level, industrial network-level, and market-level strategies.
Muratahan Aykol, PhD
Toyota Research Institute
Muratahan is a Senior Research Scientist in Accelerated Materials Design and Discovery at the Toyota Research Institute. Before that, he was a postdoctoral research fellow at Lawrence Berkeley National Laboratory, working on materials informatics and infrastructure. He received his BS and MS degrees from the Middle East Technical University and a PhD in Materials Science from Northwestern University. His research focuses on machine-learning, material computations, and network science for materials discovery.
Phillip M. Maffettone, DPhil Brookhaven National Laboratory
Nobuyuki N. Matsuzawa, PhD
Panasonic Corporation
Nobu obtained his PhD in computational materials science in 1994 from The University of Tokyo. He started his career at Sony in 1987, developing various organic materials for electronic devices and lithography processes for semiconductor manufacturing. He served as a visiting research scientist at DuPont Central Research and Development in Wilmington, Delaware, and was the Senior Manager of Material Science Laboratories of Sony Europe from 2001-2004. In 2005, Nobu was named a Distinguished Engineer at Sony. Since 2016, he has been working for Panasonic, designing materials used in various electronic devices produced by Panasonic.
Michael Helander, PhD
OTI Lumionics
Michael is co-founder and CEO of OTI Lumionics, an advanced materials company he co-founded while pursuing his PhD at the University of Toronto in 2011. The company commercializes disruptive materials and process technology for OLED displays from headquarters in Toronto and offices in Asia. OLED is the leading display technology used in virtually all high-end consumer electronics and is the next generation of design-driven lighting. Dr. Helander received a BSc in Engineering Science and a PhD in Materials Science & Engineering from the University of Toronto. He has over 100 patents and peer-reviewed publications related to OLED materials, process, equipment, and displays.
Phillip M. Maffettone, DPhil
Brookhaven National Laboratory
Phil is currently a Research Associate in Computational Science at Brookhaven National Laboratory, where he focuses on developing the laboratory of the future using artificial intelligence to combine simulation and autonomous experimentation. During his career, Phil has developed a healthy disregard for disciplinary boundaries by working at the intersection of physical and computational sciences. He earned a BS in Chemical Engineering at the University at Buffalo (2014), researching silicon nanoparticle synthesis and applications. After receiving a Marshall Scholarship, he completed his DPhil in Inorganic Chemistry at the University of Oxford (2018), focused on simulating disorder in diffraction where Bragg’s law breaks down in hard and soft matter. Phil recently returned home to New York from a role at the University of Liverpool, where he developed the AI for an autonomous mobile robotic scientist searching for new photocatalytic materials.
James Warren, PhD National Institute of Standards and Technology
Greg Mulholland Citrine Informatics
Tim Robertson, PhD Schrödinger, Inc.
Sam Samdani, PhD
McKinsey & Company
Sam is a senior industry expert in the Global Chemicals & Agriculture Practice at McKinsey & Company, a global management consulting firm. His responsibilities include providing thought leadership across a range of complex knowledge domains in advanced/engineered materials, pharmaceutical ingredients, and specialty chemicals for the top management of many multinational chemical, pharmaceutical, and petroleum companies as well as government agencies and NGOs worldwide. Before joining McKinsey, Sam worked at McGraw-Hill as an Associate Editor with Chemical Engineering, a monthly technical publication. He received his BS in chemical engineering from Yale University and his PhD in chemical engineering from the University of Rochester.
Philipp Harbach, PhD
Merck KGaA
Philipp is the Head of In Silico Research in the Digital Organization of Merck KGaA. There he focuses on the digitalization of chemical and experimental processes in R&D, production, and analytics with the help of modern computational modeling and data analytics methods. He is specifically interested in applying quantum mechanical methods to industrial problems and is leading first initiatives to adapt these algorithms to noisy intermediate-scale quantum computers as part of the Merck Quantum Computing Task Force.
James Warren, PhD
National Institute of Standards and Technology
Since 2010, Jim has been focusing his energies on the US Materials Genome Initiative, a multi-agency initiative designed to create a new era of policy, resources, and infrastructure that supports US institutions to discover, manufacture, and deploy advanced materials twice as fast a fraction of the cost. As Director of the NIST Materials Genome Program, he works with a government-wide team to build out the materials innovation infrastructure need to realize the initiative’s goals. He is also one of the co-founders and the current Director of the NIST Center for Theoretical and Computational Materials Science. Jim has a PhD in physics from the University of California, Santa Barbara.
Greg Mulholland
Founder and CEO, Citrine Informatics
Greg is the co-founder and CEO of Citrine Informatics and a recognized leader in the use of digital tools and digitization practices in the development of next-generation materials and chemicals products and the creation of next-generation business models. Under his leadership, Citrine has been recognized as a WEF Technology Pioneer, a member of the Cleantech 100, the World Materials Forum Startup of the Year, and CB Insights AI 100 in 2017 and 2020. Greg holds a BS in Electrical Engineering and a BS in Computer Engineering from NC State University, an MPhil in Materials Science from Cambridge University, and an MBA from Stanford University.
Tim Robertson, PhD
Schrödinger, Inc.
Tim is a full-stack software engineer with a doctorate in computational biology and extensive experience in applied machine learning. He worked as a data scientist for companies such as Twitch and Yelp and founded two YCombinator-funded startups. Currently, Tim is Principal Scientist at Schrödinger, where he works in a hybrid scientist/engineer role, developing and applying deep learning and other AI techniques to problems in rational drug design. He has a PhD in Computational Biology (Biochemistry) from the University of Washington.
Diverse top leaders and problem-solvers are critical to fostering and accelerating creativity and innovation in STEM. This diversity is impossible unless we invest in making the STEM workforce more inclusive for women and those from underrepresented populations.
To achieve this, we need to promote diversity at all stages of the STEM pipeline and increase the number of people participating in scientific endeavors, inside and outside academia, as well as those who will help address the most pressing challenges of the 21st century.
This panel discussion, presented by the New York Academy of Sciences and Hudson River Park, features diverse STEM experts as they discuss their career paths and the importance of supporting diversity in the STEM workforce.
In this eBriefing, You’ll Learn:
The traditional and non-traditional routes panelists took into STEM and the nature of their work
The importance of mentorship and how to best leverage these relationships throughout your career
How individuals, especially people of color and members of other minority groups, can find and cultivate supportive communities
Why conversations about diversity and inclusion are meaningful in STEM
How both individuals and large organizations can address systemic inequality to create work environments where everyone can succeed
Speakers
Moderator:
Wanjiku “Wawa” Gatheru Environmental Justice Advocate, Writer, and Rhodes Scholar
Megan Lung NYS DEC Hudson River Estuary Program & NEIWPCC
Tepring Piquado, PhD RAND Corporation
Diversity and Inclusion in STEM: Leveraging Your Network and Skills
Wanjiku “Wawa” Gatheru
Wanjiku “Wawa” Gatheru is a 21-year-old environmental justice advocate passionate about creating a more inclusive environmental movement. As an emerging climate writer, she has bylines in VICE News and Glamour magazine. Wawa is also the first Black person in history to receive the Rhodes, Truman, and Udall Scholarships.
Megan Lung
NYS DEC Hudson River Estuary Program & NEIWPCC
Megan Lung is an Environmental Analyst at NEIWPCC serving the The New York State Department of Environmental Conservation Hudson River Estuary Program in stream restoration. Megan coordinates the Culvert Prioritization Project, which seeks to restore stream habitat for migratory fishes and reduce localized flooding through field work, community engagement, and implementation.
Megan hails from the Great Lakes of Michigan and earned a BS in History and Ecology and Evolutionary Biology from the University of Michigan.
Ronald E. Hunter, Jr, PhD
Mérieux NutriSciences
Dr. Ronald E. Hunter, Jr. is the Technical Director of Chemistry for North America at Mérieux NutriSciences. In this role, he directs quality control and technical functions of chemistry labs throughout North America to ensure performance meets corporate standards. Previously, Dr. Hunter was a scientist at The Coca-Cola Company, where he served as a subject-matter expert in beverage analyses, method development, and mass spectrometry. He has over ten years of experience as an analytical chemist in the public, private, and academic sectors.
Dr. Hunter holds BAs in chemistry and Spanish from Mercer University and a PhD in analytical chemistry from Emory University.
Tepring Piquado, PhD
RAND Corporation
Dr. Piquado is a senior policy researcher at RAND Corporation, professor at Pardee RAND Graduate School, chief policy director at California Issues Forum, and CEO of The TMP Group. Through her work, she leads complex, multi-site and multi-disciplinary projects to provide evidence-based guidance to federal, state and local decision-makers; provides advisory guidance and analysis on active bills and major issues being considered by state legislators; and works with institutional leaders to provide outcome-based solutions that advance diversity, equity, and inclusion.
Dr. Piquado earned her MS and PhD in neuroscience from Brandeis University and BS in computer science from Georgetown University.
Mandë Holford, PhD
Hunter College/AMNH/Killer Snails, LLC
Dr. Mandë Holford is an Associate Professor in Chemistry at Hunter College and CUNY-Graduate Center. Her laboratory investigates the power of venom to transform lives when it is adapted to create novel therapeutics for treating human diseases and disorders. Dr. Holford is also actively involved in science education, advancing the public understanding of science, and science diplomacy. She is co-founder of KillerSnails.com, an award-winning EdTech company that uses tabletop, digital, and XR games about extreme creatures in nature to advance scientific learning in K-12 classrooms.
Dr. Holford received her PhD in Synthetic Protein Chemistry from The Rockefeller University.
Many promising strategies for promoting neuroregeneration have emerged in the past few years, but a further research push is needed for these ideas to be translated into therapies for neurodegenerative diseases.
On June 13–14, a symposium presented by Eli Lilly and Company and The New York Academy of Sciences brought together academic and industry researchers working on multiple neurodegenerative diseases as well as clinicians and government stakeholders to discuss cutting edge basic and clinical research on neuroregeneration and neurorestoration. Topics included neuronal plasticity, inflammation, glial cell function, autophagy, and mitochondrial function, as well as analysis of recent drug development failures and how to move forward from them.
Speakers
Benedikt Berninger, PhD, University Medical Center Johannes Gutenberg University Mainz, Germany
Graham Collingridge, PhD, University of Toronto
Ana Maria Cuervo, MD, PhD, Albert Einstein College of Medicine
Valina Dawson, PhD, Johns Hopkins School of Medicine
Roman Giger, PhD, University of Michigan
Steven Goldman, MD, PhD, University of Rochester Medical Center
Eric Karran, PhD, AbbVie
Arthur Konnerth, PhD, Technical University of Munich, Germany
Guo-li Ming, MD, PhD, Johns Hopskins School of Medicine
David Rowitch, MD, PhD, ScD, University of Cambridge and University of California, San Fransisco
Amar Sahay, PhD, Massachusetts General Hospital
Reisa A. Sperling, MD, MMSc, Brighman and Women’s Hospital
James Surmeier, PhD, Northwestern University
Richard Tsien, DPhil, New York University, Longone Medical Center
Jeffrey Macklis, Harvard University
Mark Mattson, National Institute of Aging
Clive Svendsen, Cedars-Sinai Medical Center
Michael Sofroniew, David Geffen School of Medicine, UCLA
Michael J. O’Neill, Eli Lilly and Company
Presented By
Meeting Reports
Meeting Reports
Astrocytes in CNS Repair; Disease-Modifying Therapies in the Pipeline
Speakers
Eric Karran AbbVie
Michael V. Sofroniew David Geffen School of Medicine, University of California, Los Angeles
Highlights
Astrocyte scar formation is not detrimental to neuronal regeneration and repair after injury but is in fact critical to the healing process.
The clinical pipeline in Alzheimer’s disease is dominated by amyloid beta-targeting compounds, despite the fact that the approach has not been successful to date.
Astrocytes in CNS Repair
In his keynote talk, Michael V. Sofroniew of the University of California, Los Angeles, described 25 years of work on the overlooked and misunderstood role of astrocytes in the central nervous system (CNS).
These glial cells were discovered in the 19th century, and researchers widely believed that their activation after injury—which often results in scar formation around the lesion—detrimentally affects recovery. “But one has to ask, why would nature conserve this response to injury across all mammalian species if it were purely detrimental?” Sofroniew said.
Astrocytes can play fundamentally different roles in the CNS. In healthy tissue, they help synapses take up and release neurotransmitters and other factors, and help maintain neuronal energy balance and blood flow in surrounding tissue. Their activation in response to damage differs depending on whether recovery requires neurons to grow through lesioned tissue or through intact neural tissue.
Two different phenotypes of reactive astrocytes occur after injury.
Astrocytes responding to injury exist in different phenotypes: a hypertrophic reactive form interacts with neural cells, and a scar-forming reactive form interacts with non-neuronal inflammatory and fibrotic cells. Researchers are just beginning to define the function of hypertrophic astrocytes, but Sofroniew and his colleagues hypothesize that they represent a beneficial gain of function—helping injured neurons make new synapses and reorganize damaged circuits. Much remains to be learned about this process, he said.
Ongoing research from Sofroniew’s lab suggests that scar-forming astrocytes have a different, also beneficial function: recruiting inflammatory cells into the tissue, regulating their activity, and restricting their spread outside the lesion. Inflammation is crucial for getting rid of damaged cells, but too much of it damages surrounding intact tissue.
When neural tissue is injured, astrocytes recruit cells to scavenge damaged tissue. Somehow, astrocytes sense where the border between damaged and healthy tissue should be and wall off the injury with scar tissue. Sofroniew and others have shown that disrupting scar formation causes neurons in surrounding tissue to die.
Entrenched dogma in the field, however, says that astrocyte scar formation prevents axon regeneration. Twenty years ago, Sofroniew’s lab first tested whether disrupting scar formation in mice would spur injured axons to spontaneously regenerate. Their results showed that it didn’t, but the findings went against current dogma so the team never published them. When a researcher interested in the question joined the lab recently, they began exploring the question again, using two mouse models with mutations that prevent scar format.
After a spinal cord injury, sensory axons stimulated with growth factors can regrow despite astrocyte scar formation.
They showed that axons in three different types of CNS tracts failed to regrow in the mutant mice. Both astrocytes in lesions, along with other, non-astrocyte cells, all produced a variety of molecules both promoting and inhibiting axonal growth, underscoring the multi-component nature of regeneration. And axons that received appropriate stimulatory molecules “grow happily across astrocyte scars,” he said. The group is now confirming the result in additional types of CNS tracts. Sofroniew concluded that astrocyte reactivity and scar formation are not forms of astrocyte dysfunction, but adaptive functions critical for CNS repair and regeneration after injury.
Disease-Modifying Drugs for Alzheimer’s Disease: An Industry Perspective
The 1990s were a rich decade of discovery in Alzheimer’s disease, said Eric Karran of the pharmaceutical company AbbVie. Researchers identified disease-causing autosomal dominant mutations in the amyloid precursor protein presenilin and in tau. The field began to uncover key mechanisms and targets, and many believed that the next decade would yield effective therapeutics. However, that has not transpired, and many uncertainties about Alzheimer’s disease drug development remain.
Researchers still puzzle over the relationship between tau pathology and amyloid beta deposition. And while evidence suggests that Apolipoprotein E (ApoE) is closely involved in amyloid beta pathology, the mechanistic details remain mysterious. Nonetheless, research on the autosomal dominant mutations has geared drug discovery toward the idea that amyloid deposition initiates the disease process. Yet it is not clear that amyloid beta is an effective target for people who already have symptoms of Alzheimer’s disease.
Three questions are critical for therapeutics targeting amyloid: at what stage of the disease is such a drug most likely to be effective, by how much should amyloid beta be lowered, or its clearance be facilitated, and what kind of clinical experiment will test the validity of the amyloid cascade hypothesis.
Karran made a distinction between onset and duration of the disease. Possibly, amyloid beta deposition initiates the disease, he said, but is not the factor that drives its progression. The amyloid cascade hypothesis has many permutations, making proving or disproving it particularly difficult. One clear sign of this is the multiple failed trials that targeted amyloid beta. Lilly’s solanezumab seemed to show a mild effect on cognitive decline, but the signal was too small for a phase 3 trial. One currently promising candidate is Biogen’s aducanumab, which showed time- and dose-dependent reduction of amyloid plaques in early-stage trials.
Tau binpathology correlates with disease progression, but amyloid does not.
A drug that intervenes with the onset and spread of tau pathology could potentially have therapeutic value relatively late in disease. Tau pathology is the most proximate marker for neuronal loss and cognitive impairment. Tau proteins are released by a currently unknown mechanism; how they are seeded and travel to distant neurons is also poorly understood. The process points to several points of interventions, such as anti-tau antibodies targeting seeds or fibrils. However, early efforts at tau therapeutics have failed.
Speaker Presentation
Further Readings
Michael Sofroniew
Anderson MA, Burda JE, Ren Y, Ao Y, O’Shea TM, Kawaguchi R, Coppola G, Khakh BS, Deming TJ, Sofroniew MV.
Dendritic Spines, Axons, and Synapses in Neuroplasticity
Speakers
Richard Tsien New York University Langone Medical Center
Roman J. Giger University of Michigan School of Medicine
Jeffrey Macklis Harvard University
James Surmeier Feinberg School of Medicine, Northwestern University
Highlights
Neuronal cell bodies regulate events at the synapse via the CamKII signaling pathway.
Imperfect adaptation to the gradual loss of dopaminergic neurons in the striatum drives Parkinson’s disease symptoms
Dectin1, a receptor expressed on the surface of macrophages, mediates a neuroregenerative immune response after injury.
Growth cones may contain autonomous machinery for building the neuronal circuitry of the brain.
Regulation of Synapses and Synaptic Strength
Understanding the neural circuitry underlying learning and memory requires understanding how neurophysiological events at the synapse are integrated with molecular events in the nucleus such as gene transcription and protein translation, said Richard Tsien of New York University. At the synapse, this process depends on the combined activation of glutamate receptors and so-called L-type calcium channels. Tsien’s lab discovered that such dual activation is coordinated by the mobilization of a molecule called CamKII—known to be a key player in learning and memory—around tiny protrusions from dendrites called dendritic spines.
Tsien and his colleagues then elucidated how the signal from this synaptic activity is conveyed to the nucleus. Two of the four known forms of CamKII do their jobs at the synapse, but a third form, called gamma CamKII, shuttles calcium and its binding partner calmodulin to the nucleus, where it initiates a signaling cascade that drives the transcription of genes involved in long-term potentiation, a key molecular mechanism underlying learning and memory. Mice mutated to lack gamma-CamKII showed reduced learning and memory and did not upregulate key genes after training in memory tasks.
Mice mutated to lack gamma-CamKII showed reduced learning and memory and did not upregulate key genes after training in memory tasks.
A mutation in gamma CamKII has been linked to intellectual disability in humans; studies on this human mutation revealed that it prevented the protein’s ability to shuttle calcium / calmodulin. Mutations in multiple proteins on this CamKII signaling pathway have been causally implicated in neuropsychiatric disorders such as autism, pointing to its importance in linking neuronal activity with nuclear processes.
Striatal Plasticity in Parkinson’s Disease
The core motor symptoms of Parkinson’s disease (PD) are caused by the loss of dopaminergic neurons in a brain region called the striatum. James Surmeier of Northwestern University described his lab’s research on how the two main pathways of the striatum—the direct (dSPN) and the indirect (iSPN) pathway—maintain homeostasis as the disease progresses.
Dopaminergic signaling in the striatum helps regulate goal-directed behaviors. The dSPN promotes desired actions, while the iSPN suppresses undesired actions, and the two must remain balanced for appropriate action selection to occur. Dopamine helps provide that balance. When its levels are high, it promotes long-term potentiation (LTP) of the dSPN (increasing choice of good actions) and long-term depression (LTD) of the iSPN (limiting opposition to them). When levels fall, the opposite occurs, quashing the selection of “bad” actions. Surmeier’s lab studies what drives LTP and LTD at these synapses by visualizing them. Only a subset of synapses is responsive to dopamine, they found.
Dopamine differentially affects the dSPN and iSPN via D1 and D2 receptors.
According to the standard model of Parkinson’s, loss of striatal neurons changes the excitability of the dSPN and iSPN, leading to suppression of motor activity. However, this model fails to account for how the system might compensate for its gradual deterioration. Such compensation may explain why the striatum must lose more than 60% of its dopaminergic cells before a person shows symptoms of the disease, Surmeier said. His work instead suggests that the dSPN and iSPN undergo a more graded but imperfect adaptation to the loss of dopaminergic innervation which distorts the information that these pathways receive, and which may cause deficits in goal-directed behavior before gross motor symptoms appear.
Immune-mediated Nervous System Regeneration
There is no spontaneous regeneration after nerve injury in the central nervous system. That is probably because extrinsic factors exist that block regeneration intrinsic factors that promote it are not activated, said Roman J. Giger of the University of Michigan School of Medicine. However, some types of inflammation can activate such regeneration factors.
His team found that an injection of zymosan (a mixture of proteins and carbohydrates prepared from the yeast cell wall) induced significant long-distance regeneration after optic nerve injury in mice, while the bacterial extract lipopolysaccharide did not. He and his colleagues found that this regenerative antifungal response is mediated primarily by a dectin-1, a receptor for a substance called beta glucan, which is expressed on the surface of macrophages and other immune cells, as well as by the immune recognition protein Toll-like receptor 2 (TLR2).
They also found this mechanism in spinal cord regeneration, as tested after a so-called conditioning injury to the sciatic nerve (which activates immune response genes) followed by a spinal cord lesion at the dorsal root ganglion. Wild type mice showed significant spinal cord axon regrowth after zymosan injection, while mice engineered to lack dectin-1 or TLR2 showed none.
Wild type mice showed significant spinal cord axon regrowth after zymosan injection, while mice engineered to lack dectin-1 or TLR2 showed none.
The researchers then tried to pinpoint which immune cell types produced dectin-1, and where it had to be localized to spur regeneration. They found that immune cells from the sciatic nerve—that is, the conditioning lesion—carried the signal. Although mice lacking dectin showed no regeneration, immune cells from the lesioned sciatic nerve of a wild type mouse transplanted into the dectin-1 knockout mouse could rescue this deficit.
Growth Cone Control over Circuit Development
Building the brain’s neuronal circuitry is enormously complex endeavor: neurons exist in a multitude of diverse subtypes, they project to precise sompatotopic targets, and some send projections to more than one specific location. Projections can be up to a meter in length – some 10,000 cell body diameters away. The system’s precision is astounding, said Jeffrey Macklis of Harvard University, and being able to rebuild circuits when they go awry is key to regeneration in the face of injury or disease.
Macklis described work showing that the transcriptional machinery that generates this complexity is present not just in the neuronal cell body, but also in growth cones located at the tips of projections as they extend. His lab has found that growth cones contain locally translated proteins, suggesting that these neuronal outposts might exert autonomous control over circuit development. “As a developmentalist, I view growth cones as little baby synapses,” Macklis said.
Immature axons transplanted in the developing mouse still project to their original, appropriate targets, suggesting a logic and subtype specificity to the process. Macklis’s lab came up with an approach to label and isolate growth cones from different neuronal subtypes. They found specific protein and RNA enriched at growth cones that was not present in the neuronal cell body, suggesting a localized projection machinery. Targeting this machinery could be an important strategy for promoting regeneration.
Inflammation, Oxidative Stress, Mitochondrial Function, and Autophagy
Speakers
Ana Maria Cuervo Albert Einstein College of Medicine
Valina L. Dawson Johns Hopkins University
Mark Mattson National Institute of Aging
Highlights
Fasting and exercise exert protective effects on the brain and improve the bioenergetics properties of neurons.
Activators of a selective autophagy process may help clear aggregating proteins implicated in neurodegenerative disease.
A key cluster of Parkinson’s disease proteins regulate mitochondrial biogenesis and function.
Bioenergetic Challenges Bolster Brain Resilience
Mark P. Mattson of the National Institute of Aging described how two bioenergetics challenges—food deprivation and exercise—affect brain health. The ability to function under conditions of food deprivation is the main driving force in brain evolution, he said: Fasting was frequent, and it drove humans to search for food. Aging is a major risk factor for dementia and stroke, but sedentary lifestyles contribute as well, by compromising cells’ ability to adapt to the molecular stresses of aging.
Increased exercise is known to boost brain levels of the neuroprotective factor BDNF, and early work in Mattson’s lab found that fasting has the same effect in mice. Also, in mice genetically engineered to be obese and diabetic, alternate day fasting and increased exercise on a running wheel increased the density of synaptic spines in their brain. Further work showed that fasting and exercise also increased the number of mitochondria—the cell’s energy-generating organelles—in cultured hippocampal neurons.
The brains of mice lacking Sirt3 experience more cell death (blue) upon excitotoxic treatment with glutamate, kainic acid, and NMDA.
More recently, Mattson’s lab found that exercise and intermittent fasting upregulate a mitochondrial protein called sirtuin 3 (sirt3), which goes on to block enzymes that protect the mitochondria against stress and protect cells against apoptosis. The group has also explored the effects of fasting in humans. Currently, the group is studying whether people at risk for cognitive impairment due to age or metabolic status benefit from fasting two days per week.
Malfunctioning Autophagy Pathways in Neurodegeneration
Autophagy is the process of degradation or recycling of materials inside the cell, and many facets of it are coming under scrutiny as causal factors in neurodegeneration. Ana Maria Cuervo of the Albert Einstein College of Medicine studies chaperone-mediated autophagy (CMA), in which individual proteins targeted with a degradation motif are recognized by a chaperone protein, carried to a receptor called LAMP-2A on the lysosome surface, and pulled inside for degradation. In order to study the role of CAM in neurodegeneration, Cuervo’s lab designed a fluorescent reporter system that can track the process in vivo, in the brain and other organs.
A fluorescent reporter technique developed by Cuervo lab allows researchers to observe chaperone-mediated autophagy in different tissues of a live mouse.
The CAM pathway is highly sensitive to aging; levels of the LAMP-2A receptor drop as animals age. Additionally, many proteins involved in neurodegenerative diseases have CMA degradation motifs. The mutant form of LRRK2, the protein most often mutated in familial cases of Parkinson’s, interferes with LAMP-2 receptor’s ability to form complexes as required for translocation into the lysosome; other neurodegeneration-related proteins, such as tau, showed a similar effect, which led to an aggregation of these proteins due to their inability to be broken down inside the lysosome. Human postmortem Alzheimer’s disease brains also appear to have a CMA deficit.
The lab has now developed a selective activator of the CAM pathway and is administering it to a mouse model of Alzheimer’s disease. The intervention ameliorates behavioral symptoms such as anxiety, depression, and visual memory in the animals, as well as cellular markers of the disease.
Mitochnodrial Mechanisms and Therapeutic Opportunities
Mitochondrial dysfunction was first observed in Parkinson’s disease some 40 years ago, but how it plays a role in the disease is unknown. Some genetic causes of PD have been identified, including mutations in Parkin and PINK1. Valina L. Dawson’s lab at Johns Hopkins University is investigating how three closely interacting proteins, Parkin, PINK1, and PARIS, regulate mitochondrial function and, in turn, the integrity of dopaminergic neurons, which malfunction in PD.
In 2011, Dawson’s lab identified PARIS, a protein that tamps down mitochondrial production by repressing another protein called PGC1-alpha. PARIS is ubiquitinated by Parkin to remove the brake on mitochondrial production. Mice genetically engineered to lack Parkin show age-dependent loss of dopaminergic neurons and serve as a model of PD. But if these mice also experience a knock-down in PARIS, the deficit is rescued. Loss and gain of function studies of these proteins in mice revealed a homeostasis between them that regulates mitochondrial biogenesis and function. Pink1 is also central; it must phosphorylate Parkin for this homeostasis to occur.
In human neuron lacking Parkin, knocking down PARIS restores mitochondrial deficits.
The relationships between these proteins also hold in human embryonic stem cells when these proteins are knocked down, and in induced pluripotent cells derived from Parkinson’s patients with mutations in these proteins. Based on these findings, Dawson’s team and collaborators are exploring whether PARIS inhibitors, Parkin activators, or other molecules affecting this protein network have therapeutic value in PD mice.
Speaker Presentations
Further Readings
Mark Mattson
Cheng A, Yang Y, Zhou Y, Maharana C, Lu D, Peng W, Liu Y, Wan R, Marosi K, Misiak M, Bohr VA, Mattson MP.
Cell Rep. 2017 Jan 24;18(4):918-932. doi: 10.1016/j.celrep.2016.12.090.
Glial Function
Speakers
Steven A. Goldman University of Rochester Medical Center
David H. Rowitch University of Cambridge
Clive Svendsen Cedars-Sinai Medical Center
Highlights
Glial cell dysfunction may causally contribute to schizophrenia and other neurological diseases.
Astrocytes engineered to produce GDNF are in clinical trials for treating amyotrophic lateral sclerosis.
Astrocytes are functionally and regionally heterogeneous, and their dysfunction may contribute to neurodegenerative disease.
Targeting Glial Cell Dysfunction in Neurological Disease
Glial cells make up a significant proportion of cells in the brain, yet their contribution to disease is poorly understood. Steven A. Goldman of the University of Rochester Medical Center studies glia’s role in brain diseases such as schizophrenia. His lab injects human glial progenitor cells into the brains of mutant mice that lack their own glia; the brains of the resultant chimeras become fully repopulated with human astrocytes and oligodendrocytes. This human glial chimera maintains the phenotypes of human glial cells, and mice with human glia show stronger long-term potentiation in the hippocampus and learn fear-conditioning and other behavioral and cognitive tasks more quickly than wildtype mice.
Astrocytes in mice populated by glial cells derived people with schizophrenia had different morphology than those derived from control subjects, with fewer and longer processes.
Goldman’s team created chimeric mice populated by glia derived from eight different people with juvenile onset schizophrenia, and compared them to mice with glial cells derived from control subjects. These glial precursor cells migrated abnormally and formed less myelin than precursors from control human subjects. Myelin-producing and glial differentiation genes, as well as genes associated with synaptic development and transmission, were downregulated. Astrocytes in the patient-derived chimeras also had irregular morphology. The animals exhibited impaired response to stimuli as well as anxiety and antisocial behavior. Genes related to glial cells might be potent therapeutic targets for schizophrenia and other diseases, like Huntington’s disease and frontotemporal dementia.
“We never thought of these as glial diseases, but fundamentally they might be,” Goldman said.
Stem-cell-derived Astrocytes for Treating Neurodegenerative Disease
Ninety percent of neurodegenerative diseases have no known genetic cause, and may be amenable to treatment with cell therapy, said Clive Svendsen of Cedars-Sinai Medical Center. While delivering neurons into diseased CNS is still evolving, astrocytes have great potential for immediate use, Svendsen said.
His lab developed a protocol for deriving astrocytes from human fetal tissue; these cells migrate to areas of damage when delivered to a rat brain. To give these cells more regenerative capacity, Svendsen and collaborators engineered the cells to release the growth factor GDNF. They initially tested this cell delivery therapy in a Parkinson’s disease model, but it has also been applied in stroke, and both Huntington’s and Alzheimer’s disease.
More recently they have begun to explore its use in amyotrophic lateral sclerosis (ALS), where life expectancy after diagnosis is a mere three years and no treatments exist. They first tested it in an ALS rat transgenic model in which astrocytes lacked the protein SOD1. When they transplanted the therapeutic astrocytes to the lumbar spine, the cells survived well and improved neuronal survival, but did not prevent paralysis. As they moved up the spinal cord, results improved; cell delivery into the brain’s motor cortex yielded improved motor function and survival in the animals.
GDNF-releasing astrocytes injected into the motor cortex spur motor neuron growth in a rat model of ALS.
Last October, Svendsen and his team launched an 18-person clinical trial of this approach. For safety reasons, the U.S. Food and Drug Administration required the researchers to start by delivering cells into the lumbar spine; patients will receive the therapy in one leg, with the other acting as a control. If the first few patients experience no adverse effects, delivery into the cervical spine and the cortex will also be attempted.
Functionally Heterogeneous Astrocytes in the Mammalian CNS
How neuron patterning generates a diversity of neuronal types throughout the central nervous system is well understood. But very little is known about heterogeneity in astrocytes, although they are the most abundant cells in the CNS, comprising about half of all brain cells, said David H. Rowitch of the University of Cambridge.
Early work in Rowitch’s lab identified an astrocyte-specific transcription factor that showed that astrocytes are allocated to specific regions of the brain during development. They then searched for postnatal astrocytes in the spinal cord that were regionally and functionally distinct by comparing gene expression in the dorsal and ventral part of the spinal cord. The gene Sema3a was most highly expressed in ventral astrocytes in mice, and when it was deleted, half the animal’s alpha motor neurons, which innervate fast-twitching muscle, died.
Mice lacking Kir4.1 have abnormal signaling in motor neurons, smaller muscle fibers, and decreased strength.
To investigate how neurons and astrocytes interact, the researchers examined a potassium channel called Kir4.1, which is preferentially expressed in the ventral brain and spinal cord. Loss of function mutations to the channel cause epilepsy, and the channel is strongly downregulated in astrocytes of people with ALS. Mice engineered to lack the channel in astrocytes have smaller alpha motor neurons and weaker muscle function. Transfecting the astrocytes of these mice with the channel reverses these deficits. The fact that astrocytes so strongly affect neuron function suggests that dysfunction in specific subsets of astrocyte may play a role in neurodegenerative diseases.
Speaker Presentations
Further Readings
Steven Goldman
Han X, Chen M, Wang F, Windrem M, Wang S, Shanz S, Xu Q, Oberheim NA, Bekar L, Betstadt S, Silva AJ, Takano T, Goldman SA, Nedergaard M.
Science. 2012 Jul 20;337(6092):358-62. doi: 10.1126/science.1222381. Epub 2012 Jun 28.
Innovative Approaches to Promote Neuroregeneration
Speakers
Graham Collingridge University of Toronto
Guo-li Ming University of Pennsylvania
Benedikit Berninger Johannes Gutenberg University Mainz
Amar Sahay BROAD Institute of Havard and MIT
Highlights
Novel therapies targeting the synaptic plasticity pathways could address the dysregulation of long term depression underlying Alzheimer’s disease.
Brain organoids grown from human induced pluripotent stem cells recapitulate development and can model brain disease.
Reprogramming pericyte cells into neuronal cells occurs via a distinct developmental program.
Promoting neurogenesis and re-engineering molecular connectivity in the hippocampus restored age-related memory decline in mice.
Is Alzheimer’s Disease Caused by Long Term Depression Gone Awry?
One key purpose of brains is to enable learning and memory—a process that relies on a balance between long term potentiation (LTP) and long term depression (LTD) to drive synaptic plasticity, said Graham Collingridge of the University of Toronto. Dysregulation of that balance causes Alzheimer’s disease, he said.
In 1983, Collingridge’s lab identified the role of the NMDA receptor in synaptic plasticity, finding that its activation could cause both LTP and LTD. In later work, they sought kinase inhibitors that could block LTP and LTD. One of the few ways to inhibit LTD was to block glycogen synthase kinase 3beta (GSK-3beta). This molecule is also known as tau kinase because it hyperphosphorylates the protein tau—a process implicated in Alzheimer’s disease pathogenesis. “I thought, well, that’s just not coincidence, is it,” Collingridge said.
Dysregulation of the pathway regulating LTD can cause the pathogenic features of Alzheimer’s disease.
Tau regulates microtubules in axons, but Collingridge’s lab found that it also exists in synapses, and is phosphorylated by GSK-3beta. In mice engineered to lack tau, LTD is absent but LTP is undisturbed. Work from other researchers had shown that amyloid beta, the protein that aggregates in Alzheimer’s disease, inhibits LTP and facilitates LTP. His group showed that GSK-3beta reverses this effect, and identified other parts of the signaling pathway linking amyloid beta, tau, GSK-3beta, and both LTP and LTD. Dysregulation in these components can generate amyloid beta plaques, tau tangles, and the neuroinflammation, synapse loss and memory loss that characterizes Alzheimer’s. Modulators of NMDA receptor activity may have therapeutic potential.
Modeling Human Brain Development and Disease with Human Induced Pluripotent Stem Cells
Guo-Li Ming of the University of Pennsylvania is developing 3-dimensional cell culture models of the developing brain—so-called organoids—using induced pluripotent stem cells. High school students working in her lab designed 3D-printed lids with shafts that insert into standard cell culture plates, to divide each individual well of the plate into a separate miniaturized spinning bioreactor. Because most brain organoid protocols produced highly heterogeneous tissue, she used these tiny bioreactors to create organoids containing almost exclusively forebrain tissue.
Using markers specific to different layers of the cerebral cortex, Ming’s lab could show that organoids roughly recapitulated the cortical architecture.
Cell labeling and gene expression studies showed that when grown for 100 days, these organoids recapitulated fetal forebrain development through the end of the second trimester. Progenitor cells generated neurons and glia whose migration pattern mirrored development, and the neurons received both excitatory and inhibitory input. The researchers used the organoids to study how Zika virus affects the developing brain. They found that the virus specifically targets neural progenitor cells, dose-dependently causing cell death and causing a collapse of tissue that resembles the microcephaly in infants affected by Zika. A screen of 6000 compounds yielded a neuroprotective compound called Emricasan that is positioned to enter clinical trials.
The group has now developed other brain-region specific organoids, modeling the midbrain and the hypothalamus. They plan to use these tools to study other neurodevelopmental disorders. Recent publications suggest the approach can also recapitulate features of neurodegenerative diseases, Ming said.
Engineering Neurogenesis via Lineage Reprogramming
For the past decade, Benedikit Berninger of Johannes Gutenberg University Mainz has been working on identifying cellular signals that can drive the reprogramming of astroglial cells from early postnatal mouse brain into neurons. More recently, to see if such reprogramming could be conducted in human cells, his lab began working with cells derived from adult human brains during epilepsy surgery. These cells turned out to be pericytes, and Berninger’s team identified a two transcription factors—Sox2 and Ascl1—that could reprogram them into functional neurons, which formed synapses and fired action potentials in culture.
To understand how the two transcription factors interact, the researchers investigated gene expression in the early stages—day 2 and day 7—in this two-week reprogramming process. A few genes were regulated by just one of the factors, but most were turned on only when both factors were present, suggesting that the two factors act synergistically. Ascl1 alone appears to target a different set of genes—ones associated with mesodermal cell fate (which generate pericytes), rather than neurogenesis-related genes activated when Ascl1 is co-expressed with Sox2. A similar difference was seen on a single cell level.
The researchers also observed two subpopulations in the starting population of pericytes—one of which was susceptible to reprogramming into neurons while the other was not. That may account for distinct competence in reprogramming in individual cells, Berninger said. For example, cells expressing the leptin receptor had a low level of reprogramming efficiency, indicating subtype differences in reprograming competence.
Three sets of genes are induced during reprogramming of pericytes to neurons—a set associated with pericytes, one associated with a progenitor-like stage, and one associated with neurons.
In the subset of cells that do reprogram successfully, a set of genes was induced transiently, then downregulated. These genes reflect a progenitor-like stage in the reprogramming process. These studies suggest that cells are not transforming directly from pericyte to neuron, but undergo a series of events reminiscent of an unfolding developmental program, Berninger said.
Rejuvenating and Re-engineering Aging Memory Circuits
The hippocampus plays a critical role in formation of episodic memories-that is, memories of what, when, and where. Essential to this capacity is the need to keep similar memories separate and retrieve past memories in a context appropriate manner. With age, the ability to keep similar memories separate and context-appropriate retrieval is potentially impaired, said Amar Sahay of Massachusetts General Hospital and Harvard Medical School. Within the hippocampus, the dentate gyrus-CA3 circuit performs operations such as pattern separation and pattern completion that support resolution of memory interference and retrieval. With age, neurogenesis in the hippocampus declines and CA3 neurons become hyper excitable in rodents, non-human primates and humans. Sahay’s lab investigates circuit mechanisms that may be harnessed to optimize hippocampal memory functions in adulthood and aging.
The DG-CA3 circuit in the hippocampus regulates episodic memory.
The hippocampus generates new neurons throughout life, and previous work has suggested that adult-born neurons integrate into the hippocampal circuitry by competing with existing mature neurons for inputs. Sahay and his colleagues identified a transcription factor called Klf9 that, when unregulated just in the mature neurons, biases competition dynamics in favor of integration of the adult-born neurons. This enhances neurogenesis in adult (3-month-old), middle-aged (12 months) and in aged (17-month-old) mice. Older rejuvenated animals (with enhanced adult hippocampal neurogenesis) had a memory advantage: they were better at discriminating between two similar contexts, one safe and one associated with a mild footshock.
In a complementary series of experiments, Sahay and his colleagues found age-related changes in connectivity between dentate granule neurons and inhibitory interneurons. They performed a screen and identified a factor with which they re-engineered connectivity between dentate granule neurons and inhibitory interneurons and augmented feed-forward inhibition onto CA3. By targeting this factor in the dentate gyrus of aged mice, the authors were able to reverse age-related alterations in dentate granule neuron-inhibitory interneuron connectivity and enhance memory precision.
Nat Neurosci. 2011 May;14(5):545-7. doi: 10.1038/nn.2785. Epub 2011 Mar 27.
Kimura T, Whitcomb DJ, Jo J, Regan P, Piers T, Heo S, Brown C, Hashikawa T, Murayama M, Seok H, Sotiropoulos I, Kim E, Collingridge GL, Takashima A, Cho K.
Philos Trans R Soc Lond B Biol Sci.2013 Dec 2;369(1633):20130144. doi: 10.1098/rstb.2013.0144. Print 2014 Jan 5.
Peineau S, Taghibiglou C, Bradley C, Wong TP, Liu L, Lu J, Lo E, Wu D, Saule E, Bouschet T, Matthews P, Isaac JT, Bortolotto ZA, Wang YT, Collingridge GL.
McAvoy KM, Scobie KN, Berger S, Russo C, Guo N, Decharatanachart P, Vega-Ramirez H, Miake-Lye S, Whalen M, Nelson M, Bergami M, Bartsch D, Hen R, Berninger B, Sahay A.
Biomarkers, Hot Topics, and the Future of Therapeutics
Speakers
Reisa Sperling Brigham and Women’s Hospital
Johanna Jackson Eli Lilly and Company
Eliška Zlámalová University of Cambridge
Arthur Konnerth Technical University of Munich
Milo Robert Smith Icahn School of Medicine at Mount Sinai
Highlights
Multimodal imaging is becoming advanced enough to identify people with early-stage disease, which will help determine the critical window for therapies in clinical trials.
Slow wave oscillations are disrupted in Alzheimer’s disease model mice due to a misregulation of excitatory and inhibitory synaptic activity.
Imaging pre- and post-synaptic structures over time can reveal how disease progression affects synapses.
Integrative bioinformatics can identify common pathways across neurodegenerative diseases and as well as drugs that can may act on those pathways.
An RNAi-based screen in Drosophila can reveal genes that shape the morphology of axonal ER.
Neuroimaging in Early Alzheimer’s Disease
Alzheimer’s disease evolves over a couple decades, but most research to date has examined the disease at a late stage—perhaps too late to intervene effectively, said Reisa Sperling of Brigham and Women’s Hospital. Multimodal imaging is becoming advanced enough to identify people with early-stage disease, which will help determine the critical window for therapies in clinical trials.
PET amyloid imaging detects amyloid pathology in humans in vivo. Some 30% of clinically normal individuals have high amyloid levels, accumulating data suggests that this increases the risk of cognitive decline over the next 15 years—particularly when combined with markers of neurodegeneration such as decreased hippocampal volume. Still, Sperling said, “I see that as a glass half full—we’ve got 15 years to intervene.”
Committing something to memory requires activation of a brain region called the medial temporal lobe, where tau accumulates in AD. It also requires disabling the so-called default mode network (DMN), a brain circuit active when the brain is not engaged in a particular task. Amyloid accumulation disrupts the DMN, and disruptions also emerge in other networks and the specificity with which they signal.
Tau levels are associated with cognitive decline.
It’s the combination of amyloid and tau that is important for cognitive decline. Because tau—though not amyloid—correlates clearly with cognitive decline, tau PET imaging, which emerged just a couple years ago, has the most promise as a neurodegenerative marker for clinical trials, Sperling said. Ultimately, trials should move toward primary prevention—identifying drugs that block disease onset before clinical symptoms emerge. The field also needs biomarkers that show a person’s response to therapy.
Circuitry Dysfunction in Alzheimer’s Disease Mouse Models
A lot is known about clinical symptoms, pathology, and molecular mechanisms involved in Alzheimer’s disease, but there is a big gap in understanding how neuronal circuits are affected, said Arthur Konnerthof the Technical University of Munich.
About ten years ago, Konnerth’s lab developed a method for measuring neuronal function at the single cell level in living mice using fluorescent calcium indicators. They used it to investigate neurons surrounding amyloid beta plaques in mice lacking functional amyloid precursor protein (APP), an Alzheimer’s disease model. They hypothesized that these neurons would show decreased activity, but to their surprise, they were hyperactive, while further-away cells were silent. The error signal sent by these hyperactive cells probably disturbs the circuit significantly, Konnerth said.
His team also explored the function of long-range circuits in Alzheimer’s disease model mice. They studied slow wave oscillations, a form of activity that is essential for slow wave sleep and for memory consolidation. These waves travels through the cortex and into the hippocampus in a coherent fashion. In Alzheimer’s disease mice, the coherence of this circuitry is highly disrupted. Enhancing inhibitory (GABAergic) neuron activity reversed the deficit.
Alzheimer’s disease model mice showed improved learning after restoration of slow wave activity.
Tweaking GABAergic activity in normal mice also affected this circuitry, pointing to a synaptic effect. Returning the circuitry to normal also improved a learning task, the Morris water maze, and individual animals’ behavioral performance could be predicted by the coherence of this slow wave oscillation. An fMRI study in humans conducted by another lab showed also showed a disruption in slow wave oscillation. Targeting the shift in excitation-inhibition that underlies slow wave disruption may ameliorate cognitive deficits in the disease, Konnerth said.
Hot Topics in Neuroregeneration
In three short talks, early career researchers described imaging, bioinformatics and candidate gene analyses for probing neurodegenerative diseases.
Johanna Jackson from Eli Lilly used two-photon imaging in two mouse models of Alzheimer’s disease to study how disease progression affects synapses. She and her colleagues tracked axonal boutons and dendritic spines—the presynaptic and postsynaptic points of contact—over time in the same brain region. In the J20 mouse, which develops amyloid plaques, dendritic spine number remained constant, but axonal boutons were lost and the turnover rate of both spines and boutons increased as amyloidopathy progressed. The Tg4510s mouse, which develops tauopathy, showed a different pattern: both spines and boutons were lost, and neurites sickened then disappeared over time. Switching off the transgene in these mice could partially prevent or delay these deficits.
Milo Robert Smith of the Icahn School of Medicine at Mt. Sinai used bioinformatics to probe plasticity mechanisms in neurodegenerative diseases and to identify common disease pathways and potential therapeutic drugs. First, his team conducted microarray experiments to capture gene expression signatures of plasticity in mice. They then matched these signatures to transcriptomics signatures of 436 diseases taken from publicly available databases. The 100-plus illnesses showing a significant association were enriched for neurodegenerative diseases, and inflammatory genes appeared highly implicated. Finally, the researchers matched disease transcriptional signatures to transcriptional signatures of drugs measured in cell lines, also from publicly available databases. Using this approach, they identified drug candidates for resting plasticity in Huntington’s disease.
A strategy for using integrative bioinformatics to identify drugs that target common mechanisms in neurodegenerative disease.
Human motor neuron axons can extend a meter in length, but dysfunction in trafficking such a distance underlies a neurodegenerative disease called hereditary spastic paraplegia (HSP), in which corticospinal motor neurons progressively degenerate. Eliška Zlámalová of the University of Cambridge is identifying candidate genes involved in long axon transport and HSP pathology. Three genes associated with HSP—reticulon, REEP1, and REEP2—produce proteins that localize to smooth endoplasmic reticulum (ER) in axons. When Zlámalová disabled all three in Dropsophila, ER fragmented in the middle of the axon and degenerated distally. To look for additional candidate genes, Zlámalová developed fluorescent markers for two other proteins, knocked own their genes in triple-mutant flies using RNA interference, and imaged ER morphology. She found a trend toward further ER fragmentation; a higher number of experiments may yield more conclusive results.
Further Readings
Reisa Sperling
Jack CR Jr, Bennett DA, Blennow K, Carrillo MC, Feldman HH, Frisoni GB, Hampel H, Jagust WJ, Johnson KA, Knopman DS, Petersen RC, Scheltens P, Sperling RA, Dubois B.
Science. 2014 Jan 31;343(6170):506-511. doi: 10.1126/science.1247363.
Panel Discussion: The Future of Research and Therapies in Neuroregeneration and Restoration
Speakers
Michael J. O’Neill, Moderator Eli Lilly and Company
Ana Maria Cuervo, Panelist Albert Einstein College of Medicine
Mark P. Mattson, Panelist National Institute of Aging
Clive Svendsen, Panelist Cedars-Sinai Medical Center
Jeffrey Macklis, Panelist Harvard University
Panel Discussion The Future of Research & Therapies in Neuroregeneration & Restoration
The panelists began by summarizing what they consider the most exciting dimension in the field of regeneration. Jeffery Macklis said that since graduate school, he had puzzled over the fact that only certain cell types were vulnerable and selectively damaged in different neurodegenerative diseases. “I find that the most exciting question,” he said. “Until we get to neuron subtype specificity and the circuits involved, we could be looking at a lot of unrelated stuff.”
Ana Maria Cuervo notes that neurodegenerative diseases primarily occur in the elder population, yet researchers still don’t know enough about the physiology of aging to determine which dimensions of the disease are due to aging and which are not.
Mark P. Mattson agreed, noting that in Alzheimer’s disease, events upstream of amyloid including generic age-related events such as increased oxidative stress, can affect the disease. “We need to understand those if we want to intervene earlier,” he said. He also wondered whether mechanisms being targeted by drug development could also be activated by exercise or energy restriction. A related approach might be to induce mild intermittent bioenergetic stress on cell pharmacology.
“The thing that keeps me up at night in this field is biomarkers,” said Clive Svendsen. Molecules that change as the disease progresses are not necessarily causative; indeed, some of the stress responses observed in Alzheimer’s disease might be neuroprotective, and that holds for Huntigton’s disease, too, he explained.
An audience member raised the question of sex differences in neurodegenerative disease, noting that even when boys and girls reach the same cognitive milestones, they often arrive there through different routes. In response, Mattson described a study conducted by his group that compared responses to different diets in male versus female mice. At 40% calorie restriction, females shut down their estrus cycle, increased their physical activity, and lost most of their body fat. Males under the same circumstances remained fertile, and their activity levels did not change. That could be because from an evolutionary perspective, females would ostensibly want to avoid having babies when there is no food around, because they lack the energy to care for them, while males might want to inseminate as widely as possible before they starve to death.
Reisa Sperling noted that women respond more adversely to a smaller amount of amyloid beta. “Something about being female means that you are more vulnerable,” she noted. An audience member noted that although men have a higher risk of Parkinson’s disease, females deteriorate faster once diagnosed. Svendsen noted that these observations speak to broader issues in personalizing treatments for neurodegenerative diseases. Sporadic Alzheimer’s disease likely consists of more than one disease, for example. “We’re trying to subdivide ALS into 10 types,” he said.
Panel Discussion
Open Questions
How do hypertrophic astrocytes help require damaged neuronal circuits?
What is the best way of clinically testing the amyloid beta hypothesis?
Can the signaling mechanism linking neuronal activity at the synapse and gene transcription in the nucleus be therapeutically targeted?
How should Parkinson’s disease therapeutic efforts account for homeostatic plasticity in stratal neurons?
Why do different inflammatory responses have different effects on CNS regeneration? [Giger]
How can growth cone machinery be targeted to promote regeneration?
Can fasting and exercise mitigate against dementia and neurodegenerative damage in diseases like Alzheimer’s and Parkinson’s?
How do pathogenic proteins cause the breakdown of chaperone-mediated autophagy, and how does such authophagy contribute to the clearance of pathogenic proteins?
Will improvements in mitochondrial function obtained by targeting Parkin, PARIS or related proteins provide therapeutic benefits in Parkinson’s disease?
How does glial cell dysfunction cause neurological disease and can it be therapeutically targeted? [Goldman]
Can a cell therapy consisting of GDNF-releasing astrocytes stave off paralysis in ALS?
Are there neurodegenerative diseases besides ALS in which genes are maladaptively downregulated in astrocytes?
Will drugs that modulate NMDA activity prove beneficial for Alzheimer’s disease?
How well can organoids reflect the pathology of neurodegenerative diseases?
Can promoting reprogramming strategies that turn non-neuronal cells into neurons be used therapeutically?
Can memory be improved with the help of molecular strategies to rejuvenate hippocampal circuitry that degenerates with age?
Will candidates identified through integrative bioinformatics yield drugs that target common mechanisms in neurodegenerative disease?
How to determine the optimal window for efficacy of different prospective Alzheimer’s disease therapies?
Will reversing the disintegration of slow wave oscillations ameliorate cognitive impairment in Alzheimer’s disease?
The enterprise of drug development is a crucial lifeline for patients and their families. Those who need new and better treatment options depend on researchers to deliver safe and effective therapies as quickly as possible, meaning experimental drugs must first be tested on human volunteers before they can be approved for widespread use.
Since the mid-twentieth century, the randomized controlled trial (RCT) has been considered the gold-standard in research design because of its ability to overcome bias and yield high-quality evidence. But it comes at a steep cost: The average new drug requires six to eight years of human testing and $100 million to fund the clinical trial phase alone. Moreover, conducting an RCT is not always feasible or moral, such as during a pandemic or in the case of a very rare disease.
In such cases, alternative trial designs may produce faster and cheaper results, but in doing so, they must not compromise appropriate levels of standards of safety and efficacy, say regulators, patients, and insurers. While more rapid development is critical to save lives, difficult questions remain about how to tread this delicate balance.
On June 21 – 22, 2017, the Academy convened a colloquium at which academic and pharmaceutical researchers, federal regulators, bioethicists, executives, patient advocates, and lawyers met to discuss the relevance of the randomized controlled trial as the default model for human subject research. Talks focused on the history of the RCT, the ethics and use of alternative trial designs, the risks of foregoing traditional tools, the role of patient advocacy, lessons learned from a recent case study, and the importance of innovation in reforming a flawed system.
With the success of emerging interventions like genomic therapy and immunotherapy, a cultural conversation has opened up around issues such as determining how clinical trials should be designed in this new era, who may participate in research, and when promising therapies should reach the market. Formulating answers to these urgent questions could benefit millions of patients and reshape the future of medicine.
Speakers
Alison Bateman-House, PhD, MPH, MA NYU School of Medicine
Luciana Lopes Borio, MD U.S. Food & Drug Administration
Timothy Caulfield, LLM, FRSC, FCAHS University of Alberta
Anne Cropp, PharmD Early Access Care, LLC
George D. Demetri, MD Dana-Farber Cancer Institute
Rebecca Susan Dresser, JD Washington University in St. Louis
Susan S. Ellenberg, PhD University of Pennsylvania
Howard Fingert, MD, FACP Takeda Pharmaceuticals
Pat Furlong Parent Project Muscular Dystrophy
Barry J. Gertz, MD, PhD Clarus Ventures
Edward M. Kaye, MD Sarepta Therapeutics
Nancy M.P. King, JD Wake Forest School of Medicine
Clifton Leaf Fortune
Holly Fernandez Lynch, JD, BE The Petrie-Flom Center for Health Law Policy, Biotechnology and Bioethics at Harvard Law School
Susan E. Lederer, PhD University of Wisconsin School of Medicine and Public Health
Andrew McFayden The Isaac Foundation
Jane Perlmutter, PhD Gemini Group
Vinay Prasad, MD, MPH Oregon Health and Science University
Amrit Ray, MD Johnson & Johnson
Jane Reese-Coulbourne, MS, ChE MK&A
Christopher Robertson, PhD, JD University of Arizona
Matthew D. Rotelli, PhD Eli Lilly and Company
Eric H. Rubin, MD Merck & Co., Inc.
David Scheer Scheer & Company, Inc.
J. Russell Teagarden NYU School of Medicine Working Group on Compassionate Use & Pre-Approval Access
John (L.P.) Thompson, PhD Columbia University
Meg Tirrell CNBC
Andrea B. Troxel, ScD NYU School of Medicine
Ellis Frank Unger, MD U.S. Food & Drug Administration
Steve Usdin BioCentury
Joanne Waldstreicher, MD Johnson & Johnson
Jeffrey S. Weber, MD, PhD NYU Langone Medical Center
Charles Weijer, MD, PhD Western University
Robert Walker, MD U.S. Department of Health and Human Services
Sponsors
This symposium was made possible with support from
Presented by
Meeting Reports
History and Contribution of Randomized Controlled Trials to Public Health
Speakers
Susan S. Ellenberg, Panelist University of Pennsylvania
Howard Fingert, Panelist Takeda Pharmaceuticals
Susan E. Lederer, Panelist University of Wisconsin School of Medicine and Public Health
Jane Perlmutter, Panelist Gemini Group
Arthur Caplan NYU School of Medicine
Panel Discussion History and Contribution of Randomized Controlled Trials to Public Health
Highlights
American laws regarding drug testing transformed in the 1960s in response to crisis.
New epidemics and targeted genomic therapy are prompting re-evaluation of the RCT.
Industry sponsors have a responsibility to uphold data quality.
A tension may exist between statistical endpoints and patient experiences.
History lessons
The opening panel set the stage for the role RCTs have played in the history of medical research. Susan Lederer, a professor of medical history and bioethics at the University of Wisconsin, described how clinical trials first came to be. In the 1760s, James Lind was a ship surgeon in the British navy faced with a rash of scurvy cases.
In a bid to stop the outbreak, he divided twelve sailors into groups of two, rotating each through different sets of treatments. The groups tried sea water, sulfuric acid, vinegar, cider, a tamarind paste, and oranges and lemons. When that last treatment proved effective, Lind realized he had hit upon a cure.
But officially randomizing treatment into a control arm and a trial arm didn’t gain traction until the mid-twentieth century, when World War II prompted a massive influx of federal dollars for research, and the pharmaceutical industry began to transform American medicine. In the early 1960s, after many pregnant women took the drug thalidomide, which caused fetal deaths and birth defects, Congress established laws calling for “adequate and well-controlled” studies that demonstrated efficacy as well as safety before drugs could be approved.
By the 1970s and 80s, the RCT had become the gold standard, said Arthur Caplan, a bioethicist at the NYU School of Medicine. But just two decades later, criticism emerged during the HIV epidemic, when many patients pushed back on ethical grounds against being randomized, contending that scientific advancement should not come at the cost of their own lives. While some patient groups praised the RCT model, some observers, like prominent physician Marcia Angell, called into question researchers’ “slavish adherence” to the RCT at the expense of compassion for individual sufferers.
In the current era of epidemics like Ebola and Zika, and the increasing prevalence of targeted genomic therapies, the relevance of the standard RCT has been called into question with renewed urgency. Some situations, Caplan said, don’t permit the time or expense of a standard RCT. That’s a point which has raised substantial debate, but some argue that patients may be too sick to participate, the need for treatment may be immediate, the number of sufferers too small, or the ability to maintain oversight too unrealistic.
Jane Perlmutter, a patient advocate, offered additional concerns about RCTs, including limitations on generalizability if trial subjects don’t comply with the protocol, and if eligibility requirements narrow the scope of the testing population.
“This gold standard is not so terrific,” she declared. “We need to innovate.”
Patients also don’t like to be guinea pigs, she noted, suggesting one “baby step,” towards innovation could be allowing patients to choose to join the experiment or control group, rather than be blinded and randomized. Still, randomizing some patients in is important to preventing bias and leading to generalizable findings. In such a design, researchers must carefully assess the data to ensure that the randomized subjects and the self-chosen subjects show no misleading discrepancies. As long as their results align, this type of study can be both efficient and effective.
At the same time, Howard Fingert, senior medical director at Takeda Pharmaceuticals, said that the management of big data is a major responsibility and opportunity for industry sponsors no matter the trial design. Mechanisms for data sharing can improve understanding in trials that are single-armed, propensity-matched (meaning those driven by a statistical score that estimates a treatment’s effectiveness), or underpowered (meaning those statistically unlikely to distinguish a treatment effect from pure luck due to a low sample size).
And upholding data that is reflective of reality is crucial, he said. For example, if a primary endpoint in a study fails, investigators may look for a positive secondary endpoint that wasn’t originally in the protocol—a practice he called “ubiquitous but not legitimate.”
Caplan asked whether fear of adverse outcomes is hindering innovation, citing the 1999 death of an 18-year-old in a clinical trial for gene therapy, which proved a major setback for the entire burgeoning field. Susan Ellenberg, a professor of biostatistics at the University of Pennsylvania, noted that risk aversion is even more common now, as social media reinforces people’s negative beliefs about the prevalence of dire medical events, like vaccine toxicity. But 37 states have passed right-to-try laws, Perlmutter pointed out, which permit terminally ill patients to volunteer for experimental therapies.
Finally, the panelists discussed how the very notion of efficacy has evolved. Decades ago, patients were asked if they were feeling better after an intervention, but now the focus is on objective endpoints—sometimes to a fault, according to Perlmutter. She questioned whether an outcome such as a shrinking cancer tumor, for example, is truly meaningful if the patient’s quality of life remains unaffected. Caplan reminded the group that in dialysis programs for end-stage renal disease in the 1970s, one measure of success was whether patients could return to work. But Ellenberg added that patient improvement and high-quality data are not mutually exclusive.
“Looking for objectivity doesn’t exclude those endpoints,” she said.
Luciana Lopes Borio, Panelist U.S. Food and Drug Administration
Barry J. Gertz, Panelist Clarus Ventures
Andrea Troxel, Panelist NYU School of Medicine
Charles Weijer, Moderator Western University
Panel Discussion Beyond RCTs — Assessing the Need for Alternatives
Highlights
Choosing between a standard RCT and a non-RCT alternative can be a false dichotomy.
FDA states that all patients deserve the same evidentiary and regulatory standard.
Tension exists between targeting therapies to the right patient population and understanding drug safety and efficacy.
Innovation without compromise
Moderator Charles Weijer, a bioethicist at Western University, kicked off the discussion by asking panelists to offer up important lessons about clinical trials from the past. Barry Gertz, a partner at Clarus, pointed to randomization as the key to reducing bias, and a reason why the RCT ought to remain the default design for testing new agents. Despite its heavy costs, he argued, the overall societal burden would be even higher without it, citing the example of a new device tested for severe hypertension. It produced what seemed at first to be miraculous results—until a subsequent RCT proved it no better than a sham procedure.
“If you don’t test with adequate rigor,” he warned, “society will pay a very substantial price if it’s not as effective as it’s billed to be.”
Luciana Borio, acting chief scientist at the FDA, agreed that classic RCTs still deserve a primary place in the medical ecosystem.
“Nobody has said, ‘I regret doing an RCT,’” she said. “History has played out the other way around; ‘we didn’t know and had to live with the consequences.’”
Even in cases of public health emergencies like Ebola, when some scientists deem RCTs impractical, she maintained that such situations especially demand informative studies.
“We have to be better prepared for the next epidemic,” she added. “We can’t say it’s too hard to do it.”
Weijer commented on the importance of avoiding a false dichotomy between RCTs and alternatives that are still capable of incorporating randomization into their designs. For example, Borio mentioned the ring study carried out during the Ebola crisis, in which groups of people known to be in contact with a patient—those forming a so-called “ring” around the infected person—were randomized to receive a new vaccine either immediately or after a three-week delay.
Andrea Troxel, a professor of population health and biostatistics at NYU, agreed that randomization in some capacity is necessary for the generation of high-quality knowledge, even if a standard RCT “is not always the answer.”
Weijer then raised the complex issue of whether rare diseases necessitate a lower standard of evidence for drug approval, given the lack of patients available for clinical trials. While Troxel said that “sufficient evidence” would not be possible to attain in a standard RCT, others disagreed. Gertz pointed to the approval of a drug for spinal muscle atrophy, the most common genetic cause of early demise in infants. An RCT was carried out on just 81 patients using a sham placebo administered to the spinal fluid, with results showing that the drug yielded profound increases in motor activity.
“It benefited a very small number, but even in that rare disease, the RCT provided some real evidence of benefit and is now available,” he said.
Regarding common diseases, the panelists disagreed on whether a social imperative to accelerate development justifies a lower evidentiary standard. Borio cautioned against the temptation to take short cuts, and said that all patients deserve the same high regulatory standard, while Troxel similarly warned of the “unintended consequences” of rushing new therapies out to desperate patients.
But at that moment, patient advocate Jane Perlmutter spoke up from the audience, declaring that patients with terminal illnesses want to take risks. “I don’t think we need to lower the bar, but we need to have innovative approaches to deal with deadly diseases,” she said. Gertz suggested that one potential solution would be for the FDA to grant a drug provisional approval using an intermediate or surrogate endpoint, with later testing to confirm the findings.
But if those findings failed to hold up, withdrawing such a drug from the market would be difficult, due to a backlash from patients still demanding access to the drug. He said this scenario has happened once in oncology and “it wasn’t pretty.”
In the unique case of biologically targeted therapy, Weijer posed the intriguing question as to whether the science has evolved to such point that an RCT is not needed to evaluate a drug’s effectiveness. Gertz acknowledged that studies with driving mutations in oncology are typically single arm, but that a control group is implicit in the standard of care response rates. He also maintained that a randomized trial would be needed eventually to determine safety as well as efficacy.
All the panelists agreed that helping the diverse range of patients who exist in the real world is crucial, and requires testing beyond narrow subsets who don’t reflect the larger population. More recently, pragmatic trials—which more realistically mimic day-to-day practice settings—are gaining traction, said Troxel.
Such trials aim to clarify how effective a recommended therapy might be, as opposed to explanatory trials, which aim to elucidate mechanisms of action in a new agent. Yet, “those two goals are not necessarily in conflict,” she said. A trial might be explanatory at first, with small numbers and a strict inclusion criteria, then broadened to test the therapy on a wider group.
Borio lamented that researchers don’t learn from most patient encounters because of a lack of access to studies. Today, the practice of medicine often takes place separately from the world of research, so when sick patients visit the doctor, their cases are not analyzed to improve the effectiveness of treatments for others. And those who do participate in studies often fail to represent important demographic subgroups.
Currently, the typical participant is a young, white male who lives near a metropolitan area. Borio would like studies to systematically include patients who are not usually included, like those in rural areas, minorities, babies, and pregnant women. Her dream is for every person in the medical system to be able to enroll in a clinical trial.
“There’s no national trial infrastructure in this country, like highways,” she said. “But we need it so we can make the most use of all the knowledge.”
Doing so, however, she acknowledged would require a major shift in how doctors are educated about clinical trials and how patients view the riskiness of participating in research. Many patients decline to participate because they view research as inherently riskier than regular medicine, which is not necessarily true.
Expert Opinion Pharmacotherapy 16(9):1275-9, Jun 2015.
Finding the Right Balance in Learning about Therapies
Speakers
Robert Califf Duke University
Audience Q&A with Robert Califf
Highlights
There are clear benefits to combining research with clinical practice.
RCTs and alternative trials could be combined and run continuously.
The current clinical trial system is deeply flawed and too expensive.
Data sharing among health care systems will speed evaluation and development.
Asking the right questions
The future of human experimentation is at a crossroads. Sick and dying patients need treatment options as quickly as possible, but rushing out new therapies will not necessarily benefit them.
The real issue, said keynote speaker Robert Califf, is how to accelerate drug development but also “get it right.” The current clinical trial enterprise has “gone awry,” he said, calling it unnecessarily expensive.
Califf, a cardiologist and former commissioner of the FDA from 2016 to 2017, charged the system fails patients by not asking all the important questions. “It’s not that clinical trials are too hard, it’s that there are questions not even being asked because it’s so costly,” he said.
A schism has developed in medicine, between those who think human experimentation should be conducted within the context of daily practice and those who feel it should remain separate from it. In Califf’s view, human experiments benefit when combined with insights of clinicians, but the layers of oversight and the risk of punitive action dissuades doctors from participating in research. He criticized what he called common myths about RCTs: that they must exclude patients who represent the likeliest use of therapies, and that clinical trials are risky compared with routine care.
“There’s no reason you can’t enroll real-world clinical patients in a trial,” he said.
He also dismissed the notion that doctors regularly review evidence and make the best decisions for their patients—simply because statistically valid evidence often doesn’t exist in medicine. In fact, he said, many practice recommendations are not based on high-quality evidence, such as the CDC’s recent guidelines for prescribing opioids. Unlike the rigorous approach codified in research studies, much of medical practice is rooted in observational and historical data, leaving doctors with a limited set of tools.
His solution is to run RCTs in combination with alternatives, such as pragmatic trials, to reduce the cost and enable better generalizability from the start. In the early phases of therapeutic development, he suggested randomizing from the first patient — “the quickest way to get treatment to patients even with rare diseases.”
Then, in later phases, every interaction with patients would be logged in a digital database. He urged a national paradigm shift toward the sharing of such data across health care networks. Such an effort would be in keeping with the drive to create incentives for health systems to work together that was written into the 21st Century Cures Act, which was signed into law in December 2016, and the user fee reauthorization bill currently working its way through Congress.
He envisions moving away from inefficient one-off studies, done in a parallel track to clinical practice with passive surveillance, to active surveillance, for instance by collecting information on millions of patients in a central database embedded in the health care system, with broad data-sharing among providers. He discussed his involvement in PCORnet, the National Patient-Centered Clinical Research Network, which collects data across hospitals, doctor’s offices, and community clinics in an attempt to help guide healthcare decisions.
He also urged patients to push academic health systems to stop hoarding data, positing that if the medical world shared the business world’s mentality of persistent data collection, progress would accelerate.
“When you do a Google search, you’re participating in up to ten randomized trials,” he noted.
The bottom line, he concluded, is not to abandon RCTs, but to maintain continuous and constant observation as health care is delivered. Just as Google analyzes its data nonstop to improve user experience and anticipate search queries, medicine ought to catalogue and interpret its abundance of real-world data to bring to light the best treatment options for patients.
Weighing the Risks of Randomized Controlled Trials and Alternatives
Speakers
Holly Fernandez Lynch, Panelist Petrie-Flom Center for Health Law Policy, Biotechnology and Bioethics at Harvard Law School
Amrit Ray, Panelist Johnson & Johnson
Matthew D. Rotelli, Panelist Eli Lilly and Company
Robert Walker, Panelist U.S. Dept. of Health and Human Services
Steve Usdin, Moderator BioCentury
Panel Discussion Weighing the Risks of RCTs and Alternatives
Highlights
Controversy surrounds the use of RCTs during public health emergencies.
Platform trials can reduce costs and increase efficiency.
Vulnerable populations need adequate access to clinical trials.
Increasing research participation is key to obtaining comprehensive data.
Maintaining equipoise
Genuine uncertainty about the comparative effectiveness of different interventions is the ethical foundation for randomized clinical testing, a concept known as equipoise. Moderator Steve Usdin, Washington editor of BioCentury, opened the discussion by asking panelists to weigh in on the challenges of striking equipoise during therapeutic development. If a given intervention is known to work, the researcher cannot in good conscience withhold it from test subjects, said Harvard bioethicist Holly Fernandez Lynch.
Robert Walker, acting chief medical officer of the Biomedical Advanced Research and Development Authority (BARDA) within the U.S. Department of Health and Human Services, discussed the challenges of maintaining equipoise during the Ebola crisis.
“There was a sense that you can’t conduct a clinical trial in the midst of an emergency response, but we saw that it was in fact feasible,” he said, citing three randomized vaccines trials carried out in Liberia, New Guinea, and Sierra Leone.
As to the private groups that granted some patients emergency access to treatment during the public health emergency—without a trial to gauge efficacy—Walker said, “It’s not even information…We really didn’t learn.”
Usdin asked the panelists to describe when to use RCTs versus alternatives. Amrit Ray, chief medical officer at Johnson & Johnson, posed a solution that would retain the benefits of randomization but reduce the costs and facilitate data sharing: integrated platform trials. In the current system, five companies might test five different drugs for the same disease in isolation. Instead, Ray proposed, what if those companies collaborated on one joint trial with a common control arm? This would lessen the burden of duplicate trials and patient recruitment, and allow for faster data collection to evaluate drugs, as with the innovative I SPY-2 trials in breast cancer.
Then Matthew Rotelli, a director at Eli Lilly, raised the challenge of how to broaden clinical trials to include vulnerable populations like children, “because if you don’t study them,” he said, “you have no way to guide their treatment.” Walker responded that BARDA has a legislative mandate to study all populations, and that special additional oversight for kids would only make the process more onerous. He stated that existing measures, including institutional review board review, are already responsible for ensuring proper informed consent.
Regarding novel trial designs, Usdin worried that even if the data persuades regulators to approve a drug, insurers still might not pay without the legitimacy conferred by a standard RCT. Ray responded that the comparison design of platform trials could potentially mitigate that risk.
Another new paradigm could emerge to meet serious unmet needs—allowing patients to risk taking a promising experimental drug faster in exchange for the sponsor collecting comprehensive data post-market. While Fernandez called this an “ideal world,” she was somberly realistic about its prospects. New laws would have to be passed, and sponsors would need to be held accountable.
Rotelli envisioned a future in which clinical trials never end. As new drugs come out, they are added to ongoing randomized platform trials for further study against known drugs. Less effective ones eventually get dropped, while electronic medical records facilitate the data collection. But obtaining and sharing that data would require a “dramatic shakeup of systems,” Fernandez said.
Right now, research participation requires robust informed consent, with autonomy prized as the highest value. It operates in a separate layer from clinical care, protected by institutional review boards that regularly review protocols to protect participants from exploitation.
An opposite model, in which data collection is routine, would require most patients to participate by default, rather than choosing to opt in.
“I don’t know that I would go to that end of the spectrum,” she said. “It’s so different from how we’ve done research, given concerns about historical abuses.”
Expecting compulsory participation makes many observers in the medical community uneasy. The twentieth century, after all, is filled with brazen examples of vulnerable subjects who were harmed or killed for the sake of science, including concentration camp victims and the cohort of African-American men infected with syphilis who, unbeknownst to them, were denied penicillin by Tuskegee researchers. Such infamous cases led to establishment of a set of morals and rules for human research participation via the 1947 Nuremberg Code and the 1979 Belmont Report, establishing the ethical pillars of autonomy and informed consent.
Unless such historical abuses fade in the collective consciousness, any future reforms that dial back the protection of individuals are unlikely to be popular.
Ethics and Patient Advocacy in Clinical Trial Design
Speakers
Rebecca Susan Dresser, Panelist Washington University in St. Louis
Andrew McFadyen, Panelist The Isaac Foundation
Jane Reese-Coulbourne, Panelist MK&A
J. Russell Teagarden, Panelist NYU School of Medicine Working Group on Compassionate Use & Pre-Approval Access
Alison Bateman-House, Moderator NYU School of Medicine
Panel Discussion Ethics and Patient Advocacy in Clinical Trial Design
Highlights
A cultural clash exists between patients and researchers.
Many specialists agree that clinical design could be improved by involving patient experts early on.
Early access programs allow patients to try experimental therapies outside of clinical trials.
Expanding inclusion criteria would allow more people to participate in research.
Balancing acts
When it comes to the policies and guidelines that govern drug development, patients and researchers often find themselves at odds, clashing over aspects such as trial design, compliance protocols, and early access. Differing motivations lie at the heart of the conflict, suggested moderator Alison Bateman-House, a bioethicist at NYU. Researchers want to help push science forward, while patients want access to therapies that will help them and their loved ones.
“When you’re a patient or a parent, [those with] the option of being in a trial think, ‘What would benefit me most?’” said Rebecca Susan Dresser, a cancer survivor and bioethicist at Washington University in St. Louis. “Altruism is low on the list.”
Many of those who can’t participate in trials struggle to persuade sponsors to let them try experimental therapies under the FDA’s program of expanded access, also known as compassionate use. Companies may be reluctant to participate in compassionate use in part because giving patients a drug outside of a trial means a lost opportunity to gain valuable data on safety and efficacy.
Advocacy groups, such as Andrew McFadyen’s Isaac Foundation, fight for access despite these concerns, because offering patients with rare diseases otherwise unavailable treatments can mean the difference between life and death.
“While RCTs are good at getting approval of drugs, it’s a hindrance in getting them to our children,” he said.
Patients and researchers should work together from the start of designing a trial so their goals can coincide, the panelists all suggested. Too often, Bateman-House said, patients are silent partners, which can translate into a protocol that’s too lengthy, intensive, or inconvenient. For example, Dresser rejected an offer to join a cancer trial for her advanced illness because its timeline would have meant a delay in starting treatment.
Jane Reese-Coulbourne, a former cancer trial participant and now consultant, stressed the importance of companies bringing in expert patients for feedback. As a patient advisor, she helped Genentech, a major biotech company responsible for several dozen pioneering drugs, understand why its recruitment for one particular trial was so low: the protocol required enduring four painful bone marrow aspirations, a procedure which extracts fluid from the marrow.
But when collaborating, patient advocacy groups can sometimes find themselves toeing a tricky ethical line between accepting funding from pharmaceutical companies and maintaining their organizational independence.
“Often pharma does bring us in for advice,” McFadyen said, “but sometimes the expectation there is that we’re going to be the people out there making sure [the drug] gets reimbursed.”
Some advocacy groups, he said, will accept payment in exchange for not asking tough questions about access. J. Russell Teagarden, a pharmacist, executive and educator, acknowledged that there are advocacy groups vulnerable to coercion.
“But on the other hand, there are some groups so sophisticated that maybe the companies are vulnerable,” he said, to laughs.
Another source of conflict between companies and patients is rhetoric. Too often, companies employ hyperbolic language like “breakthrough” to attract investors, while desperate patients line up for trials that haven’t in actuality even begun.
Bateman-House said that the lack of scientific literacy in the general population, who are prone to believing the science holds all the answers, is a major source of confusion. In fact the vast majority of drugs fail in the lab long before the point of human trials. Indeed, only one in a thousand compounds graduates to clinical testing, and of those, only about 10 percent eventually cross the finish line.
Issues about trial demographics came to the fore during a vibrant Q&A. One audience member expressed concern over how to expand clinical trials to underrepresented groups, like pediatric cancer patients. McFadyen suggested that is possible if parents push for it. Another spoke of disappointment over the fact that only 10 percent of patients qualify for trials. Several panelists responded that legislation is underway in the Senate to expand inclusion criteria so more people can participate in research.
Expert Opinion Pharmacotherapy 16(9):1275-9, Jun 2015.
Modern Trends in Clinical Drug Development
Speakers
Janet Woodcock U.S. Food and Drug Administration
Audience Q&A with Janet Woodcock
Highlights
Clinical trials are a prohibitively expensive element of drug development.
Trial designs, no matter how novel, will only be as good as the knowledge underlying them.
Randomization remains an important tool but is not always necessary in novel designs.
The system requires reform to incentivize continuous, collaborative platform trials.
Designing alternatives
The clinical trial system in the United States is broken because it isn’t “fit for purpose,” argued Janet Woodcock, director of the Center for Drug Evaluation and Research at the FDA. Because trials are so expensive and time-consuming, many questions remain unanswered after a drug is approved, leading to health care practices that too often lack high-quality evidence.
She discussed the wasteful efforts when a sponsor sets up a trial, tests an intervention, and then walks away because either the trial fails or the drug reaches the market. Instead, the future ought to bring more continuous, ongoing platform trials that can answer multiple questions at once, with data shared among health care networks.
“The goal,” she said, “is not to test a specific therapy, but to bring about continuous improvement in disease outcomes.”
In the current era, rapidly evolving science is driving novel research designs. Molecularly targeted therapies are on the rise, along with drugs that are “disease agnostic,” such as those that might target a specific biomarker appearing across tumor types, for example. With rare, life-threatening diseases that lack treatment options, she suggested it may in fact be adequate to test a targeted therapy, which is expected to show a large treatment effect, in a single-arm trial with an extended phase one cohort.
“I’m going to say something heretical, but oncology has been doing this, and it’s perfectly reasonable in my mind under these circumstances,” she said.
While a very useful tool, randomization in her mind is not an imperative. However, “it’s foolish not to use it if at all possible,” she added, such as if the disease outcomes are very variable or researchers don’t expect to get a homerun treatment effect.
But as more development programs are working on very rare and orphan diseases, the FDA is approving drugs based on limited trials that may lack randomization, as with an antidote for methotrexate toxicity devised from the data of 22 patients. Such cases involve a serious unmet medical need, a well-understood disease, highly plausible biomarkers that can be easily measured in a standardized way, and a drug that yields a large treatment effect.
She cautioned, however, that the design of a trial is only as good as the quality of the knowledge underlying it. Biomarkers may mistakenly drive a development program, for example, if they are not reproducibly measured, accurate, or predictive. Before starting trials in humans, researchers should be confident that a biomarker is “reasonably likely to predict clinical benefit,” she said, suggesting that randomization usually remains the best design in this situation.
Calling for greater efficiency overall, she urged a shift toward collaborative platform trials that integrate research and practice. But implementing changes to the status quo poses a serious challenge.
“In this translational world of continuous improvement in medicine, nobody is charged to do it and that’s the real problem,” she acknowledged. Since there isn’t a key stakeholder, she urged patients to rise up and demand reform.
Lessons from the Eteplirsen Drug Trial for Duchenne Muscular Dystrophy
Speakers
Pat Furlong, Panelist Parent Project Muscular Distrophy
Edward M. Kaye, Panelist Sarepta Therapeutics
Ellis Frank Unger, Panelist U.S. Food and Drug Administration
David Scheer, Panelist Scheer & Company, Inc.
Meg Tirrell, Moderator CNBC
Panel Discussion Lessons from the Eteplirsen Drug Trial for Duchenne Muscular Dystrophy
Highlights
Patients-researcher collaboration could enhance planning of clinical trials.
Surrogate endpoints should comport with clinical gains.
Understanding a disease’s natural history and its biomarkers is crucial to guiding high-quality research.
Limited trials can lead to accelerated approvals, but questions may remain about real-world effectiveness and who will pay for the drug.
Case study
The accelerated FDA approval late last year of the drug Eteplirsen for Duchenne Muscular Dystrophy (a disorder that predominantly manifests in young boys and progresses rapidly) based on controversial data from a 12-patient randomized trial, set off a lively discussion. Sufferers with this rare disease progressively lose muscle function and previously had no treatment options before Eteplirsen came on the market in late 2016, gaining approval even though its effectiveness is still being debated. Moderator Meg Tirrell of CNBC opened by asking the panelists whether the case established any precedents.
Ellis Unger, director of Drug Evaluation-I in the Office of New Drugs at the FDA, offered his concern about the trial’s surrogate endpoint: the drug was approved based on a small increase in the amount of dystrophin, a key protein, found in skeletal muscle. However, whether the increase yields meaningful clinical benefits remains an open question and a flash point for controversy. Unger said he worried that the case will prompt other companies to present similarly limited data and expect to gain FDA approval.
But Pat Furlong, a patient activist and the mother of two boys who died of the disease, argued that the drug did show improvement in the gaits of those who took it, and that desperate patients should not be prevented from taking risks. “It has rocked my world,” she said, speaking of the hope it’s brought to patients who now have the option to try a drug where before none existed.
Edward Kaye, CEO of Sarepta Therapeutics, the drug’s sponsor, contended that small amounts of a biologically active component can work, and said that “the bigger precedent is patient involvement.” Throughout the study, his company worked with patients’ families to understand what quality-of-life outcomes would be most meaningful from a drug, such as the ability to go to the bathroom independently. He also collaborated with other companies to accelerate the search for biomarkers. Their collective pact to publish joint findings represents a notable shift in how research is done.
“It’s not one company against another,” he said, “it’s a number of companies and patient groups against the disease.”
In response to a question about the importance of biomarkers in raising capital for research, David Scheer, an entrepreneur in the life sciences, said that they provide crucial preclinical data.
“Without having some sort of biomarker technology that can facilitate translational medicine, we might be shooting in the blind,” he explained.
The discussion turned heated when Unger described the unusual public comment period during an FDA advisory meeting prior to the drug’s approval. While some of the boys from the trial declared their improvement, Unger said the data showed they were in fact deteriorating. He also described the comment period as “a circus,” because in his view patients and families went over the line, taking too long, and in some instances heckling the committee.
Shortly thereafter, patient advocate Andrew McFadyen approached the microphone and admonished Unger. McFadyen told him to show more respect for families with dying children and to listen to their stories for a year if necessary. “If you can’t do that, you should give your chair up to someone else,” he declared.
While Unger apologized for using the word “circus,” he defended his remarks, recounting the conduct of the session.
“When the deputy director of the Neurology Division told a very personal story of tragedy, he was heckled.” He continued, “we have to draw a line somewhere, and we thought that the time we allotted was reasonable… the catcalls, and the heckling, it was very disheartening.”
Former FDA commissioner Robert Califf stepped up to offer additional insight into the process. “Advisory committees are so named because they do not make the decisions,” Califf said. “Full time government employees are the ones who make the decisions and most people are still confused by that.”
“There was no one in the FDA who thought that the studies were well done,” he added. “I don’t think these decisions would have been so hard otherwise.”
Another audience question addressed the cost of the drug: At a price of $300,000 annually per patient, insurers are balking; they want to see real-life evidence of effectiveness before paying for it. Kaye’s company has started a registry to gather such data, but he said that new therapies in small populations entail expensive drugs because limited numbers of patients receive them.
“So, the cost of developing the drug is transferred to them,” he said. “That is the cost of innovation.”
George D. Demetri, Panelist Dana-Farber Cancer Institute
Anne Cropp, Panelist Early Access Care, LLC
Christopher Robertson, Panelist University of Arizona
John (L.P.) Thompson, Panelist Columbia University
Donald Berry, Moderator MD Anderson Cancer Center
Panel Discussion A Way Forward
Highlights
Platform trials are a key way to merge research and practice.
RCTs are still considered the definitive trial design for reducing bias.
Sharing well-established trial templates for common diseases will eliminate redundant efforts.
Sponsors should make compliance less burdensome for patients.
Refinements or reinvention?
The lively panelists in the afternoon session had no qualms about offering blunt criticism of the status quo. Echoing a theme of the conference, moderator Donald Berry, a statistician at MD Anderson Cancer Center, underscored the need to merge research with practice.
“It’s inevitable,” he said. “We have to figure out how to do it, and platform trials may be the way.”
George Demetri, a medical oncologist at the Dana-Farber Cancer Institute, summed up the situation more derisively: “The clinical trial system is broken,” he said, “and we’re not being honest with the public about what we do and don’t know.” He argued that the benefits of precision medicine and testing for mutations are oversold to patients, particularly by academic institutions.
Christopher Robertson, a law professor at the University of Arizona, raised concerns about bias skewing results, especially in cancer drugs with small effect sizes. Blinding not only patients, but also investigators and statisticians, is crucial to finding legitimate outcomes. Another way to reduce bias is to specify endpoints before the start of a trial to avoid coming up with erroneous probability values, he said.
Anne Cropp, chief scientific officer of Early Access Care, suggested that one way to improve efficiency in the development of new protocols is to make data from the NIH more widely available so that companies working on common diseases like Alzheimer’s and diabetes can use established trial templates, with well-defined endpoints, rather than starting from scratch. She also urged physicians to become more literate in drug testing, citing a Tufts study that revealed a surprising lack of knowledge among doctors regarding the ins and outs of clinical trials.
John Thompson, a professor of biostatistics and neurology at Columbia, cautioned against a premature rejection of double-blind, placebo-controlled RCTs. He worked on just such a trial involving a treatment for ALS, despite being told it was impossible because patients would decline to participate.
“Must RCTs give way?” he asked. “A flexible no, although many people will hear it as a yes, because I envisage considerable fast-moving changes.…It’s a matter of adding to and expanding rather than abandoning the existing techniques.”
A key to the ALS trial’s success, he said, was explaining to patients the importance of its design to gain their trust and cooperation. To help with recruitment, he suggested that disease advocacy groups post the details of ongoing trials on their websites, after networking with investigators, so that patients can find the information and take it to their doctors.
During the audience comment period, Ellis Unger of the FDA offered additional suggestions. He mentioned a “very powerful” trial design that he believes is not used enough: randomized withdrawal, in which responders stop taking a drug and investigators observe what happens. He also urged sponsors to make compliance easier for patients by letting them participate from home, via Skype, when possible, and to limit the number of scans and tests they must undergo.
“The FDA does not need forty thousand blood tests or four X-rays per person to approve a drug,” he said dryly.
Panel Discussion
Further Readings
Journal Articles
Poor Physician and Nurse Engagement Contributes to Low Patient Recruitment Rates
Tufts Center for the Study of Drug Development, Jan 2017.
Nature Reviews Clinical Oncology 9(4):199-207, Nov 2011.
Ethics Panel Wrap-up
What is the Future of Accelerated Development and the Randomized Controlled Trial Standards?
Speakers
Nancy King, Panelist Wake Forest School of Medicine
Vinay Prasad, Panelist Oregon Health and Science University
Eric H. Rubin, Panelist Merck & Co., Inc.
Jeffrey S. Weber, Panelist NYU Langone Medical Center
Timothy Caulfield, Moderator University of Alberta
Panel Discussion Ethics Panel Wrap-up
Highlights
RCTs remain a key method for overcoming bias and threats to validity.
But not every drug requires a large, expensive RCT for initial approval.
Platform trials retain randomization while operating more efficiently.
The system requires reform to incentivize collaboration and transparency.
One tool along the continuum
The final presenters agreed that RCTs are not going away anytime soon, but there are changes in how and when they are being used.
Vinay Prasad, a hematologist-oncologist at Oregon Health and Sciences University, defended the RCT against common criticisms. If such trials don’t reflect real-world populations, he said that is due to strict inclusion criteria, not the design. And if endpoints don’t represent clinical outcomes, the RCT itself is not to blame. That would be “like blaming the Wright brothers for United Airlines,” he quipped.
Eric Rubin, vice president of oncology clinical research at Merck, argued that the title of the conference should not be, “Must RCTs give way?” but rather, “Should RCTs not get in the way?” For example, a single-arm study in a drug with a large effect size can lead to initial approval faster than a standard RCT. Finding such a drug requires high-quality basic research and reproducible preclinical studies, he added.
Jeffrey Weber, deputy director of the Perlmutter Cancer Center at NYU, acknowledged that everyone “worships the god of the randomized, phase three trial,” but he expressed concern over its exploding costs and suggested that modifications in some cases could be appropriate. He proposed using more novel endpoints, such as landmark survival at one and two years, combined with a quality-of-life questionnaire.
“In the immunotherapy era,” he said, “I think we can afford to be more flexible than to do an RCT with a thousand participants and the only endpoint is survival.”
But Prasad was quick with a rebuttal, arguing that progression-free survival is based on an arbitrary line in the sand and is not necessarily correlated to eventual survival.
Rubin agreed with Weber that RCTs are the definitive way to demonstrate benefit, but that at least in oncology, an RCT with overall survival as the endpoint is not required to approve a drug. Rubin suggested that it is possible to deliver promising therapies quickly, without sacrificing randomization, by conducting early phase explanatory trials for initial registration and then a post-market RCT to verify early results.
For example, at Merck he led the development of a melanoma drug that was approved after a single-arm study in patients with advanced illness who were out of options. Afterward, Merck followed up with a randomized trial in less sick patients to compare the new drug against existing alternatives, in which case researchers still retained equipoise. This scenario shows it’s possible to do both—accelerate approval, in this case by three years, and still conduct a confirmatory RCT.
Rubin reiterated an idea that got a lot of play across the conference—the notion that a collaborative platform trials could streamline the process of matching drugs to patients across a spectrum of disease, as was the case in I-SPY2. Such trials benefit patients by letting them get assigned to one arm no matter their cancer.
But Nancy King, a professor of social sciences and health policy at Wake Forest, pointed out that academic institutions and businesses aren’t structured to reward transparency and collaboration over competition.
“You have to be able to turn the battleship,” she remarked.
In a discussion on the exorbitant expense of running trials, Weber said that the regulatory scrutiny has proliferated over the last decade, adding to the cost burden.
“We have monitors that monitor the monitors,” he joked.
Ultimately, even as some areas of drug development move away from the RCT, it remains a fundamental tool along a continuum of designs. One reason it may never vanish is that it can act as a bulwark against the fallibility of human nature.
“Our capacity for hope makes us incredibly susceptible to inferior levels of evidence,” Prasad reflected.
In his opinion, the beauty of randomization is that with a modest effect size, there is no better way to tease apart what works.
Journal of Clinical Epidemiology 66(4):361-366, April 2013.
Open Questions
How can researchers and industry sponsors be incentivized to collaborate on platform trials and sharing data?
How can researchers accelerate drug development without sacrificing the collection of data on efficacy and safety?
How can society increase scientific literacy among the public to encourage participation in research, rather than view it as inherently risky?
How can the medical profession systematically integrate research and clinical practice?
If a drug is granted accelerated approval after a single-arm trial, how can the sponsor be held accountable for collecting post-market data on safety and efficacy?
Will insurers pay for drugs that are approved based on limited or non-randomized trials?
During the next humanitarian emergency, is it feasible and ethical to test interventions with a RCT?
How can costs be better managed to complete a RCT without the prohibitive expenses?
Can decreasing the amount of oversight streamline trials without sacrificing protection for research participants?
Can sponsors and patients work together to design trials at the earliest stages, ensuring that compliance in a protocol is not overly burdensome?
Should a standard RCT remain the default design?
Is choosing between a RCT and an alternative design a false dichotomy?
How can trials better accommodate broad patient populations to yield real-world data and integrate more patients into research?
Who should decide how much risk is appropriate in trying an unproven therapy?
Will big data solve quandaries of speed and the historical limits on the RCT?
Should rare diseases with limited numbers of patients available for trials have a lower evidentiary standard for drug approval?
What can be done to reign in exorbitant drug prices?
Will automation and electronic medical records help motivate doctors to participate in large-scale practical trials?
How can society implement the changes required to modernize the clinical trial enterprise?
This website uses cookies. Some of the cookies we use are essential for parts of the website to operate, while others offer you a better browsing experience. You give us your permission to use cookies by clicking on the “I agree” button or by continuing to use our website after receiving this notification. To find out more about cookies on this website, see our Privacy Policy and Terms of Use.I Agree