Skip to main content

9 Young Scientists Are Innovating to Transform Our World for a Better Future

Overview

The Blavatnik Awards for Young Scientists in the United Kingdom are the largest unrestricted prize available to early career scientists in the Life Sciences, Physical Sciences & Engineering, and Chemistry in the UK. The three 2021 Laureates each received £100,000, and two Finalists in each category received £30,000 per person. The honorees are recognized for their research, which pushes the boundaries of our current technology and understanding of the world. In this event, held at the historic Banqueting House in London, the UK Laureates and Finalists had a chance to explain their work and its ramifications to the public.

Victoria Gill, a Science and Environment Correspondent for the BBC, introduced and moderated the event. She noted that “Science has saved the world and will continue to do so,” and stressed how important it is for scientists to engage the public and share their discoveries at events like this. This theme arose over and over again over the course of the day.

Symposium Highlights

  • Single-cell analyses can reveal how multicellular animals develop and how our immune systems deal with different pathogens we encounter over the course of our lives.
  • Viruses that attack bacteria—bacteriophages—may help us fight antibiotic resistant bacterial pathogens.
  • Fossils offer us a glimpse into what life on Earth was like for the millennia in which it thrived before mammals took over.
  • Stacking layers of single-atom-thick sheets can make new materials with desired, customizable properties.
  • Memristors are electronic components that can remember a variety of memory states, and can be used to build quicker and more versatile computer chips than currently used.
  • The detection of the Higgs boson, which had been posited for decades by mathematical theory but was very difficult to detect, confirmed the Standard Model of Physics.
  • Single molecule magnets can be utilized for high density data storage—if they can retain their magnetism at high enough temperatures.
  • When examining how life first arose on Earth, we must consider all of its requisite components and reactions in aggregate rather than assigning primacy to any one of them.

Speakers

Stephen L. Brusatte
The University of Edinburgh

Sinéad Farrington
The University of Edinburgh

John Marioni
European Bioinformatics Institute and University of Cambridge

David P. Mills
The University of Manchester

Artem Mishchenko
The University of Manchester

Matthew Powner
University College London

Themis Prodromakis
University of Southampton

Edze Westra
University of Exeter

Innovating in Life Sciences

Speakers

John Marioni, PhD
European Bioinformatics Institute and University of Cambridge, 2021 Blavatnik Awards UK Life Sciences Finalist

Edze Westra, PhD
University of Exeter, 2021 Blavatnik Awards UK Life Sciences Finalist

Stephen Brusatte, PhD
The University of Edinburgh, 2021 Blavatnik Awards UK Life Sciences Laureate

How to Build an Animal

John Marioni, PhD, European Bioinformatics Institute and University of Cambridge, 2021 Blavatnik Awards UK Life Sciences Finalist

Animals grow from one single cell: a fertilized egg. During development, that cell splits into two, and then into four, and so on, creating an embryo that grows into the billions of cells comprising a whole animal. Along the way, the cells must differentiate into all of the different cell types necessary to create every aspect of that animal.

Each cell follows its own path to arrive at its eventual fate. Traditionally, the decisions each cell has to make along that path have been studied using large groups of cells or tissues; this is because scientific lab techniques have typically required a substantial amount of starting material to perform analyses. But now, thanks in large part to the discoveries of John Marioni and his lab group, we have the technology to track individual cells as they mature into different cell types.

Marioni has created analytical methods capable of observing patterns in all of the genes expressed by individual cells. Importantly, these computational and statistical methods can be used to analyze the enormous amounts of data generated from the gene expression patterns of many individual cells simultaneously. In addition to furthering our understanding of cell fate decisions in embryonic development, this area of research—single cell genomics—can also be applied to many other processes in the body.

One relevant application is to the immune system: single cell genomics can detect immune cell types that are activated by exposure to a particular pathogen. To illustrate this, Marioni showed many gorgeous, colorized images of individual cells, highlighting their unique morphology and function. Included in these images was histology showing profiles of different types of T cells elicited by infection with SARS-CoV-2 (the virus that causes COVID-19).

The cells were computationally grouped by genetic profile and graphed to show how the different cell types correlated with disease severity. There are many other clinical applications of his research into genomics. For instance, he said, if we know exactly which cell types in the body express the targets of specific drugs, we will be better able to predict that drug’s effects (and side effects).

In addition to his lab work, Marioni is involved in the Human Cell Atlas initiative, a global collaborative project whose goal it is to genetically map all of the cell types in healthy human adults. When a cell uses a particular gene, it is said to “transcribe” that gene to make a particular protein—thus, the catalog of all of the genes one cell uses is called its “transcriptome.” The Human Cell Atlas is using these single cell transcriptomes to create the whole genetic map.

This research is currently completely redefining how we think of cell types by transforming our definition of a “cell” from the way it looks to the genetic profile.

Bacteria and Their Viruses: A Microbial Arms Race

Edze Westra, PhD University of Exeter, 2021 Blavatnik Awards UK Life Sciences Finalist

All organisms have viruses that target them for infection; bacteria are no exception. The viruses that infect bacteria are called bacteriophages, or phages.

Edze Westra’s lab studies how bacteria evolve to defend themselves against infection by phage and, specifically, how elements of their environment drive the evolution of their immune systems. Like humans, bacteria have two main types of immune systems: an innate immune system and an adaptive immune system. The innate immune system works similarly in both bacteria and humans by modifying molecules on the cell surface so that the phage can’t gain entry to the cell.

In humans, the adaptive immune system is what creates antibodies. In bacteria, the adaptive immune system works a little bit differently—a gene editing system, called CRISPR-Cas, cuts out pieces of the phage’s genome and uses it as a template to identify all other phages of the same type. Using this method, the bacterial cell can quickly discover and neutralize any infectious phage by destroying the phage’s genetic material. In recent years, scientists have harnessed the CRISPR-Cas system for use in gene editing technology.

Westra wanted to know under what conditions do bacteria use their innate immune system versus their adaptive immune system: How do they decide?

In studies using the bacterial pathogen Pseudomonas aeruginosa, his lab found that the decision to use adaptive vs. innate immunity is controlled almost exclusively by nutrient levels in the surrounding environment. When nutrient levels are low, the bacteria use the adaptive immune system, CRISPR-Cas; when nutrient levels are high, the bacteria rely on their innate immune system. He recognized that this means we can artificially guide the evolution of bacterial defense by controlling elements in their environment.

When we need to attack pathogenic bacteria for medical purposes, such as in a sick or infected patient, we turn to antibiotics. However, many strains of bacteria have developed resistance to antibiotics, leaving humans vulnerable to infection.

Additionally, our antibiotics tend to kill broad classes of microbes, often damaging the beneficial species we harbor in our bodies along with the pathogenic ones we are trying to eliminate. Phage therapy—a medical treatment where phages are administered to a patient with a severe bacterial infection—might be a good way to circumvent antibiotic resistance while also attacking bacteria in a more targeted manner, harming only those that harm us and leaving the others be.

Although it is difficult to manipulate bacterial nutrients within the context of a patient’s body, we can use antibiotics to direct this behavior. Antibiotics that are shown to limit bacterial growth will induce the bacteria to use the CRISPR-Cas strategy, mimicking the effects of a low-nutrient environment; antibiotics that work by killing bacteria will induce them to use their innate defenses. In this way, it may be possible to direct the evolution of bacterial defense systems in order to reveal their weaknesses and target them with phage therapy.

The Rise and Fall of the Dinosaurs

Stephen Brusatte, PhD The University of Edinburgh, 2021 Blavatnik Awards UK Life Sciences Laureate|

Stephen Brusatte is a paleontologist, “and paleontologists”, he says, “are really historians”. Just as historians study recorded history to learn about the past, paleontologists study prehistory for the same reasons.

The Earth is four and a half billion years old, and humans have only been around for the last three hundred and fifty thousand of those years. Dinosaurs were the largest living creatures to ever walk the earth; they started out around the size of house cats, and over eighty million years they evolved into the giant T. rexes, Stegosauruses, and Brontosauruses in our picture books.

They reigned until a six-mile-wide asteroid struck the Earth sixty-six million years ago at the end of the Cretaceous period, extinguishing them along with seventy-five percent of the other species on the planet. Brusatte called this day “the worst day in Earth’s history.” However, the demise of dinosaurs paved the way for mammals to take over.

Fossils can tell us a lot about how life on this planet used to be, how the earth and its occupants respond to climate and environmental changes, and how evolution works over long timescales. Particularly, fossils show how entirely new species and body plans emerge.

Each fossil can yield new knowledge and new discoveries about a lost world, he said. It can teach us how bodies change and, ultimately, how evolution works. It is from fossils that we know that today’s birds evolved from dinosaurs.

Life Sciences Panel Discussion

Victoria Gill started the life sciences panel discussion by asking all three of the awardees if, and how, the COVID-19 pandemic changed their professional lives: did it alter their scientific approach or were they asking different questions?

Westra replied that the lab shutdown forced different, non-experimental approaches, notably bioinformatics on old sequence data. He said that they found mobile genetic elements, and the models of how they moved through a population reminded him of epidemiological models of COVID spread.

Marioni shared that he was inspired by how the international scientific community came together to solve the problem posed by the pandemic. Everyone shared samples and worked as a team, instead of working in isolation as they usually do. Brusatte agreed that enhanced collaboration accelerated discoveries and should be maintained.

Questions from the audience, both in person and online, covered a similarly broad of a range of topics. An audience member asked about where new cell types come from; Marioni explained that if we computationally look at gene transcription changes in single cells over time, we can make phylogenetic trees showing how cells with different expression patterns arise.

A digital attendee asked Brusatte why birds survived the asteroid impact when other dinosaurs didn’t. Brusatte replied that the answer is not clear, but it is probably due to a number of factors: they have beaks so they can eat seeds, they can fly, and they grow fast. Plus, he said, most birds actually did not survive beyond the asteroid impact.

Another audience member asked Brusatte if the theory that the asteroid killed the dinosaurs was widely accepted. He replied that it is widely accepted that the impact ended the Cretaceous period, but some scientists still argue that other factors, like volcanic eruptions in India, were the prime mover behind the dinosaurs’ demise.

Another viewer asked Westra why the environment impacts a bacterium’s immune strategy. He answered that in the presence of antibiotics that slow growth, infection and metabolism are likewise slowed so the bacteria simply have more time to respond. He added that the level of diversity in the attacking phage may also play a role, as innate immunity is better able to deal with multiple variants.

To wrap up the session, Victoria Gill asked about the importance of diversity and representation and wondered how to make awards programs like this more inclusive. All three scientists agreed that it is hugely important, that the lack of diversity is a problem across all fields of research, that all voices must be heard, and that the only way to change it is by having hard metrics to rank universities and departments on the demographics of their faculty.

Innovating in Physical Sciences & Engineering

Speakers

Artem Mishchenko, PhD
The University of Manchester, 2021 Blavatnik Awards UK Physical Sciences & Engineering Finalist

Themis Prodromakis, PhD
University of Southampton, 2021 Blavatnik Awards UK Physical Sciences & Engineering Finalist

Sinead Farrington, PhD
The University of Edinburgh, 2021 Blavatnik Awards UK Physical Sciences & Engineering Laureate

Programmable van der Waals Materials

Artem Mishchenko, PhD The University of Manchester, 2021 Blavatnik Awards UK Physical Sciences & Engineering Finalist

Materials science is vital because materials define what we can do, and thus define us. That’s why the different eras in prehistory are named for the materials used: the Stone Age, the Bronze Age, the Iron Age, the Copper Age. The properties of the materials available dictated the technologies that could be developed then, and the properties of the materials available still dictate the technologies that can be developed now.

Van der Waals materials are materials that are only one or a few atoms thick. The most well-known is probably graphene, which was discovered in 2004 and is made of carbon. But now hundreds of these two-dimensional materials are available, representing almost the whole periodic table, and each has different properties. They are the cutting edge of materials innovation.

Mishchenko studies how van der Waals materials can be made and manipulated into materials with customizable, programmable properties. He does this by stacking the materials and rotating the layers relative to each other. Rotating the layers used to be painstaking, time-consuming work, requiring a new rig to make each new angle of rotation. But his lab developed a single device that can twist the layers by any amount he wants. He can thus much more easily make and assess the properties of each different material generated when he rotates a layer by a given angle, since he can then just reset his device to turn the layer more or less to devise a new material. Every degree of rotation confers new properties.

His lab has found that rotating the layers can tune the conductivity of the materials and that the right combination of angle and current can make a transistor that can generate radio waves suitable for high frequency telecommunications. With infinite combinations of layers available to make new materials, this new field of “twistronics” may generate an entirely new physics, with quantum properties and exciting possibilities for biomedicine and sustainability.

Memristive Technologies: From Nano Devices to AI on a Chip

Themis Prodromakis, PhD University of Southampton, 2021 Blavatnik Awards UK Physical Sciences & Engineering Finalist

Transistors are key elements in our electronic devices. They process and store information by switching between on and off states. Traditionally, in order to increase the speed and efficiency of a device one increased the number of transistors it contained. This usually entailed making them smaller. Smartphones contain seven billion transistors! But now it has become more and more difficult to further shrink the size of transistors.

Themis Prodromakis and his team have been instrumental in developing a new electronic component: the memristor, or memory resistor. Memristors are a new kind of switch; they can store hundreds of memory states, beyond on and off states, on a single, nanometer-scale device. Sending a voltage pulse across a device allows to tune the resistance of the memristor at distinct levels, and the device remembers them all.

One benefit of memristors is that they allow for more computational capacity while using much less energy from conventional circuit components. Systems made out of memristors allow us to embed intelligence everywhere by processing and storing big data locally, rather than in the cloud. And by removing the need to share data with the cloud, electronic devices made out of memristors can remain secure and private. Prodromakis has not only developed and tested memristors, he is also quite invested in realizing their practical applications and bringing them to market.

Another amazing application of memristors is linking neural networks to artificial ones. Prodromakis and his team have already successfully connected biological and artificial neurons together and enabled them to communicate over the internet using memristors as synapses. He speculates that such neuroprosthetic devices might one day be used to fix or even augment human capabilities, for example by replacing dysfunctional regions of the brain in Alzheimer’s patients. And if memristors can be embedded in a human body, they can be embedded in other environments previously inaccessible to electronics as well.

What Do We Know About the Higgs Boson?

Sinead Farrington, PhD The University of Edinburgh, 2021 Blavatnik Awards UK Physical Sciences & Engineering Laureate

In the Standard Model of particle physics, the bedrock of modern physics, fermions are the elementary particles comprising all of the stable matter in the universe, while bosons—the other collection of elementary particles—are the ones that transmit forces. The Higgs boson, whose existence was theoretically proposed in 1964, is a unique particle; it gives mass to the other particles by coupling with them.

Sinéad Farrington led the group at CERN that further elucidated the properties of the Higgs boson and thus bolstered the Standard Model. The Standard Model “effectively encapsulates a remarkably small set of particles that make up everything we know about and are able to create,” explained Farrington.

“The Higgs boson is needed to maintain the compelling self-consistency of the Standard Model. It was there in theory, but the experimental observation of it was a really big deal. Nature did not have to work out that way,” Farrington said.

Farrington and her 100-person international team at the Large Hadron Collider demonstrated that the Higgs boson spontaneously decays into two fermions called tau leptons. This was experimentally challenging because tau is unstable, so the group had to infer that it was there based on its own degradation products. She then went on to develop the analytical tools needed to further record and interpret the tau lepton data and was the first to use machine learning to trigger, record, and analyze the massive amounts of data generated by experiments at the LHC.

Now she is looking to discover other long-lived but as yet unknown particles beyond the Standard Model that also decay into tau leptons, and plans to make more measurements using the Large Hadron Collider to further confirm that the Higgs boson behaves the way the Standard Model posits it will.

In addition to the satisfaction of verifying that a particle predicted by mathematical theorists actually does exist, Farrington said that another consequence of knowing about the Higgs boson is that it may shed light on dark matter and dark energy, which are not part of the Standard Model. Perhaps the Higgs boson gives mass to dark matter as well.

Physical Sciences & Engineering Panel Discussion

Victoria Gill started this session by asking the participants what they plan to do next. Farrington said that she would love to get more precise determinations on known processes, reducing the error bars upon them. And she will also embark on an open search for new long-lived particles—i.e. those that don’t decay rapidly—beyond the Standard Model.

Prodromakis wants to expand the possibilities of memristive devices, since they can be deployed anywhere and don’t need a lot of power. He envisions machine-machine interactions like those already in play in the Internet of Things as well as machine-human interactions. He knows he must grapple with the ethical implications of this new technology, and mentioned that it will also require a shift in how electricity, electronics, and computational fabrics are taught in schools.

Mishchenko is both seeking new properties in extant materials and making novel materials and seeing what they’ll do. He’s also searching for useful applications for all of his materials.

A member of the audience asked Farrington if, given all of the new research in quantum physics, we have new data to resolve the Schrӧedinger’s cat conundrum? But she said no, the puzzle still stands. That is the essence of quantum physics: there is uncertainty in the (quantum) world, and both states exist simultaneously.

Another wondered why she chose to look for the tau lepton as evidence of the Higgs boson’s degradation and not any other particles, and she noted that tau was the simplest to see over the background even though it does not make up the largest share of the breakdown products.

An online questioner asked Prodromakis if memristors could be used to make supercomputers since they allow greater computational capacity. He answered that they could, in principle, and could be linked to our brains to augment our capabilities.

Someone then asked Mishchenko if his technology could be applied into biological systems. He said that in biological systems current comes in the form of ions, whereas in electronic systems current comes in the form of electrons, so there would need to be an interface that could translate the current between the two systems. Some of his materials can do that by using electrochemical reactions that convert electrons into ions. But the materials must also be nontoxic in order to be incorporated into human tissues, so he thinks this innovation is thirty to forty years away.

The last query regarded whether the participants viewed themselves as scientists or engineers. Farrington said she is decidedly a physicist and not an engineer, though she collaborates with civil and electrical engineers and relies on them heavily to build and maintain the colliders and detectors she needs for her work.

Prodromakis was trained as an engineer, but now works at understanding the physics of devices so he can design them to reliably do what he wants them to do. And Mishchenko summarized the difference between them by saying the engineering problems are quite specific, while scientists mostly work in darkness. At this point, he considers himself an entrepreneur.

Innovating in Chemistry

Speakers

David P. Mills, PhD
The University of Manchester, 2021 Blavatnik Awards UK Chemistry Finalist

Matthew Powner, PhD
University College London, 2021 Blavatnik Awards UK Chemistry Finalist

Building High Temperature Single-Molecule Magnets

David P. Mills, PhD The University of Manchester, 2021 Blavatnik Awards UK Chemistry Finalist

David Mills’ lab “makes molecules that have no right to exist.” He is specifically interested in the synthesis of small molecules with unusual shapes that contain metal ions, and using these as tiny molecular magnets to increase data storage capacity to support high-performance computing. Mills offers a bottom-up approach to this problem: he wants to make new molecules for high density data storage. This could ultimately make computers smaller and reduce the amount of energy they use.

Single-Molecule Magnets (SMMs) were discovered about thirty years ago. They differ from regular magnets, which derive their magnetic properties from interactions between atoms, but they still have two states: up and down. These can be used to store data in a manner similar to the bits of binary code that computers currently use. Initially, SMMs could only work at extremely cold temperatures, just above absolute zero. For many years, scientists were unable to create an SMM capable of operation above −259oC, only 10oC above the temperature of liquid helium, which makes them decidedly less than practical for everyday use.

Mills works with a class of elements called the lanthanides, sometimes known as the rare-earth metals, that are already used in smartphones and hybrid vehicles. One of his students utilized one such element, dysprosium, in the creation of an SMM that was dubbed, dysprosocenium. Dysprosocenium briefly held its magnetic properties even at a blistering −213oC, the warmest temperature at which any SMM had ever functioned. This temperature is starting to approach the temperature of liquid nitrogen, which has a boiling point of −195.8°C. If an SMM could function indefinitely at that temperature, it could potentially be used in real-world applications.

When developing dysprosocenium, the Mills group and their collaborators learned that controlling molecular vibrations is essential to allowing the single-molecule magnet to work at such high temperatures. So, his plan for the future is to learn how to control these vibrations and work toward depositing single-molecule magnets on surfaces.

The Chemical Origins of Life

Matthew Powner, PhD University College London, 2021 Blavatnik Awards UK Chemistry Finalist

The emergence of life is the most profound transition in the history of Earth, and yet we don’t know how it came about. Earth formed four-and-a-half billion years ago, and it is believed that the earliest life-forms appeared almost a billion years later. However, we don’t know what happened in the interim.

Life’s Last Universal Common Ancestor (LUCA) is believed to be much closer to modern life forms than to that primordial originator, so although we can learn about life’s common origins from LUCA, we can’t learn about the true Origin of Life. Where did life come from? How did the fundamental rules of chemistry give rise to life forms? Why did life organize itself the way that it did?

Matthew Powner thinks that to answer these vital existential questions, which lie at the nexus of chemistry and biology, we must simultaneously consider all of life’s components—nucleic acids, amino acids and peptides, metabolic reactions and pathways—and their interactions. We can’t just look at any one of them in isolation.

Since these events occurred in the distant past, we can’t discover it—we must reinvent it. To test how life came about, we must build it ourselves, from scratch, by generating and combining membranes, genomes, and catalysis, and eventually metabolism to generate energy.

In this presentation, Powner focused on his lab’s work with proteins. Our cells, which are highly organized and compartmentalized machines, use enzymes—proteins themselves—and other biological macromolecules to synthesize proteins. So how did the first proteins get made? Generally, the peptide bonds linking amino acids together to make proteins do not form at pH 7, the pH of water and therefore of most cells. But Powner’s lab showed that derivatives of amino acids could form peptide bonds at this pH in the presence of ultraviolet light from the sun, and sulfur and iron compounds, all of which were believed to have been present in the prebiotic Earth.

Chemistry Panel Discussion

Victoria Gill started this one off by asking the chemists how important it is to ask questions without a specific application in mind. Both agreed that curiosity defines and drives humanity, and that the most amazing discoveries arise just from trying to satisfy it. Powner says that science must fill all of the gaps in our understanding, and the new knowledge generated by this “blue sky research” (as Mills put it) will yield applications that will change the world but in unpredictable ways. Watson and Crick provide the perfect example; they didn’t set out to make PCR, but just to understand basic biological questions. Trying to drive technology forward may be essential, but it will never change the world the same way investigating fundamental phenomena for its own sake can.

One online viewer wanted to know if single-molecule magnets could be used to make levitating trains, but Mills said that they only work at the quantum scale; trains are much too big.

Other questions were about the origin of life. One wanted to know if life arose in hydrothermal vents, one was regarding the RNA hypothesis (which posits that RNA was the first biological molecule to arise since it can be both catalytic and self-replicating), and one wanted to know what Powner thought about synthetic biology. In terms of hydrothermal vents, Powner said that we know that metabolism is nothing if not adaptable—so it is difficult to put any constraints on the environment in which it arose.

He said that the RNA world is a useful framework in which to form research questions, but he no longer thinks it is a viable explanation for how life actually arose since any RNA reactions would need a membrane to contain them in order to be meaningful. And he said that synthetic biology—the venture of designing and generating cells from scratch, and even using non-canonical nucleic acids and amino acids beyond those typically used by life forms—is a complementary approach to the one his lab takes to investigate why biological systems are the way they are.

The Future of Research in the UK: How Will We Address the Biggest Challenges Facing Our Society?

Contributors

Stephen Brusatte, PhD
The University of Edinburgh, 2021 Blavatnik Awards UK Life Sciences Laureate

Sinead Farrington, PhD
The University of Edinburgh, 2021 Blavatnik Awards UK Physical Sciences & Engineering Laureate

Victoria Gill moderated this discussion with the Blavatnik laureates, Stephen Brusatte and Sinead Farrington. First, they discussed how COVID-19 affected their professional lives. Both of them spoke of how essential it was for them to support their students and postdocs throughout the pandemic. These people may live alone, or with multiple roommates, and they may be far from family and home, and both scientists said they spent a lot of time just talking to them and listening to them. This segued into a conversation about how the rampant misinformation on social media about COVID-19 highlighted the incredible need for science outreach, and how both laureates view it as a duty to communicate their work to the public by writing popular books and going into schools.

Next, they tackled the lack of diversity in STEM fields. Farrington said that she has quite a diverse research group—but that it took effort to achieve that. This led right back to public outreach and schooling. She said that one way to increase diversity would be to develop all children’s’ analytical thinking skills early on to yield “social leveling” and foment everyone’s interest in science. Brusatte agreed that increased outreach and engagement is an important way to reach larger audiences and counteract the deep-seated inequities in our society.

Lastly, they debated if science education in the UK is too specialized too early, and if it should be broader, given the interdisciplinary nature of so many breakthroughs today. Brusatte was educated under another system so didn’t really want to opine, but Farrington was loath to sacrifice depth for breadth. Deep expert knowledge is important.

How the Brain Gives Rise to the Mind

A professor gives a presentation to students.

This Year’s Blavatnik National Awards for Young Scientists Laureate in the Life Sciences is connecting the activity of cells and synapses to emotions and social behavior

Published October 21, 2021

By Roger Torda

Neuroscientist Kay Tye has challenged orthodoxy in her field by studying the connection between the brain and the mind. The work has led to breakthroughs in basic science. It also points to new approaches to mental illness, with significant potential impact.

Tye is a professor in the Systems Neurobiology Laboratory at the Salk Institute for Biological Studies. She and her research team work to identify the neural mechanism of emotional and social processing, in health and disease. Tye explained to the New York Academy of Sciences why this work is so important.

Impacts on Mental Health

“Mental health disorders have a prevalence of one in two. This is half the population. If we could understand how the brain gives rise to the mind, we could de-stigmatize mental health, and everyone would go and get the treatment that they need,” she says.

Current therapies for mental disorders are developed by trial-and-error, with drugs that have broad ranges of effects. Tye envisions a much different approach, with treatments that target specific mechanisms in the brain.

“Our insights could revolutionize our approach to mental health treatments, supporting individualized therapies that would be effective for everyone and have the precision to be free of side effects,” she says.

Neuroscientist Kay Tye at the Salk Institute

Tye’s work is widely recognized, and this year the Blavatnik National Awards for Young Scientists named Tye its 2021 Life Sciences Laureate.

Tye’s Background

Tye is the daughter of two scientists—a biologist and a physicist—who met while travelling to the U.S. from Hong Kong to pursue their educations. From a young age, Tye says she was fascinated by subjective experiences, foreshadowing her studies on the connection between brain and mind.

“How do I feel the way I feel?” Tye recalls wondering as a child. “How can two people listen to the same song and one person loves it and one person hates it? What are emotions?”

Tye with her children

Tye went to MIT for her undergraduate degree and received her Ph.D. from the University of California, San Francisco. After a postdoctoral fellowship at Stanford, she opened her lab as an assistant professor at MIT in 2012. In 2019, she moved across the country again, to the Salk Institute.

As Tye gained confidence as a young scientist, she took on a difficult professional challenge as she sought to examine questions that had not traditionally been the purview of her field.

“As a neuroscientist, I’m often told I am not allowed to study how internal states like anxiety, or craving, or loneliness are represented by the brain,” she recalled in a TED Talk. “And so, I decided to set out and do exactly that.”

Research in Optogenetics

In her research, Tye uses technology called “optogenetics,”  which transfers the light sensitivity of certain proteins found in some algae to specific neurons in the brains of lab animals. Researchers can then use light to control signaling by the neuron, and they can establish links between the neuron and specific behavior. Tye developed an approach using this tool called “projection-specific optogenetic manipulation.”

“This permits scientists to dissect the tangled mess of wires that is our brains to understand where each wire goes and what each wire does,” Tye said.

Kay Tye in the lab

Tye’s postdoctoral training was in the Stanford University lab of Karl Deisseroth, who had recently developed optogenetics. Many young neuroscientists wanted to be among the first to use optogenetics, and Tye was eager to use it to study behavior and emotion. Tye recalled that period.

“It was a very exciting time in neuroscience, and in 2009 I already felt like I had come late to the party, and knew I needed to push the field forward to make a new contribution,” Tye says. “I worked absurdly hard during my postdoc, fueled by the rapidly changing landscape of neuroscience, and feel like I did five years of work in that two-year period.”

Analyzing Neural Circuits

Tye’s research program initially focused on the neural circuits that process emotional valence, the degree to which the brain assigns positive or negative value to certain sensory information.  Her lab has analyzed the neural circuits controlling valence processing in psychiatric and substance abuse disorders.

This work includes the discovery of a group of neurons connecting the cerebral cortex to the brainstem that can serve as a biomarker to predict whether an animal will develop compulsive alcohol drinking behavior. Recent research has focused on neurons activated when animals experience social isolation and enter “loneliness-like” states.

Kay Tye in the lab

Tye and her research team are also exploring how the brain represents “social homeostasis”— a new field of research which seeks to understand how individuals know their place within a social group and identify optimal amounts of social contact.

Kay Tye and her lab team

Pushing Boundaries in Her Field

Even after considerable success in her field, Tye says she still feels as though she is pushing boundaries of her discipline. In doing so, she is continuing to bring neuroscience rigor to the study of feelings and emotions. Referring to her recent work, Tye said:

We faced a lot of pushback with this line of research, just because “loneliness” isn’t a word that has been used in neuroscience until now. These types of processes, these psychological constructs didn’t belong in what people considered to be hardcore neuroscience.

We are now bringing rigorous neuroscience approaches to ideas that were purely conceptual before. And so we’re being quantitative. We are being mechanistic. We are creating biologically grounded, predictive dynamical models for these nebulous ideas like “feelings” and “emotions.” And this is something that I find extremely gratifying.

Kay and colleagues at Salk Insitute

Targeting Molecules with Tiny Sponges

Two men smile and shake hands.

Growing up in Romania, Mircea Dincă’s was first exposed to science. Now he’s engineering an electric Lamborghini.

Published October 1, 2021

By Roger Torda

Mircea Dincă (left) poses with Nick Dirks, President and CEO of The New York Academy of Sciences.

Mircea Dincă creates materials in the lab with surface features that can’t be found in nature. He then makes variants with electrical properties that other scientists once thought impossible. This is groundbreaking basic research with many emerging applications. One is particularly exciting: a supercapacitor to power a Lamborghini supercar.

Dincă, a professor of chemistry at MIT, is this year’s Blavatnik National Awards for Young Scientists Laureate in Chemistry. He heads a lab that synthesizes novel organic-inorganic hybrid materials and manipulates their electrochemical and photophysical properties.

Dincă and his students work with metal-organic frameworks, or MOFs. “These are basically what I like to call sponges on steroids because they are enormously porous,” Dincă told the Academy in a recent interview. “They have fantastically high surface areas, higher than anything that humanity has ever known.”

Metal-Organic Frameworks (MOFs)

MOFs have a hollow, crystalline, cage-like structure, consisting of an array of metal ions surrounded by organic “linker” molecules. Scientists can “tune” their porosity, creating MOFs that can capture molecules of different properties and size.

To help conceptualize the large surface area of MOFs, Dincă says a gram of the material would, if flattened out, cover an entire football field. This means their pores can hold an almost unimaginably large number of molecules. One application capitalizing on this capacity is gas storage. For example, a canister filled with MOFs would hold nine times more COthan an empty canister. Other emerging uses have included devices to manage heat, antimicrobial products, gas separation, and devices for scrubbing emissions and carbon capture.

Dincă first encountered MOFs as a graduate student. Several years later, after considerable research on the electronic structure of materials, he started envisioning MOFs with properties that had not been widely considered before. “Previously, people thought that metal-organic frameworks are just ideal insulators,” Dincă said. “But we realized that there are certain types of building blocks that, when put together, would allow the free flow of electrical charges.”  This was something of a paradigm shift in the field.

A Partnership with Lamborghini

Dincă and his students started synthesizing MOFs with a variety of organic ligands and metal combinations to create materials that are both porous and conducting. They also developed ways to grow MOF crystals so they can be more easily studied with imaging tools, permitting analysis of their structure, atom-by-atom.  The new techniques and materials have led to MOFs that might prove valuable for batteries, fuel cells, and energy storage.  Dincă’s lab and MIT have signed a partnership with Lamborghini to use MOF supercapcitors in the company’s planned Terzo Millennio sportscar.

Dincă and his students also study the use of MOFs as catalysts, and as chemical sensors. They explore how these materials interact with light, which could lead to smart windows that lighten or darken automatically. Better solar cells are yet another possible application.

More efficient air conditioning, with considerable environmental benefit, is another goal. Dincă has co-founded a start-up called Transaera to build  MOF-based cooling equipment that pulls water molecules out of air so that the AC doesn’t work as hard. The key is tuning the pores of the MOFs to just the right size to capture water at just the right humidity.

Scaling up remains a challenge for many of these applications. “It’s one thing to make a few grams in a laboratory, it’s quite another to make hundreds of kilograms so you can take them out into the real world,” Dincă said.

“Thirsty for Knowledge”

Dincă grew up in Romania, and says he got his first taste of chemistry in 7th grade. An MIT departmental biography playfully suggests “that having a dedicated teacher that did spectacular demonstrations with relatively limited regard for safety” was the initial influence.  One imagines awe-inspiring, semi-controlled explosions in the front of a classroom of 12 year olds. In the following years, Dincă started participating in the Chemistry Olympiads, and in 1998, when he was in high school, he won first place at an international competition in Russia.

At the time, Dincă found he was running up against limits to his education. “I think the biggest challenges to my becoming a scientist were, early on in Romania where I grew up, that we just didn’t have access to labs, to books,” Dincă said. “That made me thirsty for knowledge.” So Dincă was eager to travel to the U.S. when he was offered a scholarship for undergraduate studies at Princeton. He then earned a Ph.D. from UC Berkeley. He has been teaching and conducting research at MIT since 2008.

Dincă met his wife, who is also from Romania, while they were both students at Princeton. She is a lawyer, and the couple have two children, Amalia and Gruia. Dincă’s father is a retired Romanian Orthodox priest, and his mother, a retired kindergarten teacher.

When he is not with his family or at work, Dincă might be running, hiking, or taking photographs.

Constant Exposure to the Unknown

Dincă enjoys teaching, including freshmen chemistry. For his more advanced students and postdocs, Dincă says he fosters original thinking by giving them as much responsibility as possible. “As a Principal Investigator myself, I tend to be very hands-off,” Dincă explained. “And that’s good because it allows students to take ownership of their projects and become creative themselves. In fact, most of the best ideas in my lab come from the students, not myself.”

One of the best things about being a scientist, Dincă said, is constant exposure to the unknown, and he is pleased when his commitment to basic research is recognized. “Being a Blavatnik National Award Laureate is, of course, fantastic recognition of my research, of my group’s efforts,” Dincă said. “But also, most importantly for me, it is recognition of the fact that curiosity-driven research is still appreciated.”

While curiosity may drive Dincă’s scientific inquiries, he believes applied research with new classes of MOFs will help address important environmental challenges. At the same time, there can be no doubt that one application may prove especially thrilling. “Never in my wildest dreams did I believe that just thinking about electrical current in porous materials would take me on a path to helping make an electric Lamborghini,” Dincă said. “But that is where our research has led us.”

Also read: Exploring Metamaterials and Photonics

Exploring Metamaterials and Photonics

A man smiles for the camera.

Andrea Alù is challenging the laws of physics to improve data transmission. Oh yeah, he’s working on an invisibility cloak, too!

Published October 1, 2021

By Roger Torda

Andrea Alù

Andrea Alù isn’t satisfied with how light waves and sound travel through objects and space. So he engineers new materials that appear to violate some well-established laws of physics. Enhanced wireless communication and computing technologies, improved bio-medical sensors, and invisibility cloaks are just some of the achievements of his lab.

“We create our own materials, engineered at the nanoscale,” explained Alù, who is Director of the Photonics Initiative at the Advanced Science Research Center at the City University of New York (CUNY). “We call them metamaterials, which push technologies forward, to realize optical properties, electromagnetic properties, or acoustic properties that go well beyond what nature and natural materials offer us.”

This work has led to many honors, and this year the Blavatnik National Awards for Young Scientists is recognizing Alù as its 2021 Laureate in Physical Sciences and Engineering.

In a recent interview with The New York Academy of Sciences (the Academy), Alù explained a core behavior of light that is at the heart of his research:

One of the most basic phenomena in optics is light refraction, which describes the change in direction of propagation of an optical beam as it enters a material. We can understand this as the collective excitation of molecules and charges in the material, produced by light. In metamaterials, we make up our own molecules—we call them metamolecules.

Metamaterials feature many different geometries of at the nanoscale. Some can be engineered to interact with light in such a way that they may actually make objects disappear from sight. It is a phenomenon called “cloaking.” Alù continued:

Engineering at the Nanoscale

This engineering at the nanoscale allows us to change the ways in which light refracts as it enters a metamaterial. By bending light in unusual ways, we can actually realize highly unusual optical phenomena, like enhancing or suppressing the reflections and scattering of light from an interface, making a small object appear much larger, or conversely, even disappear altogether, by hiding it from the impinging electromagnetic waves.

“Invisibility” has long been part of our popular imagination and science fiction, from H.G. Wells’ novels to Star Trek and Harry Potter. A pioneering theoretical step dates back to 1968, when a Russian physicist wondered if a phenomenon called “negative refraction” might be possible. But no materials featuring this property were known, and some scientists believed none would be found because negative refraction might violate widely-used equations describing the propagation of light. Thirty years later, in 2000, a team of scientists was able to demonstrate negative refraction in a metamaterial for a certain frequency of electromagnetic radiation. A few years later, experiments demonstrated actual metamaterial cloaking, and Scientific American proclaimed: “Invisibility Cloak Sees Light of Day.”

Alù started working on metamaterials in 2002, when he spent a year at the University of Pennsylvania as a visiting student. He has conducted pioneering research in the field ever since. A major achievement came in 2013. Alù, then at the University of Texas at Austin, and his collaborators, demonstrated the cloaking of a three-dimensional object using radio waves. The work showed that antennas, like the ones in our cell phones, could be made transparent to radio-waves, a finding of potential commercial and military value, as it eliminates interference between closely-spaced transmitters.

A Childhood Fascination

Alù’s interest in light and other electromagnetic waves began as a child in Italy when he was fascinated by how our radios and television sets receive broadcast information without wiring. His interest intensified in high school when he realized a “beautiful common mathematical framework” describes the propagation of light, radio signals, and sound, and the fact that no information can be transmitted faster than the speed of light.

Alù went on to study at the University of Roma Tre, where he earned a Ph.D. in electronic engineering. After a postdoctoral fellowship at the University of Pennsylvania, he joined the faculty of UT Austin in 2009, and moved to CUNY in 2018.

Nanomaterials being developed in Alù’s lab may also improve near-field microscopy for better biomedical imaging, and lead to optical computers, enabling faster and more efficient PCs that use light instead of electric signals.

Yet another area of intense research for Alù and his research team has been “breaking reciprocity,” with implications for improved transmission of sound as well as radio waves and light. “Light, sound, and radio waves, typically travel with symmetry between two points in space,” Alù explained. “If you hear me, I can hear you back. If you can see me, typically you can see me back. This property is rooted into the time reversal symmetry of the wave equations.”

Connecting Basic and Applied Research

Alù said his lab’s work in breaking this symmetry with metamaterials is a good illustration of the connection between basic and applied research:

Interestingly, making materials that transmit waves one way and not the other started as a curiosity, but it has rapidly become extremely useful, from improving data rates with which our cell phones or WiFi technologies operate to protecting sensitive lasers from reflections. This has been a very exciting quest, from basic research to applications.

Alù began his research and teaching career in the U.S. only after he earned his Ph.D. in Italy and, as a result, he found he initially had a smaller professional network than many of his peers. But Alù says the U.S. was very welcoming, and he quickly caught up:

I come from Italy and I did all my undergraduate and graduate studies there. So, coming to the U.S. first as a postdoc, then as a faculty member, I didn’t have a large support network around me, I didn’t initially have a lot of connections…. But at the same time, I have to say, the United States offers tremendous opportunities, in particular to young scientists, to help build up their research groups, and to thrive.

Alù continued: “The U.S. is an amazing country in welcoming young people, new talent, and supporting them in the broadest possible terms… An excellent example of this is the Blavatnik National Awards program, and the broad range of scientists it recognizes.”

The Key to Balancing a Research Career and Parenting

A family consisting of a man, women, toddler daughter, and newborn baby.

Much like being a parent, science never stops. Daniel Straus, 2021 Blavatnik Regional Awards Winner in Chemistry, provides insight on how to balance these two responsibilities.

Published September 23, 2021

By Daniel Straus

Daniel Straus with his family

Science never stops, for better or for worse. I am a competitive person. A constant fear of mine is being “scooped” by another lab, rendering months or years of research unpublishable for a lack of novelty. Taking time off work exacerbates this risk—people in other labs will keep working while I am not. This fear preoccupied me when I took time off after my first child was born.

When my daughter Elizabeth was born in 2016, I was a graduate student at the University of Pennsylvania. Graduate students at Penn are not considered employees, so I did not have access to the 12 weeks of protected unpaid leave under the Family and Medical Leave Act. I was fortunate that Penn offered eight paid weeks to graduate students after the birth of a child—paternity leave is often overlooked, and many graduate schools do not provide any paternity leave.

After Elizabeth was born, I took the first two weeks off to take care of her, bond with her, and support my wife. I then went back to work for ten more weeks while my wife stayed home with Elizabeth. Then, my wife returned to work, and I took the remaining six weeks of my leave. My productivity at work in the ten weeks I was back was poor and I don’t remember much of this time because I barely slept. I can only imagine how unproductive I would have been had I gone back to work immediately after her birth.

During the last six weeks of my leave I was more relaxed because I realized my time was much better spent with my daughter. There was nothing as spectacular as watching my child learn and do new things every day. Nothing can replace family—I enjoy my work and doing science, but I work to live and to support my family. The time spent at home did not impede my science anyway; rather, it helped me bond with my daughter and rest so that when I did go back to work full-time, I could maximize my productivity and not fall asleep at my desk.

Being a parent has improved my science. I have learned to be more productive in the time I spend in lab so I can spend as much time as possible with my family at home. I am much better at planning my days in lab in advance and also at saying “no” to non-essential things for which I do not have time, such as reviewing manuscripts during busy times.

My mentoring skills have also improved from being a parent. Elizabeth loves doing things herself—even as a one-year-old, she hated having things done for her. When I would try to buckle her into her highchair, for instance, she would scream “SELF” or “LIZZY DO IT.” She couldn’t buckle herself the first few times she tried, and she relented to letting me help after five minutes of struggling. But by trying so many times, she eventually figured out how to do it herself. No student has ever screamed “SELF” when I would “help” (they would probably say “interfere”) with something they were doing, so Elizabeth taught me that when someone is figuring out how to do something, many times the most helpful thing is to do nothing until asked.

Being a scientist has also improved my parenting. In the lab, I reason through questions on my own or with the help of my mentors—there usually isn’t an immediately correct answer because if there were, it would not be novel research. When Elizabeth asks me a question, being a scientist has taught me to first ask her, “what do you think?” so she can develop her own reasoning skills. Curiosity is better satisfied through discovery than through answers. She also loves science and likes to learn new things—her favorite YouTube channel is SciShow Kids, where she watches age-appropriate videos about topics in science, and after watching one she is so excited to share the new things she learned with me. Much of my postdoctoral work involves solving the crystal structures of materials, so when she saw me looking at a crystal structure on my laptop, she wanted to “do crystals” too because she thinks that I drag around a 3D model of a crystal structure all day at work (she’s not entirely wrong…).

My son Noah was born in April, and thankfully Princeton provides employees (including postdocs) 12 paid weeks of parental leave. Parenting is hard work, and people who are not parents may not understand this. While on parental leave, I received emails from multiple people saying, “I hope you’re enjoying your vacation.” Being with Noah is more exhausting than being in lab, but also more rewarding.  Science still never stops—I had to submit a manuscript revision while home to meet a deadline—but I am trying to enjoy every minute with Noah before going back to work full-time, because parenting never stops either.

This piece was originally published on the National Postdoctoral Association member blog as part of 2021 National Postdoc Appreciation Week. Current Academy Members can receive a 20% discount on a National Postdoctoral Association postdoc individual membership by emailing info@nyas.org and requesting the NPA membership discount code

Learn more about the 2021 Blavatnik Regional Awards for Young Scientists


About the Author

Daniel B. Straus is the Chemistry Winner of the 2021 Blavatnik Regional Awards for Young Scientists. You can learn more about him and the Blavatnik Awards at Blavatnikawards.org

The Economic Imperative for Better Battery Technology

A graphic illustration of a battery.

A married research duo are studying ways to better predict the feasibility and potential economic benefits of adopting battery technologies for renewable energy.

Published May 13, 2021

By Roger Torda

(Left to Right) Graham Elliott and Shirley Meng at the 2019 Blavatnik National Awards Ceremony at the American Museum of Natural History

What can we learn from a marriage of physical and social sciences?

Materials scientist and Blavatnik National Awards for Young Scientists Finalist (2018, 2019) Shirley Meng, PhD, shares her answer to this question. She and her husband, economist Graham Elliott, PhD, combine their expertise in battery chemistry and economic modeling.

In an intriguing collaboration, they developed ways to better predict the feasibility and potential economic benefits of adopting battery technologies to integrate renewable energy, such as solar and wind energy, into energy grids. Together with their research team members, they published “Combined Economic and Technological Evaluation of Battery Energy Storage for Grid Applications” in the journal Nature Energy.

Meng is the Zable Chair Professor in Energy Technologies and Director of the Institute for Materials Design and Discovery at the University of California San Diego (UCSD). Elliott is also at UCSD, where he is Professor and Chair of the Department of Economics. We recently interviewed both to discuss this collaboration and what they learned through the process.

Can you tell us how this collaboration was initiated?  

Meng: UCSD is a place where interdisciplinary and convergent research is not only highly valued but practiced.  I founded the Sustainable Power and Energy Center (SPEC) at UCSD in 2015. SPEC reaches out beyond engineering and physical sciences to study economic and sociological issues that need to be addressed to create truly robust ecosystems for low-carbon electric vehicles and carbon-neutral microgrids. We won a competitive grant from the US Department of Energy, which provided the resources for this work.

Why did you choose to study batteries for energy grid applications? What question about batteries did you study?

Meng: With energy grids showing their age and continuing to distribute energy generated with high environmental costs, efforts that enable grids to distribute cleaner, renewable energy more efficiently would be a technological advance with a positive societal impact. While there have been exciting moves toward renewables, many problems lie ahead if we are to move from renewables being important to renewables being dominant.

Elliott: Grid energy storage remains a major challenge both scientifically and economically. Batteries, or energy storage systems, play critical roles in the successful operation of energy grids by better matching the energy supply with demand and by providing services that help grids function. They will not just transform the market for supplying energy but also transform consumer demand by lowering the prices of energy for households and businesses.

In this work, we studied the potential revenues that different battery technologies deployed in the grid will generate through models that consider market rules, realistic market prices for services, and the energy and power constraints of the batteries under real-world applications.

Bringing these together in an interactive way—examining the engineering and economic aspects as two parts of the problem together—allows for a complete look at the problem, and ultimately a better outcome for the economy.

Graham Elliott

What was the biggest finding of this collaboration? Were you surprised by your findings?

Meng: We found that while some battery technologies hold the greatest potential from an engineering perspective, the choice based on economics is less clear. The current rules of grid operations dictate which battery technologies are used for those particular grids—some of these rules may be out-of-date, and will be updated as the grids modernize. So even though we continue to see improvement in the energy/power performance of battery technologies and reduction in cost, policymakers are the ultimate decision-makers. Policymakers setting those rules have considerable influence on how fast and how successfully those battery technologies can be deployed, and therefore industry needs to work closely with policymakers to define the best practices for faster deployment of battery technologies.

We also found that there are a wide variety of factors that should be considered in choosing a battery technology. For instance, the battery recycling method is an important technical variable that determines the sustainability of a particular battery technology.

How could your findings eventually affect individual people and society? How can it help our economy?

Elliott: All gains in human welfare arise from what economists call productivity gains—people creating more with less effort, so there is more to go around. Technological advances in energy storage enable productivity gains. But for it to work, we need not only to be able to provide effective energy storage from an engineering perspective, but also it needs to be economically feasible. Different choices at the engineering stage mean differences in the economic feasibility, and how markets are arranged impacts engineering choices. Bringing these together in an interactive way—examining the engineering and economic aspects as two parts of the problem together—allows for a complete look at the problem, and ultimately a better outcome for the economy.

Meng: We are delighted to see to see that battery grid storage is starting to gain more momentum—policymakers are becoming informed about both economic and scientific, and engineering aspects of battery technologies.

A small-scale energy grid at the University of California San Diego, consisting of a network of solar cells with battery storage (Credit: University of California San Diego)

What did you learn from this collaboration? Are there any tips you would like to share with other researchers who would like to pursue similar collaborations between physical and social sciences?

Meng: Perhaps the most important thing for the collaborative team to do is to build a common vocabulary so we can truly understand each other. In our case, we started by explaining the most basic symbols and units in engineering, like the energy unit Wh (Watt-hour) and the power unit W (Watt). Without understanding the differences between these symbols, we will make mistakes in constructing important parameters in our economic modeling.

Elliott: Another thing we learned is that different fields have very different understandings of the big picture. Collaboration across fields helps focus everyone’s efforts. For example, engineers typically view markets as fixed, and the engineering problem is to find something that works for the market. Economists tend to think of products (such as batteries) as fixed and design markets that work for the available products.

There is a whole research area waiting patiently for economists to understand which parts of the engineering problem are important and for scientists and engineers to understand from their perspective which parts of the market design are important.

Harnessing CRISPR to Revolutionize COVID Testing

A gloved hand holds a COVID-19 test.

Professor Pardis Sabeti was able to apply findings from her research on Ebola to now develop a test for detecting COVID-19.

Published March 9, 2021

By Brittany Aguilar, PhD

Pardis Sabeti, MD, DPhil, MSc

This isn’t the first time that Pardis Sabeti, MD, DPhil, MSc, a professor of organismic and evolutionary biology at Harvard University, and newly elected member of the National Academy of Medicine, has worn the hat of viral genome detective in the earliest days of a deadly outbreak or viral disease. Sabeti and her team began sequencing Ebola samples just days after the virus was first detected in Sierra Leone during the 2013-2016 West African outbreak. Since January 2020, she has been working on diagnostics for COVID-19, developing models to predict the most sensitive and accurate assay design candidates for the rapid detection of SARS-CoV-2, including an assay that harnesses the powerful accuracy of CRISPR technology.

Describe the innovative, rapid COVID-19 test that you helped create—how does it work, and why is it an improvement on current testing methods?

Over the last several years, my lab, colleagues, and I have been developing an assortment of technologies for genomic surveillance of pathogens. In particular, we have been deeply invested in CRISPR technologies. CRISPR was first discovered within bacterial immune systems, where it is used to protect the bacteria from invading pathogens by rapidly identifying and targeting a genomic sequence with very high fidelity. Thus, it is immensely powerful as a diagnostic tool, since it can be designed to detect any sequence of genetic material with impressive accuracy.

It is an incredibly exciting technology: it is highly accurate, it would be able to rapidly detect pathogens using little equipment and a simple, paper-strip read-out, and it could be developed in a matter of days to detect newly discovered pathogens or new variants of known pathogens. Crucially, the test is also inexpensive to manufacture, which means it could be easily scaled and distributed as pathogens—or novel variants of pathogens—emerge.

Throughout the duration of the COVID-19 pandemic, some have suggested that testing is optional, unnecessary or unreliable—can you describe why the creation of rapid, reliable tests is so important?  Does that change depending on where we are in the infection curve?

Testing is extremely critical to fighting the spread of any infectious disease, and this has been demonstrated through history. However, testing technology has been achievable but not prioritized—if we had invested in this space after the SARS-CoV epidemic [the SARS outbreak in 2003], I believe we could have been poised to respond to SARS-CoV-2 before it spread throughout the world.

The need for diagnostics is critical everywhere, from pre-empting a pandemic, to response and recovery. To be as useful as possible, diagnostics must also be affordable and accessible to all—this is not just in infectious disease but throughout all medicine. The sooner individuals and communities have information, the better they can respond, enabling better outcomes.

You wrote a book last year entitled “Outbreak Culture.” Are there any key learnings from that book that can be applied to COVID or future pandemics?

In this book we argue that a dysfunctional “outbreak culture”—the collective mindset that develops among responders and communities that emerges in the chaos and crucible that is disease outbreaks—poses a great threat to our ability to curb outbreaks and save lives, and that we must continually watch for and dismantle toxic response systems where possible. This includes the data and resource hoarding, perverse capitalistic incentives, the spread of misinformation, and the loss of empathy and good citizenship.

I think people are still just beginning to understand the gravity of outbreak culture and how it is operating amidst COVID. For example, we all now know the importance of detecting outbreaks, through track-and-trace methods, before they have the chance to spread widely. But what is given less attention is how those efforts can be sidelined or undermined by many surrounding societal and political forces.

I always advocate for a massively increased effort for empathy during outbreaks. We need resilient communities to be able to do the best work against infectious disease. With our trust in our fellow citizens, our leaders, and our scientists undermined during this time, it is crucial to work within the community and low to the ground. We must listen to others, respect their opinions, and understand their fears. For that reason, I believe we must double down on empathy when it comes to community participation. If we do not work with communities and support them in the right ways, we end up causing more harm than good.

About Prof. Sabeti

Pardis Sabeti, MD, DPhil, MSc is a Professor at the Center for Systems Biology and Department of Organismic and Evolutionary Biology at Harvard University and the Department of Immunology and Infectious Disease at the Harvard School of Public Health.  She was a 2016 and 2017 Finalist for the Academy’s Blavatnik National Award for Young Scientists. To learn more about Dr. Sabeti and her work, click here to listen to the “Deciphering Zika” podcast.

When Artificial Intelligence Meets Physical Sciences

Artificial intelligence is quickly becoming a ubiquitous part of our daily lives. What can we expect as this technology continues to grow? And how will it impact you?

Published September 14, 2020

By Liang Dong

Alexandra Boltasseva, PhD

From virtual assistants like Siri to self-driving cars and computer-aided medical diagnoses, artificial intelligence (AI) affects our lives with unprecedented speed. Slowly but steadily, scientists in a broad range of fields have started to embrace AI in their research, hoping to significantly reduce the time needed to achieve new discoveries. This trend has become more obvious in the physical sciences, and in the field of materials science in particular, which is focused on the discovery and production of new, advanced materials imbued with desirable properties or functions. Think: screens of foldable smartphones; batteries that power electric cars; or materials that bend light around them, rendering them invisible.

How exactly could AI help materials scientists? We recently interviewed three honorees of the Blavatnik Awards for Young ScientistsAlexandra Boltasseva, PhD, Professor of Electrical and Computer Engineering at Purdue University; Léon Bottou, PhD, Principal Researcher at Facebook AI Research; and Sergei V. Kalinin, PhD, Corporate Fellow at Oak Ridge National Laboratory, who are contributing to an upcoming virtual symposium on October 6 and 7, AI for Materials: From Discovery to Production. Here’s what they had to say about the opportunities, as well as the challenges, in this rising field.

It is only recently that researchers in the physical sciences, like materials scientists, have begun to incorporate AI techniques into their work. Why do we need to take advantage of AI for this field? What benefits may AI offer materials science?    

Kalinin
Sergei V. Kalinin, PhD

AI offers a set of powerful tools to explore large volumes of multidimensional data in the physical sciences, and promises to uncover hidden functional relationships between the physical properties that we can observe. As such, AI methods are poised to become an inseparable part of all physical sciences, to enable discovery and hypothesis-driven research and to guide planning of experiments. We can take advantage of a broad range of AI techniques—from multivariate statistics to convolutional networks, unsupervised and semi-supervised methods, Gaussian processing, and reinforcement learning.

In addition, the proliferation of laboratory automation in areas from materials synthesis to imaging of materials’ molecular structures opens up broad opportunities for AI-driven experiments. For example, we will be able to adopt large-scale robotic systems or the microscale lab-on-a-chip platforms in our experiments, producing thousands or more materials in a single process.

Boltasseva

My own field, photonics, has truly been transformed by the concept of “inverse design,” meaning scientists input desired performances of photonic systems into computers and run physics-informed algorithms to figure out the best possible optical designs. The daunting challenge of this field lies in the inconceivably high computational power required for an exhaustive search within the extremely large, hyper-dimensional space of optical design parameters and constituent materials. Merging AI techniques with photonics is expected to not only enhance and enrich the design space, but, most importantly, to unlock novel functionalities and bring about disruptive performance improvements.

As compared to life sciences and pharmaceutical sciences, the application of AI in physical sciences is at least 10 years behind. What do you think is the biggest challenge for applying AI in physical sciences? How could the AI and physical sciences communities work together to address these challenges? 

Bottou
Léon Bottou, PhD

Using machine learning in physical sciences is not an obvious proposition. Recent advances in AI have shown how tasks in computer science, such as computer vision and machine translation, can be achieved using big data. Yet it would be unwise to claim that this success can be replicated in all scientific fields. Big data only reveal statistical correlations that are not always indicative of the causal relations that physicists often seek. To solve this question, the AI and physics communities may take the strategy of defining a hierarchy of problems for which one could envision using AI, such as:

  • Visualizing or measuring an ongoing physical phenomenon. These problems are the most accessible to AI/machine learning because they can directly leverage recent advances in computer vision and signal analysis in collecting data from physical experiments and computations.
  • Explaining a physical phenomenon. These problems belong to the next rung of difficulty because we need AI/machine learning systems that incorporate enough of our current knowledge of physics, and can then clarify the phenomenon of interest by constructing something interpretable on top of our current knowledge.
  • Designing a physical system that leverages a certain phenomenon in new ways. These are by far the most difficult problems, because they require AI/machine learning systems to accurately predict how the physical phenomenon will be affected by changes that are not included or prominent in the experimental data on which AI models have been trained.
Boltasseva

The physical sciences community should ultimately build extensive databases to unleash the power of AI. We should even set up an ‘optical structures and materials genome’ project to construct a comprehensive dataset of photonic concepts, architectures, components, and photonic materials to enable hierarchical machine learning algorithms that could provide ultimate-efficiency devices.

Kalinin

I agree with Alexandra. AI tends to proliferate in the communities that adopt the model of open sharing of codes and data. While some areas of physics research have undergone this transformation, many more require both enabling tools and proof-of-benefit to accelerate this process.

I also want to add on to Léon’s comment on the fundamental difference between the AI and physics communities. AI starts with purely correlative models, and tends to rely on big data. In comparison, research in physical sciences is strongly based on prior knowledge to explore the cause and effect relationships, and often assumes the presence of simple rules or descriptors that can give rise to complex behaviors in macroscopic systems. Experiments in physical sciences can give rise to huge data volumes, but these data can pertain only to one specific situation of the system and hence are not “big.”

In order to further leverage the benefits of AI in physical sciences, researchers have to possess both sufficient domain knowledge in physical sciences and expertise in machine learning, or forge robust interdisciplinary collaborations. Conferences like AI for Materials will help researchers in both fields form these kinds of interdisciplinary teams.

Also read: The Challenge of Quantum Error Correction

Takeda and the New York Academy of Sciences Announce 2020 Innovators in Science Award Winners

The 2020 Innovators in Science Award winners include a biochemist/molecular geneticist from Cold Spring Harbor Laboratory and brain disorder researcher from the Korea Advance Insitute of Science and Technology.

New York, NY | July 8, 2020 and Osaka, Japan | July 8, 2020 – Takeda Pharmaceutical Company Limited (“Takeda”) (TSE:4502) and the New York Academy of Sciences announced today the Winners of the third annual Innovators in Science Award for their excellence in and commitment to innovative science that has significantly advanced the field of rare disease research. Each Winner receives a prize of US $200,000.

Senior Scientist Award: Adrian R. Krainer

The 2020 Winner of the Senior Scientist Award is Adrian R. Krainer, Ph.D., St. Giles Foundation Professor at Cold Spring Harbor Laboratory. Prof. Krainer is recognized for his outstanding research on the mechanisms and control of RNA splicing, a step in the normal process by which genetic information in DNA is converted into proteins. Prof. Krainer studies splicing defects in patients with spinal muscular atrophy (SMA), a devastating, inherited pediatric neuromuscular disorder caused by loss of motor neurons, resulting in progressive muscle atrophy and eventually, death. Prof. Krainer’s work culminated notably in the development of the first drug to be approved by global regulatory bodies that can delay and even prevent the onset of an inherited neurodegenerative disorder.

“Collectively, rare diseases affect millions of families worldwide, who urgently need and deserve our help. I’m extremely honored to receive this recognition for research that my lab and our collaborators carried out to develop the first approved medicine for SMA,” said Prof. Krainer. “As basic researchers, we are driven by curiosity and get to experience the thrill of discovery; but when the fruits of our research can actually improve patients’ lives, everything else pales in comparison.”

Early-Career Scientist Award: Jeong Ho Lee

The 2020 Winner of the Early-Career Scientist Award is Jeong Ho Lee, M.D., Ph.D, Associate Professor, Korea Advanced Institute of Science and Technology (KAIST). Prof. Lee is recognized for his research investigating genetic mutations in stem cells in the brain that result in rare developmental brain disorders.

He was the first to identify the causes of intractable epilepsies and has identified the genes responsible for several developmental brain disorders, including focal cortical dysplasias, Joubert syndrome—a disorder characterized by an underdevelopment of the brainstem—and hemimegalencephaly, which is the abnormal enlargement of one side of the brain. Prof. Lee also is the Director of the National Creative Research Initiative Center for Brain Somatic Mutations, and Co-founder and Chief Technology Officer of SoVarGen, a biopharmaceutical company aiming to discover novel therapeutics and diagnosis for intractable central nervous system (CNS) diseases caused by low-level somatic mutation.

“It is a great honor to be recognized by a jury of such globally respected scientists whom I greatly admire,” said Prof. Lee. “More importantly, this award validates research into brain somatic mutations as an important area of exploration to help patients suffering from devastating and untreatable neurological disorders.”

The 2020 Innovators in Science Award Ceremony and Symposium

The 2020 Winners will be honored at the virtual Innovators in Science Award Ceremony and Symposium in October 2020. This event provides an opportunity to engage with leading researchers, clinicians and prominent industry stakeholders from around the world about the latest breakthroughs in the scientific understanding and clinical treatment of genetic, nervous system, metabolic, autoimmune and cardiovascular rare diseases.

“At Takeda, patients are our North Star and those with rare diseases are often underserved when it comes to the discovery and development of transformative medicines,” said Andrew Plump, M.D., Ph.D., President, Research & Development at Takeda. “Insights from the ground-breaking research of scientists like Prof. Krainer and Prof. Lee can lead to pioneering approaches and the development of novel medicines that have the potential to change patients’ lives. That’s why we are proud to join with the New York Academy of Sciences to broadly share and champion their work — and hopefully propel this promising science forward.”

“Connecting science with the world to help address some of society’s most pressing challenges is central to our mission,” said Nicholas Dirks, Ph.D., President and CEO, the New York Academy of Sciences. “In this third year of the Innovators in Science Award we are privileged to recognize two scientific leaders working to unlock the power of the genome to bring innovations that address the urgent needs of patients worldwide affected by rare diseases.”

About the Innovators in Science Award

The Innovators in Science Award grants two prizes of US $200,000 each year: one to an Early-Career Scientist and the other to a well-established Senior Scientist who have distinguished themselves for the creative thinking and impact of their research. The Innovators in Science Award is a limited submission competition in which research universities, academic institutions, government or non-profit institutions, or equivalent from around the globe with a well-established record of scientific excellence are invited to nominate their most promising Early-Career Scientists and their most outstanding Senior Scientists working in one of four selected therapeutic fields of neuroscience, gastroenterology, oncology, and regenerative medicine.

Prize Winners are determined by a panel of judges, independently selected by The New York Academy of Sciences, with expertise in these disciplines. The New York Academy of Sciences administers the Award in partnership with Takeda.

For more information please visit the Innovators in Science Award website.

About Takeda Pharmaceutical Company Limited

Takeda Pharmaceutical Company Limited (TSE:4502/NYSE:TAK) is a global, values-based, R&D-driven biopharmaceutical leader headquartered in Japan, committed to bringing Better Health and a Brighter Future to patients by translating science into highly-innovative medicines. Takeda focuses its R&D efforts on four therapeutic areas: Oncology, Rare Diseases, Neuroscience, and Gastroenterology (GI).

We also make targeted R&D investments in Plasma-Derived Therapies and Vaccines. We are focusing on developing highly innovative medicines that contribute to making a difference in people’s lives by advancing the frontier of new treatment options and leveraging our enhanced collaborative R&D engine and capabilities to create a robust, modality-diverse pipeline. Our employees are committed to improving quality of life for patients and to working with our partners in health care in approximately 80 countries. For more information, visit https://www.takeda.com.

For more information, visit https://www.takeda.com/newsroom/

Game Changers: Scientists Shaping the Future of Research in the UK

On March 5, 2020, the New York Academy of Sciences celebrated the Laureates and Finalists and winners of the 2020 Blavatnik Awards for Young Scientists in the United Kingdom. The one-day symposium featured fast-paced, engaging research updates from nine scientists working in diverse fields within life sciences, chemistry, and physical sciences and engineering. This year’s Blavatnik UK honorees are probing the deepest mysteries ranging from the universe to the human mind, tackling longstanding questions that have occupied scientists and philosophers for millennia. Is there life beyond our Solar system? How is knowledge organized in the brain? What is the fundamental nature of gravity? Find out how this game-changing group of young scientists is working to answer these questions in this summary of the symposium.

Symposium Highlights

  • Environmental factors can influence the defense strategies bacteria use to fend off invading viruses. Insights into this process are advancing the potential for phage therapy as an alternative to antibiotics.
  • New analytical and computational tools are revealing the neural machinery that allows the brain to create models of the world and facilitates decision-making and behavior.
  • Chemists can exploit chirality to create novel molecules with a wide variety of applications in drug design, consumer electronics, and catalysis.
  • The scientific community is closer now than ever to realizing the commercial potential of nuclear fusion as a source of clean energy.
  • The first viable theory of massive gravity might help explain some of the biggest mysteries in physics, including the accelerated expansion of the universe.

Hosted By

Victoria Gill
Science Correspondent
BBC News

Speakers

Tim Behrens, DPhil
University of Oxford and University College London

Ian Chapman, PhD
UK Atomic Energy Authority

Matthew J. Fuchter, PhD
Imperial College London

Stephen M. Goldup, PhD
University of Southampton

Kirsty Penkman, PhD
University of York

Claudia de Rham, PhD
Imperial College London

Eleanor Stride, PhD
University of Oxford

Amaury Triaud, PhD
University of Birmingham

Edze Westra, PhD
University of Exeter

Program Supporter

Changing the Game in Life Sciences

Speakers

Eleanor Stride, PhD
University of Oxford

Edze Westra, PhD
University of Exeter

Tim Behrens, DPhil
University of Oxford & University College London

Engineering Bubbles

Mechanical engineer Eleanor Stride never planned to design drug delivery systems. She was “convinced I wanted to spend my career designing Aston Martins,” until a chance discussion with a supervisor piqued her interest in therapeutic applications of engineered microbubbles. Just two microns in diameter, microbubbles can be used as ultrasound contrast agents, but Stride sees a role for these tiny tools in the fight against cancer. “In many cases, the problem with cancer drugs [is] how we deliver them,” she said, explaining that systemic chemotherapy agents often cannot penetrate far enough into tumors to be effective. These drugs can also cause side effects and damage healthy tissues.

Microbubbles can help sidestep these challenges, safely encapsulating drug molecules within a stabilizing shell.  The shell can be functionalized with magnetic nanoparticles, allowing clinicians to direct the bubbles’ aggregation at tumor sites and visualize them with ultrasound. As the bubbles compress and release in response to the ultrasound beam, the oscillation helps the bubbles penetrate into the surrounding tissue. “If we increase the ultrasound energy, we can destroy the bubble, allowing us to release the drugs on demand,” said Stride, noting that molecules released from a single 2-micron microbubble can circulate up to 100 times that diameter, pumping drugs deep into tumor tissues. This approach is highly localized—drugs are only released at the tumor site—which eliminates the potential for systemic toxic effects.

Ultrasound-stimulated oscillation of microbubbles creates a vortex in surrounding fluids. The vortex pumps drug molecules deep into tumor sites.

In 2019, Stride and a team of collaborators published the results of trials using oxygen-loaded magnetic microbubbles to treat malignant pancreatic tumors. In animal models, tumors treated with microbubble-delivered drugs showed dramatic spikes in cell death and also shrank in size, “which can mean the difference between a surgeon being able to remove a tumor or not,” said Stride. Additional experiments have helped hone techniques for external magnetic control of microbubbles within blood vessels to ensure precise, targeted drug delivery—a critical step toward tailoring this method for use in humans. Stride and her collaborators aim to launch a clinical trial in pancreatic cancer patients “in the very near future.”

Insights From Bacteria-Phage Interactions

As the fight against viruses dominates the news cycle, 2020 Blavatnik Awards UK Finalist Edze Westra shared an update from the front lines of a viral war billions of years in duration: the “evolutionary arms race” between bacteria and the viruses that infect them, called phages. The interactions between bacteria and phages—the most abundant biological entities on Earth—have profound implications for the development of phage-based therapies as alternatives to antibiotics.

Phages are often successful killers, but bacteria have evolved sophisticated immune strategies to resist attacks. Understanding how and when bacteria deploy each of these defensive tactics is key to designing phage therapies to treat bacterial infections.

Like humans, bacteria utilize both innate and adaptive immune responses to invading pathogens. In bacteria, innate immunity relies on the modification of surface structures to prevent phages from attaching. This system is effective, yet it creates no “record,” or memory, of which phages it encounters. The adaptive immune system, however, allows bacteria to build a database of previously encountered pathogens in the form of bits of genetic material snipped from invading phages and incorporated into the bacterium’s own DNA. The adaptive immune system, known as CRISPR immunity, forms the basis of CRISPR-Cas genome editing techniques. “There’s a critical balance between these two systems, and both are critical for survival,” said Westra, whose research aims to determine the factors that influence whether a bacterium mounts an innate or adaptive immune defense against a particular phage.

Using Pseudomonas aeruginosa, an antibiotic-resistant pathogen that often infects cystic fibrosis patients, Westra determined that a bacterium’s environment—specifically, the level of available nutrients—determined which defensive strategy was utilized. In high-nutrient environments, almost all bacteria deployed an innate immune response to phage attacks, whereas in lower nutrient settings, CRISPR immunity dominated.

The level of available nutrients influences which immune strategy bacteria use to defend against phage attacks.

In experiments using moth larvae, Westra discovered that infections were more severe when bacteria utilized CRISPR immunity, whereas bacteria that evolved innate immunity often caused less aggressive infections. “If we can manipulate how bacteria evolve resistance to phages, this could potentially revolutionize the way we approach antimicrobial resistance, with major benefits to our healthcare,” Westra said.

Building Models of the World

Computational neuroscientist Timothy Behrens is fascinated with the basic functions and decisions of everyday life—the process of navigating our home or city, the steps involved in completing household tasks, the near-subconscious inferences that inform our understanding of the relationships between people and things. Behrens designs analytical tools to understand how neuronal activity in the brain gives rise to these thought processes and behaviors, and his research is illuminating how knowledge is organized in the brain.

The activities of grid cells and place cells are well understood. By creating spatial maps of the world, grid and place cells allow us to navigate familiar spaces and locate items, such as car keys. Behrens explained that much less is known about how the brain encodes non-spatial, abstract concepts and sequence-based tasks, such as loading, running, and emptying a dishwasher. Over the past several years, Behrens and his collaborators have demonstrated that abstract information is similarly mapped as grid-like codes within the brain. “On some level, all relational structures are the same, and all are handled by the same neural machinery,” he said. This insight helps explain the effects of diseases like Alzheimer’s, which targets grid and place cells first and impacts both spatial and non-spatial knowledge.

Relational information is encoded by the same neural machinery that encodes spatial and navigational maps.

In another line of research, Behrens is probing a phenomenon called replay, during which the brain revisits recent memories as a means to consolidate knowledge about current events and anticipate future ones. Behrens illustrated the concept by showing patterns of neuronal activity as a rat runs around a track, then rests. Even at rest, the rat’s brain displays millisecond-long flashes of neuronal activity that mimic those that take place during running. “He’s not running down the track anymore, but his brain is,” said Behrens. Replay also underlies the human ability to understand a simple story even when it’s told in the wrong order. “Our knowledge of the world tells us…what the correct order is, and replay will rapidly stitch together the events in the correct order.”

Computational tools developed in Behrens’ lab have been shared with thousands of scientists around the globe as they pursue new hypotheses about the neural computations that control cognition and behavior. “It’s an exciting time to be thinking about the brain,” Behrens said.

Further Readings

Stride

Beguin E, Shrivastava S, Dezhkunov NV, et al.

Direct Evidence of Multibubble Sonoluminescence Using Therapeutic Ultrasound and Microbubbles

ACS Appl Mater Interfaces. 2019 Jun 5;11(22):19913-19919

Beguin E, Bau L, Shrivastava S, Stride E.

Comparing Strategies for Magnetic Functionalization of Microbubbles

ACS Appl Mater Interfaces. 2019 Jan 16;11(2):1829-1840

Westra

Alseth EO, Pursey E, Luján AM, et al.

Bacterial Biodiversity Drives the Evolution of CRISPR-based Phage Resistance in Pseudomonas Aeruginosa

Nature. 2019 Oct;574(7779):549-552

Westra ER, van Houte S, Gandon S, Whitaker R.

The Ecology and Evolution of Microbial CRISPR-Cas Adaptive Immune Systems

Philos Trans R Soc Lond B Biol Sci. 2019 May.13;374(1772):20190101

Behrens

Liu Y, Dolan RJ, Kurth-Nelson Z, Behrens TEJ

Human Replay Spontaneously Reorganizes Experience

Cell. 2019 Jul 25;178(3):640-652.e14

Constantinescu AO, O’Reilly JX , Behrens TEJ

Organizing Conceptual Knowledge in Humans With a Gridlike Code

Science. 2016 Jun 17;352(6292):1464-1468

Behrens TEJ, Muller TH, Whittington James CR

What Is a Cognitive Map? Organizing Knowledge for Flexible Behavior

Neuron. 2018 Oct 24;100(2):490-509

Changing the Game in Chemistry

Speakers

Matthew J. Fuchter, PhD
Imperial College London

Stephen M. Goldup, PhD
University of Southampton

Kirsty Penkman, PhD
University of York

Exploiting Molecular Shape to Develop Materials and Medicines

Consider the handshake: a greeting so automatic it takes place without thinking. Two right hands extend and naturally lock together, but as Matthew Fuchter explained, that easy connection becomes impossible if one party offers their left hand instead. The fumbling that ensues stems from a type of asymmetry called chirality. Chiral objects, such as hands, are mirror-image forms that cannot be superimposed or overlapped, and when one chiral object interacts with another, their chirality dictates the limits of their interaction. Chirality can be observed throughout nature, from the smallest biological molecules to the structures of skyscrapers.

In organic chemistry, molecular chirality can be exploited to tremendous advantage. Fuchter explained that the shape of molecules “is not only critical for their molecular properties, but also for how they interact with their environment.” By controlling subtle aspects of molecular shape, Fuchter is pioneering new strategies in drug design and devising solutions to technological problems that plague common electronic devices.

The notion of pairing complementary molecular geometries to achieve a specific effect is not unique to drug design—such synchronicities can be found throughout nature, including in the “lock and key” structure of enzymes and their substrates. Fuchter’s work aims to invent new drug molecules with geometries perfectly suited to bind to specific biological targets, including those implicated in diseases such as malaria and cancer.

Only one of these two chiral molecules has the correct orientation, or “handedness” to bind to the receptor site on the target protein.

Fuchter is also exploring applications for chirality in a field where the concept is less prominent—consumer electronics. Organic LED, or OLED, technology has “revolutionized the display industry,” allowing manufacturers to create ultra-thin, foldable screens for smartphones and other displays. Yet these features come at a steep efficiency cost—more than half of the light generated by OLED pixels is blocked by anti-glare filters added to the screens to minimize reflectiveness. A novel solution, in the form of chiral molecules bound to non-chiral OLED-optimized polymers, induces a chiral state of light called circularly polarized light. These circularly polarized, chiral light molecules are capable of bypassing the anti-glare filter on OLED screens. Fuchter noted that displays are far from the only technology that stands to be impacted by the introduction of chiral molecules. “Our research is generating new opportunities for chiral molecules to control electron transport and electron spin, which could lead to new approaches in data storage,” he said.

Making Use of the Mechanical Bond

Most molecules are bound by chemical bonds—strong, glue-like connections that maintain the integrity of molecules, which can be both simple, such as hydrogen, and highly complex, such as DNA. 2020 Blavatnik Awards UK Finalist Stephen Goldup’s work focuses on a less familiar bond. Mechanical bonds join molecules in a manner akin to an interconnected chain of links—the components retain movement, yet cannot separate.

Mechanically interlocked molecules have the potential to yield materials with “exciting properties,” according to Goldup, but in the decades since they were first synthesized, they have largely been regarded as “molecular curiosities.” Goldup’s lab is working to push these molecules beyond the laboratory bench by characterizing the properties of interlocked molecules and probing their potential applications in unprecedented ways. His work focuses on two types of mechanically bound molecules—catenanes, in which components are linked together like a chain, and rotaxanes, which consist of a ring component threaded through a dumbbell-shaped axle.

Goldup’s lab has taken cues from nature to introduce additional elements into rotaxanes, resulting in novel molecules with a variety of potential applications. For example, much as enzymes contain “pockets” within which small molecules can bind, rotaxanes too contain a space that can trap a molecule or ion of interest. Rotaxanes that bind metal ions have unique magnetic and electronic properties that could be used in memory storage devices or medical imaging. Inspired by proteins and enzymes that bind DNA, Goldup’s lab has also designed rotaxanes in which DNA itself is the “axle.” In theory, these molecules can be used to effectively “hide” portions of DNA and alter its biological behavior.

Just as enzymes bind small molecules with their structures, rotaxanes can bind molecules in the cavity between the ring and the axle.

Perhaps most significantly, Goldup’s lab has solved a longstanding obstacle to studying rotaxanes: the difficulty of making them. The problem lies in the fact that rotaxanes can be chiral even when their components are not, making it extremely challenging to synthesize a distinct “hand,” or version, of the molecule. Recalling Matthew Fuchter’s example of how an awkward left-hand/right-hand handshake differentiates the “handedness” of two chiral objects, Goldup explained how his lab developed a technique for synthesizing distinctly “left” or “right” handed rotaxanes by utilizing a chiral axle to build the molecules. “Our insight was that by making the axle portion chiral on its own, when we thread the axle into the ring, the rotaxanes we make are no longer mirror-images of each other. They have different properties, and they can now be separated,” he said. Once separate, the chiral portion of the axle can be chemically removed and replaced with other functional groups.

Goldup’s lab is conducting experiments with new mechanically-locked molecules—including chiral rotaxane catalysts— to determine where they may outperform existing catalysts.

Amino Acids as a Portal to the Past

Scientists have multiple methods for peering into the history of Earth’s climate, including sampling marine sediment and ice cores that encapsulate environmental conditions stretching back millions of years. “But this is an incomplete picture—akin to a musical beat with no notes,” said Kirsty Penkman, the 2020 Blavatnik Awards UK Laureate in Chemistry. The records of life on land—fossil records—provide “the notes to our tune, and if we know the timing, that gives us the whole melody,” she said.  Archaeologists, paleontologists, and climate scientists can harmonize fossil records with climate history to understand the past, yet their efforts stall with fossils older than 50,000 years—the limit of radiocarbon dating.

Penkman’s lab is developing dating methods for organic remains that reach far deeper into the history of life on Earth. Their strategy relies not on the decay of carbon, but the conversion of amino acid molecules from one form to another. Continuing the theme of chirality from previous presentations, Penkman explained that amino acids exist in two mirror-image forms. However, the body only synthesizes amino acids in the “left-handed,” or L-form. This disequilibrium shifts after death, when a portion of L-amino acids begins a slow, predictable conversion to the right-handed, or D-form. The older the fossil, the greater the balance between D and L isomers. This conversion process, called racemization, was first proposed as a dating method in the 1960s. Yet, it became clear that some of the fossil amino acids were vulnerable to environmental factors that impact the racemization rate, and therefore the date.

About 15 years ago, Penkman discovered that minute stores of proteins within the remains of snail shells are entrapped in intracrystalline voids. These tiny time capsules are unaffected by environmental factors. Studies have since confirmed that shells found in older horizons, for example deeper underground, contain higher ratios of D-amino acids versus those found at younger sites, thus validating the technique.

Calcitic snail shells found at older horizons have higher ratios of D-amino acids than those found at younger horizons.

Snail shells are often found in archeological sites, a serendipity that has led to astonishing findings about early human migration. Shells found alongside several Paleolithic tools “dated as far back as 700,000 years,” according to Penkman. “We’ve successfully shown that early humans were living in Northern Europe 200,000 years earlier than previously believed,” she said.

Penkman’s team has analyzed remains of ostrich eggshells at some of the earliest human sites in Africa, discovering fully preserved, stable sequences of proteins in shells dating back 3.8 million years. Mammalian remains are the next frontier for Penkman’s lab. They have analyzed amino acids in ancient tooth enamel—including that of a 1.7-million-year-old rhinoceros—and are developing microfluidic techniques to sample enamel from early human remains.

Further Readings

Fuchter

Yang Y, Rice B, Shi X, et al.

Emergent Properties of an Organic Semiconductor Driven by its Molecular Chirality

ACS Nano. 2017 Aug 22;11(8):8329-8338

Yang Y, Correa da Costa R, Fuchter MJ, Campbell AJ

Circularly polarized light detection by a chiral organic semiconductor transistor

Nat. Photonics. 2013 July 21;7:634–638

Goldup

Jamieson EMG, Modicom F, Goldup SM

Chirality in Rotaxanes and Catenanes

Chem Soc Rev. 2018 Jul 17;47(14):5266-5311

Lewis JEM, Beer PD, Loeb SJ, Goldup SM

Metal Ions in the Synthesis of Interlocked Molecules and Materials

Chem Soc Rev. 2017 May 9;46(9):2577-2591

Galli M, Lewis JEM, Goldup SM

A Stimuli-responsive Rotaxane–Gold Catalyst: Regulation of Activity and Diastereoselectivity

Angewandte Chemie International Edition. 2015

Penkman

Penkman KEH, Kaufman DS, Maddy D, Collins MJ

Closed-system Behavior of the Intra-crystalline Fraction of Amino Acids in Mollusk Shells

Quaternary Geochronology. 2008. Feb-May; 3, 1–2:2-25

Demarchi B, Hall S, Roncal-Herrero T, et al

Protein Sequences Bound to Mineral Surfaces Persist Into Deep Time

eLife. 2016 Sep 27;5:e17092

Penkman KEH, Preece RC, Bridgland DR, et al

A Chronological Framework for the British Quaternary Based on Bithynia Opercula

Nature. 2011 Jul 31;476(7361):446-9

Changing the Game in Physical Sciences and Engineering

Speakers

Amaury Triaud
University of Birmingham

Ian Chapman
UK Atomic Energy Authority and Culham Centre for Fusion Energy

Claudia de Rham
Imperial College London

Worlds Beyond Our Solar System

For millennia, humans have wondered whether life exists beyond our planet.  Amaury Triaud, 2020 Blavatnik Awards UK Finalist believes we are closer to answering that question now than at any other time in history. The study of exoplanets—planets that orbit stars other than the Sun—offers what Triaud believes is “the best hope for finding out how often genesis happens, and under what conditions.”

The search for exoplanets has revealed remarkable variety among stars and planets in our galaxy. “The universe is far more surprising and diverse than we anticipated,” said Triaud. Astronomers have identified thousands of exoplanets since 1995, and now estimate that there are more planets in the Milky Way than stars—”something we had no idea about ten years ago,” Triaud said. Many exoplanets orbit stars so much smaller than the Sun that these stars cannot be seen with the naked eye.  Yet these comparatively small stars provide “optimal conditions” for exoplanet hunters.

Exoplanets are often detected using the transit method—as an orbiting planet passes in front of a star, its shadow temporarily dims the star’s brightness. The larger the planet relative to the star, the greater its impact on the brightness curve and the easier for astronomers to detect. While monitoring a small star 39 light-years from Earth, TRAPPIST-1, a team of astronomers, including Triaud, discovered an exoplanet system comprised of seven rocky planets similar in size to Earth, Venus, and Mercury.

“The next question is to find out whether biology is happening out there,” said Triaud, joking that the biology of interest is not little green men, but rather green algae or microbes similar to the ones that fill our atmosphere with oxygen. The presence of oxygen “acts like a beacon through space, broadcasting that here on Earth, there is life,” said Triaud, explaining that the only way to gauge the presence of life on exoplanets is through atmospheric analysis. Using transmission spectroscopy, Triaud and other astronomers will look for exoplanets that possess an atmosphere and chemical signatures of life, such as oxygen, ozone, or methane, in the atmospheric composition of exoplanets.

Measurements of spectral signatures in a planet’s atmosphere can reveal the presence of gases associated with life, including oxygen and methane. 

Such analyses will begin with the launch of the James Webb telescope in 2021.  In the meantime, a land-based mission called Speculoos, based partially in Chile’s Atacama desert, is monitoring 1,400 stars in search of additional exoplanets. “It’s rather poetic that from one of the most inhospitable places on Earth, we are on the path to investigating habitability and the presence of life in the cosmos,” Triaud said.

The Path to Delivering Fusion Power

“There’s an old joke that nuclear fusion is 30 years away and somehow always will be,” said 2020 Blavatnik Awards UK Finalist Ian Chapman, but he insists that the joke will end soon. According to Chapman, the “ultimate energy source” is entering the realm of reality. “We’re now in the delivery era, where fusion lives up to its potential,” he said. Low-carbon, low-waste, capable of producing tremendous amounts of energy from an unlimited fuel source—seawater—and far safer than nuclear fission, fusion power has a long list of desirable qualities. Chapman is the first to acknowledge that fusion is “really hard,” but his work is helping to ease the challenges and bring a future of fusion into focus.

Nuclear fusion relies on the collision of two atoms—deuterium, or “heavy” hydrogen, and tritium, an even heavier isotope of hydrogen. Inside the Sun, these atoms collide and fuse, producing the heat and energy that powers the star. Replicating that process on Earth requires enough energy to heat the fuel. of deutrium and tritium gases to temperatures ten times hotter than the Sun, a feat that Chapman admits “sounds bonkers, but we do it every day.”

Within fusion reactors called tokamaks, this superhot fuel is trapped between arrays of powerful magnets that “levitate” the jet as it spins around a central magnetic core, preventing the fuel from melting reactor walls. Yet this is an imperfect process, explained Chapman, and due to fuel instabilities, eruptions akin to “throwing a hand grenade into the bottom of the machine” happen as often as once per second. Chapman devised a method based on his numerical calculations for preventing these eruptions using additional magnet arrays that induce three-dimensional perturbations, or “lobes” at the edge of the plasma stream. Just as a propped-open lid on a pot of boiling water allows steam to escape, these lobes provide a path to release excess pressure.

An array of magnets near the plasma edge creates perturbations in the fuel stream, allowing pressure to escape safely.

Chapman’s technique has been incorporated into the “the biggest scientific experiment ever undertaken by humankind”—a massive tokamak called ITER, roughly the size of a football stadium and equipped with a central magnet strong enough to lift an aircraft carrier. Scheduled to begin producing power in 2025, ITER aims to demonstrate the commercial viability of nuclear fusion. “We can put 50 megawatts of power into the machine, and it produces 500 megawatts of power out,” said Chapman. “That’s enough to power a medium-sized city for a day.”

Even before ITER’s completion, Chapman and others are setting their sights on designing less expensive fusion devices. Late last year, the UK committed to building a compact tokamak that offers the benefits of fusion with a smaller footprint, and Chapman is the leader of this project.

The Nature of Gravity

Claudia de Rham, the 2020 Blavatnik Awards UK Laureate in Physical Sciences and Engineering, concluded the day’s research presentations with an exploration of nothing less than “the biggest mystery in physics today.”  For decades, cosmologists and physicists have grappled with discrepancies between observations about the universe—for example, its accelerated expansion— and Einstein’s general theory of relativity, which dictates that gravity should gradually slow that expansion. “The universe is behaving in unexpected ways,” said de Rham, whose efforts to resolve this question stand to profoundly impact all areas of physics.

Understanding the fundamental nature of gravity is key to understanding the origin and evolution of the universe. As de Rham explained, gravity can be detected in the form of gravitational waves, which are produced when two black holes or neutron stars rotate around each other, perturbing the fabric of spacetime and sending rippling waves outward like a stone tossed into a pond. But gravity can also be represented as a fundamental particle, the graviton, similar to the way light can be considered as a particle, the photon, or an electromagnetic wave.  Unlike the other fundamental particles such as the photon, the electron, the neutrino, or even the famously elusive Higgs boson, the graviton has never been observed. In theory, the graviton would, like all fundamental particles, exist even in a perfect vacuum, a phenomenon known as vacuum quantum fluctuation. Unknown in Einstein’s day, vacuum quantum fluctuations, when factored into the general theory of relativity, do predict an accelerated expansion of the universe. “That’s the good news,” said de Rham. “The bad news is that the predicted rate of expansion is too fast by at least 28 orders of magnitude.”

This raises the possibility that “general relativity may not be the correct description of gravity on large cosmological scales,” said de Rham. If the graviton had mass, however, it would impact the behavior of gravity on the largest scales and could explain the observed rate of expansion.

Signal patterns from gravitational wave events can serve as models for estimating the mass of the graviton. By comparing the expected signals produced by either a massless particle or a high-mass particle with actual signal patterns from detected events, physicists can place an upper and lower boundary on the graviton’s potential mass.

The idea of a massive graviton has been considered—and refuted—by physicists as far back as the 1930s. Several years ago, de Rham, along with collaborators Andrew Tolley and Gregory Gabadadze, “realized a loophole that had evaded the whole community.” Together, they derived the first theory of massive gravity. “Through gravity, we can now connect small vacuum fluctuations with the acceleration of the universe, linking the infinitely small with the infinitely large,” de Rham said.

Determining the mass of the graviton requires the most precise scale imaginable, and de Rham believes that gravitational wave observatories are perfectly suited to the task. Whether her theory will hold up in future tests remains to be seen, but when it comes to solving this epic mystery, “the possibility is now open.”

Further Readings

Triaud

Gillon M, Triaud AH, Demory BO, et al.

Seven temperate terrestrial planets around the nearby ultracool dwarf star TRAPPIST-1

Nature. 2017 Feb 22;542(7642):456-460

Gillon M,  1 , Jehin E, Lederer SM, et al

Temperate Earth-sized Planets Transiting a Nearby Ultracool Dwarf Star

Nature. 2016 May 12;533(7602):221-4

de Wit J, Wakeford HR, Gillon M, et al

A Combined Transmission Spectrum of the Earth-sized Exoplanets TRAPPIST-1 B and C

Nature. 2016 Sep 1;537(7618):69-72

Chapman

Kirk A, Harrison J, Liu Y, et al.

Observation of Lobes Near the X Point in Resonant Magnetic Perturbation Experiments on MAST

Phys Rev Lett. 2012 Jun 22;108(25):255003

Chapman IT, Morris AW

UKAEA Capabilities to Address the Challenges on the Path to Delivering Fusion Power

Philos Trans A Math Phys Eng Sci. 2019 Mar 25;377(2141):20170436

Claudia de Rham

de Rham C.

Massive Gravity

Living Rev Relativ. 2014;17(1):7.

de Rham C, Gabadadze G, Tolley AJ

Resummation of Massive Gravity

Phys Rev Lett. 2011 Jun 10;106(23):231101

de Rham C, Deskins JT, Tolley AJ, Zhou S.

Graviton Mass Bounds

Rev. Mod. Phys. 89 (2017), 025004

Panel Discussion: Hopes for the Future

Speakers

Ian Chapman, PhD
UK Atomic Energy Authority

Kirsty Penkman, PhD
University of York

Eleanor Stride, PhD
University of Oxford

Edze Westra, PhD
University of Exeter

Victoria Gill
BBC News (Moderator)

Several Laureates and Finalists of the 2020 Blavatnik Awards in the UK joined BBC science reporter Victoria Gill for the final session of the day, a wide-ranging panel discussion that touched on issues both current and future-looking.

Two themes—fear and opportunity— emerged as powerful forces shaping science and society, especially as it relates to climate change and the threat of emerging infectious disease. Gill noted that climate change is “the biggest challenge ever to face humanity,” and that many efforts to raise awareness of its impacts focus on bleak projections for the future. Asked for insights on shifting the tone of climate change communications, Kirsty Penkman acknowledged that “there needs to be a certain level of fear to get people’s attention.” She then advocated for a solutions-oriented plan rooted in the fast pace of scientific progress in clean energy, among other areas. “This is an amazing opportunity,” she said. “Humans are ingenious….in the last 120 years we’ve moved from a horse-drawn economy to a carbon-based economy, and in 5 or 20 years we could be in a fusion-based economy. We have the potential to open up a whole new world.” Eleanor Stride suggested combatting complacency by emphasizing the power of small changes in mitigating the impact of climate change. “One billion people making a tiny change has a huge impact,” she said.

The specter of a coronavirus pandemic had not yet become a reality at the time of the symposium. But Edze Westra presciently detailed the challenges of containing a highly contagious emerging pathogen in a “tightly connected world.” He commented that detecting and containing emerging diseases hinges on the development of new diagnostics, and that preventing future outbreaks will require cultural shifts to limit high-risk interactions with wildlife. For zoonotic diseases such as the novel coronavirus, “it’s all about opportunity,” Westra said.

Panelists also looked to the future of science, touching on issues of equality, discrimination, and diversity, and emphasizing the importance of raising the bar for science education. Stride noted that children are natural scientists, gravitating toward problem-solving and puzzles regardless of nationality or gender. “But something happens later,” she said, lamenting the drop in interest in science as children progress in school. “One of the things that gets lost is that creativity, which is what science really is—we’re coming up with a guess and trying to gather evidence for it—we’re not just learning a huge number of facts and regurgitating them,” she said.

In the wake of Brexit, panelists expressed concern about potential difficulties in attracting international students to their labs. “Diversity is so important,” said Penkman. “Getting ideas from all around the world from people with different backgrounds is essential to making science in the UK—and the world—the best it can be.” In her closing comments, Penkman said that ultimately, the trajectory of science comes down to the people in the field. “My eternal optimism is in the people I work with and the people I talk to when I visit schools—it’s that innate interest and curiosity. Whenever I see it, I feel that is the future of science,” she said.