Skip to main content

Shaping the Future of Science: 2019 Blavatnik Science Symposium

Overview

The New York Academy of Sciences and the Blavatnik Family Foundation hosted the annual Blavatnik Science Symposium on July 15–16, 2019, uniting 75 Finalists, Laureates, and Winners of the Blavatnik Awards for Young Scientists. Honorees from the UK and Israel Awards programs joined Blavatnik National and Regional Awards honorees from the U.S. for what one speaker described as “two days of the impossible.” Nearly 30 presenters delivered research updates over the course of nine themed sessions, offering a fast-paced peek into the latest developments in materials science, quantum optics, sustainable technologies, neuroscience, chemical biology, and biomedicine.

Symposium Highlights

  • Computer vision and machine learning have enabled novel analyses of satellite and drone images of wildlife, food crops, and the Earth itself. 
  • Next-generation atomic clocks can be used to study interactions between particles in complex many-body systems.
  • Bacterial communities colonizing the intestinal tract produce bioactive molecules that interact with the human genome and may influence disease susceptibility.
  • New catalysts can reduce carbon emissions associated with industrial chemical production.
  • Retinal neurons display a surprising degree of plasticity, changing their coding in response to repetitive stimuli.
  • New approaches for applying machine learning to complex datasets is improving predictive algorithms in fields ranging from consumer marketing to healthcare.
  • Breakthroughs in materials science have resulted in materials with remarkable strength and responsiveness.
  • Single-cell genomic studies are revealing some of the mechanisms that drive cancer development, metastasis, and resistance to treatment.

Speakers

Emily Balskus, PhD
Harvard University

Chiara Daraio, PhD
Caltech

William Dichtel, PhD Northwestern University

Elza Erkip, PhD
New York University

Lucia Gualtieri, PhD
Stanford University

Ive Hermans, PhD
University of Wisconsin – Madison

Liangbing Hu, PhD
University of Maryland, College Park

Jure Leskovec, PhD
Stanford University

Heather J. Lynch, PhD
Stony Brook University

Wei Min, PhD
Columbia University

Seth Murray, PhD
Texas A & M University

Nicholas Navin, PhD, MD
MD Anderson Cancer Center

Ana Maria Rey, PhD
University of Colorado Boulder

Michal Rivlin, PhD
Weizmann Institute of Science

Nieng Yan, PhD
Princeton University

Event Sponsor

Technology for Sustainability

Speakers

Heather J. Lynch
Stony Brook University

Lucia Gualtieri
Stanford University

Seth Murray
Texas A & M University

Highlights

  • Machine learning algorithms trained to analyze satellite imagery have led to the discovery of previously unknown colonies of Antarctic penguins.
  • Seismographic data can be used to analyze more than just earthquakes—typhoons, hurricanes, iceberg-calving events and landslides are reflected in the seismic record.
  • Unmanned aerial systems are a valuable tool for phenotypic analysis in plant breeding, allowing researchers to take frequent measurements of key metrics during the growing season and identify spectral signatures of crop yield.

Satellites, Drones, and New Insights into Penguin Biogeography

Satellite images have been used for decades to document geological changes and environmental disasters, but ecologist and 2019 Blavatnik National Awards Laureate in Life Sciences, Heather Lynch, is one of the few to probe the database in search of penguin guano. She opened the symposium with the story of how the Landsat satellite program enabled a surprise discovery of several of Earth’s largest colonies of Adélie penguins, a finding that has ushered in a new era of insight into these iconic Antarctic animals.

Steady streams of high quality spatial and temporal data regularly support environmental science. In contrast, Lynch noted that wildlife biology has advanced so slowly that many field techniques “would be familiar to Darwin.” Collecting information on animal populations, including changes in population size or migration patterns, relies on arduous and imprecise counting methods. The quest for alternative ways to track wildlife populations—in this case, Antarctic penguin colonies—led Lynch to develop a machine learning algorithm for automated identification of penguin guano in high resolution commercial satellite imagery, which can be combined with lower resolution imagery like that coming from NASA’s Landsat program. Pairing measurements of vast, visible tracts of penguin guano—the excrement colored bright pink due to the birds’ diet—with information about penguin colony density yields near-precise population information. The technique has been used to survey populations in known penguin colonies and enabled the unexpected discovery of a “major biological hotspot” in the Danger Islands, on the tip of the Antarctic Peninsula. This Antarctic Archipelago is so small that it is doesn’t appear on most maps of the Antarctic continent, yet it hosts one of the world’s largest Adélie penguin hotspots.

Satellite images of the pink stains of Antarctic penguin guano have been used to identify and track penguin populations.

Lynch and her colleagues are developing new algorithms that utilize high-resolution drone and satellite imagery to create centimeter-scale, 3D models of penguin terrain. These models feed into detailed habitat suitability and population-tracking analyses that further basic research and can even influence environmental policy decisions. Lynch noted that the discovery of the Danger Island colony led to the institution of crucial environmental protections for this region that may have otherwise been overlooked.  “Better technology actually can lead to better conservation,” she said.

Listening to the Environment with Seismic Waves

The study of earthquakes has dominated seismology for decades, but new analyses of seismic wave activity are broadening the field. “The Earth is never at rest,” said Lucia Gualtieri, 2018 Blavatnik Regional Awards Finalist, while reviewing a series of non-earthquake seismograms that show constant, low-level vibrations within the Earth. Long discarded as “seismic noise,” these data, which comprise more than 90% of seismograms, are now considered a powerful tool for uniting seismology, atmospheric science, and oceanography to produce a holistic picture of the interactions between the solid Earth and other systems.

In addition to earthquakes, events such as hurricanes, typhoons, and landslides are reflected in the seismic record.

Nearly every environmental process generates seismic waves. Hurricanes, typhoons, and landslides have distinct vibrational patterns, as do changes in river flow during monsoons and “glacial earthquakes” caused by ice calving events. Gualtieri illustrated how events on the surface of the Earth are reflected within the seismic record—even at remarkably long distances—including a massive landslide in Alaska detected by a seismic sensor in Massachusetts. Gualtieri and her collaborators are tapping this exquisite sensitivity to create a new generation of tools capable of measuring the precise path and strength of hurricanes and tropical cyclones, and for making predictive models of cyclone strength and behavior based on decades of seismic data.

Improving Crop Yield Using Unmanned Aerial Systems and Field Phenomics

Plant breeders like Seth Murray, 2019 Blavatnik National Awards Finalist, are uniquely attuned to the demands a soaring global population places on the planet’s food supply. Staple crop yields have skyrocketed thanks to a century of advances in breeding and improved management practices, but the pressure is on to create new strategies for boosting yield while reducing agricultural inputs. “We need to grow more plants, measure them better, use more genetic diversity, and create more seasons per year,” Murray said. It’s a tall order, but one that he and a transdisciplinary group of collaborators are tackling with the help of a fleet of unmanned aerial systems (UAS), or drones.

Drones facilitate frequent measurement of plant height, revealing variations between varietals early in the growth process.

Genomics has transformed many aspects of plant breeding, but phenotypic, rather than genotypic, information is more useful for predicting crop yield. Using drones equipped with specialized equipment, Murray has not only automated many of the time-consuming measurements critical for plant phenotyping, such as tracking height, but has also identified novel metrics that can accelerate the development of new varietals. Spectral signatures obtained via drone can be used to identify top-yielding varietals of maize even before the plants are fully mature. Phenotypic features distilled from drone images are also being used to determine attributes such as disease resistance, which directly influence crop management. Murray’s team is modeling the influence of thousands of phenotypes on overall crop performance, paving the way for true phenomic selection in plant breeding.

Further Readings

Lynch

Borowicz A, McDowall P, Youngflesh C, et al.

Multi-modal survey of Adélie penguin mega-colonies reveals the Danger Islands as a seabird hotspot.

Sci Rep. 2018 Mar 2;8(1):3926.

Che-Castaldo C, Jenouvrier S, Youngflesh C, et al.

Pan-Antarctic analysis aggregating spatial estimates of Adélie penguin abundance reveals robust dynamics despite stochastic noise.

Nat Commun. 2017 Oct 10;8(1):832.

Murray

Zhang M, Cui Y, Liu YH, et al.

Accurate prediction of maize grain yield using its contributing genes for gene-based breeding.

Genomics. 2019 Feb 28. pii: S0888-7543(18)30708-0.

Shi Y, Thomasson JA, Murray SC, et al.

Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research.

PLoS One. 2016 Jul 29;11(7):e0159781.

Quantum Optics

Speakers

Ana Maria Rey
University of Colorado Boulder

Highlights

  • Quantum mechanics underlies the technologies of modern computing, including transistors and integrated circuits.
  • Most quantum insights are derived from studies of single quantum particles, but understanding interactions between many particles is necessary for the development of devices such as quantum computers.
  • Atoms cooled to one billionth of a degree above absolute zero obey the laws of quantum mechanics, and can be used as quantum simulators to study many-particle interactions.

Atomic Clocks: From Timekeepers to Quantum Computers

The discovery of quantum mechanics opened “a new chapter in human knowledge,” said 2019 Blavatnik National Awards Laureate in Physical Sciences & Engineering, Ana Maria Rey, describing how the study of quantum phenomena has revolutionized modern computing, telecommunications, and navigation systems. Transistors, which make up integrated circuits, and lasers, which are the foundation of the atomic clocks that maintain the precision of satellites used in global positioning systems, all stem from discoveries about the nature of quantum particles.

The next generation of innovations—such as room temperature superconductors and quantum computers—will be based on new quantum insights, and all of this hinges on our ability to study interactions between many particles in quantum systems. The complexity of this task is beyond the scope of even the most powerful supercomputers. As Rey explained, calculating the possible states for a small number of quantum particles (six, for example) is simple. “But if you increase that by a factor of just 10, you end up with a number of states larger than the number of stars in the known universe,” she said.

Calculating the number of possible states for even a small number of quantum particles is a task too complex for even the most powerful supercomputer.

Researchers have developed several experimental platforms to clear this hurdle and explore the quantum world. Rey shared the story of how her work developing ultra-precise atomic clocks inadvertently led to one experimental platform that is already demystifying some aspects of quantum systems.

Atomic clocks keep time by measuring oscillations of atoms—typically in cesium atoms—as they change energy levels. Recently, Rey and her collaborators at JILA built the world’s most sensitive atomic clock using strontium atoms instead of cesium and using many more atoms that are typically found in these clocks. The instrument had the potential to be 1,000 times more sensitive than its predecessors, yet collisions between the atoms compromised its precision. Rey explained that by suppressing these collisions, their clock became “a window to explore the quantum world.” Within this framework, the atoms can be manipulated to simulate the movement and interactions of quantum particles in solid-state materials. Rey reported that this clock-turned-quantum simulator has already generated new findings about phenomena including superconductivity and quantum magnetism.

Further Readings

Rey

Goban A, Hutson R, Marti GE, et al.

Emergence of multi-body interactions in a fermionic lattice clock.

Nature. 2018 Nov;563(7731):369-373.

Kolkowitz S, Bromley SL, Bothwell T, et al.

Spin-orbit-coupled fermions in an optical lattice clock.

Nature. 2017 Feb 2;542(7639):66-70.

Chemical Biology

Speakers

Emily Balskus
Harvard University

Highlights

  • The human gut is colonized by trillions of bacteria that are critical for host health, yet may also be implicated in the development of diseases including colorectal cancer.
  • For over a decade, chemists have sought to resolve the structure of a genotoxin called colibactin, which is produced by a strain of E. coli commonly found in the gut microbiome of colorectal cancer patients.
  • By studying the specific type of DNA damage caused by colibactin, researchers found a trail of clues that led to a promising candidate structure of the colibactin molecule.

Gut Reactions: Understanding the Chemistry of the Human Gut Microbiome

The composition of the trillions-strong microbial communities that colonize the mammalian intestinal tract is well characterized, but a deeper understanding of their chemistry remains elusive. Emily Balskus, the 2019 Blavatnik National Awards Laureate in Chemistry, described her lab’s hunt for clues to solve one chemical mystery of the gut microbiome—a mission that could have implications for colorectal cancer (CRC) screening and early detection.

Some commensal E. coli strains in the human gut produce a genotoxin called colibactin. When cultured with human cells, these strains cause cell cycle arrest and DNA damage, and studies have shown increased populations of colibactin-producing E. coli in CRC patients. Previous studies have localized production of colibactin within the E. coli genome and hypothesized that the toxin is synthesized through an enzymatic assembly line. Yet every attempt to isolate colibactin and determine its chemical structure had failed.

Balskus’ group took “a very different approach,” in their efforts to discover colibactin’s structure. By studying the enzymes that make the toxin, the team uncovered a critical clue: a cyclopropane ring in the structure of a series of molecules they believed could be colibactin precursors. This functional group, when present in other molecules, is known to damage DNA, and its detection in the molecular products of the colibactin assembly line led the researchers to consider it as a potential mechanism of colibactin’s genotoxicity.

In collaboration with researchers at the University of Minnesota School of Public Health, Balskus’ team cultured human cells with colibactin-producing E. coli strains as well as strains that cannot produce the toxin. They identified and characterized the products of colibactin-mediated DNA damage. “Starting from the chemical structure of these DNA adducts, we can work backwards and think about potential routes for their production,” Balskus explained.

A proposed structure for the genotoxin colibactin, which is associated with colorectal cancer, features two cyclopropane rings capable of interacting with DNA to generate interstrand cross links, a type of DNA damage.

Further studies revealed that colibactin triggers a specific type of DNA damage that requires two reactive groups—likely represented by two cyclopropane rings in the final toxin structure—a pivotal discovery in deriving what Balskus believes is a strong candidate for the true colibactin structure. Balskus emphasized that this work could illuminate the role of colibactin in carcinogenesis, and may lead to cancer screening methods that rely on detecting DNA damage before cells become malignant. The findings also have implications for understanding microbiome-host interactions. “These studies reveal that human gut microbiota can interact with our genomes, compromising their integrity,” she said.

Further Readings

Balskus

Jiang Y, Stornetta A, Villalta PW et al.

Reactivity of an Unusual Amidase May Explain Colibactin’s DNA Cross-Linking Activity.

J Am Chem Soc. 2019 Jul 24;141(29):11489-11496.

Wilson MR, Jiang Y, Villalta PW, et al.

The human gut bacterial genotoxin colibactin alkylates DNA.

Science. 2019 Feb 15;363(6428).

Synthetic Methodology

Speakers

Ive Hermans
University of Wisconsin – Madison

William Dichtel
Northwestern University

Highlights

  • The chemical industry is a major producer of carbon dioxide, and efforts to create more efficient and sustainable chemical processes are often stymied by cost or scale.
  • Boron nitride is not well known as a catalyst, yet experiments show it is highly efficient at converting propane to propylene—one of the most widely used chemical building blocks in the world.
  • Two-dimensional polymers called covalent organic frameworks (COFs) can be used for water filtration, energy storage, and chemical sensing.
  • Until recently, researchers have struggled to control and direct COF formation, but new approaches to COF synthesis are advancing the field.

Boron Nitride: A Surprising Catalyst

Industrial chemicals “define our standard of living,” said Ive Hermans, 2019 Blavatnik National Awards Finalist, before explaining that nearly 96% of the products used in daily life arise from processes requiring bulk chemical production. These building block molecules are produced at an astonishingly large scale, using energy-intensive methods that also produce waste products, including carbon dioxide.

Despite pressure to reduce carbon emissions, the pace of innovation in chemical production is slow. The industry is capital-intensive — a chemical production plant can cost more than $2 billion—and it can take a decade or more to develop new methods of synthesizing chemicals.  Concepts that show promise in the lab often fail at scale or are too costly to make the transition from lab to plant. “The goal is to come up with technologies that are both easily implemented and scalable,” Hermans said.

Catalysts are a key area of interest for improving chemical production processes. These molecules bind to reactants and can boost the speed and efficiency of chemical reactions. Hermans’ research focuses on catalyst design, and one of his recent discoveries, made “just by luck,” stands to transform production of one of the most in-demand chemicals worldwide—propylene.

Historically, propylene was one product (along with ethylene and several others) produced by “cracking” carbon–carbon bonds in naphtha, a crude oil component that has since been replaced by ethane (from natural gas) as a preferred starting material. However, ethane yields far less propylene, leaving manufacturers and researchers to seek alternative methods of producing the chemical.

Boron nitride catalyzes a highly efficient conversion of propane to propylene.

Enter boron nitride, a two-dimensional material whose catalytic properties took Hermans by surprise when a student in his lab discovered its efficiency at converting propane, also a component of natural gas, to propylene. Existing methods for running this reaction are endothermic and produce significant CO2. Boron nitride catalysts facilitate an exothermic reaction that can be conducted at far cooler temperatures, with little CO2 production. Better still, the only significant byproduct is ethylene, an in-demand commodity.

Hermans sees this success as a step toward a more sustainable future, where chemical production moves “away from a linear economy approach, where we make things and produce CO2 as a byproduct, and more toward a circular economy where we use different starting materials and convert CO2 back into chemical building blocks.”

Polymerization in Two Dimensions

William Dichtel, a Blavatnik National Awards Finalist in 2017 and 2019, offered an update from one of the most exciting frontiers in polymer chemistry—two-dimensional polymerization. The synthetic polymers that dominate modern life are comprised of linear, repeating chains of linked building blocks that imbue materials with specific properties. Designing non-linear polymer architectures requires the ability to precisely control the placement of components, a feat that has challenged chemists for a decade.

Dichtel described the potential of a class of polymers called covalent organic frameworks, or COFs—networks of polymers that form when monomers are polymerized into well-defined, two-dimensional structures. COFs can be created in a variety of topologies, dictated by the shape of the monomers that comprise it, and typically feature pores that can be customized to perform a range of functions. These materials hold promise for applications including water purification membranes, energy and gas storage, organic electronics, and chemical sensing.

Dichtel explained that COF development is a trial and error process that often fails, as the mechanisms of their formation are not well understood. “We have very limited ability to improve these materials rationally—we need to be able to control their form so we can integrate them into a wide variety of contexts,” he said.

Two-dimensional polymer networks can be utilized for water purification, energy storage, and many other applications, but chemists have long struggled to understand their formation and control their structure.

A breakthrough in COF synthesis came when chemist Brian Smith, a former postdoc in Dichtel’s lab, discovered that certain solvents allowed COFs to disperse as nanoparticles in solution rather than precipitating as powder. These particles became the basis for a new method of growing large, controlled crystalline COFs using nanoparticles as structural “seeds,” then slowly adding monomers to maximize growth while limiting nucleation. “This level of control parallels living polymerization, with well-defined initiation and growth phases,” Dichtel said.

More recently, Dichtel’s group has made significant advances in COF fabrication, successfully casting them into thin films that could be used in membrane and filtration applications.

Further Readings

Hermans

Zhang Z, Jimenez-Izal E, Hermans I, Alexandrova AN.

Dynamic Phase Diagram of Catalytic Surface of Hexagonal Boron Nitride under Conditions of Oxidative Dehydrogenation of Propane.

J Phys Chem Lett. 2019 Jan 3;10(1):20-25.

Love AM, Thomas B, Specht SE, et al.

Probing the Transformation of Boron Nitride Catalysts under Oxidative Dehydrogenation Conditions.

J Am Chem Soc. 2019 Jan 9;141(1):182-190.

Dichtel

Côté AP, Benin AI, Ockwig NW, et al.

Porous, crystalline, covalent organic frameworks.

Science. 2005 Nov 18;310(5751):1166-70.

Bisbey RP, Dichtel WR.

Covalent Organic Frameworks as a Platform for Multidimensional Polymerization.

ACS Cent Sci. 2017 Jun 28;3(6):533-543.

Mulzer CR, Shen L, Bisbey RP, et al.

Superior Charge Storage and Power Density of a Conducting Polymer-Modified Covalent Organic Framework.

ACS Cent Sci. 2016 Sep 28;2(9):667-673.

Smith BJ, Parent LR, Overholts AC, et al.

Colloidal Covalent Organic Frameworks.

ACS Cent Sci. 2017 Jan 25;3(1):58-65.

Li H. Evans AM, Castano I, et al.

Nucleation-Elongation Dynamics of Two-Dimensional Covalent Organic Frameworks.

ChemRxiv, 2019.

Advances in Neuroscience

Speakers

Michal Rivlin
Weizmann Institute of Science

Nieng Yan
Princeton University

Highlights

  • The 80 subtypes of retinal ganglion cells each encode different aspects of vision, such as direction and motion.
  • The “preferences” of these cells were believed to be hard-wired, yet experiments show that retinal ganglion cells can be reprogrammed by exposure to repetitive stimuli.
  • Sodium ion channels control electrical signaling in cells of the heart, muscles, and brain, and have long been drug targets due to their connection to pain signaling.
  • Cryo-electron microscopy has allowed researchers to visualize Nav 7, a sodium ion channel implicated in pain syndromes, and to identify molecules that interfere with its function.

Retinal Computations: Recalculating

The presentation from Michal Rivlin, the Life Sciences Laureate of the 2019 Blavatnik Awards in Israel, began with an optical illusion, a dizzying exercise during which a repetitive, unidirectional pattern of motion appeared to rapidly reverse direction. “You probably still perceive motion, but the image is actually stable now,” Rivlin said, completing a powerful demonstration of the action of direction-sensitive retinal ganglion cells (RGCs), whose mechanisms she has studied for more than a decade. The approximately 80 subtypes of RGCs each encode a different aspect, or modality of vision—motion, color, and edges, as well as perception of visual phenomena such as direction. These modalities are hard-wired into the cells and were thought to be immutable—a retinal ganglion cell that perceived left-to-right motion was thought incapable of responding to visual signals that move right-to-left. Rivlin’s research has challenged not only this notion, but also many other beliefs about the function and capabilities of the retina.

Rather than simply capturing discrete aspects of visual information like a camera and relaying that information to the visual thalamus for processing, the cells of the retina actually perform complex processing functions and display a surprising level of plasticity. Rivlin’s lab is probing both the anatomy and functionality of various types of retinal ganglion cells, including those that demonstrate selectivity, such as a preference for movement in one direction or attunement to increases or decreases in illumination. By exposing these cells to various repetitive stimuli, Rivlin has shown that the selectivity of RGCs can be reversed, even in adult retinas.

Direction-selective retinal ganglion cells that prefer left-to-right motion (Before) can change their directional preference (After) following a repetitive visual stimulus.

These dynamic changes in cells whose preferences were believed to be singular and hard-wired have implications not just for understanding retinal function but for understanding the physiological basis of visual perception. Stimulus-dependent changes in the coding of retinal ganglion cells also have downstream impacts on the visual thalamus, where retinal signals are processed. This unexpected plasticity in retinal cells has led Rivlin and her collaborators to investigate the possibility that the visual thalamus and other parts of the visual system might also display greater plasticity than previously believed.

Targeting Sodium Channels for Pain Treatment

Nature’s deadliest predators may seem an unlikely inspiration for developing new analgesic drugs, but as Nieng Yan, 2019 Blavatnik National Awards Finalist, explained, the potent toxins of some snails, spiders, and fish are the basis for research that could lead to safer alternatives to opioid medications.

Voltage-gated ion channels are responsible for electrical signaling in cells of the brain, heart, and skeletal muscles. Sodium channels are one of many ion channel subtypes, and their connection to pain signaling is well documented. Sodium channel blockers have been used as analgesics for a century, but they can be dangerously indiscriminate, inhibiting both the intended channel as well as others in cardiac or muscle tissues. The development of highly selective small molecules capable of blocking only channels tied to pain signaling seemed nearly impossible until two breakthroughs—one genetic, the other technological—brought a potential path for success into focus.

A 2006 study of families with a rare genetic mutation that renders them fully insensitive to pain turned researchers’ focus to the role of the gene SCN9A, which codes for the voltage-gated sodium ion channel Nav 1.7, in pain syndromes. Earlier studies showed that overexpression of SCN9A caused patients to suffer extreme pain sensitivity, and it was now clear that loss of function mutations resulted in the opposite condition.

A powerful natural toxin derived from corn snails blocks the pore of a voltage-gated sodium channel, halting the flow of ions and inhibiting the initiation of an action potential.

As Yan explained, understanding this channel required the ability to resolve its structure, but imaging techniques available at that time were poorly suited to large, membrane-bound proteins. With the advent of cryo-electron microscopy­­­, Yan and other researchers have not only resolved the structure of Nav 1.7, but also characterized small molecules—mostly derived from animal toxins—that precisely and selectively interfere with its function. Developing synthetic drugs based on these molecules is the next phase of discovery, and it’s one that may happen more quickly than expected. “When I started my lab, I thought resolving this protein’s structure would be a lifetime project, but we shortened it to just five years,” said Yan.

Further Readings

Rivlin

Warwick RA, Kaushansky N, Sarid N, et al.

Inhomogeneous Encoding of the Visual Field in the Mouse Retina.

Curr Biol. 2018 Mar 5;28(5):655-665.e3

Rivlin-Etzion M, Grimes WN, Rieke F.

Flexible Neural Hardware Supports Dynamic Computations in Retina.

Trends Neurosci. 2018 Apr;41(4):224-237.

Vlasits AL, Bos R, Morrie RD, et al.

Visual stimulation switches the polarity of excitatory input to starburst amacrine cells.

Neuron. 2014 Sep 3;83(5):1172-84.

Rivlin-Etzion M, Wei W, Feller MB.

Visual stimulation reverses the directional preference of direction-selective retinal ganglion cells.

Neuron. 2012 Nov 8;76(3):518-25.

Yan

Shen H, Liu D, Wu K, et al.

Structures of human Nav1.7 channel in complex with auxiliary subunits and animal toxins.

Science. 2019 Mar 22;363(6433):1303-1308.

Pan X, Li Z, Huang X, et al.

Molecular basis for pore blockade of human Na+ channel Nav1.2 by the μ-conotoxin KIIIA.

Science. 2019 Mar 22;363(6433):1309-1313.

Pan X, Li Z, Zhou Q, et al.

Structure of the human voltage-gated sodium channel Nav1.4 in complex with β1.

Science. 2018 Oct 19;362(6412).

Shen H, Li Z, Jiang Y, et al.

Structural basis for the modulation of voltage-gated sodium channels by animal toxins.

Science. 2018 Oct 19;362(6412).

Computer Science

Speakers

Jure Leskovec
Stanford University

Elza Erkip
New York University


Highlights

  • A novel approach to developing machine learning algorithms has improved applications for non-linear datasets.
  • Neural networks can now be used for complex predictive tasks, including forecasting polypharmacy side effects.
  • 5G wireless networks will expand the capabilities of internet-connected devices, providing dramatically faster data transmission and increased reliability.
  • Tools used to design wireless networks can also be used to understand vulnerabilities in the design of online platforms and social networks, particularly as it pertains to user privacy and data anonymization.

Machine Learning with Networks

“For the first time in history, we are using computers to process data at scale to gain novel insights,” said Jure Leskovec, a Blavatnik National Awards Finalist in 2017, 2018, and 2019, describing one aspect of the digital transformation of science, technology, and society. This shift, from using computers to run calculations or simulations to using them to generate insights, is driven in part by the massive data streams available from the Internet and internet-connected devices. Machine learning has catalyzed this transformation, allowing researchers to not only glean useful information from large datasets, but to make increasingly reliable predictions based on it. Just as new imaging techniques reveal previously unknown structures and phenomena in biology, astronomy, and other fields, so too are big data and machine learning bringing previously unobservable models, signals, and patterns to the surface.

This “new paradigm for discovery” has limitations, as Leskovec explained. Machine learning has advanced most rapidly in areas where data can be represented as simple sequences or grids, such as computer vision, image analysis, and speech processing. Analysis of more complex datasets—represented by networks rather than linear sequences—was beyond the scope of neural networks until recently, when Leskovec and his collaborators approached the challenge from a different angle.

The team considered networks as computation graphs, recognizing that the key to making predictions was understanding how information propagates across the network. By training each node in the network to collect information about neighboring nodes and aggregating the resulting data, they can use node-level information to make predictions within the context of the entire network.

Each node within a network collects information from neighboring nodes. Together, this information can be used to make predictions within the context of the network as a whole.

Leskovec shared two case studies demonstrating the broad applicability of this approach. In healthcare, a neural network designed by Leskovec is identifying previously undocumented side effects from drug-drug interactions. Each network node represents a drug or a protein target of a drug, with links between the nodes emerging based on shared side effects, protein targets, and protein-protein interactions. This type of polydrug side effects analysis is infeasible through clinical trials, and Leskovec is working to optimize it as a point-of-care tool for clinicians.

A similar system has been deployed on the online platform Pinterest, where Leskovec serves as Chief Scientist. It has improved the site’s ability to classify users’ preferences and suggest additional content. “We’re generalizing deep learning methodologies to complex data types, and this is leading to new frontiers,” Leskovec said.

Understanding and Engineering Communications Networks

Elza Erkip has never seen a slide rule. In two decades as a faculty researcher and electrical and computer engineer, Erkip, 2010 Blavatnik Awards Finalist, has corrected her share of misconceptions about her field, and about the role of engineering among the scientific disciplines. She joked about stereotypes portraying engineers—most of them men—wielding slide rules or wearing hard hats, but emphasized the importance of raising awareness about the real-life work of engineers. “Scientists want to understand the universe, but engineers use existing scientific knowledge to design and build things,” she explained. “We contribute to discovery, but mostly we want to solve problems, to find solutions that work in the real world.”

Erkip focuses on one of the most impactful areas of 21st century living—wireless communication—and the ever-evolving suite of technologies that support it. She reviewed the rapid progression of wireless device capabilities, from phones that featured only voice calling and text messaging, through the addition of Wi-Fi capability and web browsing, all the way to the smartphones of today, which boast more computing power than the Apollo 11 spacecraft that landed on the moon. She described the next revolution in wireless—5G networks and devices—which promises higher data rates and significant increases in speed and reliability. Tapping the millimeter-wave bands of the electromagnetic spectrum, 5G will rely on different wireless architectures featuring massive arrays of small antennae, which are better suited to propagating shorter wavelengths. The increased bandwidth will enable many more devices to come online. “It won’t just be humans communicating—we’ll have devices communicating with each other,” Erkip said, describing the future connectivity between robots, autonomous cars, home appliances, and sensors embedded in transportation, manufacturing, and industrial equipment.

Despite efforts to anonymize data, many social media sites and online databases remain vulnerable to efforts to match users’ identities across platforms.

Erkip also discussed the application of tools used to understand and build wireless networks to gain insight into privacy issues within social networks. De-anonymization of user data has long plagued online platforms. Studies have shown that it’s often possible to identify and match users across multiple social platforms or databases using publicly available information—a breach that has greater implications for a database of health or voting records than it does for a consumer-oriented site such as Netflix. Erkip is working to understand the fundamental properties of these networks to elucidate the factors that predispose them to de-anonymization attacks.

Further Readings

Leskovec

Zitnik M, Agrawal M, Leskovec J.

Modeling polypharmacy side effects with graph convolutional networks.

Bioinformatics. 2018 Jul 1;34(13):i457-i466.

Erkip

Shirani F, Garg S, Erkip E.

A Concentration of Measure Approach to Database De-anonymization.

IEEE International Symposium on Information Theory. 2019.

Shirani F, Garg S, Erkip E.

Optimal Active social Network De-anonymization Using Information Thresholds.

IEEE International Symposium on Information Theory. 2018.

Materials Science

Speakers

Chiara Daraio
Caltech

Liangbing Hu
University of Maryland, College Park

Highlights

  • Computer-aided manufacturing is enabling researchers to design materials with precisely tuned properties, such as responsiveness to light, temperature, or moisture.
  • Structured materials can mimic robots or machines, changing shape and form repeatedly in the presence of various stimuli.
  • Ultra-strong, lightweight wood-based materials made of nanocellulose fibers may one day resolve some of the world’s most pressing challenges in water, energy and sustainability, replacing transparent plastic packaging, window glass, and even steel and other alloys in vehicles and buildings.

Mechanics of Robotic Matter

Chiara Daraio’s work challenges the traditional definition of words like material, structure, and robot.  Working at the intersection of physics, materials science, and computer science, she designs materials with novel properties and functionalities, enabled by computer-aided design and 3D fabrication. Rather than considering a material as the foundation for assembling a structure, Daraio, 2019 Blavatnik National Awards Finalist, designs materials with intricate structures in unique and complex geometries.

Daraio demonstrated a series of responsive materials—those that morph in the presence of stimuli such as temperature, light, moisture, or salinity. In their simplest forms, these materials change shape—a piece of heat-responsive material folds and unfolds as air temperature changes, or a leaf-shaped hydro-sensitive material opens and closes as it transitions from wet to dry. In more complex forms, materials can display time-dependent responses, as shown in a video demonstration of a row of polymer strips changing shape at different rates, depending on their thickness. Daraio showed how computer-graphical approaches allow researchers to design a single material with different properties in different regions, allowing complex actuation in a time-dependent manner, such as a polymer “flower” with interconnecting leaves taking shape and a polymer “ribbon” slowly interweaving a knot.

A thin foil elastomer comprised of materials with alternating temperature-sensitivity (heat and cold) folds up and “walks” across a table as the temperature varies.

Conventional ideas dictate that a robot is a programmable machine capable of completing a task. “But what if the material is the machine?” asked Daraio, showing the remarkable capabilities of a thin liquid crystal elastomer foil composed of one heat-sensitive and one cold-sensitive material. At room temperature, the foil is flat. Heat from a warm table causes it to curl upward, turn over, and “walk” forward. “As long as there’s some kind of external environmental stimulus, we can design a material that can repeatedly perform actions in time,” Daraio said. Similar responsive materials have been used in a self-deploying solar panel that [remove folds and] unfolds in response to heat.

Materials have been “the seeds of technological innovation” throughout human history, and Daraio believes that structured materials will enable new functionalities at the macroscale—for use in wearables such as helmets as well as in smart building technologies—and at the microscale, where responsive materials could be used for medical diagnostics or drug delivery.

Sustainable Applications for Wood Nanotechnologies

Wood, glass, plastic, and steel are among the most ubiquitous materials on Earth, and Liangbing Hu, 2019 Blavatnik National Awards Finalist, is rethinking them all. Inspired by the global need to develop sustainable materials, Hu turned to the most plentiful source of biomass on Earth— trees—to create a new generation of wood-based materials with astonishing properties. Hu relies on nanocellulose fibers, which can be engineered to serve as alternatives to commonly used unsustainable or energy-intensive materials.

Hu introduced a transparent film that could pass for plastic and can be used for packaging, yet is ten times stronger and far more versatile. This transparent nanopaper, made of nanocellulose fibers, could also be used as a display material in flexible electronics or as a photonic overlay that boosts the efficiency of solar cells by 30%.

Hu has also tested transparent wood—a heavier-gauge version of nanopaper made by removing lignin from wood and injecting the channels with a clear polymer—as an energy-saving building material. More than half of home energy loss is due to poor wall insulation and leakage through window glass. By Hu’s calculations, replacing glass windows with transparent wood would also provide a six-fold increase in thermal insulation. Pressed, delignified wood has also proven to be a superior material for wall insulation. Used on roofs, it is a highly efficient means of passive cooling—the material absorbs heat and then re-radiates it, cooling the surface below it by about ten degrees.

White delignified wood is pressed to increase its strength. It can be used on roofs to passively cool homes by absorbing and re-radiating light, cooling the area below it by about ten degrees.

Comparisons of mechanical strength between wood and steel are almost laughable, unless the wood is another of Hu’s creations—the aptly named “superwood.” Delignified and compressed to align the nanocellulose fibers, even inexpensive woods become thinner and 10-20 times stronger. Superwood rivals steel in strength and durability, and could become a viable alternative to steel and other alloys in buildings, vehicles, trains, and airplanes. Sustainable sourcing would eliminate pollution and carbon dioxide associated with steel production, and its lightweight profile could drastically improve vehicle fuel efficiency.

Further Readings

Daraio

Celli P, McMahan C, Ramirez B, et al.

Shape-morphing architected sheets with non-periodic cut patterns.

Soft Matter. 2018 Dec 12;14(48):9744-9749.

Chen T, Bilal OR, Shea K, Daraio C.

Harnessing bistability for directional propulsion of soft, untethered robots.

Proc Natl Acad Sci USA. 2018 May 29;115(22):5698-5702.

Bauhofer AA, Krödel S, Rys J, et al.

Harnessing Photochemical Shrinkage in Direct Laser Writing for Shape Morphing of Polymer Sheets.

Adv Mater. 2017 Nov;29(42).

Hu

Song J, Chen C, Zhu S, et al.

Processing bulk natural wood into a high-performance structural material.

Nature. 2018 Feb 7;554(7691):224-228.

Huang J, Zhu H, Chen Y, et al.

Highly transparent and flexible nanopaper transistors.

ACS Nano. 2013 Mar 26;7(3):2106-13.

Huang J, Zhu H, Chen Y, et al.

Novel nanostructured paper with ultrahigh transparency and ultrahigh haze for solar cells.

Nano Lett. 2014 Feb 12;14(2):765-73.

Zhu M, Song J, Li T, et al.

Highly Anisotropic, Highly Transparent Wood Composites.

Adv Mater. 2016 Jul;28(26):5181-7.

Li T, Zhai Y, He S, et al.

A radiative cooling structural material.

Science. 2019 May 24;364(6442):760-763.

Zhu H, Luo W, Ciesielski PN, et al.

Wood-Derived Materials for Green Electronics, Biological Devices, and Energy Applications.

Chem Rev. 2016 Aug 24;116(16):9305-74.

Medicine and Medical Diagnostics

Speakers

Nicholas Navin
MD Anderson Cancer Center

Wei Min
Columbia University

Highlights

  • Tumor cells are genetically heterogeneous, complicating efforts to sequence DNA from tumor tissue samples.
  • Techniques for isolating and sequencing single-cell samples have transformed the study of cancer genetics.
  • Stimulated Ramen scattering, a non-invasive imaging technique, can visualize processes including glucose uptake and fatty acid metabolism within living cells.

Single Cell Genomics: A Revolution in Cancer Biology

Nicholas Navin, 2019 Blavatnik National Awards Finalist, doesn’t use the word “revolution” lightly, but when it comes to the field of single-cell genomics and its impact on cancer research, he stands by the term. Over the past ten years, DNA sequencing of single tumor cells has led to major discoveries about the progression of cancer and the process by which cancer cells resist treatment.

Unlike healthy tissue cells, tumor cells are characterized by genomic heterogeneity. Samples from different areas of the same tumor often contain different mutations or numbers of chromosomes. This diversity has long piqued researchers’ curiosity. “Is it stochastic noise generated as tumor cells acquire different mutations, or could this diversity be important for resistance to therapy, invasion, or metastasis?” Navin asked.

Answering that question required the ability to do comparative studies of single tumor cells, a task that was long out of reach. DNA sequencing technologies historically required a large sample of genetic material—a tricky proposition when sampling a highly diverse population of tumor cells. Some mutations, which could drive invasion or resistance, may be present in just a few cells and thus not be represented in the results. Navin was part of the first team to develop a method for excising a single cancer cell from a tumor, amplifying the DNA, and producing an individualized genetic sequence. As amplification and sequencing methods have improved, so too have the insights gleaned from single-cell genomic studies, which Navin likens to “paleontology in tumors”—the notion that a sample taken at a single point in time can allow researchers to make inferences about tumor evolution.

Single-cell genomic studies reveal that some cancer cells have innate mechanisms of resistance to chemotherapy, and undergo further transcriptional changes that enhance this resistance.

Single-cell studies have contradicted the idea of a stepwise evolution of cancer cells, with one mutation leading to another and ultimately tipping the scales toward malignancy. Instead, Navin’s studies reveal a punctuated evolution, whereby many cells simultaneously become genetically unstable. Longitudinal studies of single-cell samples in patients with triple-negative breast cancer are beginning to answer questions about how cancer cells evade treatment, showing that cells that survive chemotherapy have innate resistance, and then undergo further transcriptional changes during treatment, which increase resistance.

Translating these findings to the clinic is a longer-term process, but Navin envisions single-cell genomics will significantly impact strategies for targeted therapy, non-invasive monitoring, and early cancer detection.

Chemical Imaging in Biomedicine

Wei Min, a Blavatnik Awards Finalist in 2012 and 2019, concluded the session with a visually striking glimpse into the world of stimulated Raman scattering (SRS) microscopy. This noninvasive imaging technique provides both sub-cellular resolution and chemical information about living cells, while transcending some of the limitations of fluorescence-based optical microscopy. The probes used to tag molecules for fluorescent imaging can alter or destroy small molecules of interest, including glucose, lipids, amino acids, or neurotransmitters. Rather than using tags, SRS builds on traditional Raman spectroscopy, which captures and analyzes light scattered by the unique vibrational frequencies between atoms in biomolecules. The original method, first pioneered in the 1930s, is slow and lacks sensitivity, but in 2008, Min and others improved the technique.

SRS has since become a leading method for label-free visualization of living cells, providing an unprecedented window into cellular activities. Using SRS and a variety of custom chemical tags—“vibrational tags,” as Min described them—bound to biomolecules such as DNA or RNA bases, amino acids, or even glucose, researchers can observe the dynamics of biological functions. SRS has visualized glucose uptake in neurons and malignant tumors, and has been used to observe fatty acid metabolism, a critical step in understanding lipid disorders. Imaging small drug molecules is notoriously difficult, but Min reported the results of experiments using SRS to tag therapeutic drug molecules and study their activity within tissues.

Stimulated Raman scattering microscopy uses chemical tags to image small biological molecules in living cells. The technique can visualize cellular processes including glucose uptake in healthy cells and tumor cells.

A recent breakthrough in SRS technology involves pairing it with Raman dyes to break the “color barrier” in optical imaging. Due to the width of the fluorescent spectrum, labels are limited to five or six colors per sample, which prevents researchers from imaging many structures within a tissue sample simultaneously. Min has introduced a hybrid imaging technique that allows for super-multiplexed imaging—up to 10 colors in a single cell image—and utilizes a dramatically expanded palette of Raman frequencies that yield at least 20 distinct colors.

Further Readings

Navin

Kim C, Gao R, Sei E, et al.

Chemoresistance Evolution in Triple-Negative Breast Cancer Delineated by Single-Cell Sequencing.

Cell. 2018 May 3;173(4):879-893.e13.

Casasent AK, Schalck A, Gao R, et al.

Multiclonal Invasion in Breast Tumors Identified by Topographic Single Cell Sequencing.

Cell. 2018 Jan 11;172(1-2):205-217.e12.

Gao R, Davis A, McDonald TO, et al.

Punctuated copy number evolution and clonal stasis in triple-negative breast cancer.

Nat Genet. 2016 Oct;48(10):1119-30.

Wang Y, Navin NE.

Advances and applications of single-cell sequencing technologies.

Mol Cell. 2015 May 21;58(4):598-609.

Navin NE.

Cancer genomics: one cell at a time.

Genome Biol. 2014 Aug 30;15(8):452.

Wang Y, Waters J, Leung ML, et al.

Clonal evolution in breast cancer revealed by single nucleus genome sequencing.

Nature. 2014 Aug 14;512(7513):155-60.

Min

Xiong H, Shi L, Wei L, et al.

Stimulated Raman excited fluorescence spectroscopy and imaging.

Nat Photonics. 2019; (3) 412–417.

Xiong H, Qian N, Miao Y, et al.

Stimulated Raman Excited Fluorescence Spectroscopy of Visible Dyes.

J Phys Chem Lett. 2019 Jul 5;10(13):3563-3570.

Zhang L, Shi L, Shen Y, et al.

Spectral tracing of deuterium for imaging glucose metabolism.

Nat Biomed Eng. 2019 May;3(5):402-413.

Shen Y, Hu F, Min W.

Raman Imaging of Small Biomolecules.

Annu Rev Biophys. 2019 May 6;48:347-369.

Wei M, Shi L, Shen Y, et al.

Volumetric chemical imaging by clearing-enhanced stimulated Raman scattering microscopy.

Proc Natl Acad Sci U S A. 2019 Apr 2;116(14):6608-6617.

Shi L, Zheng C, Shen Y, et al.

Optical imaging of metabolic dynamics in animals.

Nat Commun. 2018 Aug 6;9(1):2995.

Darwin’s Dilemma: The Origin and Evolution of the Eye

A shark swims in the ocean.

Award-winning science writer Carl Zimmer explains the “creation” of the organ so complex that it baffled even Darwin.

Published October 1, 2019

By Carl Zimmer

“The eye to this day gives me a cold shudder,” Charles Darwin once wrote to a friend.

If his theory of evolution was everything he thought it was, a complex organ such as the human eye could not lie beyond its reach. And no one appreciated the beautiful construction of the eye more than Darwin—from the way the lens was perfectly positioned to focus light onto the retina to the way the iris adjusted the amount of light that could enter the eye. In The Origin of Species, Darwin wrote that the idea of natural selection producing the eye “seems, I freely confess, absurd in the highest possible degree.”

For Darwin, the key word in that sentence was seems. If you look at the different sort of eyes out in the natural world and consider the ways in which they could have evolved, Darwin realized, the absurdity disappears. The objection that the human eye couldn’t possibly have evolved, he wrote, “can hardly be considered real.”

Dozens of Different Kinds of Eyes

Today evolutionary biologists are deciphering the origins of not just our own eyes but the dozens of different kinds of eyes that animals use. Fly eyes are built out of columns. Scallops have a delicate chain of eyes peeking out from their shells. Flatworms have simple light-sensitive spots. Octopuses and squids have camera eyes like we do, but with some major differences. The photoreceptors of octopuses and squids point out from the retina, towards the pupil. Our own eyes have the reverse arrangement. Our photoreceptors are pointed back at the wall of the retina, away from the pupil.

For decades, most scientists argued that these different eyes evolved independently. The earliest animals that lived over 600 million years ago were thought to be eyeless creatures. As their descendants branched out into different lineages, some of them evolved their own kinds of eyes. It now turns out, however, that this is not really true.

All eyes, in all their wonderful variety, share an underlying unity in the genes used to build them. By tracing the history of these shared genes, scientists uncovering the steps by which complex eyes have evolved through a series of intermediate steps.

Opsins in Common

When light enters your eye, it strikes a molecule known as an opsin. Opsins sit on the surface of photoreceptor cells, and when they catch photons, they trigger a series of chemical reactions that causes the photoreceptor to send an electrical message towards the brain.

Biologists have long known that all vertebrates carry the same basic kind of opsin in their eyes, known as a c-opsin. All c-opsins have the same basic molecular shape, whether they’re in the eye of a shark or the eye of a hummingbird. All c-opsins are stored in a stack of disks, each of which grows out of a hair-like extension of the retina called a cilium.

In all vertebrates, c-opsins relay their signal from the stack of disks through a pathway of proteins called the phosphodiesterase pathway. All of these homologies suggest that c-opsins were present in the common ancestor of all living vertebrates.

Vertebrates belong to a much larger group of species known as bilaterians—in other words, animals that develop a left-right symmetry. The main lineage of these other bilaterians, known as protostomes, includes millions of species, ranging from insects to earthworms and squid.

Protostome eyes don’t have the c-opsins found in vertebrates. Instead, protostomes build another molecule, known as an r-opsin. Instead of keeping r-opsins in a stack of disks, they store r-opsins in foldings in the membranes of photoreceptors. R-opsins all send their signals through the same pathway of proteins (not the same pathway as c-opsins send signals in vertebrates).

Humans Also Make R-Opsins

These similarities in the r-opsins suggest they evolved in the common ancestor of protostomes, only after their ancestors had branched off from the ancestors of vertebrates. Likewise, vertebrates only evolved c-opsins in their eyes after the split. In recent years, however, evolutionary biologists have discovered opsins where they weren’t supposed to be.

It turns out, for example, that humans also make r-opsins. We just don’t make them on the surfaces of photoreceptors where they can catch light. Instead, r-opsins help to process images captured by the retina before they’re transmitted to the brain.

In 2004, Detlev Arendt of the European Molecular Biology Laboratory and his colleagues also found c-opsins where they weren’t supposed to be. They were probing the nervous system of an animal known as a ragworm, which captures light with r-opsins. Arendt and his colleagues discovered a pair of organs atop the ragworm’s brain that grew photoreceptors packed with c-opsins.

Arendt sequenced the gene for the ragworm c-opsins and compared it with genes for other opsins. He found that it is more closely related to the genes for c-opsins in our own eyes than it is to the genes for r-opsins in the ragworm’s own eyes. These findings have led Arendt and other researchers to revise their hypothesis about the origin of opsins: the common ancestor of all bilaterians must already have had both kinds of opsins.

Clues from Cnidarians

But Todd Oakley, a biologist at the University of California at Santa Barbara, wondered if opsins might be even older. To find out, Oakley and his colleagues turned to the closest living relatives of bilaterians. Known as the cnidarians, this lineage includes jellyfish, sea anemone, and corals.

Adapted with permission from The Tangled Bank: An Introduction to Evolution, by Carl Zimmer (copyright 2010, Roberts & Company, Greenwood Village, CO).

Biologists have long known that some cnidarians can sense light. Some jellyfish even have eye-like organs that can form crude images. In other ways, though, cnidarians are radically different from bilaterians. They have no brain or even a central nerve cord, for example. Instead, they have only a loose net of nerves. These dramatic differences had led some researchers to hypothesize that bilaterians and cnidarians had evolved eyes independently. In other words, the common ancestor of cnidarians and bilaterians did not have eyes.

In recent years, scientists have sequenced the entire genomes of two species of cnidarians, the stellar sea anemone (Nematostella vectensis) and a freshwater hydra (Hydra magnipapillata). Scanning their genomes, Oakley and his colleagues discovered that both species cnidarians have genes for opsins—the first time opsin genes had ever been found in a nonbilaterian. The scientists carried out experiments on some of these genes and discovered that they are expressed in the sensory neurons of the cnidarians. Oakley’s research suggests that, as he had suspected, opsins evolved much earlier than bilaterians.

How Opsins Evolved

With discoveries from scientists such as Oakley and Arendt, we can start to get a sense of how opsins evolved. Opsins belong to a family of proteins called G-protein coupled receptors (GPCRs). They’re also known as serpentine proteins, for the way they snake in and out of cell membranes. Serpentine proteins relay many different kinds of signals in the cells of eukaryotes. Yeast cells use them to detect odorlike molecules called pheromones released by other yeast cells. Early in the evolution of animals, a serpentine protein mutated so that it picks up a new kind of signal: light.

At some point, the original opsin gene was duplicated (Figure 8.13). The two kinds of opsins may have carried out different tasks. One may have been sensitive to a certain wavelength of light, for example, while the other tracked the cycle of night and day. When cnidarians and bilaterians diverged, perhaps 620 million years ago, they each inherited both kinds of opsins. In each lineage, the opsins were further duplicated and evolved into new forms. And thus, from a single opsin early in the history of animals, a diversity of light-sensing molecules has evolved.

The Crystalline Connection

The earliest eyes were probably just simple eyespots that could only tell the difference between light and dark. Only later did some animals evolve spherical eyes that could focus light into images. Crucial to these image-forming eyes was the evolution of lenses that could focus light. Lenses are made of remarkable molecules called crystallins, which are among the most specialized proteins in the body. They are transparent, and yet can alter the path of incoming light so as to focus an image on the retina. Crystallins are also the most stable proteins in the body, keeping their structure for decades. (Cataracts are caused by crystallins clumping late in life.)

It turns out that crystallins also evolved from recruited genes. All vertebrates, for example, have crystallins in their lenses known as α-crystallins. They started out not as light-focusing molecules, however, but as a kind of first aid for cells. When cells get hot, their proteins lose their shape. They use so-called heat-shock proteins to cradle overheated proteins so that they can still carry out their jobs.

Scientists have found that α-crystallins not only serve to focus light in the eye, but also act as heat-shock proteins in other parts of the body. This evidence indicates that in an early vertebrate, a mutation caused α-crystallins to be produced on the surface of their eyes. It turned out to have the right optical properties for bending light. Later mutations fine-tuned α-crystallins, making them better at their new job.

The Evolution of the Vertebrate Eye

Vertebrates also produce other crystallins in their eyes, and some crystallins are limited to only certain groups, such as birds or lizards. And invertebrates with eyes, such as insects and squid, make crystallins of their own. Scientists are gradually discovering the origins of all these crystallins. It turns out that many different kinds of proteins have been recruited, and they all proved to be good for bending light.

In 2007, Trevor Lamb and his colleagues at Australian National University synthesized these studies and many others to produce a detailed hypothesis about the evolution of the vertebrate eye. The forerunners of vertebrates produced light-sensitive eyespots on their brains that were packed with photoreceptors carrying c-opsins. These light-sensitive regions ballooned out to either side of the head, and later evolved an inward folding to form a cup.

Early vertebrates could then do more than merely detect light: they could get clues about where the light was coming from. The ancestors of hagfish branched off at this stage of vertebrate eye evolution, and today their eyes offer some clues to what the eyes of our own early ancestors would have looked like.

The Evolution Doesn’t Stop

After hagfish diverged from the other vertebrates, Lamb and his colleagues argue, a thin patch of tissue evolved on the surface of the eye. Light could pass through the patch, and crystallins were recruited into it, leading to the evolution of a lens. At first the lens probably only focused light crudely. But even a crude image was better than none. A predator could follow the fuzzy outline of its prey, and its prey could flee at the fuzzy sight of its attackers. Mutations that improved the focusing power of the lens were favored by natural selection, leading to the evolution of a spherical eye that could produce a crisp image.

The evolution of the vertebrate eye did not stop there. Some evolved the ability to see in the ultraviolet. Some species of fish evolved double lenses, which allowed them to see above and below the water’s surface at the same time. Vertebrates adapted to seeing at night and in the harsh light of the desert. Salamanders crept into caves and ended up with tiny vestiges of eyes covered over by skin. But all those vertebrate eyes were variations on the same basic theme established half a billion years ago.


About the Author

Carl Zimmer is a lecturer at Yale University, where he teaches writing about science and the environment. He is also the first Visiting Scholar at the Science, Health, and Environment Reporting Program at New York University’s Arthur L. Carter Journalism Institute.

Zimmer’s work has been anthologized in both The Best American Science Writing series and The Best American Science and Nature Writing series. He has won numerous fellowships, honors, and awards, including the 2007 National Academies Science Communication Award for “his diverse and consistently interesting coverage of evolution and unexpected biology.”

His books include Soul Made Flesh, a history of the brain; Evolution: The Triumph of an Idea; At the Water’s Edge, a book about major transitions in the history of life; The Smithsonian Intimate Guide to Human Origins; and Parasite Rex, which the Los Angeles Times described as “a book capable of changing how we see the world.”

His newest book, The Tangled Bank: An Introduction to Evolution, will be published this fall to coincide with the 150th anniversary of the publication of The Origin of Species.

Developing Practical Solutions to Everyday Challenges

A firefighter combats a wildfire.

The Academy works with partners in industry, academia and government to develop solutions for everyday challenges.

Published October 1, 2019

By Robert Birchard

Matthew Friedman

For more than a decade the Academy has worked with partners in industry, academia and government to identify solutions to every day challenges through its innovation challenges.

“These challenges provide a platform for people to hone their STEM skills on a level playing field — no lab, credentials or financial commitment required — and apply them in an interdisciplinary, real world environment,” explains Chenelle Bonavito Martinez, MS, Vice President, STEM Talent Programs.

Challenges are not just about working on a solution to a problem. They also provide an opportunity for students to practice time and project management, as well as communication and presentation skills.

Lessening the Impact of Wildfires

In one such challenge, a team of five students from The Junior Academy in five different countries devised a solution to lessen the impact of wildfires.

Not only do [wildfires] destroy homes, they also halt local economies, raze whole habitats, injure and kill many, send carcinogens into the air, and so much more,” says Matt Friedman, 16, United States, a member of the winning Wildfire team. “Understanding the factors related to real-world problems can help us solve them.”

Rubi Lopez

The team looked at how to best counter the wildfire embers and maintain adequate water supply in pumping stations without electricity. In addition to the scientific and engineering questions, the group also grappled with questions of cost-effectiveness and how to implement their solution in already existing communities.

“I think it is really easy to fall into the trap of putting science into neat little boxes where each idea or development belongs in its own discipline,” says Wildfire team member Isabelle Robertson, 18, New Zealand. “But the real world isn’t like that and global problems require us to use collaborative approaches and tie aspects of different disciplines into one solution.”

Devising Healthier Snack Options

Rubi Lopez, Monterrey Institute of Technology and Higher Education and Bianka Martinez, Technological Institute of Morelia were completing their undergraduate degrees, when they won the Pepsico Healthy Snack Challenge, devising a healthy snack that would appeal to children. Their solution required not just extensive nutrition research, but also thorough market research.

Bianka Martinez

“My experience with this challenge expanded my vision of the food industry and focused my attention on creating bigger impact in the world,” says Martinez, a biochemical engineer who recently finished a Master’s degree in Food Technology and Innovation at the Polytechnic School of Design in Milan, Italy.

“The best way to solve worldwide problems is by applying scientific skills combined with creative and design skills. Science lays the foundations, the procedures and the means to solve problems, while the design thinking helps us create innovative and unique solutions by focusing on people,” says Martinez.

“Scientific skills are like a yellow brick road that lead you to the truth. You don’t know if Oz is near or far, but you know you’re on the right path,” echoes Lopez an international business major. “I participated in this challenge despite it not being directly related to my major. I thought my skills could be useful and that this challenge offered the opportunity to learn new things. It’s not necessary to have a science degree to generate solutions to real problems, but critical thinking and constant curiosity are always necessary to make a positive change.”

Isabelle Robertson

“The tools and techniques of science helps people make breakthrough discoveries in understanding phenomena,” says Bhavna Mehra, General Manager, Infosys Science Foundation. “Therefore, science and its pursuers and practitioners have the responsibility, along with the vision, to solve these problems.”

A Real-World Scenario

This belief in the responsibilities of a scientist led to the development of the Infosys Science Foundation Nutrition Challenge. Originally envisioned as a way to raise awareness about the number of deaths attributed to malnutrition in children under the age of five, the challenge also gave participants a platform to develop.

“The skills of observing, experimenting, data collection and applying a concept in a real-world scenario were all tested as the solvers worked on the nutrition challenge,” explains Mehra.

The top two teams — team Podible and team Nutri-APP — came up with their own hypotheses, collected data and applied the results to come up with executable plans to tackle malnutrition.

“Cultivating an understanding and practice of scientific thinking in all fields will go a long way in helping solve social, economic and civic issues, says Mehra.”

Advancing Science in an App-Driven World

Apps and other digital platforms have become part of our daily lives for everything from social interaction to ordering dinner. These technologies are also providing intriguing opportunities to accelerate the use of science to improve our daily lives.

Published June 1, 2019

By Jennifer L. Costley and Chenelle Bonavito Martinez

Image courtesy of Pixel-Shot via stock.adobe.com.

According to the Pew Research Center, 77 percent of all Americans own smartphones. For the 18 through 29 set this number increases to 93 percent and continues to rise. According to analysts who track such things, the number of apps downloaded daily across iOS and Google Play has reached 300 million, and the average number of apps downloaded to every iPhone/iPod touch and iPad is more than 60.

So it is safe to say that we are increasingly living in an app-driven world and that digital technology is now an integral part of how most of us manage our time and lives. Science is no exception — digital technologies are providing intriguing opportunities to accelerate the use of science to improve our daily lives.

This exciting trend is underlined by recent 5G announcements from Verizon and AT&T. The impact of 5G (fifth-generation wireless connectivity) has yet to be felt, but with transmission speeds much faster than current capabilities and a capacity for many more devices to connect simultaneously, it is clear that 5G is poised to transform our world.

A Network of “Solvers” from Around the Globe

Here at the Academy, the transformation has already begun. Virtual, cloud-based innovation challenges — sponsored by some of the world’s most dynamic companies — are enabling us to tap into a network of “solvers” from around the globe. Thus far, Academy challenges have generated potentially groundbreaking ideas on topics ranging from future aircraft design, to wildfire management, alternative energy sources and sustainable urban development, just to name a few.

One recent example, sponsored by aerospace giant Lockheed Martin, was “Disruptive Ideas for Aerospace and Security”. In this challenge, researchers were invited to submit ideas for novel innovations utilizing autonomy, human augmentation or block-chain technologies. The entries include an extraordinary range of truly game-changing ideas, some with the potential to upend the aerospace industry.

And researchers are not the only ones getting involved. In the “Future of Buildings and Cities Challenge,” young people from around the world were invited to develop sustainable building concepts for future urban landscapes. The winners, six gifted teens from five countries, collaborated virtually to develop an ingenious “green” building design that incorporated a water recycling system, solar roof panels and “green walls” (a collection of vines, leaf twiners and climbers on a grid-like support to help purify the air and provide additional insulation). The concept also featured an ingenious “home assistant,” leveraging a series of indoor sensors to detect occupancy, light intensity, temperature, humidity and air quality, an idea that 5G connectivity could soon enable.

Artificial Intelligence

But 5G is not the only game-changing technology at play. The field of artificial intelligence (AI) has also made astounding progress over the past decade. Machine learning and natural language are particularly dynamic subfields of AI, with the potential to revolutionize critical elements of the economy, including the media, finance, and healthcare sectors.

That’s why the Academy will be building upon the success of our annual Machine Learning Symposium to launch a new symposium series on natural language, dialog and speech in November of this year. We’re also thrilled that Yann LeCun, Chief AI Scientist at Facebook, and Manuela Veloso, Head of AI Research at J.P. Morgan, have agreed to serve as honorary chairs for the launch of a new initiative on applications of AI to critical sectors of the New York City economy.

We stand at the forefront of a massive shift in how society compiles, shares and learns from massive data sets. But there are serious obstacles to overcome before we can unlock the potential of digital technology, AI, and big data to drive positive change. As advocates of evidence-based policy and decision-making, we in the scientific community must be at the forefront of efforts to ensure these new technologies are used to the benefit of humankind, and the planet upon which we live.

Science and Social Media: #facepalm or #hearteyes?

Beneath all the negative noise, science can flourish on social media, but users must be diligent, measured, and ethical with how they use this powerful platform.

Published June 1, 2019

By Kari Fischer, PhD

Image courtesy of Poramet via stock.adobe.com.

Somewhere in between those halcyon days of Facebook as a friendly college social media network and the acrimonious 2016 elections, meme-filled newsfeeds took over, and social media sites like Facebook, Twitter, YouTube and Pinterest transformed into new express lanes for the spread of misinformation. This development feels especially glaring in science.

As the use of social media expanded it also became a major source for news and information. A 2018 Pew Research Center study found that 68 percent of American adults get news through social media sites. That change held not only for politically-themed content, but for science too. Another 2018 Pew study found that most users report seeing science-related posts, and 33 percent view it as a source for science news. Millions follow science-related pages on social media with the most popular pages including National Geographic, IFL Science, NASA, and ScienceAlert.

As news sources become increasingly fractured, it is difficult to dig through the mountains of contradictory articles, especially when we are asked to evaluate highly technical subjects that might be communicated poorly — sometimes intentionally so. The aforementioned list of influential “science-related” pages also includes those whose basis in empirical data is more loosely defined, like that of Dr. Mehmet Oz. In 2014 he was called before Congress for promoting sham supplements, and recently tweeted about the link between astrology and health. His page has over 5.5 million followers.

Flawed information has a way of spreading quickly. Of the 100 most shared health-related articles in 2018, over half of the articles contained misleading or exaggerated statements, or even outright falsehoods. Some of those articles even came from reputable news sources.

The Pervasiveness of False Information

The pervasiveness of false information on social media may translate to an effect on public health. When measles outbreaks increased 30 percent worldwide, vaccine misinformation on the internet took center stage. A recent study in the United Kingdom from the Royal Society for Public Health showthat 50 percent of parents with young children were exposed to negative messages about vaccines on social media.

This did not happen entirely organically. Russian trolls engaged not only in spreading political falsehoods, but they heightened the debate around vaccines too. A study analyzing tweets from 2014 to 2017 revealed that known Russian accounts tweeted about vaccines at higher rates than average users. The content of their tweets presented both pro- and anti-vaccine messages, a known tactic that amplifies a sense of “debate” and therefore propagates a sense of uncertainty.

Why are these misleading posts so attractive? Dominique Brossard, professor and chair in the Department of Life Sciences Communication at the University of Wisconsin-Madison, pulls no punches in her assessment, “They’re using all the strategies that unfortunately the scientific community has not been using.” She emphasizes that they exploit the most fundamental driver of whether or not information is accepted: trust. “What are the main things that build trust? Concern, care and honesty.” Or at least the perception of honesty.

The strength of these tactics can be especially heightened when they are insulated from outside influence. Many organizations against vaccines structure their Facebook groups so that they are closed or private, allowing for misinformation to be stated entirely unchecked and out of the public eye.

The Effect on Public Opinion

But, as all good scientists know, correlation does not equal causation. The pervasiveness of false information does not mean that there is a straight line of causality to an effect on public opinion. “It’s hard to quantify the effects of misinformation,” Brossard cautions. That same 2018 Pew study revealing 68 percent of American adults getting news on social media also stated that 57 percent expect the news they see to largely be inaccurate.

The public may also be changing how they’re interacting with social media. After the 2016 elections and the Cambridge Analytica scandal, some users needed a pause. On Facebook, 54 percent of adults modified their use in 2018: adjusting their privacy settings, deleting the app from their cellphone, or even taking extended breaks.

Social media companies are also modifying their approach. Pinterest blocked users from searching for vaccine-related terms. YouTube removed advertisements from anti-vaccine themed videos, and recently pledged to curb the spread of misinformation by modifying its recommendation algorithms — hopefully preventing users from following conspiracy-laden video rabbit holes.

And in spite of all the misleading content, which prompts all scientists to reply #headdesk or #facepalm — that’s social media speak for frustration or exasperation — there are many exciting online communities that may provide some redemption for these platforms.

Recognizing the opportunity to cater to the sci-curious, experts in science outreach jumped online as a way to spread a passion for science. YouTube accounts like AsapSCIENCE and Physics Girl have millions of subscribers, and take the time to break down complex subjects for their audiences.

Scientists and Instagram

On Instagram, science.sam is the account of Samantha Yammine, who uses the platform as a new line of communication with the public. While earning her PhD, she shares her daily life as a researcher through photos and videos both in and outside of the lab, with a humanizing effect. She also contributes to a research study nicknamed #ScientistsWhoSelfie, which is systematically exploring the effects of scientists’ Instagram posts to influence public perception of scientists.

Social media also provides a megaphone to amplify diverse voices in science, and remove hierarchies that exist offline. The accounts belonging to #VanguardSTEM link to live, monthly interviews with both “emerging and established women of color in STEM,” where they cover research, career advice and social commentary.

Kyle Marian Viterbo, social media manager at Guerilla Science and producer of The Symposium: Academic Stand-Up, cites her experience in biological anthropology groups on Facebook as some of the earliest examples of social forums for scientific discussion, where status and titles were stripped away. “We talked about papers and coverage of papers in depth, in a way that only an academic community can. It’s been an amazing experience to see that community grow, and add new scientists who have equal conversation power with folks who are emeritus professors.”

Scientists and Twitter

A 2017 study estimated that over 45,000 scientists use Twitter. From volcanologists, to climate scientists, to evolutionary biologists, they’re all online in a professional capacity. There, they share new papers, announce job openings in their labs, comment on published research and network with other scientists both in and outside of their field.

For science professionals who feel emboldened to get online, but don’t know how, Viterbo advises easing your way in, “My number one advice is to just lurk. You’re silent, you’re observing, it’s almost like an ethnography situation…you don’t have to be active. A lot of it is also getting to know what you want out of that experience, and you don’t really know that until you see other people doing it well, and it resonates.”

Once your field observations are complete, Viterbo says it’s time to experiment with a few posts, “You just have to play in this space, and allow yourself to make a few mistakes.” She reminds scientists that we have the instincts for learning how to do well, but we can also get out of our own way, “Apply the scientific method to communication and social media, but also be more forgiving. We’re not necessarily the most forgiving of ourselves in science, but do it for fun!”

Communication Works Both Ways

If you plan on venturing into social media with an agenda in mind, perhaps take a cue from Tamar Haspel, a science journalist who writes the award-winning Washington Post column Unearthed. She spends much of her time researching controversial topics like pesticides, GMOs and diet recommendations, and cautions scientists to remember that “communication works both ways.”

Haspel makes a point to read thoughtful discussions from all sides, even on Twitter, “I have smart people with wildly different views in my feed, and I pay attention when they post something, because of course when we see something that we don’t want to believe we have a tendency to just scroll down. I try to stop, click through, and listen.” Her own posts are comprehensive explainers on the complex science of agriculture, and she also readily self-corrects and engages politely on divisive topics.

The result has positioned her as a trustworthy source for information. Haspel’s number one piece of advice for scientists who want to achieve the same? “We need to think less about being persuasive, and think more about being persuadable.

Also read: Deepfakes and Democracy in the Digital Age

Climate Science: Decision-Making in a Warmer World

Overview

Climate change is a growing threat with global impact. Shifts in the climate present special challenges for urban areas where more than half of the world’s population lives. New York City residents, for example, are already feeling the effects through recurrent flooding in coastal communities, warmer temperatures across all five boroughs, and strains in the city’s infrastructure during heavy downpours and extreme weather events. As a result, cities like New York require the best-available climate science to develop tangible policies for resilience, mitigation, and adaptation.

On March 15, 2019, climate scientists, city planners, and community and industry stakeholders attended the Science for Decision-Making in a Warmer World summit at the New York Academy of Sciences to discuss how cities are responding to the effects of climate change. The event marked the 10th anniversary of a successful partnership between the New York City Panel on Climate Change (NPCC), the City of New York, and the New York Academy of Sciences. Established in 2008, the NPCC has opened new frontiers of urban climate science to build the foundation for resiliency actions in the New York metropolitan region.

Learn about the NPCC’s latest research findings and their implications for New York City and other cities seeking to identify and mitigate the effects of climate change in this summary.

Meeting Highlights

  • NPCC research provides tools to inform and shape climate change resilience in New York City and other cities around the globe. 
  • Shifts in mean and extreme climate conditions significantly impact cities and communities worldwide. 
  • Cities can move forward by adopting flexible adaptation pathways, an overall approach to developing effective climate change adaptation strategies for a region under conditions of increasing risk.
  • There is a growing recognition that resilience strategies need to be inclusive of community perspectives.

Speakers

Dan Bader
Columbia University, New York City Panel on Climate Change

Jainey Bavishi
New York City Mayor’s Office of Recovery and Resiliency

Sam Carter
Rockefeller Foundation

Alan Cohn
New York City Department of Environmental Protection

Kerry Constabile
Executive Office of the UN Secretary General

Susanne DesRoches
New York City Mayor’s Office of Recovery and Resiliency

Alexander Durst
The Durst Organization

Sheila Foster
Georgetown, New York City Panel on Climate Change

Vivien Gornitz
Columbia University, New York City Panel on Climate Change

Mandy Ikert
C40 Cities Climate Leadership Group

Klaus Jacob
Columbia University, New York City Panel on Climate Change

Michael Marrella
New York City Department of City Planning

Richard Moss
American Meteorological Society

Kathy Robb
Sive, Paget, and Riesel

Seth Schultz
Urban Breakthroughs

Daniel Zarrilli, PE
New York City Office of the Mayor

Climate Change, Science, and New York City

Speakers

Alan Cohn
New York City Department of Environmental Protection

Susanne DesRoches
New York City Mayor’s Office of Recovery and Resiliency

Alexander Durst
The Durst Organization

Michael Marrella
New York City Department of City Planning

Daniel Zarrilli (keynote)
New York City Office of the Mayor

James Gennaro (panel moderator)
New York State Department of Environmental Conservation

Keynote: Preparing for Climate Change — NPCC and Its Role in New York City

Daniel Zarrilli, of the New York City Office of the Mayor, gave the first keynote presentation. In addition to outlining NPCC history, he emphasized the meaning of NPCC to the city. NPCC has provided the tools to inform policy since before Hurricane Sandy in 2012. Because of NPCC, Zarrilli stated, people now know that the waters around New York City are rising “twice as quickly as the global average” and that climate change will affect communities disproportionately. The city can and will take on the responsibility to protect those who are most vulnerable.  Zarrilli highlighted steps the Mayor’s Office is taking: fossil fuel divestment, bringing a lawsuit against big oil for causing climate change, and launching a new OneNYC strategic plan to confront our climate crisis, achieve equity, and strengthen our democracy. He concluded by saying that with “8.6 million New Yorkers and all major cities watching,” NPCC is providing the best possible climate science to drive New York City policy.

Panel 1: NPCC and Its Role in New York City

How are NPCC findings used in developing resiliency in New York City?

The first panel was moderated by William Solecki of Hunter College Institute for Sustainable Cities – City University of New York, and featured three city representatives, Susanne DesRoches, of the New York City Mayor’s Office of Recovery and Resiliency; Michael Marrella, of the New York City Department of City Planning; Alan Cohn, of the New York City Department of Environmental Protection; and one industry stakeholder, Alexander Durst, of the Durst Organization.

DesRoches noted that the NPCC research has made possible a proliferation of guidelines regulating building design in the city. In fact, the New York City Climate Resiliency Design Guidelines, released the same day that the panel took place, provide instruction on how to use climate projections in the design of city buildings. The Department of City Planning also uses NPCC data in its Coastal Zone Management Program to require that coastal site developers to disclose and address current and future flood risks. Marrella added that NPCC research tools allow public and private stakeholders to make informed decisions on how to shape policy. NPCC methods and approaches are also being used climate data is also being used for New York State and national projections.

Panelists also addressed how New York City’s mitigation goals enable resilience in the face of climate change challenges. DesRoches pointed to the city’s aggressive climate targets, including an “80% [emissions] reduction by 2050,” and a goal to limit temperature increase to 1.5°C, as targeted by the Paris Agreement (UN Climate Change 2015). She gave two examples of adaptations that align with the City’s mitigation goals: adapting high “passive house” and green building standards for a reduced carbon footprint; and diversifying how the city receives energy, including the development of a renewable energy grid. Cohn added that the Department of Environmental Protection aims to free up capacity in water conservation and implement the use of methane as an energy source. With resilience in mind, Durst stressed that energy models should be uniform and based on the future, not just today.

Further Readings

Zarrilli

Wallace-Wells D.

The Uninhabitable Earth: Life after Warming

New York: Tim Duggan Books; 2019

Panel 1

UN Climate Change. The Paris Agreement.

What is the Paris Agreement?

Dec (2015)

NYC Mayor’s Office of Recovery and Resiliency.

Climate Resiliency Design Guidelines.

March (2019)

Wuebbles DJ, Fahey DW, Hibbard KA, Dokken DJ, et al.

Climate Science Special Report: Fourth National Climate Assessment

U.S. Global Change Research Program, Washington, DC, USA, 2017;1-477.

Rosenzweig C, Solecki W, DeGaetano A, O’Grady M, et al.

Responding to climate change in New York State: The ClimAID integrated assessment for effective climate change adaptation in New York State

Final report, NYSERDA. 2011;1-149

Findings from the New York City Panel on Climate Change

Panelists

Dan Bader
Columbia University, New York City Panel on Climate Change

Sheila Foster
Georgetown, New York City Panel on Climate Change

Vivien Gornitz
Columbia University, New York City Panel on Climate Change

Klaus Jacob
Columbia University, New York City Panel on Climate Change

Julie Pullen (panel moderator)
Jupiter Intelligence

Panel 2: Latest Findings from the New York City Panel on Climate Change

What types of information are the most useful?

The second panel was moderated by Julie Pullen of Jupiter Intelligence, and featured four NPCC members who presented the latest NPCC3 report findings: Vivien Gornitz, Klaus Jacob, and Daniel Bader of Columbia University; and Sheila Foster, of Georgetown Law.

The latest NPCC3 findings confirmed climate projections from the 2015 report as the projections of record for New York City planning and decision-making. For example, by the end of the century, “ocean levels will be higher than they are now due to thermal expansion; changes in ocean heights; loss of ice from Greenland and Antarctic Ice Sheets; land-water storage; vertical land movements; and gravitational, rotational, and elastic ‘fingerprints’ of ice loss,” said Gornitz. Under the NPCC’s new Antarctic Rapid Ice melt (ARIM) scenario, there could be up to a 9.5 ft. increase in sea level rise by 2100 at the high end of the projections. The new report advises that levies or raised streets might reduce the effects that sea level rise will have on New York City’s coastline.

Vulnerability to climate change varies by neighborhood and socioeconomic status. Foster presented a new three-dimensional approach to community-based adaptation through the lens of equity: distributional, contextual, and procedural. Distributional equity emphasizes disparities across social groups, neighborhoods, and communities in vulnerability, adaptive capacity, and the outcomes of adaptation actions. Contextual equity emphasizes social, economic, and political factors and processes that contribute to uneven vulnerability and shape adaptive capacity. Procedural equity emphasizes the extent and robustness of public and community participation in adaptation planning and decision-making.

Echoing Mayor Bloomberg’s sentiment that “if you can’t measure it, you can’t manage it,” Jacob presented the proposed NPCC New York City Climate Change Resilience Indicators and Monitoring system (NYCLIM). Through the new proposed NYCLIM system, NPCC recommends climate, impact, vulnerability, and resilience indicators for the City’s decision-making processes.

Further Readings

Cities as Solutions for Climate Change and Closing Remarks

Keynote Speaker and Panelists

Jainey Bavishi
New York City Mayor’s Office of Recovery and Resiliency

Sam Carter
Rockefeller Foundation

Kerry Constabile
Executive Office of the UN Secretary General

Seth Schultz
Urban Breakthroughs

Mandy Ikert (keynote)
C40 Cities Climate Leadership Group

Richard Moss (panel moderator)
American Meteorological Society

Keynote: Role of Cities in Achieving Progress

Mandy Ikert, of C40 Cities Climate Leadership Group, gave the second keynote presentationThe Future We Don’t Want, a study recently released by C40, the Urban Climate Change Research Network (UCCRN), and Acclimatise found that billions of urban citizens are at risk of climate-related heat waves, droughts, floods, food shortages, and blackouts by 2050 (UCCRN 2018). Cities are situated at the forefront of these effects and urgently need to respond. Ikert stated that “we live in an urbanizing world,” where 68% of the world’s population will be living in cities by 2050, up from approximately 54% today.” Ikert stressed that “mayors and city agencies are directly accountable to their constituency” in order to protect and preserve their lives and livelihood. She also urged cities to reach out to researchers to obtain accurate modeling for extreme events. Cities have the potential to account for 40% of the emissions reductions required to align with the Paris Agreement’s goal to limit temperature rise to 1.5°C (UN Climate Change 2015). Therefore, the way a city responds to climate change, Ikert said, determines how livable and competitive it will be in the future.

Panel 3: City Stakeholders and Beyond

How can knowledge networks and city networks improve interactions to achieve climate change solutions?

The final panel was moderated by Richard Moss of the American Meteorological Society, and featured Corinne LeTourneau, of the North America Region, 100 Resilient Cities; Kerry Constabile, of the Executive Office of the UN Secretary General; Jainey Bavishi, of the New York City Mayor’s Office of Recovery and Resiliency; and Seth Schultz, of Urban Breakthroughs, spoke about the enormous value and knowledge of stakeholders.

In this session, all of the participants highlighted that many cities are playing a critical role in meeting the challenge of climate change, both through efforts to reduce their own greenhouse gas footprints, and to update infrastructure and programs to meet the needs of their citizens as climate change impacts occur.

Panelists discussed how finances are a major challenge to addressing climate change. For example, Constabile noted that a small percentage of megacities in developing countries have credit ratings. This lack of “creditworthiness” hinders cities from raising their own bonds and attracting private investment, both of which are significant sources of funding for climate-related projects. Schultz suggested that private money may jumpstart some climate resiliency and adaptation efforts, and stated that eight of ten of the world’s largest countries are funding research on climate change. LeTourneau and Schultz identified that without the climate data to assess risks, money will not be directed to the areas of greatest need. LeTourneau highlighted the importance of describing how climate change affects risks and “the bottom line” in a way that decision makers and citizens find compelling and relatable.

Panelists also highlighted that climate does not have boundaries, but government bodies do. As Bavishi pointed out, New York City is lucky that climate change adaptation has been codified into law. Chief resilience officers are retained even after city funding is spent, so continuity is in place. City governments around the country and the globe are following suit, but as the panelists pointed out, these ideas should spread more widely.

Closing Remarks

NPCC member Michael Oppenheimer remarked that the NPCC offers a “local picture at granular level with the best possible science.” Hurricane Sandy taught the City about its vulnerability and drove research on flood tides and rising coastal tides. With the 2010 NPCC report, he said, a firm research agenda was drafted that shifted the City’s view of climate change to resiliency. Oppenheimer stressed that NPCC science is useful for policy and praised New York City for utilizing NPCC data in policy decisions. In closing, Oppenheimer said that dissemination assures that communities worldwide are able to use NPCC data.

Further Readings

Ikert

Rosenzweig C, Solecki W, Romero-Lankao P, Mehrtotra S, et al.

Climate change and cities: Second assessment report of the urban climate change research network

Cambridge: Cambridge University Press Eds; 2018

United Nations, Department of Economic and Social Affairs

World Urbanization Prospects: The 2018 Revision, Online Edition

Population Division (2018)

Moss RH, Avery S, Baja K, Burkett M, et al.

Evaluating Knowledge to Support Climate Action: A Framework for Sustained Assessment

Wea., Clim., Soc. 2019 Apr 4(2019)

The New York City Mayor’s Proclamation

Whereas: Global issues are often felt most deeply at the local level, and in the face of worldwide threats to our environment, infrastructure, and economy, cities have the power and responsibility to lead our planet in the right direction.  After Hurricane Sandy, when the devastating effects of climate change hit home for far too many of our residents, New York City reaffirmed our commitment to building a sustainable path forward.  On the 10th anniversary of its founding, it is a great pleasure to recognize the New York City Panel on Climate Change for its exceptional leadership in this work.

Whereas: Since 2008, the NPCC’s innovations in urban climate science have propelled New York to the forefront of the global fight against climate change.  Its recommendations have informed ambitious policies that have helped the five boroughs recover from past damage and emerge stronger, and its successful partnership with the City of New York and the New York Academy of Sciences demonstrates the power of collaboration between the public sector, industry and local leaders, and the scientific community.  With the NPCC’s guidance, we are better prepared to anticipate and conquer the climate challenges that lie ahead.

Whereas: New Yorkers have always been known for their resiliency and boldness, and our city must meet concerns of this scale with solutions that our worthy of its residents.  From increasing our coastal resiliency to pioneering a global protocol for cities to attain carbon neutrality by 2050, my administration remains steadfast in our efforts to protect people of all backgrounds from the impacts of climate change.  As we continue to grapple with the grave risks that global warming poses, we are grateful to the NPCC for providing our city with the rigorous science needed to thrive in our rapidly changing world.  Today’s Summit offers a wonderful opportunity to applaud this organization for a decade of service to New York City, and I look forward to the progress its members will continue to inspire in the years ahead.

Now therefore, I, Bill De Blasio, Mayor of the City of New York, do hereby proclaim Friday, March 15th, 2019 in the City of New York as:

 “NEW YORK CITY PANEL ON CLIMATE CHANGE DAY”

Proclamation of the Mayor of New York City

The Need for Centralized Info in Crisis Management

A graphic illustration of people responding to a flooding disaster.

Junior Academy students develop an app that addresses the immediate mental health needs of those impacted by hurricanes and other traumatic natural crises, much of which is worsened because of the lack of centralized information during crisis scenarios.

Published May 1, 2019

By Mandy Carr

Four high school students from around the globe came together for the Junior Academy‘s Natural Disasters: Relief & Recovery Challenge to create a solution that could help reduce future devastation. The team designed a response model that could be used for many types of disasters, not just hurricane. They used Hurricane Katrina as their case study with a focus on addressing mental health needs for those impacted.

In their analysis, the lack of central information is a common struggle for those responding to disasters. To address that struggle, the team determined that gathering critical information in high-risk and disaster-prone areas before disasters happen would provide a useful baseline for responders. To that end, they created a smartphone-based community survey app that can regularly collect information about residents financial and employment status, mindset, living habits, and mental health. These same survey tools could then also be used after disasters to understand what has shifted. Additionally it might access how to tailor interventions and where critical needs and assets exist.

The team’s winning solution was one of 40 submitted. It garnered them a trip to New York City for the Global STEM Alliance Summit, held July-July 26, 2019.

Meet the students and learn about why they feel passionate about their idea:

Luis G. Alvarez

Luis G. Alvarez

Luis G. Alvarez, 17, is from Colegio Integtral Mesoamericano Patzicia in Guatemala. He has personal experience with natural disasters following the eruption of Volcan de Fuego in 2018. He and his family were required to evacuate.

“I remember getting some tools and hearing something like rain falling on the fallen leaves,” said Alvarez. “At first, I didn’t recognize what it was, but once I put on my raincoat, I realized it was ashes and sand, not rain. I told my parents, and we quickly got into the car and left.”

This inspired him to participate in the Natural Disasters challenge.

Samiksha Raviraja

Samiksha Raviraja

“Looking at the world around, there are events happening constantly,” said Samiksha Raviraja, 17, from Renaissance High School in Charlottesville, VA. “Some of the most haunting ones are those that happen in nature and result in great damage to communities. I wanted to be able to help in some way.”

It scared her to see the disasters that were happening across the globe on TV. The word “disaster” was what drew her to this challenge in particular.

“While procedures exist to help people in the best possible way to save their lives, not many procedures exist that look into the mental health of the patient after a disaster has happened,” said Raviraja. “With PTSD, it is possible for the trauma to be passed down to children.”

Eszter Varga

Eszter Varga

Natural disasters are something Eszter Varga, 19, from Szerb Antal High School in Budapest, Hungary has always wanted to help resolve, especially because they are “becoming an emerging issue with climate change.”

“The part that really touched me and my fellow teammates, was the fact that post-Katrina, PTSD claimed thousands of lives,” said Varga.

“We discovered the mental health aspect of disaster relief is typically an untreated issue.”

Thuy Tran

Thuy Tran, 16, from Le Hong Phong High School for the Gifted in Nam Dinh, Vietnam, echoed the team’s desire to focus on mental health when creating their solution.

“Hurricane Katrina claimed many lives post-disaster because of rushed treatment ideas, poorly planned information flow, as well as lack of education and data gathering,” said Tran.

Citizen Science in the Digital Age: Eagle Eyes

Science is a tool for combatting disinformation and making informed decisions.

Published May 1, 2019

By Robert Birchard

The term “citizen science” first entered the Oxford English Dictionary in 2014. It describes a long-standing tradition of collaboration between professional and amateur scientists. Perhaps no field is as closely associated with citizen science as astronomy, where amateur stargazers continue to sweep the skies for unidentified heavenly bodies. Today, with the advent of smartphone technology, even more fields of scientific inquiry are open to the curious amateur.

Eagle Eyes

Ana Prieto, GLOBE program volunteer

One of the oldest continuing citizen science projects is the National Audubon Society’s annual Christmas Bird Count (CBC). The CBC was founded in 1900 by Frank Chapman, an ornithologist at the American Museum of Natural History. Conceived as an alternative to traditional hunts, the first CBC included 27 participants at 25 count sites across North America. It has grown to 76,987 participants counting 59,242,067 birds at 2,585 sites. This will be done during the 118th count in the United States, Canada, Latin America, the Caribbean and Pacific Islands.

Documentation and verification of CBC counts has been revolutionized by mobile technologies and digital photography.

“If somebody said they saw a scarlet tanager or an eastern kingbird, which are common in the summer, but which conventional ornithological wisdom says are always in South America during the CBC, those sightings used to be rejected,” explained Geoffrey LeBaron the longtime Audubon Society Christmas Bird Count Director.

Observing the Past, Predicting the Future

“Everything today is 100 percent electronic and no longer published in print. All results are posted online as soon as a compiler checks off that their count is completed. The data then becomes viewable to the public. Once a region is completed, we have a team of expert reviewers that go over every single count. If they feel there’s something that needs documentation, they’ll be in touch with the compiler, who will get in touch with the observer.”

Scientists use the collected CBC data to observe long-term trends. Additionally, they predict future effects of climate change on species at risk.

“When people are analyzing CBC data, they’re not usually looking at year to year variations, because there is too much variability caused by weather and other factors,” explained Mr. LeBaron. “We looked at the center of abundance of the most common and widespread species and how they varied from the 1960s to the present. We found that a lot of species have moved the center of abundance of their range as much as 200 miles northward and inward away from the coasts.”

Keeping Citizens in Science

Citizen science requires enthusiastic participation of the public, but how can researchers keep the public engaged? This question was recently considered in a paper from Maurizio Porfiri, PhD, Dynamical Systems Laboratory at New York University. The paper is titled, Bring them aboard: Rewarding participation in technology-mediated citizen science projects.”

The team hypothesized that monetary rewards and online or social media acknowledgments would increase engagement of participants.

“People contribute to citizen science projects for a variety of different reasons,” said Jeffrey Laut, PhD, a postdoctoral researcher in Dr. Porfiri’s lab. “If you just want to contribute to help out a project, and then you’re suddenly being paid for it, that might undermine the initial motivation.”

“For example, one of the things we point out in the paper is that people donate blood for the sake of helping out another human,” explained Dr. Laut. “Another study found that if you start paying people to donate blood, it might decrease the motivation to donate blood.”

Proper Rewards for Participation

If a citizen science project is suffering from levels of participation, researchers need to carefully choose the level of reward.

“I think with citizen science projects the intrinsic motivation is to contribute to a science project and wanting to further scientific knowledge,” said Dr. Laut. “If you’re designing a citizen science project, it would be helpful to consider incentives to enhance participation and also be careful on the choice of level of reward for participants.”

The technology used and scope of information collected may have changed, but the role remains as important as ever.

“It is important that citizens understand the world in which they live and are capable of making informed decisions,” said Ms. Prieto. “It’s also important that all people understand science, especially to combat disinformation. From this point of view citizen science is vital and a needed contributor to the greater field of science.”


Learn more about citizen science:

Citizen Science in the Digital Age: Learning Across the Globe

A shot of planet Earth taken from space.

The GLOBE program aims to understand how the Earth’s spheres interact as a single system.

Published May 1, 2019

By Robert Birchard

The term “citizen science” first entered the Oxford English Dictionary in 2014. It describes a long-standing tradition of collaboration between professional and amateur scientists. Perhaps no field is as closely associated with citizen science as astronomy. Here amateur stargazers continue to sweep the skies for unidentified heavenly bodies. Today, with the advent of smartphone technology, even more fields of scientific inquiry are open to the curious amateur.

Learning Across the Globe

The Global Learning and Observations to Benefit the Environment (GLOBE) program is an environmental science and education program active in over 120 countries. It seeks to understand how the Earth’s atmosphere, biosphere, hydrosphere and pedosphere interact as a single system.

“The data we collect varies depending on our research,” explained Ana Prieto a former high school science teacher and GLOBE program volunteer in Argentina. “We’re currently taking land cover measurements in the field, and in the summer we will start taking hydrology measurements. This provides students with first-hand scientific knowledge.”

Collected data is uploaded to the GLOBE database using their customized app.

“The GLOBE protocols (instructions on how to take measurements) are updated and respond to a range of opportunities for measurement and research,” said Ms. Prieto. “It teaches students to use measuring devices, perform physical-chemical analysis, make estimations, pose questions, make hypotheses and design investigations. In short STEM is applied to real-world problems.” For non-GLOBE members the GLOBE Observer allows any citizen scientist enthusiast to collect and send data from GLOBE countries.

Data with Various Applications

The data is used for a variety of purposes.

“We collaborate with NASA Scientists and Science Missions,” explained Tony Murphy, PhD, GLOBE Implementation Office Director. “One example is the August 2017 North American eclipse. NASA scientists are looking at the temperature data collected. They are examining the impact of the eclipse on air temperature and solar radiation. Another use is data gathered on mosquito larvae detection and identification, which is then used to help local communities combat the spread of mosquito-borne diseases by identifying and eliminating sources of standing water, such as containers and spare tires, in which mosquitoes breed.”

The data collected by GLOBE is verified in their system of checks and balances. “We’re looking primarily for outliers,” explained Dr. Murphy. “There’s a range of acceptability for the data in different protocols. Also, we have had scientists look at particular data sets and they found that the data is, for the most part, accurate.” He concluded, “It’s important to get people involved, get them outside, using technology in a positive way for an educational purpose.”

Keeping Citizens in Science

Citizen science requires enthusiastic participation of the public, but how can researchers keep the public engaged? This question was recently considered in a paper from Maurizio Porfiri, PhD, Dynamical Systems Laboratory at New York University titled, Bring them aboard: Rewarding participation in technology-mediated citizen science projects.”

The team hypothesized that monetary rewards and online or social media acknowledgments would increase engagement of participants.

“People contribute to citizen science projects for a variety of different reasons,” said Jeffrey Laut, PhD, a postdoctoral researcher in Dr. Porfiri’s lab. “If you just want to contribute to help out a project, and then you’re suddenly being paid for it, that might undermine the initial motivation.”

“For example, one of the things we point out in the paper is that people donate blood for the sake of helping out another human,” explained Dr. Laut. “Another study found that if you start paying people to donate blood, it might decrease the motivation to donate blood.”

Proper Rewards for Participation

If a citizen science project is suffering from levels of participation, researchers need to carefully choose the level of reward.

“I think with citizen science projects the intrinsic motivation is to contribute to a science project and wanting to further scientific knowledge,” said Dr. Laut. “If you’re designing a citizen science project, it would be helpful to consider incentives to enhance participation and also be careful on the choice of level of reward for participants.”

The technology used and scope of information collected may have changed, but the role remains as important as ever.

“It is important that citizens understand the world in which they live and are capable of making informed decisions,” said Ms. Prieto. “It’s also important that all people understand science, especially to combat disinformation. From this point of view citizen science is vital and a needed contributor to the greater field of science.”


Learn more about citizen science:

Big Data: Balancing Privacy and Innovation

Presented by:

Science & the City

Often cited as the “4th Industrial Revolution” big data has the potential to transform health and healthcare by drawing medical conclusions from new and exciting sources such as electronic health records, genomic databases, and even credit card activity. In this podcast you will hear from tech, healthcare, and regulatory experts on potential paths forward that balance privacy and consumer protections while fostering innovations that could benefit everyone in our society. 

This podcast was produced following a conference on this topic held in partnership between the NYU School of Medicine and The New York Academy of Sciences. It was made possible with support from Johnson & Johnson.