Scientists are searching for ways to develop long-term sustainable food production systems, while preserving fragile eco-systems.
Published May 1, 2020
By Alan Dove, PhD
Image courtesy of The New York Academy of Sciences Magazine, Spring 2020
What should we eat? This fundamental question has bedeviled humanity throughout our history, spurring a series of urgent, society-changing innovations. For centuries, agriculture changed little, from the hunter-gatherer tribes who lived off the limited bounty of subsistence farming and the hunting of wild animals, to the open-field system of farming performed by village serfs. But these approaches could not produce enough food to feed a growing population.
The enclosures of the common lands in 16th century England caused massive civil unrest, but it made the land more productive. The industrial revolution introduced such innovations as the combine harvester, as well as storage and shipping technologies that allowed the cultivation of a greater variety of food for an even larger population. Today “agribots” and other AI-based technologies are helping farmers to keep pace with the demand for food production as the global population balloons into the billions.
Unfortunately, these innovations have come at a steep environmental cost. Modern farming guzzles fossil fuels and scarce water reserves, emits hazardous chemicals, and overturns entire ecosystems. The planet can’t sustain this pace much longer, so we’re faced with the prospect of refereeing the interests of commercial farmers with those who believe the health of people is only as good as the environment in which they live. With environmental laws now being weakened, farmers and researchers will need to work together more closely to develop new long-term sustainable food production systems while preserving fragile eco-systems.
“Oh, The Farmer And The [Researcher] Should Be Friends …”
A central problem in restructuring global agriculture is the sheer scale and diversity of the industry. Ideas that work well in an Iowa corn field are irrelevant to an Indonesian rice paddy, which in turn bears little relation to beeves grazing on Argentine range land.
That’s why the United Nations’ Sustainable Development Solutions Network launched the Food, Agriculture, Biodiversity, Land-Use, and Energy (FABLE) Consortium in 2017. “The idea was that we need to build the capacity in many countries … to do some long-term analysis of the food and land systems in order to design policies,” says Aline Mosnier, scientific director of FABLE in Paris, France. Based mostly at research institutes, each of the 22 current FABLE country teams focuses on analyzing and modeling a specific country’s agricultural systems.
FABLE released its first report in 2019, a comprehensive overview that identified “pathways to sustainability” for different countries and types of agriculture. Consortium teams focused on strategies to increase food security while reducing greenhouse gas emissions and deforestation, tailoring them to local conditions. “It’s very important to have this being driven at the country level,” says Mosnier, adding that “studies before this were just at the global level.”
The FABLE models suggest that with appropriate policies and careful implementation, farming doesn’t have to come at the expense of the environment. “It seems feasible that we could reach many of our targets toward greater sustainability … it’s feasible by 2050 if we have some proactive measures implemented by the different countries,” says Mosnier.
However, sustainability requires political will to follow the science. Brazil, for example, achieved significant, rapid reductions in deforestation in recent years. But in 2018, a far-right government took over, appointing pro-industry administrators to top posts and setting aside many of the previous recommendations by scientists. Deforestation rates in the country have since skyrocketed.
This Land Is My Land, This Land Is Your Land
Problematic land use changes aren’t limited to the clearing of rainforests. Indeed, all forms of agriculture entail some degree of ecosystem engineering. “There’s such a variety of approaches, and they vary in how much they sort of coerce the ecological system,” explains Craig Allen, Director of the Center for Resilience in Working Agricultural Landscapes at the University of Nebraska in Lincoln.
Clear-cutting forests to make room for farms can extirpate many native species, but even converting grasslands to similar-looking fields of wheat or corn can disrupt an ecosystem. And while many environmentalists argue — correctly — that modern animal farming can be quite destructive, not all meat is the same. “Here in Nebraska we have extensive range lands, and those are pretty much native prairie, managed quite well and fulfilling habitat requirements for a wide range of species,” says Allen.
The fundamental problem is that any ecosystem can only support a finite number of organisms, so growing plants or animals for human consumption will always carry some environmental cost. “Our landscapes produce a wide range of ecosystem services, and one of those services is food production,” says Allen, adding that on land dominated by modern monoculture farming, biodiversity inevitably suffers.
Experts project that by the year 2050, the growing global population will need as much as 70 percent more food than the world’s farmers currently produce. One solution is to farm existing agricultural land more intensely. New irrigation, fertilization, and crop breeding strategies are already boosting yields in many areas. “I’m relatively optimistic about our ability to increase productivity, but of course the planet’s becoming smaller and smaller,” says Allen. He adds that pollution from some of the fundamental inputs of intensive agriculture, such as nitrogen fertilizers and fossil fuels, may soon strain the planet’s ecological limits.
In the meantime, farmers in many areas are already running up against another critical limit: water scarcity. While drought has been a hazard to agriculture throughout history, growing demand and climate change exacerbate the problem. “Water supplies are becoming increasingly erratic, and that’s a function of the rainfall and the snowfall and changes in dry times, but more importantly, it’s becoming less predictable from year to year,” says Todd Jarvis, director of the Institute for Water and Watersheds at Oregon State University in Corvallis.
That volatility is a problem for scientists as well as farmers. Researchers such as Jarvis have long built their prediction models based on stationarity, the idea that past trends will continue into the future. With droughts and floods becoming more erratic, that approach doesn’t work anymore.
Worse, the changing climate will have radically different effects in different parts of the world. “Everybody makes reference to climate change as global drying, and that’s not the case. In just the Midwest of the United States alone, the change in climate is resulting in more water in some places,” says Jarvis.
Adapting to these changes will require different approaches, depending on the types of water problems each region is facing. Infrastructure such as dams and reservoirs built to handle floods may have to be expanded or modified, while irrigation systems designed for current droughts may prove inadequate in the future. As with changes in land use, infrastructure shifts may also have unintended consequences.
“A lot of the infrastructure that’s been constructed over the past 50 to 100 years was in response to building the agricultural industry and settling lands, and today we’re having a completely different bundle of challenges,” says Jarvis. As an example, he cites the flood-control dams of the Pacific Northwest, which are now considered threats to the region’s salmon fishery.
The Answer May Not Necessarily Lie in the Soil
The difficult tradeoffs involved in farming have led some scientists to explore a different approach to food production: fermentation. Microbes growing in industrial scale fermenters can produce vast quantities of proteins, carbohydrates, and fats in a matter of days. The idea of turning this nutritional bonanza into food isn’t new. Marmite, a popular sandwich spread in Britain made from leftover brewer’s yeast, was developed in the 19th century.
More recent efforts to brew staple foods have focused on other microbes, especially soil bacteria that can grow on simple inputs. Solar Foods in Helsinki, Finland, is at the leading edge of this field. “We want to disconnect food production from agriculture,” says Pasi Vainikka, the company’s CEO. The core of Solar Foods’ system is a fermentation process that runs on hydrogen, carbon dioxide, ammonia, and a few minor nutrients such as calcium and phosphorous. The hydrogen comes from splitting water molecules. For energy, the company relies on solar-generated electricity.
“From a physicist’s point of view, we’re just converting electricity to edible calories,” says Vainikka. Based on current solar electricity production capabilities, Vainikka has calculated that a kilogram of Solein, the company’s food product, uses one-tenth the land area required for a kilogram of soy protein, and one-hundredth the land area needed for a kilogram of beef. The production process also uses orders of magnitude less water than conventional agriculture.
Solar Foods’ pilot plant in Helsinki now produces about a kilogram of Solein per day. The company is using that material to carry out the testing required by regulatory agencies in the U.S. and E.U., while refining the fermentation process. Vainikka hopes to scale up to a full-size factory by 2025, to supply Solein as a protein-rich additive for various food products worldwide.
If that happens, humanity may soon be pulling food out of thin air.
The New York Academy of Sciences and the Blavatnik Family Foundation hosted the annual Blavatnik Science Symposium on July 15–16, 2019, uniting 75 Finalists, Laureates, and Winners of the Blavatnik Awards for Young Scientists. Honorees from the UK and Israel Awards programs joined Blavatnik National and Regional Awards honorees from the U.S. for what one speaker described as “two days of the impossible.” Nearly 30 presenters delivered research updates over the course of nine themed sessions, offering a fast-paced peek into the latest developments in materials science, quantum optics, sustainable technologies, neuroscience, chemical biology, and biomedicine.
Symposium Highlights
Computer vision and machine learning have enabled novel analyses of satellite and drone images of wildlife, food crops, and the Earth itself.
Next-generation atomic clocks can be used to study interactions between particles in complex many-body systems.
Bacterial communities colonizing the intestinal tract produce bioactive molecules that interact with the human genome and may influence disease susceptibility.
New catalysts can reduce carbon emissions associated with industrial chemical production.
Retinal neurons display a surprising degree of plasticity, changing their coding in response to repetitive stimuli.
New approaches for applying machine learning to complex datasets is improving predictive algorithms in fields ranging from consumer marketing to healthcare.
Breakthroughs in materials science have resulted in materials with remarkable strength and responsiveness.
Single-cell genomic studies are revealing some of the mechanisms that drive cancer development, metastasis, and resistance to treatment.
Speakers
Emily Balskus, PhD Harvard University
Chiara Daraio, PhD Caltech
William Dichtel, PhD Northwestern University
Elza Erkip, PhD New York University
Lucia Gualtieri, PhD Stanford University
Ive Hermans, PhD University of Wisconsin – Madison
Liangbing Hu, PhD University of Maryland, College Park
Jure Leskovec, PhD Stanford University
Heather J. Lynch, PhD Stony Brook University
Wei Min, PhD Columbia University
Seth Murray, PhD Texas A & M University
Nicholas Navin, PhD, MD MD Anderson Cancer Center
Ana Maria Rey, PhD University of Colorado Boulder
Michal Rivlin, PhD Weizmann Institute of Science
Nieng Yan, PhD Princeton University
Event Sponsor
Technology for Sustainability
Speakers
Heather J. Lynch Stony Brook University
Lucia Gualtieri Stanford University
Seth Murray Texas A & M University
Highlights
Machine learning algorithms trained to analyze satellite imagery have led to the discovery of previously unknown colonies of Antarctic penguins.
Seismographic data can be used to analyze more than just earthquakes—typhoons, hurricanes, iceberg-calving events and landslides are reflected in the seismic record.
Unmanned aerial systems are a valuable tool for phenotypic analysis in plant breeding, allowing researchers to take frequent measurements of key metrics during the growing season and identify spectral signatures of crop yield.
Satellites, Drones, and New Insights into Penguin Biogeography
Satellite images have been used for decades to document geological changes and environmental disasters, but ecologist and 2019 Blavatnik National Awards Laureate in Life Sciences, Heather Lynch, is one of the few to probe the database in search of penguin guano. She opened the symposium with the story of how the Landsat satellite program enabled a surprise discovery of several of Earth’s largest colonies of Adélie penguins, a finding that has ushered in a new era of insight into these iconic Antarctic animals.
Steady streams of high quality spatial and temporal data regularly support environmental science. In contrast, Lynch noted that wildlife biology has advanced so slowly that many field techniques “would be familiar to Darwin.” Collecting information on animal populations, including changes in population size or migration patterns, relies on arduous and imprecise counting methods. The quest for alternative ways to track wildlife populations—in this case, Antarctic penguin colonies—led Lynch to develop a machine learning algorithm for automated identification of penguin guano in high resolution commercial satellite imagery, which can be combined with lower resolution imagery like that coming from NASA’s Landsat program. Pairing measurements of vast, visible tracts of penguin guano—the excrement colored bright pink due to the birds’ diet—with information about penguin colony density yields near-precise population information. The technique has been used to survey populations in known penguin colonies and enabled the unexpected discovery of a “major biological hotspot” in the Danger Islands, on the tip of the Antarctic Peninsula. This Antarctic Archipelago is so small that it is doesn’t appear on most maps of the Antarctic continent, yet it hosts one of the world’s largest Adélie penguin hotspots.
Satellite images of the pink stains of Antarctic penguin guano have been used to identify and track penguin populations.
Lynch and her colleagues are developing new algorithms that utilize high-resolution drone and satellite imagery to create centimeter-scale, 3D models of penguin terrain. These models feed into detailed habitat suitability and population-tracking analyses that further basic research and can even influence environmental policy decisions. Lynch noted that the discovery of the Danger Island colony led to the institution of crucial environmental protections for this region that may have otherwise been overlooked. “Better technology actually can lead to better conservation,” she said.
Listening to the Environment with Seismic Waves
The study of earthquakes has dominated seismology for decades, but new analyses of seismic wave activity are broadening the field. “The Earth is never at rest,” said Lucia Gualtieri, 2018 Blavatnik Regional Awards Finalist, while reviewing a series of non-earthquake seismograms that show constant, low-level vibrations within the Earth. Long discarded as “seismic noise,” these data, which comprise more than 90% of seismograms, are now considered a powerful tool for uniting seismology, atmospheric science, and oceanography to produce a holistic picture of the interactions between the solid Earth and other systems.
In addition to earthquakes, events such as hurricanes, typhoons, and landslides are reflected in the seismic record.
Nearly every environmental process generates seismic waves. Hurricanes, typhoons, and landslides have distinct vibrational patterns, as do changes in river flow during monsoons and “glacial earthquakes” caused by ice calving events. Gualtieri illustrated how events on the surface of the Earth are reflected within the seismic record—even at remarkably long distances—including a massive landslide in Alaska detected by a seismic sensor in Massachusetts. Gualtieri and her collaborators are tapping this exquisite sensitivity to create a new generation of tools capable of measuring the precise path and strength of hurricanes and tropical cyclones, and for making predictive models of cyclone strength and behavior based on decades of seismic data.
Improving Crop Yield Using Unmanned Aerial Systems and Field Phenomics
Plant breeders like Seth Murray, 2019 Blavatnik National Awards Finalist, are uniquely attuned to the demands a soaring global population places on the planet’s food supply. Staple crop yields have skyrocketed thanks to a century of advances in breeding and improved management practices, but the pressure is on to create new strategies for boosting yield while reducing agricultural inputs. “We need to grow more plants, measure them better, use more genetic diversity, and create more seasons per year,” Murray said. It’s a tall order, but one that he and a transdisciplinary group of collaborators are tackling with the help of a fleet of unmanned aerial systems (UAS), or drones.
Drones facilitate frequent measurement of plant height, revealing variations between varietals early in the growth process.
Genomics has transformed many aspects of plant breeding, but phenotypic, rather than genotypic, information is more useful for predicting crop yield. Using drones equipped with specialized equipment, Murray has not only automated many of the time-consuming measurements critical for plant phenotyping, such as tracking height, but has also identified novel metrics that can accelerate the development of new varietals. Spectral signatures obtained via drone can be used to identify top-yielding varietals of maize even before the plants are fully mature. Phenotypic features distilled from drone images are also being used to determine attributes such as disease resistance, which directly influence crop management. Murray’s team is modeling the influence of thousands of phenotypes on overall crop performance, paving the way for true phenomic selection in plant breeding.
Quantum mechanics underlies the technologies of modern computing, including transistors and integrated circuits.
Most quantum insights are derived from studies of single quantum particles, but understanding interactions between many particles is necessary for the development of devices such as quantum computers.
Atoms cooled to one billionth of a degree above absolute zero obey the laws of quantum mechanics, and can be used as quantum simulators to study many-particle interactions.
Atomic Clocks: From Timekeepers to Quantum Computers
The discovery of quantum mechanics opened “a new chapter in human knowledge,” said 2019 Blavatnik National Awards Laureate in Physical Sciences & Engineering, Ana Maria Rey, describing how the study of quantum phenomena has revolutionized modern computing, telecommunications, and navigation systems. Transistors, which make up integrated circuits, and lasers, which are the foundation of the atomic clocks that maintain the precision of satellites used in global positioning systems, all stem from discoveries about the nature of quantum particles.
The next generation of innovations—such as room temperature superconductors and quantum computers—will be based on new quantum insights, and all of this hinges on our ability to study interactions between many particles in quantum systems. The complexity of this task is beyond the scope of even the most powerful supercomputers. As Rey explained, calculating the possible states for a small number of quantum particles (six, for example) is simple. “But if you increase that by a factor of just 10, you end up with a number of states larger than the number of stars in the known universe,” she said.
Calculating the number of possible states for even a small number of quantum particles is a task too complex for even the most powerful supercomputer.
Researchers have developed several experimental platforms to clear this hurdle and explore the quantum world. Rey shared the story of how her work developing ultra-precise atomic clocks inadvertently led to one experimental platform that is already demystifying some aspects of quantum systems.
Atomic clocks keep time by measuring oscillations of atoms—typically in cesium atoms—as they change energy levels. Recently, Rey and her collaborators at JILA built the world’s most sensitive atomic clock using strontium atoms instead of cesium and using many more atoms that are typically found in these clocks. The instrument had the potential to be 1,000 times more sensitive than its predecessors, yet collisions between the atoms compromised its precision. Rey explained that by suppressing these collisions, their clock became “a window to explore the quantum world.” Within this framework, the atoms can be manipulated to simulate the movement and interactions of quantum particles in solid-state materials. Rey reported that this clock-turned-quantum simulator has already generated new findings about phenomena including superconductivity and quantum magnetism.
The human gut is colonized by trillions of bacteria that are critical for host health, yet may also be implicated in the development of diseases including colorectal cancer.
For over a decade, chemists have sought to resolve the structure of a genotoxin called colibactin, which is produced by a strain of E. coli commonly found in the gut microbiome of colorectal cancer patients.
By studying the specific type of DNA damage caused by colibactin, researchers found a trail of clues that led to a promising candidate structure of the colibactin molecule.
Gut Reactions: Understanding the Chemistry of the Human Gut Microbiome
The composition of the trillions-strong microbial communities that colonize the mammalian intestinal tract is well characterized, but a deeper understanding of their chemistry remains elusive. Emily Balskus, the 2019 Blavatnik National Awards Laureate in Chemistry, described her lab’s hunt for clues to solve one chemical mystery of the gut microbiome—a mission that could have implications for colorectal cancer (CRC) screening and early detection.
Some commensal E. coli strains in the human gut produce a genotoxin called colibactin. When cultured with human cells, these strains cause cell cycle arrest and DNA damage, and studies have shown increased populations of colibactin-producing E. coli in CRC patients. Previous studies have localized production of colibactin within the E. coli genome and hypothesized that the toxin is synthesized through an enzymatic assembly line. Yet every attempt to isolate colibactin and determine its chemical structure had failed.
Balskus’ group took “a very different approach,” in their efforts to discover colibactin’s structure. By studying the enzymes that make the toxin, the team uncovered a critical clue: a cyclopropane ring in the structure of a series of molecules they believed could be colibactin precursors. This functional group, when present in other molecules, is known to damage DNA, and its detection in the molecular products of the colibactin assembly line led the researchers to consider it as a potential mechanism of colibactin’s genotoxicity.
In collaboration with researchers at the University of Minnesota School of Public Health, Balskus’ team cultured human cells with colibactin-producing E. coli strains as well as strains that cannot produce the toxin. They identified and characterized the products of colibactin-mediated DNA damage. “Starting from the chemical structure of these DNA adducts, we can work backwards and think about potential routes for their production,” Balskus explained.
A proposed structure for the genotoxin colibactin, which is associated with colorectal cancer, features two cyclopropane rings capable of interacting with DNA to generate interstrand cross links, a type of DNA damage.
Further studies revealed that colibactin triggers a specific type of DNA damage that requires two reactive groups—likely represented by two cyclopropane rings in the final toxin structure—a pivotal discovery in deriving what Balskus believes is a strong candidate for the true colibactin structure. Balskus emphasized that this work could illuminate the role of colibactin in carcinogenesis, and may lead to cancer screening methods that rely on detecting DNA damage before cells become malignant. The findings also have implications for understanding microbiome-host interactions. “These studies reveal that human gut microbiota can interact with our genomes, compromising their integrity,” she said.
The chemical industry is a major producer of carbon dioxide, and efforts to create more efficient and sustainable chemical processes are often stymied by cost or scale.
Boron nitride is not well known as a catalyst, yet experiments show it is highly efficient at converting propane to propylene—one of the most widely used chemical building blocks in the world.
Two-dimensional polymers called covalent organic frameworks (COFs) can be used for water filtration, energy storage, and chemical sensing.
Until recently, researchers have struggled to control and direct COF formation, but new approaches to COF synthesis are advancing the field.
Boron Nitride: A Surprising Catalyst
Industrial chemicals “define our standard of living,” said Ive Hermans, 2019 Blavatnik National Awards Finalist, before explaining that nearly 96% of the products used in daily life arise from processes requiring bulk chemical production. These building block molecules are produced at an astonishingly large scale, using energy-intensive methods that also produce waste products, including carbon dioxide.
Despite pressure to reduce carbon emissions, the pace of innovation in chemical production is slow. The industry is capital-intensive — a chemical production plant can cost more than $2 billion—and it can take a decade or more to develop new methods of synthesizing chemicals. Concepts that show promise in the lab often fail at scale or are too costly to make the transition from lab to plant. “The goal is to come up with technologies that are both easily implemented and scalable,” Hermans said.
Catalysts are a key area of interest for improving chemical production processes. These molecules bind to reactants and can boost the speed and efficiency of chemical reactions. Hermans’ research focuses on catalyst design, and one of his recent discoveries, made “just by luck,” stands to transform production of one of the most in-demand chemicals worldwide—propylene.
Historically, propylene was one product (along with ethylene and several others) produced by “cracking” carbon–carbon bonds in naphtha, a crude oil component that has since been replaced by ethane (from natural gas) as a preferred starting material. However, ethane yields far less propylene, leaving manufacturers and researchers to seek alternative methods of producing the chemical.
Boron nitride catalyzes a highly efficient conversion of propane to propylene.
Enter boron nitride, a two-dimensional material whose catalytic properties took Hermans by surprise when a student in his lab discovered its efficiency at converting propane, also a component of natural gas, to propylene. Existing methods for running this reaction are endothermic and produce significant CO2. Boron nitride catalysts facilitate an exothermic reaction that can be conducted at far cooler temperatures, with little CO2 production. Better still, the only significant byproduct is ethylene, an in-demand commodity.
Hermans sees this success as a step toward a more sustainable future, where chemical production moves “away from a linear economy approach, where we make things and produce CO2 as a byproduct, and more toward a circular economy where we use different starting materials and convert CO2 back into chemical building blocks.”
Polymerization in Two Dimensions
William Dichtel, a Blavatnik National Awards Finalist in 2017 and 2019, offered an update from one of the most exciting frontiers in polymer chemistry—two-dimensional polymerization. The synthetic polymers that dominate modern life are comprised of linear, repeating chains of linked building blocks that imbue materials with specific properties. Designing non-linear polymer architectures requires the ability to precisely control the placement of components, a feat that has challenged chemists for a decade.
Dichtel described the potential of a class of polymers called covalent organic frameworks, or COFs—networks of polymers that form when monomers are polymerized into well-defined, two-dimensional structures. COFs can be created in a variety of topologies, dictated by the shape of the monomers that comprise it, and typically feature pores that can be customized to perform a range of functions. These materials hold promise for applications including water purification membranes, energy and gas storage, organic electronics, and chemical sensing.
Dichtel explained that COF development is a trial and error process that often fails, as the mechanisms of their formation are not well understood. “We have very limited ability to improve these materials rationally—we need to be able to control their form so we can integrate them into a wide variety of contexts,” he said.
Two-dimensional polymer networks can be utilized for water purification, energy storage, and many other applications, but chemists have long struggled to understand their formation and control their structure.
A breakthrough in COF synthesis came when chemist Brian Smith, a former postdoc in Dichtel’s lab, discovered that certain solvents allowed COFs to disperse as nanoparticles in solution rather than precipitating as powder. These particles became the basis for a new method of growing large, controlled crystalline COFs using nanoparticles as structural “seeds,” then slowly adding monomers to maximize growth while limiting nucleation. “This level of control parallels living polymerization, with well-defined initiation and growth phases,” Dichtel said.
More recently, Dichtel’s group has made significant advances in COF fabrication, successfully casting them into thin films that could be used in membrane and filtration applications.
Further Readings
Hermans
Zhang Z, Jimenez-Izal E, Hermans I, Alexandrova AN.
The 80 subtypes of retinal ganglion cells each encode different aspects of vision, such as direction and motion.
The “preferences” of these cells were believed to be hard-wired, yet experiments show that retinal ganglion cells can be reprogrammed by exposure to repetitive stimuli.
Sodium ion channels control electrical signaling in cells of the heart, muscles, and brain, and have long been drug targets due to their connection to pain signaling.
Cryo-electron microscopy has allowed researchers to visualize Nav 7, a sodium ion channel implicated in pain syndromes, and to identify molecules that interfere with its function.
Retinal Computations: Recalculating
The presentation from Michal Rivlin, the Life Sciences Laureate of the 2019 Blavatnik Awards in Israel, began with an optical illusion, a dizzying exercise during which a repetitive, unidirectional pattern of motion appeared to rapidly reverse direction. “You probably still perceive motion, but the image is actually stable now,” Rivlin said, completing a powerful demonstration of the action of direction-sensitive retinal ganglion cells (RGCs), whose mechanisms she has studied for more than a decade. The approximately 80 subtypes of RGCs each encode a different aspect, or modality of vision—motion, color, and edges, as well as perception of visual phenomena such as direction. These modalities are hard-wired into the cells and were thought to be immutable—a retinal ganglion cell that perceived left-to-right motion was thought incapable of responding to visual signals that move right-to-left. Rivlin’s research has challenged not only this notion, but also many other beliefs about the function and capabilities of the retina.
Rather than simply capturing discrete aspects of visual information like a camera and relaying that information to the visual thalamus for processing, the cells of the retina actually perform complex processing functions and display a surprising level of plasticity. Rivlin’s lab is probing both the anatomy and functionality of various types of retinal ganglion cells, including those that demonstrate selectivity, such as a preference for movement in one direction or attunement to increases or decreases in illumination. By exposing these cells to various repetitive stimuli, Rivlin has shown that the selectivity of RGCs can be reversed, even in adult retinas.
Direction-selective retinal ganglion cells that prefer left-to-right motion (Before) can change their directional preference (After) following a repetitive visual stimulus.
These dynamic changes in cells whose preferences were believed to be singular and hard-wired have implications not just for understanding retinal function but for understanding the physiological basis of visual perception. Stimulus-dependent changes in the coding of retinal ganglion cells also have downstream impacts on the visual thalamus, where retinal signals are processed. This unexpected plasticity in retinal cells has led Rivlin and her collaborators to investigate the possibility that the visual thalamus and other parts of the visual system might also display greater plasticity than previously believed.
Targeting Sodium Channels for Pain Treatment
Nature’s deadliest predators may seem an unlikely inspiration for developing new analgesic drugs, but as Nieng Yan, 2019 Blavatnik National Awards Finalist, explained, the potent toxins of some snails, spiders, and fish are the basis for research that could lead to safer alternatives to opioid medications.
Voltage-gated ion channels are responsible for electrical signaling in cells of the brain, heart, and skeletal muscles. Sodium channels are one of many ion channel subtypes, and their connection to pain signaling is well documented. Sodium channel blockers have been used as analgesics for a century, but they can be dangerously indiscriminate, inhibiting both the intended channel as well as others in cardiac or muscle tissues. The development of highly selective small molecules capable of blocking only channels tied to pain signaling seemed nearly impossible until two breakthroughs—one genetic, the other technological—brought a potential path for success into focus.
A 2006 study of families with a rare genetic mutation that renders them fully insensitive to pain turned researchers’ focus to the role of the gene SCN9A, which codes for the voltage-gated sodium ion channel Nav 1.7, in pain syndromes. Earlier studies showed that overexpression of SCN9A caused patients to suffer extreme pain sensitivity, and it was now clear that loss of function mutations resulted in the opposite condition.
A powerful natural toxin derived from corn snails blocks the pore of a voltage-gated sodium channel, halting the flow of ions and inhibiting the initiation of an action potential.
As Yan explained, understanding this channel required the ability to resolve its structure, but imaging techniques available at that time were poorly suited to large, membrane-bound proteins. With the advent of cryo-electron microscopy, Yan and other researchers have not only resolved the structure of Nav 1.7, but also characterized small molecules—mostly derived from animal toxins—that precisely and selectively interfere with its function. Developing synthetic drugs based on these molecules is the next phase of discovery, and it’s one that may happen more quickly than expected. “When I started my lab, I thought resolving this protein’s structure would be a lifetime project, but we shortened it to just five years,” said Yan.
A novel approach to developing machine learning algorithms has improved applications for non-linear datasets.
Neural networks can now be used for complex predictive tasks, including forecasting polypharmacy side effects.
5G wireless networks will expand the capabilities of internet-connected devices, providing dramatically faster data transmission and increased reliability.
Tools used to design wireless networks can also be used to understand vulnerabilities in the design of online platforms and social networks, particularly as it pertains to user privacy and data anonymization.
Machine Learning with Networks
“For the first time in history, we are using computers to process data at scale to gain novel insights,” said Jure Leskovec, a Blavatnik National Awards Finalist in 2017, 2018, and 2019, describing one aspect of the digital transformation of science, technology, and society. This shift, from using computers to run calculations or simulations to using them to generate insights, is driven in part by the massive data streams available from the Internet and internet-connected devices. Machine learning has catalyzed this transformation, allowing researchers to not only glean useful information from large datasets, but to make increasingly reliable predictions based on it. Just as new imaging techniques reveal previously unknown structures and phenomena in biology, astronomy, and other fields, so too are big data and machine learning bringing previously unobservable models, signals, and patterns to the surface.
This “new paradigm for discovery” has limitations, as Leskovec explained. Machine learning has advanced most rapidly in areas where data can be represented as simple sequences or grids, such as computer vision, image analysis, and speech processing. Analysis of more complex datasets—represented by networks rather than linear sequences—was beyond the scope of neural networks until recently, when Leskovec and his collaborators approached the challenge from a different angle.
The team considered networks as computation graphs, recognizing that the key to making predictions was understanding how information propagates across the network. By training each node in the network to collect information about neighboring nodes and aggregating the resulting data, they can use node-level information to make predictions within the context of the entire network.
Each node within a network collects information from neighboring nodes. Together, this information can be used to make predictions within the context of the network as a whole.
Leskovec shared two case studies demonstrating the broad applicability of this approach. In healthcare, a neural network designed by Leskovec is identifying previously undocumented side effects from drug-drug interactions. Each network node represents a drug or a protein target of a drug, with links between the nodes emerging based on shared side effects, protein targets, and protein-protein interactions. This type of polydrug side effects analysis is infeasible through clinical trials, and Leskovec is working to optimize it as a point-of-care tool for clinicians.
A similar system has been deployed on the online platform Pinterest, where Leskovec serves as Chief Scientist. It has improved the site’s ability to classify users’ preferences and suggest additional content. “We’re generalizing deep learning methodologies to complex data types, and this is leading to new frontiers,” Leskovec said.
Understanding and Engineering Communications Networks
Elza Erkip has never seen a slide rule. In two decades as a faculty researcher and electrical and computer engineer, Erkip, 2010 Blavatnik Awards Finalist, has corrected her share of misconceptions about her field, and about the role of engineering among the scientific disciplines. She joked about stereotypes portraying engineers—most of them men—wielding slide rules or wearing hard hats, but emphasized the importance of raising awareness about the real-life work of engineers. “Scientists want to understand the universe, but engineers use existing scientific knowledge to design and build things,” she explained. “We contribute to discovery, but mostly we want to solve problems, to find solutions that work in the real world.”
Erkip focuses on one of the most impactful areas of 21st century living—wireless communication—and the ever-evolving suite of technologies that support it. She reviewed the rapid progression of wireless device capabilities, from phones that featured only voice calling and text messaging, through the addition of Wi-Fi capability and web browsing, all the way to the smartphones of today, which boast more computing power than the Apollo 11 spacecraft that landed on the moon. She described the next revolution in wireless—5G networks and devices—which promises higher data rates and significant increases in speed and reliability. Tapping the millimeter-wave bands of the electromagnetic spectrum, 5G will rely on different wireless architectures featuring massive arrays of small antennae, which are better suited to propagating shorter wavelengths. The increased bandwidth will enable many more devices to come online. “It won’t just be humans communicating—we’ll have devices communicating with each other,” Erkip said, describing the future connectivity between robots, autonomous cars, home appliances, and sensors embedded in transportation, manufacturing, and industrial equipment.
Despite efforts to anonymize data, many social media sites and online databases remain vulnerable to efforts to match users’ identities across platforms.
Erkip also discussed the application of tools used to understand and build wireless networks to gain insight into privacy issues within social networks. De-anonymization of user data has long plagued online platforms. Studies have shown that it’s often possible to identify and match users across multiple social platforms or databases using publicly available information—a breach that has greater implications for a database of health or voting records than it does for a consumer-oriented site such as Netflix. Erkip is working to understand the fundamental properties of these networks to elucidate the factors that predispose them to de-anonymization attacks.
IEEE International Symposium on Information Theory. 2018.
Materials Science
Speakers
Chiara Daraio Caltech
Liangbing Hu University of Maryland, College Park
Highlights
Computer-aided manufacturing is enabling researchers to design materials with precisely tuned properties, such as responsiveness to light, temperature, or moisture.
Structured materials can mimic robots or machines, changing shape and form repeatedly in the presence of various stimuli.
Ultra-strong, lightweight wood-based materials made of nanocellulose fibers may one day resolve some of the world’s most pressing challenges in water, energy and sustainability, replacing transparent plastic packaging, window glass, and even steel and other alloys in vehicles and buildings.
Mechanics of Robotic Matter
Chiara Daraio’s work challenges the traditional definition of words like material, structure, and robot. Working at the intersection of physics, materials science, and computer science, she designs materials with novel properties and functionalities, enabled by computer-aided design and 3D fabrication. Rather than considering a material as the foundation for assembling a structure, Daraio, 2019 Blavatnik National Awards Finalist, designs materials with intricate structures in unique and complex geometries.
Daraio demonstrated a series of responsive materials—those that morph in the presence of stimuli such as temperature, light, moisture, or salinity. In their simplest forms, these materials change shape—a piece of heat-responsive material folds and unfolds as air temperature changes, or a leaf-shaped hydro-sensitive material opens and closes as it transitions from wet to dry. In more complex forms, materials can display time-dependent responses, as shown in a video demonstration of a row of polymer strips changing shape at different rates, depending on their thickness. Daraio showed how computer-graphical approaches allow researchers to design a single material with different properties in different regions, allowing complex actuation in a time-dependent manner, such as a polymer “flower” with interconnecting leaves taking shape and a polymer “ribbon” slowly interweaving a knot.
A thin foil elastomer comprised of materials with alternating temperature-sensitivity (heat and cold) folds up and “walks” across a table as the temperature varies.
Conventional ideas dictate that a robot is a programmable machine capable of completing a task. “But what if the material is the machine?” asked Daraio, showing the remarkable capabilities of a thin liquid crystal elastomer foil composed of one heat-sensitive and one cold-sensitive material. At room temperature, the foil is flat. Heat from a warm table causes it to curl upward, turn over, and “walk” forward. “As long as there’s some kind of external environmental stimulus, we can design a material that can repeatedly perform actions in time,” Daraio said. Similar responsive materials have been used in a self-deploying solar panel that [remove folds and] unfolds in response to heat.
Materials have been “the seeds of technological innovation” throughout human history, and Daraio believes that structured materials will enable new functionalities at the macroscale—for use in wearables such as helmets as well as in smart building technologies—and at the microscale, where responsive materials could be used for medical diagnostics or drug delivery.
Sustainable Applications for Wood Nanotechnologies
Wood, glass, plastic, and steel are among the most ubiquitous materials on Earth, and Liangbing Hu, 2019 Blavatnik National Awards Finalist, is rethinking them all. Inspired by the global need to develop sustainable materials, Hu turned to the most plentiful source of biomass on Earth— trees—to create a new generation of wood-based materials with astonishing properties. Hu relies on nanocellulose fibers, which can be engineered to serve as alternatives to commonly used unsustainable or energy-intensive materials.
Hu introduced a transparent film that could pass for plastic and can be used for packaging, yet is ten times stronger and far more versatile. This transparent nanopaper, made of nanocellulose fibers, could also be used as a display material in flexible electronics or as a photonic overlay that boosts the efficiency of solar cells by 30%.
Hu has also tested transparent wood—a heavier-gauge version of nanopaper made by removing lignin from wood and injecting the channels with a clear polymer—as an energy-saving building material. More than half of home energy loss is due to poor wall insulation and leakage through window glass. By Hu’s calculations, replacing glass windows with transparent wood would also provide a six-fold increase in thermal insulation. Pressed, delignified wood has also proven to be a superior material for wall insulation. Used on roofs, it is a highly efficient means of passive cooling—the material absorbs heat and then re-radiates it, cooling the surface below it by about ten degrees.
White delignified wood is pressed to increase its strength. It can be used on roofs to passively cool homes by absorbing and re-radiating light, cooling the area below it by about ten degrees.
Comparisons of mechanical strength between wood and steel are almost laughable, unless the wood is another of Hu’s creations—the aptly named “superwood.” Delignified and compressed to align the nanocellulose fibers, even inexpensive woods become thinner and 10-20 times stronger. Superwood rivals steel in strength and durability, and could become a viable alternative to steel and other alloys in buildings, vehicles, trains, and airplanes. Sustainable sourcing would eliminate pollution and carbon dioxide associated with steel production, and its lightweight profile could drastically improve vehicle fuel efficiency.
Tumor cells are genetically heterogeneous, complicating efforts to sequence DNA from tumor tissue samples.
Techniques for isolating and sequencing single-cell samples have transformed the study of cancer genetics.
Stimulated Ramen scattering, a non-invasive imaging technique, can visualize processes including glucose uptake and fatty acid metabolism within living cells.
Single Cell Genomics: A Revolution in Cancer Biology
Nicholas Navin, 2019 Blavatnik National Awards Finalist, doesn’t use the word “revolution” lightly, but when it comes to the field of single-cell genomics and its impact on cancer research, he stands by the term. Over the past ten years, DNA sequencing of single tumor cells has led to major discoveries about the progression of cancer and the process by which cancer cells resist treatment.
Unlike healthy tissue cells, tumor cells are characterized by genomic heterogeneity. Samples from different areas of the same tumor often contain different mutations or numbers of chromosomes. This diversity has long piqued researchers’ curiosity. “Is it stochastic noise generated as tumor cells acquire different mutations, or could this diversity be important for resistance to therapy, invasion, or metastasis?” Navin asked.
Answering that question required the ability to do comparative studies of single tumor cells, a task that was long out of reach. DNA sequencing technologies historically required a large sample of genetic material—a tricky proposition when sampling a highly diverse population of tumor cells. Some mutations, which could drive invasion or resistance, may be present in just a few cells and thus not be represented in the results. Navin was part of the first team to develop a method for excising a single cancer cell from a tumor, amplifying the DNA, and producing an individualized genetic sequence. As amplification and sequencing methods have improved, so too have the insights gleaned from single-cell genomic studies, which Navin likens to “paleontology in tumors”—the notion that a sample taken at a single point in time can allow researchers to make inferences about tumor evolution.
Single-cell genomic studies reveal that some cancer cells have innate mechanisms of resistance to chemotherapy, and undergo further transcriptional changes that enhance this resistance.
Single-cell studies have contradicted the idea of a stepwise evolution of cancer cells, with one mutation leading to another and ultimately tipping the scales toward malignancy. Instead, Navin’s studies reveal a punctuated evolution, whereby many cells simultaneously become genetically unstable. Longitudinal studies of single-cell samples in patients with triple-negative breast cancer are beginning to answer questions about how cancer cells evade treatment, showing that cells that survive chemotherapy have innate resistance, and then undergo further transcriptional changes during treatment, which increase resistance.
Translating these findings to the clinic is a longer-term process, but Navin envisions single-cell genomics will significantly impact strategies for targeted therapy, non-invasive monitoring, and early cancer detection.
Chemical Imaging in Biomedicine
Wei Min, a Blavatnik Awards Finalist in 2012 and 2019, concluded the session with a visually striking glimpse into the world of stimulated Raman scattering (SRS) microscopy. This noninvasive imaging technique provides both sub-cellular resolution and chemical information about living cells, while transcending some of the limitations of fluorescence-based optical microscopy. The probes used to tag molecules for fluorescent imaging can alter or destroy small molecules of interest, including glucose, lipids, amino acids, or neurotransmitters. Rather than using tags, SRS builds on traditional Raman spectroscopy, which captures and analyzes light scattered by the unique vibrational frequencies between atoms in biomolecules. The original method, first pioneered in the 1930s, is slow and lacks sensitivity, but in 2008, Min and others improved the technique.
SRS has since become a leading method for label-free visualization of living cells, providing an unprecedented window into cellular activities. Using SRS and a variety of custom chemical tags—“vibrational tags,” as Min described them—bound to biomolecules such as DNA or RNA bases, amino acids, or even glucose, researchers can observe the dynamics of biological functions. SRS has visualized glucose uptake in neurons and malignant tumors, and has been used to observe fatty acid metabolism, a critical step in understanding lipid disorders. Imaging small drug molecules is notoriously difficult, but Min reported the results of experiments using SRS to tag therapeutic drug molecules and study their activity within tissues.
Stimulated Raman scattering microscopy uses chemical tags to image small biological molecules in living cells. The technique can visualize cellular processes including glucose uptake in healthy cells and tumor cells.
A recent breakthrough in SRS technology involves pairing it with Raman dyes to break the “color barrier” in optical imaging. Due to the width of the fluorescent spectrum, labels are limited to five or six colors per sample, which prevents researchers from imaging many structures within a tissue sample simultaneously. Min has introduced a hybrid imaging technique that allows for super-multiplexed imaging—up to 10 colors in a single cell image—and utilizes a dramatically expanded palette of Raman frequencies that yield at least 20 distinct colors.
Climate change is a growing threat with global impact. Shifts in the climate present special challenges for urban areas where more than half of the world’s population lives. New York City residents, for example, are already feeling the effects through recurrent flooding in coastal communities, warmer temperatures across all five boroughs, and strains in the city’s infrastructure during heavy downpours and extreme weather events. As a result, cities like New York require the best-available climate science to develop tangible policies for resilience, mitigation, and adaptation.
On March 15, 2019, climate scientists, city planners, and community and industry stakeholders attended the Science for Decision-Making in a Warmer World summit at the New York Academy of Sciences to discuss how cities are responding to the effects of climate change. The event marked the 10th anniversary of a successful partnership between the New York City Panel on Climate Change (NPCC), the City of New York, and the New York Academy of Sciences. Established in 2008, the NPCC has opened new frontiers of urban climate science to build the foundation for resiliency actions in the New York metropolitan region.
Learn about the NPCC’s latest research findings and their implications for New York City and other cities seeking to identify and mitigate the effects of climate change in this summary.
Meeting Highlights
NPCC research provides tools to inform and shape climate change resilience in New York City and other cities around the globe.
Shifts in mean and extreme climate conditions significantly impact cities and communities worldwide.
Cities can move forward by adopting flexible adaptation pathways, an overall approach to developing effective climate change adaptation strategies for a region under conditions of increasing risk.
There is a growing recognition that resilience strategies need to be inclusive of community perspectives.
Speakers
Dan Bader Columbia University, New York City Panel on Climate Change
Jainey Bavishi New York City Mayor’s Office of Recovery and Resiliency
Sam Carter Rockefeller Foundation
Alan Cohn New York City Department of Environmental Protection
Kerry Constabile Executive Office of the UN Secretary General
Susanne DesRoches New York City Mayor’s Office of Recovery and Resiliency
Alexander Durst The Durst Organization
Sheila Foster Georgetown, New York City Panel on Climate Change
Vivien Gornitz Columbia University, New York City Panel on Climate Change
Mandy Ikert C40 Cities Climate Leadership Group
Klaus Jacob Columbia University, New York City Panel on Climate Change
Michael Marrella New York City Department of City Planning
Richard Moss American Meteorological Society
Kathy Robb Sive, Paget, and Riesel
Seth Schultz Urban Breakthroughs
Daniel Zarrilli, PE New York City Office of the Mayor
Climate Change, Science, and New York City
Speakers
Alan Cohn New York City Department of Environmental Protection
Susanne DesRoches New York City Mayor’s Office of Recovery and Resiliency
Alexander Durst The Durst Organization
Michael Marrella New York City Department of City Planning
Daniel Zarrilli (keynote) New York City Office of the Mayor
James Gennaro (panel moderator) New York State Department of Environmental Conservation
Keynote: Preparing for Climate Change — NPCC and Its Role in New York City
Daniel Zarrilli, of the New York City Office of the Mayor, gave the first keynote presentation. In addition to outlining NPCC history, he emphasized the meaning of NPCC to the city. NPCC has provided the tools to inform policy since before Hurricane Sandy in 2012. Because of NPCC, Zarrilli stated, people now know that the waters around New York City are rising “twice as quickly as the global average” and that climate change will affect communities disproportionately. The city can and will take on the responsibility to protect those who are most vulnerable. Zarrilli highlighted steps the Mayor’s Office is taking: fossil fuel divestment, bringing a lawsuit against big oil for causing climate change, and launching a new OneNYC strategic plan to confront our climate crisis, achieve equity, and strengthen our democracy. He concluded by saying that with “8.6 million New Yorkers and all major cities watching,” NPCC is providing the best possible climate science to drive New York City policy.
Panel 1: NPCC and Its Role in New York City
How are NPCC findings used in developing resiliency in New York City?
The first panel was moderated by William Solecki of Hunter College Institute for Sustainable Cities – City University of New York, and featured three city representatives, Susanne DesRoches, of the New York City Mayor’s Office of Recovery and Resiliency; Michael Marrella, of the New York City Department of City Planning; Alan Cohn, of the New York City Department of Environmental Protection; and one industry stakeholder, Alexander Durst, of the Durst Organization.
DesRoches noted that the NPCC research has made possible a proliferation of guidelines regulating building design in the city. In fact, the New York City Climate Resiliency Design Guidelines, released the same day that the panel took place, provide instruction on how to use climate projections in the design of city buildings. The Department of City Planning also uses NPCC data in its Coastal Zone Management Program to require that coastal site developers to disclose and address current and future flood risks. Marrella added that NPCC research tools allow public and private stakeholders to make informed decisions on how to shape policy. NPCC methods and approaches are also being used climate data is also being used for New York State and national projections.
Panelists also addressed how New York City’s mitigation goals enable resilience in the face of climate change challenges. DesRoches pointed to the city’s aggressive climate targets, including an “80% [emissions] reduction by 2050,” and a goal to limit temperature increase to 1.5°C, as targeted by the Paris Agreement (UN Climate Change 2015). She gave two examples of adaptations that align with the City’s mitigation goals: adapting high “passive house” and green building standards for a reduced carbon footprint; and diversifying how the city receives energy, including the development of a renewable energy grid. Cohn added that the Department of Environmental Protection aims to free up capacity in water conservation and implement the use of methane as an energy source. With resilience in mind, Durst stressed that energy models should be uniform and based on the future, not just today.
Further Readings
Zarrilli
Wallace-Wells D.
The Uninhabitable Earth: Life after Warming
New York: Tim Duggan Books; 2019
Panel 1
Rosenzweig C, Solecki W, González JE, Ortiz L, et al.
Panel 2: Latest Findings from the New York City Panel on Climate Change
What types of information are the most useful?
The second panel was moderated by Julie Pullen of Jupiter Intelligence, and featured four NPCC members who presented the latest NPCC3 report findings: Vivien Gornitz, Klaus Jacob, and Daniel Bader of Columbia University; and Sheila Foster, of Georgetown Law.
The latest NPCC3 findings confirmed climate projections from the 2015 report as the projections of record for New York City planning and decision-making. For example, by the end of the century, “ocean levels will be higher than they are now due to thermal expansion; changes in ocean heights; loss of ice from Greenland and Antarctic Ice Sheets; land-water storage; vertical land movements; and gravitational, rotational, and elastic ‘fingerprints’ of ice loss,” said Gornitz. Under the NPCC’s new Antarctic Rapid Ice melt (ARIM) scenario, there could be up to a 9.5 ft. increase in sea level rise by 2100 at the high end of the projections. The new report advises that levies or raised streets might reduce the effects that sea level rise will have on New York City’s coastline.
Vulnerability to climate change varies by neighborhood and socioeconomic status. Foster presented a new three-dimensional approach to community-based adaptation through the lens of equity: distributional, contextual, and procedural. Distributional equity emphasizes disparities across social groups, neighborhoods, and communities in vulnerability, adaptive capacity, and the outcomes of adaptation actions. Contextual equity emphasizes social, economic, and political factors and processes that contribute to uneven vulnerability and shape adaptive capacity. Procedural equity emphasizes the extent and robustness of public and community participation in adaptation planning and decision-making.
Echoing Mayor Bloomberg’s sentiment that “if you can’t measure it, you can’t manage it,” Jacob presented the proposed NPCC New York City Climate Change Resilience Indicators and Monitoring system (NYCLIM). Through the new proposed NYCLIM system, NPCC recommends climate, impact, vulnerability, and resilience indicators for the City’s decision-making processes.
Further Readings
Panel 2
Rosenzweig C, Solecki W, González JE, Ortiz L, et al.
Cities as Solutions for Climate Change and Closing Remarks
Keynote Speaker and Panelists
Jainey Bavishi New York City Mayor’s Office of Recovery and Resiliency
Sam Carter Rockefeller Foundation
Kerry Constabile Executive Office of the UN Secretary General
Seth Schultz Urban Breakthroughs
Mandy Ikert (keynote) C40 Cities Climate Leadership Group
Richard Moss (panel moderator) American Meteorological Society
Keynote: Role of Cities in Achieving Progress
Mandy Ikert, of C40 Cities Climate Leadership Group, gave the second keynote presentation. The Future We Don’t Want, a study recently released by C40, the Urban Climate Change Research Network (UCCRN), and Acclimatise found that billions of urban citizens are at risk of climate-related heat waves, droughts, floods, food shortages, and blackouts by 2050 (UCCRN 2018). Cities are situated at the forefront of these effects and urgently need to respond. Ikert stated that “we live in an urbanizing world,” where 68% of the world’s population will be living in cities by 2050, up from approximately 54% today.” Ikert stressed that “mayors and city agencies are directly accountable to their constituency” in order to protect and preserve their lives and livelihood. She also urged cities to reach out to researchers to obtain accurate modeling for extreme events. Cities have the potential to account for 40% of the emissions reductions required to align with the Paris Agreement’s goal to limit temperature rise to 1.5°C (UN Climate Change 2015). Therefore, the way a city responds to climate change, Ikert said, determines how livable and competitive it will be in the future.
Panel 3: City Stakeholders and Beyond
How can knowledge networks and city networks improve interactions to achieve climate change solutions?
The final panel was moderated by Richard Moss of the American Meteorological Society, and featured Corinne LeTourneau, of the North America Region, 100 Resilient Cities; Kerry Constabile, of the Executive Office of the UN Secretary General; Jainey Bavishi, of the New York City Mayor’s Office of Recovery and Resiliency; and Seth Schultz, of Urban Breakthroughs, spoke about the enormous value and knowledge of stakeholders.
In this session, all of the participants highlighted that many cities are playing a critical role in meeting the challenge of climate change, both through efforts to reduce their own greenhouse gas footprints, and to update infrastructure and programs to meet the needs of their citizens as climate change impacts occur.
Panelists discussed how finances are a major challenge to addressing climate change. For example, Constabile noted that a small percentage of megacities in developing countries have credit ratings. This lack of “creditworthiness” hinders cities from raising their own bonds and attracting private investment, both of which are significant sources of funding for climate-related projects. Schultz suggested that private money may jumpstart some climate resiliency and adaptation efforts, and stated that eight of ten of the world’s largest countries are funding research on climate change. LeTourneau and Schultz identified that without the climate data to assess risks, money will not be directed to the areas of greatest need. LeTourneau highlighted the importance of describing how climate change affects risks and “the bottom line” in a way that decision makers and citizens find compelling and relatable.
Panelists also highlighted that climate does not have boundaries, but government bodies do. As Bavishi pointed out, New York City is lucky that climate change adaptation has been codified into law. Chief resilience officers are retained even after city funding is spent, so continuity is in place. City governments around the country and the globe are following suit, but as the panelists pointed out, these ideas should spread more widely.
Closing Remarks
NPCC member Michael Oppenheimer remarked that the NPCC offers a “local picture at granular level with the best possible science.” Hurricane Sandy taught the City about its vulnerability and drove research on flood tides and rising coastal tides. With the 2010 NPCC report, he said, a firm research agenda was drafted that shifted the City’s view of climate change to resiliency. Oppenheimer stressed that NPCC science is useful for policy and praised New York City for utilizing NPCC data in policy decisions. In closing, Oppenheimer said that dissemination assures that communities worldwide are able to use NPCC data.
Further Readings
Ikert
Rosenzweig C, Solecki W, Romero-Lankao P, Mehrtotra S, et al.
Whereas: Global issues are often felt most deeply at the local level, and in the face of worldwide threats to our environment, infrastructure, and economy, cities have the power and responsibility to lead our planet in the right direction. After Hurricane Sandy, when the devastating effects of climate change hit home for far too many of our residents, New York City reaffirmed our commitment to building a sustainable path forward. On the 10th anniversary of its founding, it is a great pleasure to recognize the New York City Panel on Climate Change for its exceptional leadership in this work.
Whereas: Since 2008, the NPCC’s innovations in urban climate science have propelled New York to the forefront of the global fight against climate change. Its recommendations have informed ambitious policies that have helped the five boroughs recover from past damage and emerge stronger, and its successful partnership with the City of New York and the New York Academy of Sciences demonstrates the power of collaboration between the public sector, industry and local leaders, and the scientific community. With the NPCC’s guidance, we are better prepared to anticipate and conquer the climate challenges that lie ahead.
Whereas: New Yorkers have always been known for their resiliency and boldness, and our city must meet concerns of this scale with solutions that our worthy of its residents. From increasing our coastal resiliency to pioneering a global protocol for cities to attain carbon neutrality by 2050, my administration remains steadfast in our efforts to protect people of all backgrounds from the impacts of climate change. As we continue to grapple with the grave risks that global warming poses, we are grateful to the NPCC for providing our city with the rigorous science needed to thrive in our rapidly changing world. Today’s Summit offers a wonderful opportunity to applaud this organization for a decade of service to New York City, and I look forward to the progress its members will continue to inspire in the years ahead.
Now therefore, I, Bill De Blasio, Mayor of the City of New York, do hereby proclaim Friday, March 15th, 2019 in the City of New York as:
We caught up with New York City Panel on Climate Change (NPCC) member Michael Oppenheimer to discuss the importance of sound science informing effective policy.
Published February 22, 2019
By Marie Gentile, Mandy Carr, and Richard Birchard
Michael Oppenheimer, PhD
It will take more than a village — even when that “village” is the size of New York City — to find solutions to climate change, but that hasn’t deterred the New York City Panel on Climate Change (NPCC).
Consisting of leading climate change scientists, policy makers, and private sector practitioners the panel consists of leading climate change scientists, policy makers and private sector practitioners. Together, they are identifying and communicating the impacts of climate change. We recently sat down with NPCC member Michael Oppenheimer — head of Princeton University’s Center for Policy Research on Energy and Environment — to discuss the importance of sound science informing effective policy.
Why should NYC take the lead on identifying the impact of climate change?
Not only does NYC have the financial and intellectual capital to address climate change, it has the ability to deploy this capital to find solutions and consider what the looming risks and the options for dealing with these risks are. Its resources, in that way, are greater than any other city on earth.
Secondly, the city has a very high level of risk along its coast, compared to other places around the world. We are subject to both sea level rise and North Atlantic hurricanes and that’s a one, two punch. When it goes bad, you get Hurricane Sandy. So we have to learn to live in an already risk-laden world. If we can figure out how to deal with current risks and sustain the viability of the city through future, growing risks, that will be an important lesson for other places.
What role does the private sector have in helping to shape and implement NYC’s climate change response?
The private sector can be very helpful in terms of gathering the information we need to design potential options. A lot of the progress that’s been made in places like The Netherlands has been made with heavy private sector involvement. The private sector will have to be deeply involved in capital intensive solutions, like a surge barrier or the Big U, not as investors in the projects but because these will have significant implications for businesses. Their support could be a critical factor in the success of such efforts.
Conversely the private sector can create obstacles to progress by being resistant to the financial arrangements that are needed for adaptation and resilience building. NYC’s real estate industry is very politically influential and its preferences have often been quite visible. Sometimes their proposals are smart, and sometimes they are counterproductive and focused on rather narrow interests rather than the welfare of the city. Instead, I hope the industry provides forward-looking engagement that helps the city to protect its people at an affordable cost.
Why is scientific research critical to the development of good policy?
If we don’t have science, we have nothing. We have no evidence to provide a basis for rational decisions, we have no way to know whether it’s wise to retreat from certain areas of the city, or the effects of surge barriers versus more modest control efforts.
We have to understand these things as best we are able decades in advance, in order to implement cost effective solutions. Policymakers cannot make efficient decisions on any particular type of broad scale adaptation project, unless they have at least a vague idea of how fast the sea level may rise. For example, we won’t know whether to begin certain activities now or defer them for 10 years, without science.
If there was ever a problem where you need cutting edge science, climate change is it. The city has been very wise in engaging scientists in understanding what the risk is through the NPCC. That way, the city is in the position to make the best decisions that can be made today, even given significant uncertainty.
How can scientists more effectively communicate with policymakers to implements their findings in effective policy?
Scientists need to be honest with policymakers about what the uncertainties are, what might happen, and what the risks are of taking certain steps (or not taking them). Scientists have to be willing to engage in a two-way conversation, listening carefully to what policymakers need, so that they can better formulate their responses.
In general scientists are not brilliant communicators, but it isn’t necessarily their fault. It’s also difficult to decipher what politicians are willing to hear. Scientists have to talk to political leaders, as if they’re average people, and not in jargon. They need to understand when they approach politicians and policy makers, that in a democracy everyone involved in the decision process, including scientists, are ultimately responsible to the average citizen.
To learn more on this topic, read the full report published in our Annals Special Issue: Advancing Tools and Methods for Flexible Adaption Pathways and Science Policy Integration: NPCC 2019 Report.
Recently, Vice President Pence laid out an ambitious plan to establish a new military “Space Force” as soon as 2020. NASA has already outlined its plans to send humans to Mars in the 2030s. Private companies like Boeing, SpaceX and Sierra Nevada Corp., are investing heavily in commercial spacecraft. And Orion Span, Bigelow Aerospace, Virgin Galactic and Blue Origin are just a few of the players testing the space tourism waters as the ultimate vacation destination for those who have lots of disposable income and have already been everywhere on Earth, twice.
But what impact might increased human activity have on the fragile space eco-system? How will space travelers grow enough food to sustain a trip of months or years? Already some experts are sounding the alarm about the amount of “space debris” in orbit around the Earth. Who gets to own space and how will commercial and military use of space be governed?
2019 will mark yet another milestone for space travel. As we celebrate the 50th anniversary of the moon landings, the first fleet of private “space taxis” will be deployed. If all goes as planned, SpaceX’s Crew Dragon capsule and Boeing’s CST-100 Starliner are both scheduled to blast off on test flights with NASA astronauts on board.
A Tremendous Expansion in Scientific Knowledge
We have had nearly sixty years of space travel, and almost fifty years since the iconic “giant leap for mankind.” Human exploration of space has resulted in a tremendous expansion in scientific knowledge about our solar system, and orbiting satellites have provided critical knowledge about the Earth itself — continuously collecting data on global climate, environmental change and natural hazards.
But the scientific benefits of space exploration are only the tip of the iceberg. Our activity in space has improved nearly every aspect of quality of life on Earth. Early satellites contributed critical knowledge and capabilities for communication and global positioning. The challenges of energy efficiency for space exploration drove the development of solar cells, batteries and fuel cells. The precision and reliability required of robots for space have advanced robotic capabilities on Earth, such as a robotic glove developed as a grasp assist device, first for astronauts and then factory workers.
The International Space Exploration Coordination Group recently published an overview of the benefits stemming from space exploration, listing the following technological innovations: implantable heart monitors, light-based anti-cancer therapy, cordless tools, light-weight high temperature alloys for jet engines, cell phone cameras, compact water purification systems, global search-and-rescue systems and biomedical technologies.
An Exciting New Phase of Space Exploration
We are poised on the edge of an exciting new phase of space exploration — what Bloomberg Businessweek recently called “The New Space Age.” This new phase is characterized not only by a new mission — Mars and beyond — but by a new focus on sustainability. With years in an enclosed environment and on a planet without oxygen, a long-haul space mission will not get replenishments of food, water, equipment, clothing or anything else.
As astronaut Cady Coleman put it, “Sustainability, for someone like myself planning to go to Mars, is a closed loop system, not being able to go home or bring supplies. The things we need to think about are exactly the things we need to think about for a sustainable Earth.”
Sustainable space exploration promises to be an essential driver for exciting and dynamic discoveries. The possibilities of providing solutions to some of our most urgent problems, creating ecosystems of innovation, fueling job creation, and inspiring new generations of young people toward careers in science, engineering and technology are limitless.
And by overcoming the challenges of sustainable space travel, we have an opportunity to realize a whole new set of benefits for the 7.5 billion people here on Earth.
Amy Pruden’s research examines the spread of antibiotic resistance, a major public health and environmental concern.
Amy Pruden, PhD
Published August 13, 2018
By Marie Gentile, Mandy Carr, and Richard Birchard
The spread of antibiotic resistance is a major public health concern, prompting a movement to reduce their use in food animal production, and prevent resistance buildup in people and the environment.
Amy Pruden, PhD, the W. Thomas Rice Professor in the Department of Civil and Environmental Engineering at Virginia Tech, was among the first researchers to describe antibiotic resistance genes (ARGs) as environmental “contaminants.”
Her research has laid a foundation for understanding why and how agricultural, wastewater, and water environments may represent key pathways for receiving and spreading antimicrobial resistance.
This interview has been edited for space and clarity.
What first led you to investigate water pathways as locations that contribute to the antibiotic resistant genes burden?
As a new faculty member at Colorado State University, there was this growing awareness of emerging pollutants – the trace chemicals that end up in our water. Things like pharmaceuticals, personal care products, etc.
Things that in the past, we thought, ‘Oh, it goes down the drain and it goes away,’ or, ‘I took that pill, it’s gone. My body broke it down.’ Now we know that isn’t the case.
At the time, my collaborator, Dr. Ken Carlson had begun looking at antibiotic residuals in Colorado’s Poudre River. Ken is a water chemist and had developed techniques to look for pharmaceuticals at trace levels in environmental water samples. He was able to distinguish between antibiotics typically found in livestock and in people.
This led me to think, ‘Antibiotics in the environment might not be much of a concern, unless they’re influencing the resident microbial communities and stimulating the spread of antibiotic resistance.’ At the same time, I was well-aware of the complexity of microbial communities in the environment and that culture-based methods would only provide information about a small fraction of a percent of the bacteria in the river.
It all came together, if we wanted to understand antibiotic resistance in these river sediments, we had to use the DNA-based tools, and not look at one culture or strain at a time.
What are some of the practical challenges of your work?
A big challenge is the lack of a standard agreed upon method for monitoring antibiotic resistance in the environment. Most of the antibiotic resistance work that’s been done, has been done in the clinic, but the single strain-based diagnostic methods used there are not necessarily appropriate for environmental monitoring.
Ideally, what is needed are tools and metrics that capture microbial ecological dimensions of antibiotic resistance, including types, mechanisms, and magnitudes of ARGs, and their potential to spread.
Assessing the potential for bacteria to share their ARGs, which they can do within and among members of microbial communities via horizontal gene transfer, is especially key.
Currently we’re working on methods using next-generation DNA sequencing and bioinformatics analysis to gain a holistic “resistome” perspective: a full sense assessment of all the ARGs that are present, along with mobile genetic elements, like plasmids, transposons, and integrons and things that may facilitate development of multi-drug resistance and the capacity for ARGs to spread among bacteria.
How can we better control the spread of antibiotic resistance genes?
We need to get at the root causes, understanding how antibiotic resistance evolves and spreads in the first place. Identifying hotspots can be a useful way to achieve this.
A hotspot is a place where many factors come together to increase the chances that antibiotic-resistant pathogens can evolve. For example, wastewater treatment plants are potential hotspots, because they bring together everything that’s flushed down the drain, pathogens, ARGs, and antibiotics. Hotspots would be a useful target both for monitoring and mitigation.
The other big area is in agriculture. The majority of antibiotics used in the world, are for agriculture and livestock. Yet, we don’t have wastewater treatment plants on farms – that would be too costly and impractical.
Instead, there are opportunities to improve manure management. For this to work, we need simple, practical guidelines, that determine which antibiotics best protect livestock, but have the least effect on human health and lesser environmental impact. Then we need to decide how to handle manure from livestock treated with antibiotics.
Should it be composted or digested? What are the safest practices for land application as a soil amendment?
Textile waste has been on the rise in recent years because of “fast fashion” trends. Companies are exploring ways to recycle these otherwise discarded materials.
How much stuff do you have in your closet? If you’re like most people, it’s way too much and with clothing you probably seldom wear. According to Mattias Wallander, CEO of USAgain, Americans purchase five times as much clothing as they did in 1980 — largely due to “fast-fashion” — low-quality, inexpensive fashions typically found at retailers like H&M and Forever 21. As a result, textile waste grew 40 percent between 1999 and 2009, according to the Council for Textile Recycling. In 2014 the EPA reported that 10,460,000 tons of textile waste was thrown into landfills.
In the State of Fashion 2018 report by Business of Fashion and McKinsey & Company, Dame Ellen MacArthur said, “Today’s textiles economy is so wasteful that in a business-as-usual scenario, by 2050 we will have released over 20 million tons of plastic microfibers into the ocean.” Those stats show a frightening trend, but according to a 2014 article in The Atlantic, of the clothing that is collected by charities: 45 percent is used for secondhand clothing, 30 percent is cut down and made into industrial rags, 20 percent is ground down and reproduced and five percent is unusable. Less than one percent is recycled into new textile fiber.
Barriers to Recycling Textiles
So why isn’t more disused clothing being recycled? According to Natasha Franck, the founder of EON, a collective focused on making fashion sustainable, the biggest barrier to recycling textiles is the lack of material transparency. Fabric cannot be recycled if its composition is unknown. Seventy percent of retailers plan to provide item level tagging by 2021 and EON is developing the first global tagging system for textile recycling, making it easier to sort through fabrics.
Some retail companies are developing their own solutions. International fashion retailer Zara, for example, is installing collection bins across all its stores in China, while Swedish retailer H&M, has invested in Re:Newcell the first garment in the world made from chemically recycled used textiles. C&A introduced a mass market price T-shirt that is “Cradle-to-Cradle” certified i.e. designers and manufactures have undergone a continual improvement process that looks at five quality categories; material health, material reutilization, renewable energy and carbon management, water stewardship, and social fairness. Each product receives a level of achievement in each category — basic, bronze, silver, gold or platinum.
Many cities have their own recycling programs. New York City has NYC Grow collection points to donate clothing. Unwanted clothes are picked up at collection stations and then taken to a facility to be sorted and recycled. Germany-based I:CO — short for I:Collect — provides global solutions for collection, reuse and recycling of used clothing and shoes. Their worldwide take-back system and logistics network currently operates in 60 countries and helps cities and retail outlets to develop recycling solutions.
Based on aerodynamic laws bumblebees should not be able to fly, and yet they do. Similarly if past lessons of human history are reliable guides to future performance, ambitious global commitments to address poverty, inequality and sustainable development should quickly flounder amidst human foible. And yet, in the three years since their adoption, the United Nations’ Sustainable Development Goals (SDGs) have already changed the conversation about what collective will can accomplish. The shift has taken place, thanks in part to members of the world’s scientific community, who have stepped into informal roles as conceptual interpreters, brokers between advocacy and realpolitik, and coalition builders.
The Power of Collective Effort
When 193 U.N. member states signed onto the SDGs in 2015, there was fresh evidence that seemingly intractable issues of poverty, growth and inequality could in fact yield to collective effort. The U.N.’s preceding framework, the Millennium Development Goals (MDGs), had met its most well known objective of “cutting extreme poverty in half” five years ahead of schedule. The SDGs raise the poverty goalposts even higher — by redefining poverty beyond purely monetary terms as a threefold condition that includes economic, social and environmental factors.
The SDGs have pulled in active participation from a growing spectrum of stakeholders that include governments, multi-lateral organizations, NGOs and private-sector actors. But with every stakeholder pressing ahead with its own SDG priorities, what actually addresses global poverty is the question that connects all parties. This common need for shared, fact-based understanding has put scientific disciplines into a position of de-facto referee. The perceived apolitical objectivity of scientific methods and the historic training of scientists in the transfer of knowledge offer a glue strong enough to hold together would-be SDG collaborators and partners, and dissolve tensions born out of perceived biases or competing agendas.
An Unfolding, Dynamic Entity
Scientists involved with the SDGs acknowledge they are a complex, even sprawling web of interdependent causes and effects. The scientific tearing apart of causes, conditions and valid findings would be challenging even before all the cultural, political and environmental variables that prevail across the globe are factored. “How do you talk to people when sustainability is an unfolding, dynamic entity?” asks Dr. Mark B. Milstein, who directs the Center for Sustainable Global Enterprise at Cornell University’s Samuel Curtis Johnson Graduate School of Management. “The SDGs really capture that—they’re overlapping, they’re not clean, with sub-areas that are not mutually exclusive.”
A strategic management expert by training, Milstein straddles the intersection where situation-specific solutions and broad, transferable scientific insights merge or collide. Explicitly, Milstein specializes in framing the world’s social and environmental challenges as unmet market needs, often best addressed by the private sector. Tacitly, as someone who consults extensively with business entities to help them effect change, he’s a translator. “For somebody like myself, rigorous scientific inquiry means training to examine and analyze data sets, and look for trends,” says Milstein. “How do you go about doing work that can adhere to scientific rigor while still trying to move the needle on these critical issues that we believe have to be addressed?”
Immediate Problem Solving
The private-sector SDG actors who are making decisions and on-the-ground investments, Milstein notes, tend to be focused on immediate problem solving. They’re equally committed to their own SDG projects, he notes, but often working with shorter deadlines, and applied research that leans more to market needs and decisions. Part of his job, he elaborates, is using the kind of knowledge science can produce to help private business along.
“Since we’re talking about how it makes sense for the private sector to get involved and stay involved, we have to make sure the questions we’re asking are as clear as can be, that we’re being very specific about the language that we use and the data that we collect, and the conclusions that we draw from that,” he explains. “There’s no reason why applied research cannot be rigorous the way academic research is.”
For SDG scientist stakeholders, dynamic tension is built into the multiple roles they are asked to play. Working as a policy expert for the U.N. Development Program (UNDP), Dr. Esuna Dugarova walks a tightrope every day between scientific detachment and the realities of SDG realpolitik.
“Being part of the U.N. system, I’m here to promote the framework of the SDGs, and to provide recommendations to governments on how to implement the 2030 Agenda,” says Dugarova, emphasizing that her perspective on SDG multi-tasking is her own, and not that of UNDP. “On the other hand, in my capacity as a researcher, I do research and analysis. Sometimes, the recommendations are not always what governments want to hear. I’m also critical about what kind of data should be used, and how to incorporate that data to make good policy advice.”
Processing Data Mindfully
As one example, Dugarova points to her research work on unemployment and poverty in Central Asia. Accurate findings are difficult to obtain, she recounts, partly because large portions of local employment are not parts of formal economies, and thus underreported. Additionally, host governments are sensitive about their image, creating a delicate atmosphere for the presentation of the data. “One must be mindful about how to process data,” says Dugarova.
Dugarova has a very definite point of view about one of the major levers that drive progress against poverty, inequality and towards sustainable development: gender equality. “There are certain universal accelerators. Gender equality is one of them, capable of achieving many goals at the same time, whether it’s economic development, food security, climate change or political participation.”
But here again, Dugarova is keenly aware of her role as an informal broker of facts to sometimes unreceptive national governments, who happen to be her major professional stakeholders. She can easily point to gender-equality progress. For example, two-thirds of developing countries have achieved gender equality in primary education, female political participation is growing strongly in Latin America and U.N. economic models show strong correlation between female labor force participation and economic growth.
Structural, Institutional, and Cultural Bottlenecks
She’s also aware of structural, institutional and cultural bottlenecks in the way of further progress, citing gender-based violence as an example. As a policy expert and advocate for gender equality, Dugarova realizes it’s one thing to know that 49 countries still have no legal framework to address domestic violence, it’s entirely another to go up against social and cultural norms that are often woven into national identity. “If you address gender norms that are embedded in national identity, you have to address or even change national identity, and these are deeply embedded in the nation-state,” she elaborates. Dugarova does not have to state the obvious, that the nation-state is the foundation of the U.N. system.
There does seem to be consensus among stakeholders that achievement of the SDGs will require unprecedented levels of cooperation, and entirely new models of partnership. Dr. Robert Lepenies is a Research Scientist at the Helmholtz Centre for Environmental Research (UFZ) in Leipzig, Germany, and a member of the Global Young Academy. He has watched the specific ways in which the world’s scientific communities coalesce around the SDGs, and is an active participant in related coalition-building.
The SDGs, Lepenies points out, have put new initiatives in motion to bring together scientists, policy specialists and non-governmental actors, with impacts yet to be revealed. Lepenies mentions cooperation between statistical agencies worldwide to agree on metrics to determine whether the SDGs have been successfully met. In no way is this a finished process, notes Lepenies, and scientists must use the prestige of their positions to continue to press for accountability and statistical rigor. “I think the major advantage is that the discussion has been changed for good now,“ Lepenies says. “It is simply assumed that partnerships must be interdisciplinary, transdisciplinary, participatory and draw on different types of input.”
Processes, Methodologies, and Approaches
Lepenies is particularly optimistic about relatively new entities such as the Global Young Academy, and innovative hybrid frameworks such as Future Earth’s Knowledge-Action Networks. “I am personally very excited about the pioneering roles played by national science academies, particularly young academies in places like Africa, and even associations of science academies such as the InterAcademy Partnership,” Lepenies observes. “Poverty is back on the agenda, defined in ways that will contribute to huge capacity building for social, economic and environmental statistics around the world.”
The Holy Grail for SDG scientists who attempt to address the economic, social and environmental dimensions of poverty are universally applicable solutions — processes, methodologies and approaches — that are in fact sustainable, scalable and replicable.
But the reality seems to be much messier, with progress that takes the form of scalpels rather than hammers, and localized, population-specific solutions rather than sweeping antidotes. In the past three years, scientists invested in the success of the SDGs may have built or picked up an increasingly fine-grained understanding of what works, what doesn’t and why. They’ve learned new ways of communicating with SDG partners who think and speak in a different idiom. And they’ve demonstrated willingness to partner with each other and with non-scientist stakeholders.
A More Just World is Possible
Scientists are also learning, perhaps, to remain participants in an SDG universe of calibrated expectations and incremental advancements. The U.N.’s own SDG charter contains terms like “slow and uneven progress.” As Lepenies says, “The SDGs are primarily about the long-term vision we have for our planet. Even though the agreed-upon goals represent a non-binding consensus, I think we should look at the 2030 Agenda as the best chance to achieve a ‘realistic utopia,’ a global endeavor to bring about social and intergenerational justice. A more just world is possible, and the SDGs give us a pretty good shot at achieving this.”
Editor’s Note: The views expressed by the participants quoted in this article are personal and do not necessarily reflect the positions of their affiliated institutions or The New York Academy of Sciences
Imagine an “Intellicity,” where neural networks ensure everything works together.
Published May 1, 2018
By Lori Greene
Today’s students will be the inhabitants of tomorrow’s cities, so they want more sustainable ways of living and working in urban ecosystems.
That was the premise behind United Technologies’ Future of Buildings Innovation Challenge. This event was created by The New York Academy of Sciences and launched in September 2017.
Fifty-two teams of students 13 to 18 years old from across the globe competed. Their goal: to conceive the most inventive green building solution.
Imagining an “Intellicity,” was the creation of one team. Here, neural networks run a building’s systems to ensure people, machines and the environment work in concert to adroitly use and conserve resources.
Reducing Waste
In the “Intellicity” paradigm, little is wasted. Solar panels and wind turbines create an on-going source of clean, abundant, renewable energy. Rainwater collected from the roofs of buildings provide water for indoor plumbing and hydroponic systems. Once inside, hydroponic walls can repurpose rainwater for food growth. Intellicity’s student founders want to ensure that people are harnessing energy generated by city activity and putting it to use.
Floor tiles in larger structures convert footsteps into electrical energy, and waste is turned into fertilizer. Solar panels on windows maximize sunlight and capture the energy to help run a building’s lighting and temperature systems. Revolving doors connected to electric generators can be used to capture energy as people walk in and out. This creates another source to power the structure’s electricity, heating and cooling needs.
The Applications of Artificial Intelligence
Using artificial intelligence (AI), energy is redistributed to increase the comfort and productivity of building occupants. The AI system that would run the integrated interior and exterior building networks “learns” from several inputs and the resulting outputs. For example, during high usage times, the power could go towards controlling lighting as well as heating and cooling rooms. Over time, the network records occupant preferences and automatically adjusts the room, heat and light depending on who enters and leaves.
Similarly, the team sought to give people an opportunity to interact with their building using a “neural network.” This computer system was developed around the human nervous system. It aims to allow the building to communicate back through an app detailing the energy being collected, used and wasted in the structure.
Retrofitting Existing Infrastructure
With the flexibility of AI, the team theorizes that this can also be implemented in a variety of structures. This includes transportation hubs such as airports as well as offices and apartment buildings. According to the plan, each section of the building could provide sustainable energy with minimal impact to the environment around it. Rather than redesigning structures, the team suggests using sensors in every room. They also suggested monitoring software that can help devise a customized solution to precisely redistribute energy.
Integrating neural networks into buildings to create an energy efficient sustainable future is Intellicity’s ultimate goal.
Check: nyas.org/challenges for information about the UTC Future Buildings and Cities Challenge winners.
From global data-sharing efforts to local educational campaigns, new urban sustainability projects are shaping the cities of a greener future.
Published May 1, 2018
By Alan Dove, PhD
In 1900, about 13 percent of the world’s population lived in cities. Today, well over half of it does, and that proportion continues to grow. Cities now account for three-fourths of global gross domestic product, and about the same fraction of human-generated carbon emissions.
Because they concentrate huge amounts of human activity into small areas, cities are ideal test beds for new sustainability efforts. Inspired by the United Nations’ Sustainable Development Goals (SDGs) new collaborations have sprung up between political leaders, scientists, communities and non-governmental organizations. From global data-sharing efforts to local educational campaigns, these new urban sustainability projects are shaping the cities of the future.
Christiana Figueres
The Political Climate
Nations formally sign international agreements such as the SDGs, but in the case of urban sustainability, it falls to the leaders of individual cities to implement relevant policies. Fortunately, compared to national or regional governments, “cities are much more in tune with the direct impact of their policies, and they are much more in tune with the quality of life of citizens … from day to day,” says Christiana Figueres, Vice Chair of the Brussels-based Global Covenant of Mayors for Climate and Energy.
Figueres’ group provides a global network through which city leaders can share their ideas and results in pursuing sustainability.
“We’re a very important platform for city officials to learn what has worked,” says Figueres, pointing to examples such as Seoul’s renewable energy campaign, Paris’ expanding bicycle infrastructure, and a multi-city effort in India that has exchanged over 700 million incandescent lightbulbs for high-efficiency ones.
The central focus of the Global Covenant of Mayors is helping cities design and implement ambitious climate action plans, but that remit intersects with many of the U.N.’s other SDGs.
“How we pursue building our cities for the future — such as using high-carbon or low-carbon infrastructure, the way we change our consumption and production patterns, the way we deliver economic growth — are all relevant to the sustainable development goals and will largely determine the quality of life on this planet,” says Figueres.
United by Common Problems, Divided by Different Regulations
While cities around the world face common problems, they’re also bound by the particular laws and circumstances of their nations. Figueres emphasizes that the Global Covenant of Mayors has neither the authority nor the desire to try to synchronize urban policies across national boundaries. Instead, the group serves as a clearinghouse for cities to share data, strategies and ideas and discuss their experiences and results.
Science is a central part of all of these efforts, in measuring greenhouse gas emissions, studying and predicting the potential impacts of future climate change and also identifying the most effective measures cities can take to reduce their environmental impact and mitigate risks. Figueres points to a project in Myanmar, where scientists are developing models that can predict storm surges from cyclones, and others that identify areas at the highest risk of earthquakes and fires.
That information will help local leaders plan disaster responses to focus on the areas with the greatest needs, while also guiding future infrastructure development. Data from that project could inform similar efforts in coastal cities around the world, as rising seas and temperatures will likely make natural disasters more frequent.
Fundamentally a Problem of Physics and Atmospheric Chemistry
Climate change is fundamentally a problem of physics and atmospheric chemistry, but responding to it will require many other disciplines. Figueres emphasizes that in cities especially, researchers need to focus on social aspects of sustainability.
“We have a tendency to dehumanize cities, as though the purpose of cities were to have buildings and infrastructure, [but] the purpose of cities is actually to be the home for human beings,” says Figueres.
For policymakers to make the best use of science, scientists also need to explain it in human terms. “It does no good to come with science, accurate as it may be, if it’s not made relevant and understandable,” says Figueres.
Melanie Uhde Photo: Sun Kim, skstudiosnyc
Hungry For Change
While the Global Covenant of Mayors is helping scientists and city leaders work together globally, individual researchers are also taking local action in their own towns. New York’s Urban17 Initiative exemplifies this trend.
“I wanted the students who are part of our team to focus on urban sustainability in New York City, because it’s a great city to model hypotheses,” says Melanie Uhde, Urban17’s founder and managing director.
Urban17 currently consists of about a half-dozen volunteer analysts, mostly graduate students and young researchers from different disciplines and universities around the city. Despite its small size and lack of funding, the ambitious group is already tackling a project with global relevance, studying the overlapping problems of obesity and hunger.
“We know that, for example, the rates of obesity and hunger in the Bronx are the highest [in the city], so they’re basically bedfellows, which is a very common phenomenon in urban environments throughout the world,” says Uhde.
The Paradoxical Overlap of Hunger and Obesity
It may seem paradoxical for hunger and obesity to overlap, but interconnected problems can yield exactly that result.
“It’s definitely poverty, but it’s unfortunately much more complicated,” says Uhde, adding “even if you have money, do you have access to food, do you have the education, do you know what’s actually good for you, [and] do you have the time to put effort into a nutritious meal?”
In poor urban neighborhoods, the answers to those questions are often ‘no,’ causing synergistic deficits that can produce the entire spectrum of dietary problems. To address that, Uhde and her team are combining data on obesity and hunger with the locations of groceries, parks, fitness centers and schools.
The Impact of Obesity and Hunger on Education
Public schools provide good anchors for the project, not only in mapping the extent of obesity and hunger in some of the most vulnerable populations, but also in implementing solutions.
“Education is a very important factor to achieve sustainability, and we’re seeing [how] other factors like obesity or hunger influence education,” says Uhde. Malnourished students aren’t likely to learn well, which in turn can perpetuate poverty and poor health. Improving school meal programs and health classes could help break that cycle.
Uhde hopes other scientists will start tackling sustainability problems in their own towns. “Sustainability … affects everyone in every aspect of life,” she says, adding that “we’re living in this era where we have to do something no matter what.”
Jennifer Costley, PhD, Director, Physical Sciences, Sustainability and Engineering, New York Academy of Sciences contributed to this story.