Support The World's Smartest Network

Help the New York Academy of Sciences bring late-breaking scientific information about the COVID-19 pandemic to global audiences. Please make a tax-deductible gift today.

This site uses cookies.
Learn more.


This website uses cookies. Some of the cookies we use are essential for parts of the website to operate while others offer you a better browsing experience. You give us your permission to use cookies, by continuing to use our website after you have received the cookie notification. To find out more about cookies on this website and how to change your cookie settings, see our Privacy policy and Terms of Use.

We encourage you to learn more about cookies on our site in our Privacy policy and Terms of Use.

I Imagine: How Technology Will Shape Scientific Research in the Next Century

I Imagine: How Technology Will Shape Scientific Research in the Next Century

by Charles Cooper

When the New York Academy of Sciences marked its centennial in 1917, just eight percent of homes had landline telephones, and it took a full five days to travel from New York to London.

Albert Einstein would introduce the idea of stimulated radiation emission that year with the publication of On the Quantum Theory of Radiation. However, it wasn't until mid-century that researchers were able to apply his insights to build the maser, and then the laser.

In a speech he gave in 1917, inventor and Academy Honorary Member, Alexander Graham Bell offered several prescient predictions about things like industrialization and the prospects for commercial aviation 100 years later. Yet even the most clairvoyant observers at the time would not have foreseen the transformations wrought by science and technology in the world of 2017.

But what about 2117? What can we expect in the coming century given our understanding of the trajectory of scientific and technological advances? We put that question to Academy Members in a number of different disciplines, and here's what they said.

Ryan Rose

Ryan Rose

No Knowledge Ever Gets Left Behind Again

A century into the future, predictive analytics and machine learning systems will be in a position to anticipate what human beings need to know, according to Ryan Rose, who leads Customer Experience and Product Design for a new social learning platform at Cisco.

"Right now, we're just trying to leverage data to give us better ideas," said Rose. "But if we project 100 years ahead, computer systems won't just be making recommendations to people, they will make the decisions. Machine learning won't be just about finding a way to get that information to a human. It will make the leap in logic to actually say, 'This customer needs this system to be this way' and then make that happen."

With machines poring through disparate bits of information, systems will be able to connect the dots to register what Rose describes as "instant adaptation."

"That's going to be huge. You will see innovation occurring as quickly as the machine thinks it and asks, 'Why don't we try this?' You can still have all of the human touch points, but the speed at which this happens will be much faster simply because we will not be waiting on someone to say, 'I think that these things are related.'"

Rose also expects a future in which no knowledge gets left behind as information is captured and retained digitally.

"Now, when we want to review knowledge from yesteryear, it's archived in a movie or maybe some type of audio recording that we cannot interact with. But think about a society with access to the great experts or just the everyday experiences of people from any time. We'll have all this information about individuals, their knowledge and expertise, and it will be stored so that someone in the future can 'speak' with any individual. Your descendants will be able to get a better understanding, even if it is just a digital understanding, of what you felt or thought."

"The interaction could be something as simple as a 3D projector or augmented reality, but you'll be able to talk back and forth through natural language processing. I think there is a great future where the wealth of information about humanity is preserved and being able to interact with those moments in perpetuity."

William Schmidt

William Schmidt

Imagining a Pain-Free World

William Schmidt, a pharmacologist and the President of NorthStar Consulting, LLC, is optimistic that pain treatments in the next century will no longer carry high risks of addictive side effects.

"Within the next 100 years, we will have additional analgesics to prescribe along with opioids so that we can use lower dosages, replace opioids altogether, or (perhaps) have safer opioid analgesics that are less likely to show an addictive profile," he said.

That would be a welcome development. An epidemic of opioid abuse has led to one of the worst drug crises in American history. Indeed, the Centers for Disease Control estimates that 91 Americans die every day from an opioid overdose.

Schmidt, one of the world's leading researchers into the discovery and development of novel analgesic and narcotic antagonist drugs, also expects developmental breakthroughs in the products that doctors can prescribe to deal with pain.

"I expect we will have analgesic products that are unlikely to cause respiratory depression, either acutely or chronically, were someone to take a higher dose. I also expect we will also have — not only medicines to treat inflammation and pain directly — but genetic mechanisms for controlling some types of pain or pain signaling pathways that we can exploit to reduce the impact of pain within the body," he said. "We are already finding that we are able to treat things like rheumatoid arthritis in ways that are far more effective than what I learned when I was taking pharmacology in medical training."

It also would mark a veritable revolution in pain treatment, a field whose limitations Schmidt learned about through personal experience as a five-year-old when he broke his arm. Back then, doctors were afraid to use opioids to relieve his excruciating pain.

"I now recognize the medication they used hadn't a chance of working because they were afraid to use more effective medications in children," Schmidt recalled. "But that was the best that doctors knew how to do back then." A century from now, Schmidt says, no one may ever have to suffer that sort of trauma.

The Countdown to a Big Bio-Ethics Debate

When evolutionary biologists like UC Berkeley's Jan Buellesbach look at the trajectory of recent advances in genetics and molecular biology, they see a future laden with untold scientific potential.

"The field is developing so quickly — especially in genomics," said Buellesbach. "It's unbelievable when you think how expensive and cumbersome it used to be to sequence a genome. Now, they almost come at a rate of a dime a dozen ... and we're just scratching the surface."

Department of Environmental Science, Policy & Management, University of California, Berkley. From left to right: Jan Buellesbach, Maria Tonione, Kelsey Scheckel, John Lau, Elizabeth I. Cash, Rebecca Sandidge, Brian Whyte, Jenna Florio, Neil Tsutsui (Principal Investigator), and Joshua D. Gibson. Photo credit: Elizabeth I. Cash.

One example of that new technical prowess is CRISPR, a gene editing technology that scientists are now using to develop treatment therapies for a range of diseases, including cancer. Researchers have already successfully used gene editing to repair a disease-causing mutation in a human embryo.

But access to that kind of capability has also fueled debate about the ethics of using technology to alter human genes. In the world of 2117, Buellesbach expects genomics breakthroughs will give society the theoretical ability to selectively eradicate the genetic conditions that lead to diseases, or any traits that might be considered detrimental. It also means society will need to navigate an ethical minefield where so-called designer babies are no longer a theoretical possibility.

"With computational power getting exponentially faster and cheaper all the time, it's not such a sci-fi scenario anymore," he said. "I think we are likely heading towards a future where there will be research on how to perfect Homo sapiens in certain ways, especially if we start to manipulate our own genomes."

Before then, he noted that more cautious naturalists who don't believe we should interfere with human nature are likely to argue that just because science can do something doesn't mean it's wise to put theory into practice.

"What would be considered genetic perfection?" pondered Buellesbach. "I would find that very troubling. Who is to say what trait can be considered universally negative? Even 100 years from now, I don't think we'll have a unified view about that. There's no question that this would entail too much power. We know from history that this ... can be very dangerous, and decisions about that shouldn't be left in the hands of the few people in positions of authority.

Genomics Will Revolutionize Medicine

Doctors nowadays choose among myriad treatments to help patients suffering from heart disease and other ailments. By the time 2117 rolls around, however, trial-and-error will have been relegated to the history books. Genomics advances will pave the way for the right treatments for the right diseases for the right patients and at the right times, according to Kent Lloyd, a professor in the Department of Surgery at UC Davis.

In the future, Lloyd says doctors will have the kinds of drugs that don't just target the protein product — the end result of genes gone bad — but actually fixes them without needing to worry about having the drug go after the protein product.

"Also, if we have enough knowledge and can predict with great certainty that someone will develop a disease — why not try to prevent the disease from progressing or even starting?" he said. "That's where the future is — not only more precise treatments for diseases when they happen, but further down the road, more precise preventive measures for individuals you can predict are highly likely to contract the disease," he added.

These breakthroughs are predicated on research now underway to uncover deeper understanding of basic gene functions and how they impact human health.

"When we scan a person's genome, we might find a variant in gene X, another variant in gene Y, and another variant in gene Z. If we didn't know what those genes do, we wouldn't know which of those are more related to the cardiovascular disease that a patient might have," he said.

"We can test therapies in mice with that mutated gene to assess whether that therapy might be good or bad, what the effect might be and whether it might cause other things that we wouldn't want it to cause," Lloyd said. "This new knowledge will greatly catalyze and accelerate the implementation and practice of precision medicine. I think this will have a huge impact on health ... around the world."

Lloyd also sees potential in harnessing new genome editing technologies. In the future, he expects doctors to be able to change gene variants that create mutant proteins. The patient's system would then produce the normal protein, potentially reducing symptoms or relieving prospective diseases.

"We definitely need to improve on extant technologies and develop newer and more precise (or targeted) ones than today, no question about that," Lloyd said. "And we have the scientific power to be able to do it. If we put a little bit of effort in now ... the return on investment will be enormous."

Subhro Das

Subhro Das

Engineering a Stress-free Life

Ongoing advances in engineering and computer science are going to transform the global healthcare system, raising the prospect of breakthroughs in various areas of personal health, according to Subhro Das, a computer engineering researcher at the IBM T. J. Watson Research Center.

"Life expectancy will go beyond what we might imagine," said Das, part of an interdisciplinary team at IBM working on developing new computational approaches for improving health behaviors. "We might be able to find cures for diseases like cancer, and to find more effective ways of preventing things like type 2 diabetes."

That's the long-term view. More short-term, Das also expects intelligent systems will be able to analyze real-time data collected from body sensors and other mobile technologies that trigger commands to other connected devices to address signs of stress, such as elevated blood pressure or cortisol levels.

"For instance, I might be having a hard day at work. But my laptop, my phone, my house thermostat and my car — they are all going to be connected and sharing data among themselves," he said. "My car would get a signal from my laptop and put it in a mode so that when I'm driving home, soothing music would come on. Also, my house thermostat now knows that I was having a bad day at the office, so it will be able to adjust the temperature of my house to make me feel more comfortable."

More broadly, Das said that the continuing improvement in machine learning and data mining will enable more "smart buildings" to be equipped with sensors that can alert medical teams when somebody needs assistance. If there are people living inside who have medical conditions like Alzheimer's disease, or suffer a fall, those sensors are going to be communicating among themselves and will be able to get help quickly.

Winning the Battle to Beat Brain Pathologies

While the study of the brain presents dauntingly complex challenges, Dr. Marcie Zinn, a cognitive neuroscientist at DePaul University believes medical practitioners will one day be able to reverse the process of brain degeneration.

One hundred years from now, Zinn expects new technologies will transform our understanding of the functioning of the central nervous system. Armed with new tools, future researchers will be equipped to gain new insights into brain pathologies and uncover more effective ways to diagnose, treat, prevent and even cure disorders.

"There has been a lot of excellent research telling us why brain degeneration occurs. Take, for example, ALS (a progressive neurodegenerative disease of the central nervous system.) Currently, there is no cure for ALS. The degeneration takes place rather quickly without impediment. I think the first thing that anyone wants is to figure out how to slow down the process."

The brain poses obvious challenges for cognitive neuroscientists because it is continually changing itself on a millisecond basis. But the study of neurologically impaired people has been aided by recent imaging advances, such as visualization tools, which allow researchers to more accurately understand neural networks.

Looking over the horizon, though, Zinn expects more breakthroughs thanks to the increasing intersection of biochemistry and technology that might lead to new treatments for many neurological impairments, including the regrowth of brain cells.

"Formerly, science thought that new brain cells did not grow or regrow throughout the lifespan," she said, "but we now know that brain cells do regenerate under the right conditions."

Slow But Steady: Closing In on a World Without Cancer

Roughly $300 billion has been spent since 1971, when President Nixon declared the nation's "war on cancer" but as new technologies give researchers deeper understandings of genes and molecular pathways, it's also possible to imagine a future world free of cancer. Just don't bet on bolt-from-the-blue breakthrough announcements.

To be sure, the history of medicine is replete with serendipitous, sometimes world-changing observations, such as the 1928 discovery of penicillin by bacteriologist Alexander Fleming. That discovery resulted in the development of antibiotics that have saved millions of lives. In contrast, the field of cancer treatment has been marked by steady improvements in technology and better patient care.

Indeed, Academy Members Ijaz S. Jamall, a toxicologist and Principal Scientist with the biomedical consultancy, Risk-Based Decisions Inc., working in conjunction with Dr. Björn LDM Brücher, a surgical oncologist in Germany, noted that while cancer biology "has increased by leaps and bounds during the last 50 years," it's wise not to get too carried away.

"We should try to avoid using terms such as landmark, hallmark, breakthroughs or war against cancer, etc.," he said. "Such terms imply a lot more than can be delivered."

Still, slow but steady advances offer encouragement about the future. Jamall pointed to the deployment of new immunotherapy and nanotechnology techniques that help doctors diagnose and treat cancers earlier than ever before. Also, researchers now benefit from increased computer and data processing power as well as more precise 3D imaging tools. In addition, Jamall said, some vaccines are proving effective in preventing cancers caused by pathogens like HPV (human papillomavirus), HCV (the Hepatitis C virus), and HBV (the Hepatitis B virus), a development that he predicted will influence future therapies worldwide.

Even more progress is possible in the future with the development of nanobots and nano-drug delivery tools that improve the diagnosis and treatment of cancers by targeting features specific to cancer cells or malignant tissue without damaging nearby healthy cells and tissues.

Jamall said that nanotechnology can further improve the earlier detection of cancers by homing in on particular features of early cancers such as inflammation that currently slip below the radar of existing imaging and blood tests (biomarkers) of cancer.

In the meantime, he said, science is on the right path with the development of more effective vaccines and immunotherapies that will become better over time. But just as critical to the future, said Jamall, is a re-thinking of diseases and their treatments with an eye toward developing new and relevant approaches.

"One goal is interdicting the multi-sequence steps leading up to carcinogenesis," he said. "This would be a giant leap forward in cancer prevention." In conjunction with early screening and more effective treatments, he said science would advance closer toward the goal of making the majority of cancers (approximately 80 percent) "diseases of inconvenience" such as diabetes or arthritis.

Charles Cooper is a Silicon Valley based technology writer and former Executive Editor of CNET.