Skip to main content

What Happens When Innovative Scientists Embrace Entrepreneurship?

A woman talks with other women.

Deciding to make the leap from research to start-up doesn’t mean you have to leave your passion for science behind.

Published October 1, 2019

By Chenelle Bonavito Martinez

Sean Mehra
Chief Strategy Officer and co-founder, HealthTap

The days of lifetime employment with one employer are long gone. Most people will have at minimum half a dozen jobs over a working lifetime and possibly two or three career paths. And just as many will try their hand at starting their own business. Unfortunately, small business statistics show that by the end of four years more than half of them will be gone.

But being a scientist may have a distinct advantage when deciding to be an entrepreneur. Forbes contributor and STEM consultant Anna Powers writes in a 2018 article titled One Scientific Principal Will Make You A Better Entrepreneur, that “…the process of entrepreneurship mirrors the process of innovation in science. A lot can be learned [about innovation and entrepreneurship] from science, which has formulated certain guidelines about the process of innovation. Perhaps that is why almost 30 percent of Fortune 50 CEOs are scientists.”

The key to easing the transition from employee to “boss” is recognizing how the skills you possess for one job, translate into another. This not only applies to a direct transfer of specific technical knowledge or soft skills like communication and collaboration, but also how certain skills specific to your current career, are the same as those you need to possess to become a successful entrepreneur.

What it Takes

So what does it take for a scientist to become an entrepreneur? Opinions vary, but mostly it starts with a question and a desire to make an impact. However, deciding to make the leap from research to start-up doesn’t mean you are leaving your passion for science behind.

Sean Mehra, Chief Strategy Officer and co-founder of HealthTap, a digital health company that enables convenient access to doctors, says, “People think of the decision to be an entrepreneur as a choice to leave your skills and knowledge as a scientist behind, when that’s not really the case.” Scientists are innovators and they can easily identify as entrepreneurs. Mehra cites several examples of skills developed in the lab that can be applied to starting a business.

“Writing grants to acquire funds for research is not much different than fundraising, corporate development and sales,” he says. “Conducting experiments is product R&D and market fit. If you have recruited postdocs to work in your lab and guided their work, then you have hired talent and managed a team. Publishing and presenting your research at conferences is pretty much like marketing your vision. And networking and connecting with colleagues in your field is no different than prospecting for business connections and talking to your customers.”

Myriam Sbeiti and Daniela Blanco, Co-founders of Sunthetics, met in school and as graduation approached saw an opportunity to launch a viable business. In 2018 they developed a more efficient and more sustainable electrochemical manufacturing path for a chemical intermediate of Nylon 6,6. The process uses electricity rather than heat to power the reaction in a way that uses 30 percent less raw materials and energy, reducing a variety of harmful emissions in the process.

Suntheics co-founders from left to right: Professor Miguel Modestino, Myriam Sbeiti, Daniela Blanco

Similar to the Scientific Method

In the future, Sbeiti and Blanco are planning to apply this electrochemical process to a variety of reactions, making the chemical industry green, one reaction at a time. Sbeiti reflects that a lot of the research and interviews conducted to figure out if their ideas were viable were very similar to the scientific method and scientific experiments, i.e. they created a hypothesis and then needed to validate it. The major difference was that they did not need to confirm their hypothesis through years of research, instead they needed to talk to potential customers to find the right fit.

As scientists and researchers themselves, both emphasized that failure was the hardest skill to master. “The chemical industry wasn’t really interested in our original idea and the fashion industry didn’t really see value.” After a round of customer interviews, they realized they were designing a product they thought the customer needed instead of the product the customer said they wanted. In addition, efficacy and cost were a customer priority so Sbeiti and Blanco pivoted their idea to develop a product that fit the market. The Sunthetics team is shaping up to make the impact that was envisioned after graduate school. In fact, Blanco continues to pursue her technology as part of her PhD research and “thinks of it like R&D.”

Entrepreneurship is definitely a “higher risk and higher reward” scenario says Mehra. Most traditional researchers typically have a lower risk tolerance than the average innovator or entrepreneur. It can be very uncomfortable for a trained researcher turned entrepreneur to accept failure and pivot away from their original idea. But Mehra says that “even if the original idea isn’t quite right, there is still a lot of good knowledge acquired through the process.”

Unlocking the “Why”

Unlocking the “why” and a desire to create impact at scale are drivers behind making the shift into entrepreneurship. While contemplating his own career path, Mehra reflects that “thinking about my passion for technology, I realized that technology has a way to scale and have a one-to-many impact on the world. I started to think about ways I could use my technology skills to help people on a global scale instead of, for example, treating patients one-at-a-time as a doctor.”

Sbeiti and Blanco also began their journey by observing their surroundings and asking why. These common traits make up what Clayton Christensen, the current Kim B. Clark Professor of Business Administration at the Harvard Business School of Harvard University, and his co-authors, call The Innovators DNA. After six years of studying innovative entrepreneurs, executives and individuals, they agree this common skill set is present in every innovative entrepreneur. Clayton et al. argue that if innovation can be developed through practice, then the first step on the journey to being more innovative is to sharpen the skills.

Studies of identical twins separated at birth indicate that one’s ability to think creatively comes one-third from genetics “that means that roughly two-thirds of our innovation skills come through learning — from first understanding the skill, then practicing it, and ultimately gaining confidence in our capacity to create,” says Clayton. The most important skill to practice is questioning. Asking “why” or “what if” can help strengthen the other skills and allow you to see a problem or opportunity from a different perspective.

Ted Cho
StartupHoyas MED

A Search for Something That’s Never Been Done

Ted Cho, President of StartupHoyas MED, an organization dedicated to healthcare startups and innovators at Georgetown University, sees that skill in many of the innovators and entrepreneurs who are part of the StartupHoyas community. Like Drs. Jean-Marc Voyadzis and Hakim Morsli who created Amie, a “virtual surgical assistant” to help patients prepare for surgery and recovery, entrepreneurs often create their companies by observing and questioning their surroundings, identifying a problem, and developing a solution.

Cho says that “one of the most common pitfalls for entrepreneurs is building solutions without problems. Often times the most successful startups are those that are rooted in problems that the founders experienced firsthand. However, that doesn’t mean that you necessarily have to be an insider. Some of the most innovative ideas with the greatest potential to create impact have come from outsiders with fresh perspectives who aren’t locked into the conventions that seem to restrict many of the traditional players in the healthcare space.” While all of the innovators and entrepreneurs in the StartupHoyas community are focused on improving healthcare, not all are medical students. In fact, many are students and faculty from other areas of life sciences.

Starting one’s own company is much like scientific research — it’s the search for something that’s never been done before, because there is no product that is exactly like yours. But it’s important for researchers considering a business launch to stay flexible. As Cho says “pick something you love, but be careful not to fall in love with your own science.”


Creative Intelligence

Innovative entrepreneurs have something called “creative intelligence,” which enables discovery, yet differs from other types of intelligence. This means innovators are more than “right-brained individuals.” They engage both sides of the brain and leverage what the authors call the “five discovery skills” to create new ideas.

  • Associating: Connecting seemingly unrelated question, ideas or problems from different areas.
  • Questioning: Challenging the status quo by asking “why,” “why not” and “what if.”
  • Observing: Scrutinizing common phenomena, particularly behavior.
  • Experimenting: Trying new ideas.
  • Networking: Meeting people with different viewpoints, ideas and perspectives to expand your knowledge.

Source: Excerpted from “The Innovators DNA.”

Also read: Advancing Innovation and Entrepreneurship in Clean Energy

Darwin’s Dilemma: The Origin and Evolution of the Eye

A shark swims in the ocean.

Award-winning science writer Carl Zimmer explains the “creation” of the organ so complex that it baffled even Darwin.

Published October 1, 2019

By Carl Zimmer

“The eye to this day gives me a cold shudder,” Charles Darwin once wrote to a friend.

If his theory of evolution was everything he thought it was, a complex organ such as the human eye could not lie beyond its reach. And no one appreciated the beautiful construction of the eye more than Darwin—from the way the lens was perfectly positioned to focus light onto the retina to the way the iris adjusted the amount of light that could enter the eye. In The Origin of Species, Darwin wrote that the idea of natural selection producing the eye “seems, I freely confess, absurd in the highest possible degree.”

For Darwin, the key word in that sentence was seems. If you look at the different sort of eyes out in the natural world and consider the ways in which they could have evolved, Darwin realized, the absurdity disappears. The objection that the human eye couldn’t possibly have evolved, he wrote, “can hardly be considered real.”

Dozens of Different Kinds of Eyes

Today evolutionary biologists are deciphering the origins of not just our own eyes but the dozens of different kinds of eyes that animals use. Fly eyes are built out of columns. Scallops have a delicate chain of eyes peeking out from their shells. Flatworms have simple light-sensitive spots. Octopuses and squids have camera eyes like we do, but with some major differences. The photoreceptors of octopuses and squids point out from the retina, towards the pupil. Our own eyes have the reverse arrangement. Our photoreceptors are pointed back at the wall of the retina, away from the pupil.

For decades, most scientists argued that these different eyes evolved independently. The earliest animals that lived over 600 million years ago were thought to be eyeless creatures. As their descendants branched out into different lineages, some of them evolved their own kinds of eyes. It now turns out, however, that this is not really true.

All eyes, in all their wonderful variety, share an underlying unity in the genes used to build them. By tracing the history of these shared genes, scientists uncovering the steps by which complex eyes have evolved through a series of intermediate steps.

Opsins in Common

When light enters your eye, it strikes a molecule known as an opsin. Opsins sit on the surface of photoreceptor cells, and when they catch photons, they trigger a series of chemical reactions that causes the photoreceptor to send an electrical message towards the brain.

Biologists have long known that all vertebrates carry the same basic kind of opsin in their eyes, known as a c-opsin. All c-opsins have the same basic molecular shape, whether they’re in the eye of a shark or the eye of a hummingbird. All c-opsins are stored in a stack of disks, each of which grows out of a hair-like extension of the retina called a cilium.

In all vertebrates, c-opsins relay their signal from the stack of disks through a pathway of proteins called the phosphodiesterase pathway. All of these homologies suggest that c-opsins were present in the common ancestor of all living vertebrates.

Vertebrates belong to a much larger group of species known as bilaterians—in other words, animals that develop a left-right symmetry. The main lineage of these other bilaterians, known as protostomes, includes millions of species, ranging from insects to earthworms and squid.

Protostome eyes don’t have the c-opsins found in vertebrates. Instead, protostomes build another molecule, known as an r-opsin. Instead of keeping r-opsins in a stack of disks, they store r-opsins in foldings in the membranes of photoreceptors. R-opsins all send their signals through the same pathway of proteins (not the same pathway as c-opsins send signals in vertebrates).

Humans Also Make R-Opsins

These similarities in the r-opsins suggest they evolved in the common ancestor of protostomes, only after their ancestors had branched off from the ancestors of vertebrates. Likewise, vertebrates only evolved c-opsins in their eyes after the split. In recent years, however, evolutionary biologists have discovered opsins where they weren’t supposed to be.

It turns out, for example, that humans also make r-opsins. We just don’t make them on the surfaces of photoreceptors where they can catch light. Instead, r-opsins help to process images captured by the retina before they’re transmitted to the brain.

In 2004, Detlev Arendt of the European Molecular Biology Laboratory and his colleagues also found c-opsins where they weren’t supposed to be. They were probing the nervous system of an animal known as a ragworm, which captures light with r-opsins. Arendt and his colleagues discovered a pair of organs atop the ragworm’s brain that grew photoreceptors packed with c-opsins.

Arendt sequenced the gene for the ragworm c-opsins and compared it with genes for other opsins. He found that it is more closely related to the genes for c-opsins in our own eyes than it is to the genes for r-opsins in the ragworm’s own eyes. These findings have led Arendt and other researchers to revise their hypothesis about the origin of opsins: the common ancestor of all bilaterians must already have had both kinds of opsins.

Clues from Cnidarians

But Todd Oakley, a biologist at the University of California at Santa Barbara, wondered if opsins might be even older. To find out, Oakley and his colleagues turned to the closest living relatives of bilaterians. Known as the cnidarians, this lineage includes jellyfish, sea anemone, and corals.

Adapted with permission from The Tangled Bank: An Introduction to Evolution, by Carl Zimmer (copyright 2010, Roberts & Company, Greenwood Village, CO).

Biologists have long known that some cnidarians can sense light. Some jellyfish even have eye-like organs that can form crude images. In other ways, though, cnidarians are radically different from bilaterians. They have no brain or even a central nerve cord, for example. Instead, they have only a loose net of nerves. These dramatic differences had led some researchers to hypothesize that bilaterians and cnidarians had evolved eyes independently. In other words, the common ancestor of cnidarians and bilaterians did not have eyes.

In recent years, scientists have sequenced the entire genomes of two species of cnidarians, the stellar sea anemone (Nematostella vectensis) and a freshwater hydra (Hydra magnipapillata). Scanning their genomes, Oakley and his colleagues discovered that both species cnidarians have genes for opsins—the first time opsin genes had ever been found in a nonbilaterian. The scientists carried out experiments on some of these genes and discovered that they are expressed in the sensory neurons of the cnidarians. Oakley’s research suggests that, as he had suspected, opsins evolved much earlier than bilaterians.

How Opsins Evolved

With discoveries from scientists such as Oakley and Arendt, we can start to get a sense of how opsins evolved. Opsins belong to a family of proteins called G-protein coupled receptors (GPCRs). They’re also known as serpentine proteins, for the way they snake in and out of cell membranes. Serpentine proteins relay many different kinds of signals in the cells of eukaryotes. Yeast cells use them to detect odorlike molecules called pheromones released by other yeast cells. Early in the evolution of animals, a serpentine protein mutated so that it picks up a new kind of signal: light.

At some point, the original opsin gene was duplicated (Figure 8.13). The two kinds of opsins may have carried out different tasks. One may have been sensitive to a certain wavelength of light, for example, while the other tracked the cycle of night and day. When cnidarians and bilaterians diverged, perhaps 620 million years ago, they each inherited both kinds of opsins. In each lineage, the opsins were further duplicated and evolved into new forms. And thus, from a single opsin early in the history of animals, a diversity of light-sensing molecules has evolved.

The Crystalline Connection

The earliest eyes were probably just simple eyespots that could only tell the difference between light and dark. Only later did some animals evolve spherical eyes that could focus light into images. Crucial to these image-forming eyes was the evolution of lenses that could focus light. Lenses are made of remarkable molecules called crystallins, which are among the most specialized proteins in the body. They are transparent, and yet can alter the path of incoming light so as to focus an image on the retina. Crystallins are also the most stable proteins in the body, keeping their structure for decades. (Cataracts are caused by crystallins clumping late in life.)

It turns out that crystallins also evolved from recruited genes. All vertebrates, for example, have crystallins in their lenses known as α-crystallins. They started out not as light-focusing molecules, however, but as a kind of first aid for cells. When cells get hot, their proteins lose their shape. They use so-called heat-shock proteins to cradle overheated proteins so that they can still carry out their jobs.

Scientists have found that α-crystallins not only serve to focus light in the eye, but also act as heat-shock proteins in other parts of the body. This evidence indicates that in an early vertebrate, a mutation caused α-crystallins to be produced on the surface of their eyes. It turned out to have the right optical properties for bending light. Later mutations fine-tuned α-crystallins, making them better at their new job.

The Evolution of the Vertebrate Eye

Vertebrates also produce other crystallins in their eyes, and some crystallins are limited to only certain groups, such as birds or lizards. And invertebrates with eyes, such as insects and squid, make crystallins of their own. Scientists are gradually discovering the origins of all these crystallins. It turns out that many different kinds of proteins have been recruited, and they all proved to be good for bending light.

In 2007, Trevor Lamb and his colleagues at Australian National University synthesized these studies and many others to produce a detailed hypothesis about the evolution of the vertebrate eye. The forerunners of vertebrates produced light-sensitive eyespots on their brains that were packed with photoreceptors carrying c-opsins. These light-sensitive regions ballooned out to either side of the head, and later evolved an inward folding to form a cup.

Early vertebrates could then do more than merely detect light: they could get clues about where the light was coming from. The ancestors of hagfish branched off at this stage of vertebrate eye evolution, and today their eyes offer some clues to what the eyes of our own early ancestors would have looked like.

The Evolution Doesn’t Stop

After hagfish diverged from the other vertebrates, Lamb and his colleagues argue, a thin patch of tissue evolved on the surface of the eye. Light could pass through the patch, and crystallins were recruited into it, leading to the evolution of a lens. At first the lens probably only focused light crudely. But even a crude image was better than none. A predator could follow the fuzzy outline of its prey, and its prey could flee at the fuzzy sight of its attackers. Mutations that improved the focusing power of the lens were favored by natural selection, leading to the evolution of a spherical eye that could produce a crisp image.

The evolution of the vertebrate eye did not stop there. Some evolved the ability to see in the ultraviolet. Some species of fish evolved double lenses, which allowed them to see above and below the water’s surface at the same time. Vertebrates adapted to seeing at night and in the harsh light of the desert. Salamanders crept into caves and ended up with tiny vestiges of eyes covered over by skin. But all those vertebrate eyes were variations on the same basic theme established half a billion years ago.


About the Author

Carl Zimmer is a lecturer at Yale University, where he teaches writing about science and the environment. He is also the first Visiting Scholar at the Science, Health, and Environment Reporting Program at New York University’s Arthur L. Carter Journalism Institute.

Zimmer’s work has been anthologized in both The Best American Science Writing series and The Best American Science and Nature Writing series. He has won numerous fellowships, honors, and awards, including the 2007 National Academies Science Communication Award for “his diverse and consistently interesting coverage of evolution and unexpected biology.”

His books include Soul Made Flesh, a history of the brain; Evolution: The Triumph of an Idea; At the Water’s Edge, a book about major transitions in the history of life; The Smithsonian Intimate Guide to Human Origins; and Parasite Rex, which the Los Angeles Times described as “a book capable of changing how we see the world.”

His newest book, The Tangled Bank: An Introduction to Evolution, will be published this fall to coincide with the 150th anniversary of the publication of The Origin of Species.

What Can Science Tell Us About Death?

Sam Parnia smiles for the camera, wearing a suit and tie.

Sam Parnia, a leading expert in resuscitation science research, explains how death is not an absolute, but a process, and what happens when patients experience death.

Sam Parnia MD, PhD

Published September 30, 2019

By Robert Birchard

Across time and cultures, people have been conditioned to view death as an endpoint to the experience of life. However, advances in resuscitation science and critical care medicine have challenged assumptions about the finality of death. Sam Parnia, Director of the Critical Care & Resuscitation Research Division of Pulmonary, Critical Care & Sleep Medicine at New York University Langone Medical Center, recently spoke to The New York Academy of Sciences about his resuscitation science research. Dr. Parnia’s work illuminates how death is not an absolute, but a process, and what happens when patients experience death — sharing insights from his research in his own words:

What is death?

Death occurs when the heart stops beating. We call this death by cardiopulmonary criteria and it is how death is defined for more than 95 percent of people. A person stops breathing and their brain shuts down, causing all life processes to cease. More recently with the birth of modern intensive care medicine and the ability to artificially keep people’s hearts beating, doctors like myself can keep a patient’s heart beating longer.

Where people may have suffered irreversible brain damage and brain death, this leads to a situation where the brain has died, but the person’s heart is still beating, so legally, they are declared dead based upon irreversible brain death, or death by brain death criteria. This happens in a small fraction of the cases where people are declared dead.

For millennia death was considered an irreversible event and nothing could restore life. During the last decade, we’ve realized it’s only after a person has died that the cells inside their body, including the brain, begin their own death process. We used to think that you had five or 10 minutes before brain cells died, from a lack of oxygen, but we now know that’s wrong.

You have hours, if not days, before the brain and other organs in the body are irreversibly damaged after death. It’s actually the restoration of oxygen and blood flow back into organs after a person’s heart stops, but is then resuscitated that paradoxically leads to accelerated cell death. So, this accelerated secondary injury process is what we need to combat in medicine now.

Why is the term “near-death” experience inaccurate?

The problem with this term is that it is inconsistent with what people actually experience. It is undefined and imprecise. If I said ‘an airplane was involved in a near-miss incident,’ what does that mean? Did you have another plane come in within an inch of another plane, or were they a mile away? The term is ill-defined, and, it doesn’t take into consideration the fact that a lot of people have biologically died and returned.

What is a death experience?

I call it an “experience of death” because that’s what it is. People report a unique cognitive experience in relation to death. They may have a perception of seeing their body and the doctors and nurses trying to revive them, yet feel very peaceful while observing. Some report a realization that they may have actually died.

Later they develop a perception or a sensation of being pulled towards a type of destination. During the experience, they review their life from birth, until death, and interestingly this review is based upon their humanity.

They don’t review their lives based on what people strive for, like a career, promotions, or an amazing vacation. Their perspective is focused on their humanity. They notice incidents where they lacked dignity, acted inappropriately towards others, or conversely, acted with humanity and kindness.

They re-experience and relive these moments, but also, what’s fascinating, which sort of blows me away because I can’t really explain it, is they also describe these experiences from the other person’s perspective.

If they caused pain, they experience the same pain that other person felt, even if they didn’t realize it at the time. They actually judge themselves. They suddenly realize why their actions were good or bad, and many claim to see the downstream consequences of their actions.

How do studies of cardiac arrest  inform the debate on the nature of consciousness?

Traditionally, researchers had proposed that mind or consciousness – our self – is produced from organized brain activity. However, nobody has ever been able to show how brain cells, which produce proteins, can generate something so different i.e. thoughts or consciousness. Interestingly, there has never been a plausible biological mechanism proposed to account for this.

Recently some researchers have started to raise the question that maybe your mind, your consciousness, your psyche, the thing that makes you, may not be produced by the brain. The brain might be acting more like an intermediary. It’s not a brand new idea. They have argued that we have no evidence to show how brain cells or connections of brain cells could produce your thoughts, mind or consciousness.

The fact that people seem to have full consciousness, with lucid well-structured thought processes and memory formation from a time when their brains are highly dysfunctional or even nonfunctional is perplexing and paradoxical.

I do agree that this raises the possibility that the entity we call the mind or consciousness may not be produced by the brain. It’s certainly possible that maybe there’s another layer of reality that we haven’t yet discovered that’s essentially beyond what we know of the brain, and which determines our reality.

So, I believe it is possible for consciousness to be an as of yet undiscovered scientific entity that may not necessarily be produced by synaptic activity in the brain.

What are PROTACs and How Do They Treat Diseases?

“Optimized degrader molecules will have fast rates of degradation and relatively short exposure with therapeutic doses that result in complete elimination of the target protein, which can result in a more durable and deeper effect.”

Published July 23, 2019

By Robert Birchard

Eric Fischer, PhD

Around 80% of disease-causing proteins, including key drivers of many cancers and other serious neurological conditions like Alzheimer’s disease, cannot be targeted by currently available therapeutics. These so called “undruggable” proteins lack specific surface areas required for treatments such as small molecule inhibitors or antibodies to bind with disease causing proteins and modulate their function.

However, an alternative therapeutic strategy known as targeted protein degradation has shown the potential to remedy these “undruggable” proteins. Utilizing small molecules known as PROTACs, this strategy harnesses the cell’s waste disposal system to promote the destruction of disease-causing proteins. Dr. Eric Fischer, Assistant Professor of Biological Chemistry and Molecular Pharmacology at Harvard Medical School, recently sat down with us to help create this primer on PROTACs, and their potential implications for treating disease.

What are PROTACs?

PROteolysis TArgeting Chimeras, or PROTACS for short, are two separate molecules bound together to form a two headed molecule. One end binds to an ubiquitin ligase, while the other end binds to the “undruggable” protein targeted by pharmacologists. In illustrations, PROTACs are often depicted as dumbbells, but it may be more helpful to think of them as flexible harnesses.

How do PROTACs work?

PROTACs are designed to take advantage of the cell’s waste disposal system that removes unneeded proteins. This system, known as the proteasome, is important for the cell to remove unneeded or damaged proteins and recycle their building blocks to make new proteins. The proteasome plays critical roles in cell growth, management of cellular stress, and in the immune system. One end binds to the targeted proteins, while the other end of the molecule binds to the ubiquitin ligase, which then marks the targeted protein for destruction. This lets the cell’s proteasome know that this specific protein can be destroyed. In this way the body’s regularly occurring mechanisms are co-opted to destroy disease-causing proteins.

Optimized degrader molecules will have fast rates of degradation and relatively short exposure with therapeutic doses that result in complete elimination of the target protein, which can result in a more durable and deeper effect.”

Eric Fischer, PhD

What makes PROTACs so unique?

Most therapies are divided between small molecule inhibitors or therapeutic antibodies/biologics. However, “PROTACs are small molecules and as such not restricted to targeting surface proteins, however, in contrast to traditional small molecule inhibitors, PROTACs are fundamentally different,” explained Dr. Fischer, “While inhibitors need to achieve an almost perfect degree of target engagement over an extended period of time to exert their pharmacologic effect, PROTACs follow more of a hit and run strategy.”

“Optimized degrader molecules will have fast rates of degradation and relatively short exposure with therapeutic doses that result in complete elimination of the target protein, which can result in a more durable and deeper effect,” he explained. “More importantly, however, small molecule degraders completely eliminate the disease-causing protein and as such can target the non-catalytic activity of enzymes but also scaffolding proteins, and other non-enzymatic targets.”

When will PROTACs be more widely available?

While researchers have demonstrated the potential of PROTACs in the lab, the first clinical trials are just opening. Still Dr. Fischer is very optimistic, “The technology has rapidly spread, and we can expect to see many more programs entering clinical development. Due to the pioneering work of a growing academic community spearheading this field, the concepts underlying protein degradation are largely public domain and widely available.”

What is the future of PROTACs research?

“I believe the field of targeted protein degradation is here to stay and will significantly expand our repertoire of therapeutic modalities,” said Dr. Fischer. “I also believe it is still in its infancy and many challenges lie ahead of us to broadly deploy this to the more challenging targets.” PROTACs could potentially prove the impossible is possible by allowing scientists to destroy disease-causing proteins that were previously considered beyond their reach.

Also read: Cancer Metabolism and Signaling in the Tumor Microenvironment

Lockheed Martin Challenge Inspires Innovative Ideas

A shot of a pilot in a cockpit hovering above planet Earth.

The winners of Lockheed Martin’s 2019 Challenge are developing innovative ways to advance national defense.

Published May 15, 2019

By Marie Gentile, Robert Birchard, and Mandy Carr

Big ideas come from the unlikeliest sources. Their only common attributes are the passion and ingenuity of their inventors. Recently, Lockheed Martin sponsored the “Disruptive Ideas for Aerospace and Security” Challenge to find the next big idea. Meet the winners who hope to transform the future with their innovative solutions.

Grand Prize Winner: IRIS

Bryan Knouse

The ability to make decisions can be comprised by cognitive overload, especially during stressful situations, so Bryan Knouse envisioned IRIS — a voice-controlled interface for Patriot Missile Systems — that would help people make better decisions.

“IRIS leverages software automation and speech technology in high pressure scenarios to reduce human cognitive overload and enable the operator to better focus on mission-critical decisions,” explained Mr. Knouse. “I came at this project thinking about using AI and software interfaces to make sophisticated experiences simpler and safer.”

A mechanical engineer by training, but Al software and programing tinkerer by habit, Mr. Knouse believes voice interfaces present the greatest opportunity to make complicated and sophisticated processes simpler. In the aerospace and security field simplicity is valued because complexity can cause poor decision making which loses lives.

“Artificial intelligence excels at not getting overwhelmed with scales of information. Unlike humans, a computer won’t get paranoid, or disturbed, or stressed out after being fed a spreadsheet with millions of rows of data. A computer will process the information.”

“This challenge was an awesome opportunity. Not just because I was able to build a cool project, but also to connect with a company that I’d otherwise not really have an opportunity to interface with,” Mr. Knouse concluded. “These kinds of technology competitions are a great way for the private sector and established companies to interface with innovators.”

Second Place: Improving Urban Situational Awareness

Dan Cornett

Ninety four percent of vehicular accidents in the United States are caused by driver error, but what if assistive technologies could help drivers focus? This is the premise advanced by Garrett Colby and Dan Cornett, two solutions orientated engineering students, from the University of North Carolina at Charlotte.

While no technology can remove modern day distractions, a modular sensor array could collect data about roadside conditions and unobtrusively alert the driver to potential hazards.

The pair plan to combine neural networks, RADAR, LiDAR, and a 360-degree camera, to continuously collect information on roadside conditions. The weakness of one sensor could be compensated for, with the strength of another, while the data provided by each, individually could be compared to ensure accuracy.

Garrett Colby

“Challenges like this are a good illustration for potential engineers that anyone can make a difference,” said Mr. Colby. “This project was different in that the sky was the limit, being a conceptual project you got to really think outside the box,” added Mr. Cornett.

“Challenges like this give young engineers a place to demonstrate their creativity.”

Third Place: Augmented Superluminal Communication

The sense of isolation experienced during space flight could contribute to the degradation of mission performance. Gabriel Bolgenhagen Schöninger, a physics student at the Technical University of Berlin with a communications technology background, believes his proposal could help lonely astronauts focus. The solution is wearable technologies, biometric sensors and augmented reality to simulate conversation.

Gabriel Bolgenhagen Schöninger

The idea came from Mr. Bolgenhagen Schöninger’s own experience with the rigors of living far from his native Brazil.

“My intention was to create an environment where you can simulate a conversation by collecting communications data and then emulating this data in a virtual environment,” he explained.

In advance of space travel, information could be condensed and inserted into intelligent communications platforms. The compressed communications data could then be “reanimated” to respond to the astronaut. While he developed this idea for long distance travel, Mr. Bolgenhagen Schöninger believes it could have implications in the field of education.

“This challenge creates a great opportunity for young people to get feedback on their ideas,” he finished. “It can help motivate young engineers, to display their ideas, while developing more confidence in that idea.”

Learn more about our challenges

Citizen Science in the Digital Age: Eagle Eyes

Science is a tool for combatting disinformation and making informed decisions.

Published May 1, 2019

By Robert Birchard

The term “citizen science” first entered the Oxford English Dictionary in 2014. It describes a long-standing tradition of collaboration between professional and amateur scientists. Perhaps no field is as closely associated with citizen science as astronomy, where amateur stargazers continue to sweep the skies for unidentified heavenly bodies. Today, with the advent of smartphone technology, even more fields of scientific inquiry are open to the curious amateur.

Eagle Eyes

Ana Prieto, GLOBE program volunteer

One of the oldest continuing citizen science projects is the National Audubon Society’s annual Christmas Bird Count (CBC). The CBC was founded in 1900 by Frank Chapman, an ornithologist at the American Museum of Natural History. Conceived as an alternative to traditional hunts, the first CBC included 27 participants at 25 count sites across North America. It has grown to 76,987 participants counting 59,242,067 birds at 2,585 sites. This will be done during the 118th count in the United States, Canada, Latin America, the Caribbean and Pacific Islands.

Documentation and verification of CBC counts has been revolutionized by mobile technologies and digital photography.

“If somebody said they saw a scarlet tanager or an eastern kingbird, which are common in the summer, but which conventional ornithological wisdom says are always in South America during the CBC, those sightings used to be rejected,” explained Geoffrey LeBaron the longtime Audubon Society Christmas Bird Count Director.

Observing the Past, Predicting the Future

“Everything today is 100 percent electronic and no longer published in print. All results are posted online as soon as a compiler checks off that their count is completed. The data then becomes viewable to the public. Once a region is completed, we have a team of expert reviewers that go over every single count. If they feel there’s something that needs documentation, they’ll be in touch with the compiler, who will get in touch with the observer.”

Scientists use the collected CBC data to observe long-term trends. Additionally, they predict future effects of climate change on species at risk.

“When people are analyzing CBC data, they’re not usually looking at year to year variations, because there is too much variability caused by weather and other factors,” explained Mr. LeBaron. “We looked at the center of abundance of the most common and widespread species and how they varied from the 1960s to the present. We found that a lot of species have moved the center of abundance of their range as much as 200 miles northward and inward away from the coasts.”

Keeping Citizens in Science

Citizen science requires enthusiastic participation of the public, but how can researchers keep the public engaged? This question was recently considered in a paper from Maurizio Porfiri, PhD, Dynamical Systems Laboratory at New York University. The paper is titled, Bring them aboard: Rewarding participation in technology-mediated citizen science projects.”

The team hypothesized that monetary rewards and online or social media acknowledgments would increase engagement of participants.

“People contribute to citizen science projects for a variety of different reasons,” said Jeffrey Laut, PhD, a postdoctoral researcher in Dr. Porfiri’s lab. “If you just want to contribute to help out a project, and then you’re suddenly being paid for it, that might undermine the initial motivation.”

“For example, one of the things we point out in the paper is that people donate blood for the sake of helping out another human,” explained Dr. Laut. “Another study found that if you start paying people to donate blood, it might decrease the motivation to donate blood.”

Proper Rewards for Participation

If a citizen science project is suffering from levels of participation, researchers need to carefully choose the level of reward.

“I think with citizen science projects the intrinsic motivation is to contribute to a science project and wanting to further scientific knowledge,” said Dr. Laut. “If you’re designing a citizen science project, it would be helpful to consider incentives to enhance participation and also be careful on the choice of level of reward for participants.”

The technology used and scope of information collected may have changed, but the role remains as important as ever.

“It is important that citizens understand the world in which they live and are capable of making informed decisions,” said Ms. Prieto. “It’s also important that all people understand science, especially to combat disinformation. From this point of view citizen science is vital and a needed contributor to the greater field of science.”


Learn more about citizen science:

How to Improve Your Presentation Skills

A woman gives a presentation.
Jayne Latz

A professional communication coach provides guidance on how you can improve your communication skills.

Published May 1, 2019

By Jayne Latz

You have a major presentation and you work on the perfect PowerPoint and practice reading your notes. But on the big day it feels like your presentation falls flat. Sound familiar?

If public speaking gives you anxiety, you’re not alone. Comedian Jerry Seinfeld once said that “According to most studies, people’s number one fear is public speaking. Number two is death … This means to the average person, if you go to a funeral, you’re better off in the casket than doing the eulogy.”

Unfortunately, such anxiety can interfere with your delivery. It doesn’t matter how strong your presentation is, if you’re unable to speak in a clear, confident manner, your message will suffer. In fact, research has shown that how you say something actually matters twice as much as what you say!

Learning to master the art of public speaking is crucial to professional success. Whether it’s giving a sales presentation, pitching an idea to a committee, or a concept to a potential funder, the ability to speak in an engaging and convincing manner is important. You may only get one chance to make your case, so a polished and dynamic presentation could be a game-changer.

You should develop a style that works best for you, but here are 10 tips that may help you improve your overall presentation skills:

1. Open strong.

Enter with a confident stride and take a moment to make eye contact with the audience. Smile, and take a deep breath. This will help center you and allow you to begin your presentation in a strong, confident way.

2. Own your space.

Be mindful of body language. Don’t fold your arms, stand with your hands on your hips or put your hands in your pockets. Incorporate natural gestures into your speech — but be careful of “talking” with your hands. You will appear more relaxed, confident and in control.

3. Connect with your audience.

Looking into a sea of faces can be intimidating. Focus on connecting with one audience member at a time by making eye contact with individuals rather than just scanning the crowd. If you have a friend or colleague in the audience, focus on that person to start.

4. Tone matters.

When giving a presentation, your vocal quality can make all the difference. Does your tone sound positive or negative? Confident or tentative? The energy in your voice tells your listener a lot about whether what you have to say has value.

5. Be engaging.

Keep your audience involved and invested in your presentation to drive your message home. Ask questions that require a show of hands, have people stand up, or include moments where audience members need to interact.

6. Use strategic pausing to deliver with impact.

Pauses not only make your speech slower and more understandable, it can also be a powerful tool for drawing your audience’s attention to the parts of your message you want to highlight the most.

7. Don’t let your voice “fall into the abyss.”

Be careful not to drop sounds, particularly at the end of words or trail off at the end of a sentence. I refer to this as “falling into the abyss” and your audience may miss your most important point.

8. Let your audience know why your message matters.

Understand your audience and how what you have to say will benefit them. Then, spell it out. Let everyone know what they have to gain up front, and you’ll have a more attentive audience throughout your presentation.

9. Tell stories.

Including a story or specific case study in your presentation that relates to your audience’s interests will help them feel more connected to you and your message.

10. Close strong!

Finish with a quote or a point that illustrates the one takeaway you want the audience to remember. Memorize your closing in advance so that you can concentrate on your delivery and nerves won’t get in the way of a strong ending.

Polishing your public speaking skills will help you to gain confidence and increase your professional credibility. Take the time to focus on your speaking style, and make sure your presentation is doing your message justice. Remember: It’s not just what you say, it’s how you say it!

Jayne Latz is an executive communication coach and President and Founder of Corporate Speech Solutions, LLC.

Are you looking for an expert to present at an upcoming event? Check out our Speaker’s Bureau to find the Academy expert that fits your needs.

AI and Big Data to Improve Healthcare

Am image of a stethoscope and a tablet displaying a health/medical app.

The next decade will be a pivotal one for the integration of AI and Big Data into healthcare, bringing both tremendous advantages as well as challenges.

Suchi Saria, PhD

Published May 1, 2019

By Sonya Dougal, PhD

One of the most common causes of death among hospital patients in the United States is also one of the most preventable — sepsis.

Sepsis symptoms can resemble other common conditions, making it notoriously challenging to identify, yet early diagnosis and intervention are critical to halting the disease’s rapid progress. In children, for each hour that sepsis treatment is delayed, the risk of death increases by as much as 50 percent.

Novel innovations, such as the one pioneered by Suchi Saria, director of the Machine Learning and Healthcare Lab and the John C. Malone Assistant Professor at Johns Hopkins University, are helping to reverse this trend. In 2013, Saria and a team of collaborators began testing a machine learning algorithm designed to improve early diagnosis and treatment of sepsis.

Using troves of current and historical patient data, Saria’s artificial intelligence (AI) system performs real-time analysis of dozens of inpatient measurements from electronic health records (EHRs) to monitor physiologic changes that can signal the onset of sepsis, then alert physicians in time to intervene.

“Some of the greatest therapeutic benefits we’re going to see in the future will be from computational tools that show us how to optimize and individualize medical care,” Saria said. She explained that the emergence of EHRs, along with the development of increasingly sophisticated AI algorithms that derive insights from patient data, will fuel a seismic shift in medicine — one that merges “what we are learning from the data, with what we already know from our best physicians and best practices.”

Nick Tatonetti, PhD

Electronic Health Records: A Gold Mine for Computer Scientists

EHRs have become a data gold mine for computer scientists and other researchers who are tapping them in ways designed to improve physician-patient encounters, inform and simplify treatment decisions, and reduce diagnostic errors. Like many other technological advances, though, there are those physicians who regard EHR systems with less enthusiasm.

A 2016 American Medical Association study revealed that physicians spend nearly twice as much time engaged in EHR tasks than they do in direct clinical encounters. Physician and author Atul Gawande recently lamented in The New Yorker that “a system that promised to increase my mastery over my work has, instead, increased my work’s mastery over me.”

Yet, data scientist Nicholas Tatonetti, the Herbert Irving Assistant Professor of Biomedical Informatics at Columbia University envisions a day when such AI algorithms will enable physicians to deepen their interaction with patients by freeing them from the demands of entering data into the EHR. Tatonetti has designed a system using natural language processing algorithms that takes accurate notes while physicians talk with patients about their symptoms. Like Saria’s AI system, Tatonetti’s takes advantage of the vast amount of data captured in EHRs to alert physicians in real time to potentially dangerous drug interactions or side effects.

Unknown Interactions

Anyone who has filled a prescription is familiar with the patient information leaflet that accompanies each medication, detailing potential side effects and known drug interactions. But what about the unknown interactions between medications?

Ajay Royyuru, PhD

Tatonetti has also developed an algorithm to analyze existing data in electronic health records, along with information in the FDA’s “adverse outcomes” database, to tease out previously unknown interactions between drugs. In 2016, he published a study showing that ceftriaxone, a common antibiotic, can interact with lansoprazole, an over-the-counter heartburn medication, increasing a patient’s risk of a potentially dangerous form of cardiac arrhythmia.

As data-driven AI techniques become more accessible to clinicians, the treatment of conditions both straightforward, like hypertension, and highly complex, such as cancer, will be transformed.

A Paradigm Shift in Physician-Patient Interactions

Ajay Royyuru, vice president of healthcare and life sciences research at IBM and an IBM Fellow, explained that, “when a practitioner makes a patient-specific decision, the longitudinal trail of information from thousands of other patients from that same clinic is often not empowering that physician to make that decision. The data is there, but it’s not yet being used to provide those insights.”

In the coming years, physicians and researchers will be able to aggregate and better utilize EHR data to guide treatment decisions and help set patients’ expectations.

The ability to draw on information from tens or even hundreds of thousands of patients, in addition to a physician’s own experience and expertise, could represent a paradigm shift in physician-patient interactions, according to Bethany Percha, assistant professor at the Icahn School of Medicine at Mount Sinai, and CTO of the Precision Health Enterprise, a team that turns AI research into tangible products for the health system.

“Big Data offers us the promise of using data to have a real dialogue with patients — if you’re newly diagnosed with cancer, it means giving people a realistic, data-driven assessment of what their future is likely to be,” she said.

Biases and Pitfalls

Despite the surge of interest and investment in AI over the past two decades, significant barriers to its widespread application and deployment in healthcare remain.

AI systems that tap current and historical patient health data risk reinforcing well-noted biases and embedded disparities. Medical research and clinical trials have long suffered from a lack of both ethnic and gender diversity, and EHR data may reflect patient outcomes and treatment decisions influenced by race, sex or socioeconomic status. AI systems that “learn” from datasets that include these biases will inherently share and perpetuate them.

Percha noted that greater transparency within the algorithms themselves — such as systems that learn which features an algorithm uses to make a prediction — could alert users to obvious examples of bias. Removing bias from AI algorithms is a work in progress, but the research community’s awareness of the issue and efforts to address it mirror a greater push to eliminate bias and decrease inequities in medicine overall. Optimistically, Percha noted that Big Data and AI may ultimately help create a more level playing field in healthcare delivery.

“Clinical decisions made on the basis of data have the potential to be much more standardized across different health facilities, so people who are in a rural area, for example, might have access to the same decision-making benefits as someone in a city,” she said.

Patient Data Privacy

Ensuring patient data privacy is another hot-button issue. Training artificial intelligence systems requires access to massive troves of patient data. Despite the fact that this information is anonymized, some patient advocates and bioethicists object to this access without explicit permission from the patients themselves.

Another privacy issue looms equally large: how to safely collect and protect the streams of potentially useful health data generated by wearable devices and in-home technologies without making patients and consumers feel, in Royyuru’s words, “like they are living their lives in front of a camera.” Studies have shown that data from smartphone apps can provide valuable information about the progression of certain diseases, such as Parkinson’s.

Wearables and in-home IoT devices can also extend the realm of clinical observation well beyond the doctor’s office, revealing, for example, important details about a Parkinson’s patient’s ability to complete the tasks of daily living. Yet Royyuru emphasizes that unless patients trust that their data will be kept private and ethically utilized, these technologies will fizzle long before they’re widely adopted.

Building Trust

The next decade will be a pivotal one for the integration of AI and Big Data into healthcare, bringing both tremendous advantages as well as challenges. Some applications of AI, such as image recognition, are already especially well-suited to healthcare — AI algorithms often match or even outperform radiologists in interpreting medical images — while others are far from ready for widespread use.

Saria, who has deployed her system successfully at multiple hospitals says, “physicians often greet news of AI breakthroughs with skepticism because they’re being over-promised results without clear data demonstrating this promise. True integration and adoption of AI requires not just careful attention to physician workflows, but transparency into exactly how and why an algorithm has arrived at a particular recommendation.”

Rather than replacing or challenging a physician’s place in the healthcare ecosystem, Saria believes that AI has the ability to lighten the load, and as algorithms improve, generate diagnostic and treatment recommendations that physicians and patients can both deem trustworthy.

“We are still figuring out how to make real-time information available so that it’s possible for physicians or expert decision-makers to understand, interpret and determine the right thing to do — and to do that in an error-free way, over and over again,” Saria said. “It’s a high-stakes scenario, and you want to get to a good outcome.”

Mark Shervey, Max Tomlinson, Matteo Danieletto, Sarah Cherng, Cindy Gao, Riccardo Miotto, and Bethany Percha, PhD, Mount Sinai Health System, Icahn School of Medicine at Mount Sinai.

The Cutting Edge: There’s An App for That

A graphic illustration of a smart watch and its various medical/health applications.

Researchers are making greater use of the increasing computational power found in smartphones. This means apps may soon be able to help improve human health outcomes.

Published May 1, 2019

By Charles Cooper

The Apple Watch Series 4 helps users stay connected, be more active and manage their health in powerful new ways.
Photo credit: Apple Inc.

Apple CEO Tim Cook has major ambitions to “democratize” the health sector. In a recent interview with CNBC, Cook said that “health will be the company’s greatest contribution to mankind.” He’s also enlisted an important ally to help Apple make that happen.

Atrial fibrillation, which affects 33 million people worldwide, can lead to blood clots, stroke and heart failure. But later this year, Johnson & Johnson (J&J), which developed a heart health application, will carry out a study of volunteer patients 65 and older wearing an Apple Watch to understand whether smartphone technology can help enhance the accuracy and speed of clinicians’ efforts for earlier detection, diagnoses and treatment of the malady.

“Five years from now — and certainly within a decade — wearable devices will be an integral part of healthcare diagnosis and delivery,” said Paul Burton, MD, PhD, FACC, Vice President, Medical Affairs, Janssen Scientific Affairs, LLC, noting that the app will work in conjunction with the Apple Watch Series 4’s irregular rhythm notifications and ECG feature.

Real-Time Data

The diodes on the back of an Apple Watch Series 4 essentially look for a pulse to check blood flow and applies an algorithm to determine whether the pattern pulses are irregular. It has the capability to take a high-fidelity ECG reading which is then sent to a physician. That kind of real-time data is crucial when you consider that around 20 percent of individuals who experience a stroke are not aware of their underlying AFib condition.

The widening availability of digital tools, paired with advances in technologies like artificial intelligence and machine learning, is raising hopes that history will repeat itself. In the last decade, business applications helped organizations become more efficient and to better engage with their customers. Now researchers are making greater use of the increasing computational power found in smartphones and it’s no longer a stretch to imagine a future in which there’s an app for nearly every step in the research process.

Burton expressed excitement at the potential of apps to make changes in behavior and improve health outcomes in ways that were unimaginable less than a decade ago. “I think this is an amazingly exciting point bound only by our imagination. I think the possibilities are endless,” Burton said. “AFib is treatable but you need definitive, compelling data to really make a difference in healthcare.” At the same time, Burton cautions that “apps don’t work if people download them but can’t be bothered to use them.” The point being that all the technology in the world won’t help, if the people who need it most don’t incorporate the tools into their lifestyles.

Promises and Reality Checks

That challenge was faced head-on by University of Southern California research scientists Susan Evans and Peter Clarke, as they tested out a mobile app they developed to help low-income people who use food pantries to obtain fresh vegetables, which, while often plentiful in supply, may be limited in variety.

Though the use of health-related mobile apps are now common, the promise and the performance often don’t match up. Clarke noted that fewer than one percent of the estimated 330,000 apps available on the Apple and Android download stores have been subjected to rigorous testing for effectiveness.

“Getting people to incorporate devices and apps into their lives is a whole separate science,” he said.

In developing their app, Evans and Clarke made sure the design incorporated user input early in the process, just as if they were creating a consumer app. For example, even though food banks collect fresh food and vegetables, many low income people aren’t incorporating those offerings into their diet because they may not know how to cook and/or preserve the food that’s available.

Evans and Clarke, who started the project with certain assumptions about what was needed, were forced to refine their ideas about how to change dietary habits and that came only after extensive field research and speaking with the people they hoped would ultimately use the app.

“We had to customize the app in order to meet clients’ needs and not impose this on them from the top,” said Evans. “It took years of tinkering. In terms of functionality and navigation, we designed it over and over again to try and get it right.”

Technology Is Only as Good as the User

An app recipe for broccoli burritos. This user wanted Latino-flavored and kid-friendly recipes.

As scientists and researchers struggle with the alchemy of user engagement, they have the advantage of being able to lean on the experience of software developers working in the consumer and business markets. Unfortunately, there’s no one size-fits-all answer explaining how to get a target audience not just to download the applications, but to also use them consistently.

University of Michigan computer scientist Kentaro Toyama struggled to understand the nuances surrounding successful user engagement when he worked as assistant managing director of Microsoft Research in India. Toyama’s team built several different digital apps in areas like healthcare and social services that performed well in the labs. But few survived the test of time after they were released to the public.

“When we did these research projects in relatively constrained contexts, we could show how technology has a positive impact,” he said. “However, when we scaled those projects, we found that it did not have the same impact. Technology can be extremely good at delivering what people want,” he said. “It’s not so good when it comes to encouraging [people] to become better versions of themselves.”

Marissa Burgermaster would probably agree. As an elementary and middle school teacher she became interested in how food and nutrition influenced the lives of the students she taught. Ultimately she decided to pursue a doctorate in behavioral nutrition.

Nutrition Education Interventions

During the course of her research, she also discovered a seeming contradiction: As a whole, nutrition education interventions didn’t produce tremendous results, but anecdotally they did appear to work for at least some students.

“What kept coming across from the data was … that different groups of kids … responded quite differently to the intervention,” she said. “That explained why an average intervention didn’t get great results — even though for some kids, it was exactly what they needed.”

Burgermaster said it underscored the importance of accumulating as much data as possible before the fact. She went on to do her post-doctoral research in biomedical informatics and nowadays teaches in the Department of Nutritional Sciences at the University of Texas, Austin. Burgermaster kept the lesson in mind when she set out to develop an app that provides nutrition information to underserved communities.

“The reason why I was drawn to intervening via technology was not just to use data, but also it’s about meeting people where they are and get them to where they need to be. And let’s be honest: people are stuck in their phones,” she said.

The app, which is rolling out this spring in Austin, offers users personalized recommendations with tailored nutritional recommendations and interventions to help them reach their goals. Like J&J’s test project with Apple, it’s another indication of the potential for health practitioners to use smartphone and wearable technology to generate data about their patients to help with diagnoses.

A Mobile Lab in Every Home

Mobile Instruments — Ozcan Lab

When Aydogan Ozcan talks about the potential of smartphone apps to effect transformative changes, don’t expect to hear him riff about cool new ways to arrange virtual candies on a screen or share adorable cat videos. He has a far bigger goal in mind.

Over the years, Ozcan’s lab has focused on developing field-portable medical diagnostics and sensors for resource-poor areas, coming up with relatively inexpensive ways to equip smartphones with advanced imaging and sensory capabilities that once were only found in expensive high-end medical instruments.

In the last decade, he has come up with ways to exploit the functionality available in contemporary smartphone hardware and software to further bio- and nano-photonics research. For example, one technique allowed a smartphone to produce images of thousands of cells in samples that were barely eight micrometers wide — and at the cost of less than $50 in off-the-shelf parts.

More recently, Ozcan demonstrated how the application of deep learning techniques can generate smartphone images that approach the resolution and color details found in laboratory-grade microscopes using 3-D printed attachments that cost less than $100 apiece.

“Instrumentation is very expensive. The cost of advanced microscopes, for example, can run to hundreds of thousands of dollars,” said Ozcan, a professor of electrical and computer engineering and bioengineering at the UCLA Samueli School of Engineering, and a three-time Blavatnik National Awards for Young Scientists finalist.

Smartphones: Mobiles Medical Labs

Smartphones are relatively inexpensive with more than 3 billion people using them around the world, encouraging Ozcan to envision a future where resource-poor nations will have expanded access to advanced measurement tools, that provide data for local residents to better treat medical conditions. Think of the average smartphone one day functioning as a mobile medical lab.

Ozcan also believes that people in their homes will soon be using a growing assortment of advanced mobile technologies and apps for preventive care, particularly when it comes to monitoring an aging patient or someone with a chronic condition.

“In the U.S., five percent of patients cause 50 percent of health expenditures per year. We can reduce that cost with better preventive care but for that, the home needs better technology. We should be able to provide that with mobile cost-effective systems so you can do some of the measurements that would normally require sending people to the hospital to take a sample, wait for the results and then go to the pharmacy with a prescription.”

While we may not be there yet, the world is fast approaching that tipping point where mobile apps lead to a veritable explosion of powerful, cost-effective alternatives to some of the most advanced biomedical imaging and measurement tools now in the market.

Also read: Tech’s Messy Challenge: Finding the Rx for Global E-Waste


About the Author

Charles Cooper is a Silicon-valley based technology writer and former Executive Editor of CNET.

Citizen Science in the Digital Age: Get Out the Maps

An over-the-shoulder shot of a person driving, using an iPhone as a dashcam.

Mapillary aims to make the world a smaller place with maps that continually update street-level conditions.

Published May 1, 2019

By Robert Birchard

The term “citizen science” first entered the Oxford English Dictionary in 2014, but it describes a long standing tradition of collaboration between professional and amateur scientists. Perhaps no field is as closely associated with citizen science as astronomy, where amateur stargazers continue to sweep the skies for unidentified heavenly bodies. Today, with the advent of smartphone technology, even more fields of scientific inquiry are open to the curious amateur.

Jan Erik Solem, CEO and Founder of Mapillary

Making the World a Smaller Place

With more than 440 million images from more than 190 countries, the street-level imagery platform Mapillary is trying to make the world a smaller place with maps that continually update street-level conditions.

“Carmakers can use the data to help train their autonomous systems — essentially ‘teaching’ cars to see and understand their surroundings — and mapmakers to populate their maps with up-to-date data. Cities can use it to keep inventories of traffic signs and other street assets among other things,” explained Jan Erik Solem, PhD, CEO and founder of Mapillary.

The data is collected by contributors who upload it onto Mapillary’s platform.

“The traditional approach of mapping places include sending out fleets of cars to map cities and towns, but these places change faster than mapping corporations are able to keep up with,” Solem added.

Simpe Tools Like Mobile Phones and Action Cameras

“Using simple tools like mobile phones or action cameras, anyone can go out, map their town and have data instantly generated from the images to update maps everywhere,” said Dr. Solem. “No one else collects data in this collaborative way.” The data is free for educational and personal use he added. “The company is closely tied to the research community and we recognize how helpful it is for researchers to have access to the kind of data that’s hosted on our platform,” explained Dr. Solem. “Mapillary is a commercial entity, but we are driven by research and this is part of our way of paying it forward.”

The data that Mapillary receives is verified through computer vision technology and GPS coordinates, integrated with the mobile phones and cameras that map the roads. “Our computer vision technology detects and recognizes objects in images including things like traffic signs, fire hydrants, benches and CCTVs. Having diverse imagery from all over the world means we have a rich training dataset that enables us to build some of the world’s best computer vision algorithms for street scenes.”

Mapillary’s mobile app allows for instant updates with the latest road conditions.

Keeping Citizens in Science

Citizen science requires enthusiastic participation of the public, but how can researchers keep the public engaged? This question was recently considered in a paper from Maurizio Porfiri, PhD, Dynamical Systems Laboratory at New York University titled, Bring them aboard: Rewarding participation in technology-mediated citizen science projects.”

The team hypothesized that monetary rewards and online or social media acknowledgments would increase engagement of participants.

“People contribute to citizen science projects for a variety of different reasons,” said Jeffrey Laut, PhD, a postdoctoral researcher in Dr. Porfiri’s lab. “If you just want to contribute to help out a project, and then you’re suddenly being paid for it, that might undermine the initial motivation.”

“For example, one of the things we point out in the paper is that people donate blood for the sake of helping out another human,” explained Dr. Laut. “Another study found that if you start paying people to donate blood, it might decrease the motivation to donate blood.”

Proper Rewards for Participation

If a citizen science project is suffering from levels of participation, researchers need to carefully choose the level of reward.

“I think with citizen science projects the intrinsic motivation is to contribute to a science project and wanting to further scientific knowledge,” said Dr. Laut. “If you’re designing a citizen science project, it would be helpful to consider incentives to enhance participation and also be careful on the choice of level of reward for participants.”

The technology used and scope of information collected may have changed, but the role remains as important as ever.

“It is important that citizens understand the world in which they live and are capable of making informed decisions,” said Ms. Prieto. “It’s also important that all people understand science, especially to combat disinformation. From this point of view citizen science is vital and a needed contributor to the greater field of science.”


Learn more about citizen science: