Skip to main content

Help Wanted to Close the Skills Gap

A researcher examines a test tube inside a science lab.

The fastest growing occupations over the next decade will be in the energy, health and education sectors.

Published October 1, 2019

By Joan Lebow

Fabio Manca, Head of the Skills Analysis team at the OECD Centre for Skills

According to the Bureau of Labor Statistics, the fastest growing occupations over the next decade will be in the energy, health and education sectors, while the medical and technical sectors will contain the highest paying occupations. All these occupations will require a STEM education.

STEM learning is often cited by the public and private sectors as the way to prepare for a technology-driven future. A recently published study by Randstad USA, an employment/recruitment agency, found that 68 percent of U.S. workers surveyed would focus on studying science, technology, engineering and math (STEM) fields, if they could restart their educational journeys at age 18.

Spending for STEM education has grown substantially at all levels of schooling, largely due to the investment of billions of public and private sector dollars. This trajectory continues even with the persistent challenge of keeping young people, especially girls, engaged in STEM learning in their elementary years throughout higher education.

Filling the “Skills Gap” in STEM Careers

On the surface, an emphasis on STEM would seem to be all that’s needed to prepare the next generation workforce. But with projections for employment in STEM related occupations expected to grow to more than nine million jobs by 2022 and the steady drumbeat of corporate leaders saying they cannot find qualified workers for millions of open positions, the issues surrounding the so-called “skills gap” are not quite that straightforward.

“To thrive in a digital world, workers will need not only digital skills, but a broad mix of skills including strong cognitive and socio-emotional skills. High level information communication technology skills will also be increasingly important in growing occupations linked to new technologies,” says Fabio Manca, Head of the Skills Analysis team at the Organisation for Economic Co-operation and Development (OECD) Centre for Skills.

The OECD is an international forum and knowledge hub for data and analysis, best-practice sharing, and advice on public policies and global standard-setting. “[Workers] will also need complementary skills, ranging from good literacy and numeracy to the socio-emotional skills required to work collaboratively and flexibly,” says Manca.

Also Developing Soft Skills

Peter Robinson, President and CEO, United States Council for International Business (USCIB)

Analysts agree that more training and more types of abilities are needed now and in the future for workers to fill those jobs. Along with STEM knowledge, it’s traits like “flexibility” and “adaptability” that analysts repeatedly mention as signposts to success.

“It’s not just the hard skills, but critical thinking and soft skills that will be valued,” says Peter Robinson, president and CEO of the United States Council for International Business (USCIB), a policy advocacy and trade services organization dedicated to promoting open markets and representing American business interests internationally.

Technological advances mean work itself will keep evolving. Robinson and others call for more public-private partnerships among business, education and government to help the labor force prepare for, and respond to change. Without this shared burden they see a skills gap that will only widen.

“You won’t be able to front load your education. You will have to be adaptable to change down the road in your career,” says Robinson.

It Starts with Education

Any one-dimensional academic or on-the-job background, could pose challenges. As the OECD’s 2019 Report on Skills points out, “Initial education systems have a key role to play in providing young people with the skills required for a successful entry into the labor market. However, deep and rapid changes in technology make it difficult for initial education to equip young people with the knowledge and capabilities they will need throughout their work life.”

Says the OECD’s Manca, “Recent research by the OECD also highlights that labor market shortages are widespread in high-skilled occupations that make an intense use of communication and verbal abilities, these latter influencing the acquisition and application of information in problem solving contexts.”

An ability to collaborate, problem solve, think creatively and be malleable enough for a future of life-long learning are essential, experts agree. A paradox is emerging. Such skills are often best learned on the job, and not having them is an impediment to hiring, the USCIB’s Robinson explains. He says companies will need to partner with the education system much earlier. “They can’t just show up on graduation day.”

New approaches to curriculum, modern versions of industrial apprenticeships, and efforts to re-skill existing employees and returning mid-career employees through “returnships” are among the ways to accomplish these expanded training needs. “Employers who want the right work force will also need to invest in training workers,” says Robinson. “But it will not be just about training in computers or robotics. Entire industries may change in ways we don’t foresee.”

Sangheon Lee, Director of the Employment Policy Department of the International Labour Organization (ILO)

Filling the “Investment Gap”

“We have an investment gap,” says Sangheon Lee, Director of the Employment Policy Department of the International Labor Organization (ILO). The ILO seeks to promote full and productive employment by developing integrated employment, development and skills policies. Lee also views reinvigorated job training initiatives as essential to creating a productive workforce.

“The most important thing is to reduce the gap between the rhetoric and investment” Lee says. “In over 20 countries, people are learning more and doing more in STEM. But what they are learning is theoretical and needs to be more reality-based. You need to come out of your education with some reasonable set of skills, and the job would train you further.”

Lee and other labor policy analysts concur, a forward-thinking combination of government, education and industry must support this focus on training and especially life-long learning. For now, employers are poaching skilled workers from other companies.

“They are hesitant to spend money on training for transferable skills, the very skills that are often important to success. Instead, employers typically want to invest only in training related to a specific job, keeping their investments targeted to their bottom line,“ says Lee.

This is especially true in the tech sector where innovative businesses are small and agile, but don’t have the money for significant training programs, Lee notes.

Tax Incentives for Job Training

Neither students nor individuals seeing their jobs morph mid-career can afford to pay for additional training without help. Public incentives will be necessary, from apprenticeships to late-career pivots. According to Lee, new accounting structures, tax incentives for job training, and more up-front government investment will be important tools bridging the skills gap as work changes.

Another critical issue to address that will ultimately narrow the skills gap, Lee says, is gender bias. More attention is needed to improve workplace policies and attitudes towards qualified women in the labor force. STEM skills may land a woman a job, he points out, but attitudes and stereotypes are a persistent barrier to their success especially in STEM professions.

“There is still a lot of implicit discrimination. It’s not just about the ability to do the job,” Lee says.

Labor policy analysts say it’s an over-simplification to divide jobs of the future into tech and non-tech roles; the future of work will be far more nuanced than what works for the STEM haves and have nots. To prepare for what’s ahead and be able to address changes when the time comes, as well as to find a workforce with the necessary skills, will take a longer, collaborative view from many societal sectors.

“There needs to be a paradigm shift, from employment to employability, says Robinson from USCIB.

Grant Rejection Could Be the Best Thing for Your Career

Four different sciences and engineers share their experiences of transitioning from academia into research-focused private sector positions.

Published October 1, 2019

By Ann Delfaro

As a doctoral student, microbiologist Natasha Frank was known for challenging assumptions. Her scientific skepticism and technical skills steered more than one experiment to safety when it threatened to tank, and classmates routinely approached her for advice.

Few were surprised, then, when Frank accepted a postdoctoral position at the Pacific Northwest National Laboratory and started down the path of a traditional academic career. Later, as a research scientist at Washington State University, she divided her days between teaching, bench work and grant applications.

It’s not that Frank particularly wanted to become a professor — that’s simply the path graduate students are steered down, she says.

“I’d heard of a few alternate careers in science but they seemed out of reach,” says Frank. “I always thought, how do you get into those things?”

She eventually accepted a microbiologist position at Clorox, reasoning that industry was basically science with added job stability.

But that wasn’t quite true, as she discovered when her department was dissolved. While scanning LinkedIn for new opportunities, she noticed that she met all the qualifications for a position unlike any other on her CV.

She landed the job. Now she works as a patent agent for a large molecular diagnostics company, using her science training to gauge whether new products or services might infringe on existing patents.

“I went from thinking alternative careers were out of reach to having one,” she says.

If Frank’s story seems familiar, that’s because it is. More and more students are graduating from PhD programs — a 41 percent increase between 2003 and 2013 — but ultimately, only 26 percent move into tenured or tenure-track positions in the United States. Others migrate to jobs in business, government or industry.

And still others leave science entirely. Sort of.

Define ‘Anomaly’

Joseph Brown, a senior data scientist, holds a PhD in biomedical sciences and was working for Thermo Fisher Scientific — writing software to analyze peptide behavior in different conditions — when a friend mentioned the strong culture and benefits at nearby Netflix.

On a whim, Brown went online and scanned the company’s job listings. He noticed one for a data scientist to do anomaly detection; that is, to pinpoint a small number of problematic servers among the company’s hundreds of thousands of servers.

“And I thought, you know — it’s kind of similar to my past work, identifying individual peptides or genes that are behaving unusually in a huge swath of the proteome or transcriptome,” Brown says.

Video streaming might seem a far reach from molecular biology, but for Brown the shift was a natural progression of his lifelong interests in statistics and computer programming.

“The math is what really tied everything together,” he says.

Now he works alongside other scientists, most holding doctorates in physics, economics, mathematics or computer science. While few have a life sciences background, it isn’t unheard of, according to Brown.

Rebranding the PhD

David Cox
MIT-IBM Watson Artificial Intelligence Lab

As it turns out, math isn’t the most critical common denominator. According to David Cox, director of the MIT-IBM Watson Artificial Intelligence (AI) Lab at the Cambridge Research Center.

“A lot of it is training you how to think, how to solve problems, how to be resilient,” Cox says

During his years as a Harvard professor of engineering, computer science, and molecular and cell biology, Cox saw many PhD graduates apply their critical thinking skills to successful careers in consulting. In particular, he says, the routine practice of “analyzing data” is now called “data science” — and it’s in high demand.

“Scientists have been doing that for a long time and didn’t think anything of it, but industry has woken up to the idea that this is an interesting thing to do with business data,” Cox says. “If you know how to wrangle data, run statistically valid and rigorous tests to understand it, that’s a marketable and valuable skill.”

It’s obvious how computer science graduates might leverage that skill, but scientists in fields such as neurology can bank on that, too. The combination of data analysis and specialized knowledge — for example, how the brain and intelligence work — is especially transferable.

“Those skills are often transferable to thinking about AI and structuring experiments to understand what is happening in an artificial system,” Cox says.

Emphasizing Marketable Skills

Sometimes PhDs need help rebranding themselves to emphasize these marketable skills. That’s where physicist Alejandro de la Puente comes in.

“Nowadays, there are fewer options in academia and more options elsewhere,” says de la Puente, who completed a postdoc in physics and now offers career and professional development for STEM graduates at the New York Academy of Sciences.

The pressures that discourage recent STEM graduates from entering academia are cyclical, de la Puente notes. Few tenure positions exist because scientists who land those positions tend to stay a long time and retire late in life. At the same time, university enrollment is up. To deal with the demand, institutions are hiring more adjuncts or non-tenure track professors than in years past.

“When you join as an adjunct, most of your responsibility is teaching,” de la Puente explains. “So it’s a circular thing: You want to stay in academia, but most positions are not tenure track. And if you’re not tenure track, you’re doing more teaching and less research. That limits your chances of getting grants and gives you no chance at tenure.”

Through the Academy’s Science Alliance Initiative, de la Puente teaches scientists how to transfer their skills to nonacademic jobs, how to broaden their reach — and most importantly, how to communicate the technicalities of their work to a broader audience, including job recruiters. The program fills an unmet need for graduate students like Frank, who may hear about alternate careers but have no idea how to pursue one.

Counter Culture

Chacko Sonny
Blizzard Entertainment

How does one land quite so far from the lab, though?

Chacko Sonny, executive producer and vice president at Blizzard Entertainment, the company behind the game Overwatch, knew he wanted to be an engineer years before enrolling in Stanford’s undergraduate and master’s electrical engineering programs. But what he didn’t count on was eventually applying that training to the video game industry.

Strategic by nature, Sonny was working as a consultant for the international strategy firm McKinsey & Company when he realized he craved a change of pace. Specifically, he wanted to use his training in electrical engineering and economics to build and market things, and he wanted those things to be fun and creative. He saw two options: the visual effects industry or the video game industry.

Sonny began applying to every game company he could think of, finally landing an interview with Los Angeles-based Activision. He noticed a “massive” culture divide between the engineering and video game industries.

A Heterogeneous Blend Of Talent

Whereas both his McKinsey and video game colleagues were exceptionally smart, his game industry colleagues were talented across more different dimensions that is typically found in consulting companies. Teams of 200 people, consisting of a third each of artists, designers and engineers, collaborated on projects that demanded a heterogeneous blend of talent.

For one thing, debugging problems becomes a massive ordeal for video games built on millions of lines of code.

“If a character behaves oddly on screen or doesn’t display an expected behavior, you need a structured problem-solving approach to figure out why,” he says.

Games can take hundreds of hours to play to completion, so Sonny used his engineering mindset to hone in on small yet critical errors in the code. The consistent challenge and excitement of the game industry propelled him forward, and before long he’d made a career of it.

Forward Momentum

Like Sonny, Frank has gained valuable, diverse skills since leaving the traditional academic route.

“I get exposed to business development and even finance, regulatory, marketing, communications,” she says.

She’s learned how to calculate prospective revenue and determine if the company can afford a certain license. These challenges keep her engaged, but she hasn’t ruled out a future career shift.

“This experience has created opportunities to do different things, should I decide later on that I might take a different turn.”

Also read: So, You Want to Publish a Scientific Paper? and Legendary Labs: Secrets for Scientific Excellence.

Minding the Science and Technology Skills Gap

A teacher presents to students inside a high school science lab.

Boosting STEM classes in public schools and retraining adults so they can enter STEM fields are only the first steps to closing the employment skills gap. Long-term solutions are much more complex.

Published October 1, 2019

By Alan Dove, PhD

Mark Dembo
Cornell University eCornell

In the 21st century, advances in science and technology drive much of the global economy, employing millions of people while causing fundamental shifts in the nature of work and the distribution of wealth. These changes have led many corporate leaders, academics and policy experts to warn of a widening “skills gap,” in which a lack of workers with the necessary training holds companies back and exacerbates inequity.

Traditional labor markets follow the law of supply and demand, where filling a position requires little more than offering adequate pay and benefits based on the number of workers able to do the job. When the employer pays the market price for a position, someone will take it. In some science and technology fields today, however, companies have trouble finding qualified employees at any price.

Policymakers and educators have offered blanket solutions for the problem, ranging from efforts to boost science, technology, engineering and math (STEM) classes in public schools, to retraining skilled adults looking to change fields. However, discussions with subject experts reveal that the reality of the skills gap is complex, and suffused with thorny geographic, economic and political challenges.

Serfs Up

The U.S. unemployment rate, often cited as a major indicator of economic health, has been falling since 2010 and now hovers below four percent. Beneath that rosy figure hides a troubling reality, with huge swaths of the population in precarious, low-paying jobs.

“The people at the bottom of the skills spectrum have experienced wage stagnation and lower mobility, while the people in high skill jobs have seen more job opportunities and … great upward mobility,” says Marcela Escobari, Senior Fellow in Global Economy at the Brookings Institute in Washington, D.C.

The skills gap lies at the core of this bifurcation; educated, skilled workers, especially in science and technology fields, enjoy expanding opportunities and growing wages, while less-skilled individuals see their options narrowing and wages shrinking. Advances in automation promise to make the problem worse, as computers and robots replace mostly low-skilled workers. Geography also influences this trend, with most of the high-skill, high-paying jobs concentrated in a handful of major cities.

Drawing on large databases of employment and social trends, Escobari and her colleagues have identified the factors that could drive a more broad-based form of economic growth. Brookings is now producing a series of reports based on their findings, to help regions address not only the skills gap but the broader social and economic forces that exacerbate it.

“For your low-wage workers to be able to take advantage of opportunities, [they need] affordable housing, accessible transport, [and] childcare,” says Escobari. Most important, “cities need thriving industries that create opportunities for upward mobility” adds Escobari.

CEOs: The Skills Gap is Big Problem

Even with the basic services in place, training and re-training workers for fast-evolving businesses will require a major change in tactics. One recent survey found that the vast majority of CEOs say the skills gap is a big problem for them, but few have invested in training programs to address it.

Brookings Center for Universal Education Senior Fellow Marcela Escobari presents her May 2019 report “Growing Cities that Work for All: A Capability-based Approach to Regional Economic Development” at the 2019 Building the Workforce of the Future: Resilient People and Places symposium. Photo: Brookings

Companies that do implement training programs often see their workers poached by competitors who didn’t have to make that investment. Escobari contrasts that with the situation in many European countries, where strong unions and labor regulations encourage companies to collaborate on training and building the pipeline of talent, “then even when people move from company to company, they all benefit from having more highly skilled and technically able people.”

With 44 percent of the American workforce now in low-wage jobs, the problem may be coming to a head.

“People are thinking about this because we are seeing the repercussions not only in increased inequality, and financial precariousness of low wage workers, but also in the political sphere,” says Escobari.

The Express Train

That anger is likely to get worse when the current economic boom reaches its inevitable end. “It’s sort of the calm before the storm, because you have high employment, [but] as in any economic cycle, when that starts to go down you’re going to see a major transformation,” says Art Langer, director of the Columbia University Center for Technology Management in New York, N.Y. Langer, who also founded the job training nonprofit Workforce Opportunity Services (WOS), has been working on multiple fronts to close the skills gap.

Traditionally, students interested in science and technology have been encouraged to get college degrees rather than vocational certifications, but that’s now leaving some industries shorthanded.

“You have advanced manufacturing that is using all different types of scientific and computerized equipment, and there are huge skills gaps there,” says Langer.

Meanwhile, the best-trained white collar workers gravitate to trendy high-tech companies such as Google and Facebook, leaving insurers, banks and other traditional businesses short of skilled labor as they try to adopt more sophisticated technologies. The irony of these skills gaps is that Americans are attending college in record numbers, and racking up massive debts to do so.

Addressing the Skills Gap

Many graduate with degrees that haven’t prepared them for the jobs that are available. While many educators and policymakers focus on public schools and state universities to address the skills gaps, Langer doesn’t have much hope for that approach.

“Public institutions of higher education [are] controlled by political forces,” says Langer, adding that changes in legislatures and governorships often jerk policies and funding in different directions every few years. “This concept that somehow these institutions … are going to change themselves is a dream,” he says.

Instead, Langer advocates transforming the relationship between employers and job training programs. WOS, for example, works directly with companies to identify the skills they need, then finds and trains people for those positions. By focusing on underserved job seekers, including minorities, women and veterans, WOS is able to recruit eager, talented individuals who would otherwise be left out of highly skilled jobs. As that and other collaborative job training programs take off, Langer hopes more traditional educational institutions will adopt similar approaches.

The Gospel According to the Peter Principle

Some major universities are already working to boost their vocational training programs, especially online.

“Our focus is primarily on providing online certificate programs that are really focused on the working professionals [and] online professional development,” says Mark Dembo, director of corporate programs at Cornell University’s eCornell in Ithaca, N.Y. Dembo explains that eCornell works closely with major employers to determine industries’ current needs, and tailors programs to meet those needs.

That perspective reveals two major types of skills gaps. First, companies need increasing numbers of technically trained people to take on entry-level positions, especially jobs requiring data analysis and computer programming capabilities. The second gap, which has received less publicity, comes after those employees have advanced in their fields for a few years.

“What we hear quite often is ‘we’ve got people that have very strong technical backgrounds, [but] now I need them to lead teams,’” says Dembo, adding that many companies have “people that are strong technically, and then they get to a point of failure because they don’t have those softer skills” required to manage people.

More Technical Training

In particular, Dembo distinguishes between leadership and management skills. The former refers to the ability to influence people and unify teams around common goals, while the latter entails an understanding of budgeting, administration and group organization. To meet the growing need, eCornell and other online universities now offer programs to teach both. Conversely, Dembo says he also hears from established managers who need more technical training to be able to understand what their subordinates are doing.

The rising need for continuing education underscores another major trend in the labor market; companies want to hire lifelong learners.

“In today’s world you’re going to have to continue to adapt because the needs are going to change, of what’s needed in the labor market,” says Dembo. Faculty will also need to adapt, keeping ahead of trends in employers’ needs so they can continue teaching relevant knowledge and skills to their students.

With student loan debt in the U.S. now ballooning past $1.5 trillion, employers’ demand for lifelong learners is taking a heavy toll on their future and present employees. Though he declines to comment on the student loan issue, Dembo urges people to take careful stock of their skills and finances, and consider the return they expect to get from their educational investments.

Gaps in the Clouds

The ways technology firms respond to the skills gap reflect their unique needs, as well as a less appreciated aspect of the problem: geography. Companies outside major cities have been hit especially hard.

“We … consistently have to go outside of our area and outside of our state to source sufficient talent, credentials, experiences and diversity,” says Bill Avey, global head of personal systems services at Hewlett-Packard in Boise, Idaho.

As a major employer in Boise and a leading manufacturer of personal computers, HP faces an ongoing struggle to find and develop the skilled workers they need. The problem extends across the educational spectrum.

Almost half of Idaho children need remedial education as early as kindergarten to meet minimum standards, and many fail to thrive academically in later years. In response, Avey and leaders in other local companies have banded together to lobby Idaho’s deeply conservative politicians for solutions.

“Business leaders are the one group of folks that can credibly show up in the legislature and say … something as crazy as ‘we suggest you raise our taxes to spend more on education’” says Avey, adding that “it’s very different than a teacher’s union showing up.”

Expanding Access to Education

Boosting education budgets is only a partial solution, though. Even for companies in major metropolitan areas with access to top university graduates, science and technology businesses are changing and growing so fast that demand for skills vastly outstrips supply.

“The country produces about 60,000 computer scientists every year, whereas we’re seeing more than 700,000 technology jobs open,” says Obed Louissaint, vice president of talent at IBM in Armonk, N.Y.

The drastic expansion of artificial intelligence technology is one of the biggest drivers of the skill gap. Previously the domain of a handful of high-tech companies, AI is now considered indispensable in numerous industries.

“We have financial services firms, retailers and insurance companies who are all looking for people with AI skills or data science capability, (which) puts a strain on the available talent,” says Louissaint.

IBM is attacking the problem aggressively, with multiple initiatives to retrain many of the same groups targeted by Langer’s team: blue collar workers, veterans and women, who the company then places in rapidly expanding fields. Like others confronting the skills gap, Louissaint also emphasizes the need for workers to change their perspectives on education and training: “They should be thinking of learning … as a continuous journey.”

What Happens When Innovative Scientists Embrace Entrepreneurship?

A woman talks with other women.

Deciding to make the leap from research to start-up doesn’t mean you have to leave your passion for science behind.

Published October 1, 2019

By Chenelle Bonavito Martinez

Sean Mehra
Chief Strategy Officer and co-founder, HealthTap

The days of lifetime employment with one employer are long gone. Most people will have at minimum half a dozen jobs over a working lifetime and possibly two or three career paths. And just as many will try their hand at starting their own business. Unfortunately, small business statistics show that by the end of four years more than half of them will be gone.

But being a scientist may have a distinct advantage when deciding to be an entrepreneur. Forbes contributor and STEM consultant Anna Powers writes in a 2018 article titled One Scientific Principal Will Make You A Better Entrepreneur, that “…the process of entrepreneurship mirrors the process of innovation in science. A lot can be learned [about innovation and entrepreneurship] from science, which has formulated certain guidelines about the process of innovation. Perhaps that is why almost 30 percent of Fortune 50 CEOs are scientists.”

The key to easing the transition from employee to “boss” is recognizing how the skills you possess for one job, translate into another. This not only applies to a direct transfer of specific technical knowledge or soft skills like communication and collaboration, but also how certain skills specific to your current career, are the same as those you need to possess to become a successful entrepreneur.

What it Takes

So what does it take for a scientist to become an entrepreneur? Opinions vary, but mostly it starts with a question and a desire to make an impact. However, deciding to make the leap from research to start-up doesn’t mean you are leaving your passion for science behind.

Sean Mehra, Chief Strategy Officer and co-founder of HealthTap, a digital health company that enables convenient access to doctors, says, “People think of the decision to be an entrepreneur as a choice to leave your skills and knowledge as a scientist behind, when that’s not really the case.” Scientists are innovators and they can easily identify as entrepreneurs. Mehra cites several examples of skills developed in the lab that can be applied to starting a business.

“Writing grants to acquire funds for research is not much different than fundraising, corporate development and sales,” he says. “Conducting experiments is product R&D and market fit. If you have recruited postdocs to work in your lab and guided their work, then you have hired talent and managed a team. Publishing and presenting your research at conferences is pretty much like marketing your vision. And networking and connecting with colleagues in your field is no different than prospecting for business connections and talking to your customers.”

Myriam Sbeiti and Daniela Blanco, Co-founders of Sunthetics, met in school and as graduation approached saw an opportunity to launch a viable business. In 2018 they developed a more efficient and more sustainable electrochemical manufacturing path for a chemical intermediate of Nylon 6,6. The process uses electricity rather than heat to power the reaction in a way that uses 30 percent less raw materials and energy, reducing a variety of harmful emissions in the process.

Suntheics co-founders from left to right: Professor Miguel Modestino, Myriam Sbeiti, Daniela Blanco

Similar to the Scientific Method

In the future, Sbeiti and Blanco are planning to apply this electrochemical process to a variety of reactions, making the chemical industry green, one reaction at a time. Sbeiti reflects that a lot of the research and interviews conducted to figure out if their ideas were viable were very similar to the scientific method and scientific experiments, i.e. they created a hypothesis and then needed to validate it. The major difference was that they did not need to confirm their hypothesis through years of research, instead they needed to talk to potential customers to find the right fit.

As scientists and researchers themselves, both emphasized that failure was the hardest skill to master. “The chemical industry wasn’t really interested in our original idea and the fashion industry didn’t really see value.” After a round of customer interviews, they realized they were designing a product they thought the customer needed instead of the product the customer said they wanted. In addition, efficacy and cost were a customer priority so Sbeiti and Blanco pivoted their idea to develop a product that fit the market. The Sunthetics team is shaping up to make the impact that was envisioned after graduate school. In fact, Blanco continues to pursue her technology as part of her PhD research and “thinks of it like R&D.”

Entrepreneurship is definitely a “higher risk and higher reward” scenario says Mehra. Most traditional researchers typically have a lower risk tolerance than the average innovator or entrepreneur. It can be very uncomfortable for a trained researcher turned entrepreneur to accept failure and pivot away from their original idea. But Mehra says that “even if the original idea isn’t quite right, there is still a lot of good knowledge acquired through the process.”

Unlocking the “Why”

Unlocking the “why” and a desire to create impact at scale are drivers behind making the shift into entrepreneurship. While contemplating his own career path, Mehra reflects that “thinking about my passion for technology, I realized that technology has a way to scale and have a one-to-many impact on the world. I started to think about ways I could use my technology skills to help people on a global scale instead of, for example, treating patients one-at-a-time as a doctor.”

Sbeiti and Blanco also began their journey by observing their surroundings and asking why. These common traits make up what Clayton Christensen, the current Kim B. Clark Professor of Business Administration at the Harvard Business School of Harvard University, and his co-authors, call The Innovators DNA. After six years of studying innovative entrepreneurs, executives and individuals, they agree this common skill set is present in every innovative entrepreneur. Clayton et al. argue that if innovation can be developed through practice, then the first step on the journey to being more innovative is to sharpen the skills.

Studies of identical twins separated at birth indicate that one’s ability to think creatively comes one-third from genetics “that means that roughly two-thirds of our innovation skills come through learning — from first understanding the skill, then practicing it, and ultimately gaining confidence in our capacity to create,” says Clayton. The most important skill to practice is questioning. Asking “why” or “what if” can help strengthen the other skills and allow you to see a problem or opportunity from a different perspective.

Ted Cho
StartupHoyas MED

A Search for Something That’s Never Been Done

Ted Cho, President of StartupHoyas MED, an organization dedicated to healthcare startups and innovators at Georgetown University, sees that skill in many of the innovators and entrepreneurs who are part of the StartupHoyas community. Like Drs. Jean-Marc Voyadzis and Hakim Morsli who created Amie, a “virtual surgical assistant” to help patients prepare for surgery and recovery, entrepreneurs often create their companies by observing and questioning their surroundings, identifying a problem, and developing a solution.

Cho says that “one of the most common pitfalls for entrepreneurs is building solutions without problems. Often times the most successful startups are those that are rooted in problems that the founders experienced firsthand. However, that doesn’t mean that you necessarily have to be an insider. Some of the most innovative ideas with the greatest potential to create impact have come from outsiders with fresh perspectives who aren’t locked into the conventions that seem to restrict many of the traditional players in the healthcare space.” While all of the innovators and entrepreneurs in the StartupHoyas community are focused on improving healthcare, not all are medical students. In fact, many are students and faculty from other areas of life sciences.

Starting one’s own company is much like scientific research — it’s the search for something that’s never been done before, because there is no product that is exactly like yours. But it’s important for researchers considering a business launch to stay flexible. As Cho says “pick something you love, but be careful not to fall in love with your own science.”


Creative Intelligence

Innovative entrepreneurs have something called “creative intelligence,” which enables discovery, yet differs from other types of intelligence. This means innovators are more than “right-brained individuals.” They engage both sides of the brain and leverage what the authors call the “five discovery skills” to create new ideas.

  • Associating: Connecting seemingly unrelated question, ideas or problems from different areas.
  • Questioning: Challenging the status quo by asking “why,” “why not” and “what if.”
  • Observing: Scrutinizing common phenomena, particularly behavior.
  • Experimenting: Trying new ideas.
  • Networking: Meeting people with different viewpoints, ideas and perspectives to expand your knowledge.

Source: Excerpted from “The Innovators DNA.”

Also read: Advancing Innovation and Entrepreneurship in Clean Energy

Darwin’s Dilemma: The Origin and Evolution of the Eye

A shark swims in the ocean.

Award-winning science writer Carl Zimmer explains the “creation” of the organ so complex that it baffled even Darwin.

Published October 1, 2019

By Carl Zimmer

“The eye to this day gives me a cold shudder,” Charles Darwin once wrote to a friend.

If his theory of evolution was everything he thought it was, a complex organ such as the human eye could not lie beyond its reach. And no one appreciated the beautiful construction of the eye more than Darwin—from the way the lens was perfectly positioned to focus light onto the retina to the way the iris adjusted the amount of light that could enter the eye. In The Origin of Species, Darwin wrote that the idea of natural selection producing the eye “seems, I freely confess, absurd in the highest possible degree.”

For Darwin, the key word in that sentence was seems. If you look at the different sort of eyes out in the natural world and consider the ways in which they could have evolved, Darwin realized, the absurdity disappears. The objection that the human eye couldn’t possibly have evolved, he wrote, “can hardly be considered real.”

Dozens of Different Kinds of Eyes

Today evolutionary biologists are deciphering the origins of not just our own eyes but the dozens of different kinds of eyes that animals use. Fly eyes are built out of columns. Scallops have a delicate chain of eyes peeking out from their shells. Flatworms have simple light-sensitive spots. Octopuses and squids have camera eyes like we do, but with some major differences. The photoreceptors of octopuses and squids point out from the retina, towards the pupil. Our own eyes have the reverse arrangement. Our photoreceptors are pointed back at the wall of the retina, away from the pupil.

For decades, most scientists argued that these different eyes evolved independently. The earliest animals that lived over 600 million years ago were thought to be eyeless creatures. As their descendants branched out into different lineages, some of them evolved their own kinds of eyes. It now turns out, however, that this is not really true.

All eyes, in all their wonderful variety, share an underlying unity in the genes used to build them. By tracing the history of these shared genes, scientists uncovering the steps by which complex eyes have evolved through a series of intermediate steps.

Opsins in Common

When light enters your eye, it strikes a molecule known as an opsin. Opsins sit on the surface of photoreceptor cells, and when they catch photons, they trigger a series of chemical reactions that causes the photoreceptor to send an electrical message towards the brain.

Biologists have long known that all vertebrates carry the same basic kind of opsin in their eyes, known as a c-opsin. All c-opsins have the same basic molecular shape, whether they’re in the eye of a shark or the eye of a hummingbird. All c-opsins are stored in a stack of disks, each of which grows out of a hair-like extension of the retina called a cilium.

In all vertebrates, c-opsins relay their signal from the stack of disks through a pathway of proteins called the phosphodiesterase pathway. All of these homologies suggest that c-opsins were present in the common ancestor of all living vertebrates.

Vertebrates belong to a much larger group of species known as bilaterians—in other words, animals that develop a left-right symmetry. The main lineage of these other bilaterians, known as protostomes, includes millions of species, ranging from insects to earthworms and squid.

Protostome eyes don’t have the c-opsins found in vertebrates. Instead, protostomes build another molecule, known as an r-opsin. Instead of keeping r-opsins in a stack of disks, they store r-opsins in foldings in the membranes of photoreceptors. R-opsins all send their signals through the same pathway of proteins (not the same pathway as c-opsins send signals in vertebrates).

Humans Also Make R-Opsins

These similarities in the r-opsins suggest they evolved in the common ancestor of protostomes, only after their ancestors had branched off from the ancestors of vertebrates. Likewise, vertebrates only evolved c-opsins in their eyes after the split. In recent years, however, evolutionary biologists have discovered opsins where they weren’t supposed to be.

It turns out, for example, that humans also make r-opsins. We just don’t make them on the surfaces of photoreceptors where they can catch light. Instead, r-opsins help to process images captured by the retina before they’re transmitted to the brain.

In 2004, Detlev Arendt of the European Molecular Biology Laboratory and his colleagues also found c-opsins where they weren’t supposed to be. They were probing the nervous system of an animal known as a ragworm, which captures light with r-opsins. Arendt and his colleagues discovered a pair of organs atop the ragworm’s brain that grew photoreceptors packed with c-opsins.

Arendt sequenced the gene for the ragworm c-opsins and compared it with genes for other opsins. He found that it is more closely related to the genes for c-opsins in our own eyes than it is to the genes for r-opsins in the ragworm’s own eyes. These findings have led Arendt and other researchers to revise their hypothesis about the origin of opsins: the common ancestor of all bilaterians must already have had both kinds of opsins.

Clues from Cnidarians

But Todd Oakley, a biologist at the University of California at Santa Barbara, wondered if opsins might be even older. To find out, Oakley and his colleagues turned to the closest living relatives of bilaterians. Known as the cnidarians, this lineage includes jellyfish, sea anemone, and corals.

Adapted with permission from The Tangled Bank: An Introduction to Evolution, by Carl Zimmer (copyright 2010, Roberts & Company, Greenwood Village, CO).

Biologists have long known that some cnidarians can sense light. Some jellyfish even have eye-like organs that can form crude images. In other ways, though, cnidarians are radically different from bilaterians. They have no brain or even a central nerve cord, for example. Instead, they have only a loose net of nerves. These dramatic differences had led some researchers to hypothesize that bilaterians and cnidarians had evolved eyes independently. In other words, the common ancestor of cnidarians and bilaterians did not have eyes.

In recent years, scientists have sequenced the entire genomes of two species of cnidarians, the stellar sea anemone (Nematostella vectensis) and a freshwater hydra (Hydra magnipapillata). Scanning their genomes, Oakley and his colleagues discovered that both species cnidarians have genes for opsins—the first time opsin genes had ever been found in a nonbilaterian. The scientists carried out experiments on some of these genes and discovered that they are expressed in the sensory neurons of the cnidarians. Oakley’s research suggests that, as he had suspected, opsins evolved much earlier than bilaterians.

How Opsins Evolved

With discoveries from scientists such as Oakley and Arendt, we can start to get a sense of how opsins evolved. Opsins belong to a family of proteins called G-protein coupled receptors (GPCRs). They’re also known as serpentine proteins, for the way they snake in and out of cell membranes. Serpentine proteins relay many different kinds of signals in the cells of eukaryotes. Yeast cells use them to detect odorlike molecules called pheromones released by other yeast cells. Early in the evolution of animals, a serpentine protein mutated so that it picks up a new kind of signal: light.

At some point, the original opsin gene was duplicated (Figure 8.13). The two kinds of opsins may have carried out different tasks. One may have been sensitive to a certain wavelength of light, for example, while the other tracked the cycle of night and day. When cnidarians and bilaterians diverged, perhaps 620 million years ago, they each inherited both kinds of opsins. In each lineage, the opsins were further duplicated and evolved into new forms. And thus, from a single opsin early in the history of animals, a diversity of light-sensing molecules has evolved.

The Crystalline Connection

The earliest eyes were probably just simple eyespots that could only tell the difference between light and dark. Only later did some animals evolve spherical eyes that could focus light into images. Crucial to these image-forming eyes was the evolution of lenses that could focus light. Lenses are made of remarkable molecules called crystallins, which are among the most specialized proteins in the body. They are transparent, and yet can alter the path of incoming light so as to focus an image on the retina. Crystallins are also the most stable proteins in the body, keeping their structure for decades. (Cataracts are caused by crystallins clumping late in life.)

It turns out that crystallins also evolved from recruited genes. All vertebrates, for example, have crystallins in their lenses known as α-crystallins. They started out not as light-focusing molecules, however, but as a kind of first aid for cells. When cells get hot, their proteins lose their shape. They use so-called heat-shock proteins to cradle overheated proteins so that they can still carry out their jobs.

Scientists have found that α-crystallins not only serve to focus light in the eye, but also act as heat-shock proteins in other parts of the body. This evidence indicates that in an early vertebrate, a mutation caused α-crystallins to be produced on the surface of their eyes. It turned out to have the right optical properties for bending light. Later mutations fine-tuned α-crystallins, making them better at their new job.

The Evolution of the Vertebrate Eye

Vertebrates also produce other crystallins in their eyes, and some crystallins are limited to only certain groups, such as birds or lizards. And invertebrates with eyes, such as insects and squid, make crystallins of their own. Scientists are gradually discovering the origins of all these crystallins. It turns out that many different kinds of proteins have been recruited, and they all proved to be good for bending light.

In 2007, Trevor Lamb and his colleagues at Australian National University synthesized these studies and many others to produce a detailed hypothesis about the evolution of the vertebrate eye. The forerunners of vertebrates produced light-sensitive eyespots on their brains that were packed with photoreceptors carrying c-opsins. These light-sensitive regions ballooned out to either side of the head, and later evolved an inward folding to form a cup.

Early vertebrates could then do more than merely detect light: they could get clues about where the light was coming from. The ancestors of hagfish branched off at this stage of vertebrate eye evolution, and today their eyes offer some clues to what the eyes of our own early ancestors would have looked like.

The Evolution Doesn’t Stop

After hagfish diverged from the other vertebrates, Lamb and his colleagues argue, a thin patch of tissue evolved on the surface of the eye. Light could pass through the patch, and crystallins were recruited into it, leading to the evolution of a lens. At first the lens probably only focused light crudely. But even a crude image was better than none. A predator could follow the fuzzy outline of its prey, and its prey could flee at the fuzzy sight of its attackers. Mutations that improved the focusing power of the lens were favored by natural selection, leading to the evolution of a spherical eye that could produce a crisp image.

The evolution of the vertebrate eye did not stop there. Some evolved the ability to see in the ultraviolet. Some species of fish evolved double lenses, which allowed them to see above and below the water’s surface at the same time. Vertebrates adapted to seeing at night and in the harsh light of the desert. Salamanders crept into caves and ended up with tiny vestiges of eyes covered over by skin. But all those vertebrate eyes were variations on the same basic theme established half a billion years ago.


About the Author

Carl Zimmer is a lecturer at Yale University, where he teaches writing about science and the environment. He is also the first Visiting Scholar at the Science, Health, and Environment Reporting Program at New York University’s Arthur L. Carter Journalism Institute.

Zimmer’s work has been anthologized in both The Best American Science Writing series and The Best American Science and Nature Writing series. He has won numerous fellowships, honors, and awards, including the 2007 National Academies Science Communication Award for “his diverse and consistently interesting coverage of evolution and unexpected biology.”

His books include Soul Made Flesh, a history of the brain; Evolution: The Triumph of an Idea; At the Water’s Edge, a book about major transitions in the history of life; The Smithsonian Intimate Guide to Human Origins; and Parasite Rex, which the Los Angeles Times described as “a book capable of changing how we see the world.”

His newest book, The Tangled Bank: An Introduction to Evolution, will be published this fall to coincide with the 150th anniversary of the publication of The Origin of Species.

What Can Science Tell Us About Death?

Sam Parnia smiles for the camera, wearing a suit and tie.

Sam Parnia, a leading expert in resuscitation science research, explains how death is not an absolute, but a process, and what happens when patients experience death.

Sam Parnia MD, PhD

Published September 30, 2019

By Robert Birchard

Across time and cultures, people have been conditioned to view death as an endpoint to the experience of life. However, advances in resuscitation science and critical care medicine have challenged assumptions about the finality of death. Sam Parnia, Director of the Critical Care & Resuscitation Research Division of Pulmonary, Critical Care & Sleep Medicine at New York University Langone Medical Center, recently spoke to The New York Academy of Sciences about his resuscitation science research. Dr. Parnia’s work illuminates how death is not an absolute, but a process, and what happens when patients experience death — sharing insights from his research in his own words:

What is death?

Death occurs when the heart stops beating. We call this death by cardiopulmonary criteria and it is how death is defined for more than 95 percent of people. A person stops breathing and their brain shuts down, causing all life processes to cease. More recently with the birth of modern intensive care medicine and the ability to artificially keep people’s hearts beating, doctors like myself can keep a patient’s heart beating longer.

Where people may have suffered irreversible brain damage and brain death, this leads to a situation where the brain has died, but the person’s heart is still beating, so legally, they are declared dead based upon irreversible brain death, or death by brain death criteria. This happens in a small fraction of the cases where people are declared dead.

For millennia death was considered an irreversible event and nothing could restore life. During the last decade, we’ve realized it’s only after a person has died that the cells inside their body, including the brain, begin their own death process. We used to think that you had five or 10 minutes before brain cells died, from a lack of oxygen, but we now know that’s wrong.

You have hours, if not days, before the brain and other organs in the body are irreversibly damaged after death. It’s actually the restoration of oxygen and blood flow back into organs after a person’s heart stops, but is then resuscitated that paradoxically leads to accelerated cell death. So, this accelerated secondary injury process is what we need to combat in medicine now.

Why is the term “near-death” experience inaccurate?

The problem with this term is that it is inconsistent with what people actually experience. It is undefined and imprecise. If I said ‘an airplane was involved in a near-miss incident,’ what does that mean? Did you have another plane come in within an inch of another plane, or were they a mile away? The term is ill-defined, and, it doesn’t take into consideration the fact that a lot of people have biologically died and returned.

What is a death experience?

I call it an “experience of death” because that’s what it is. People report a unique cognitive experience in relation to death. They may have a perception of seeing their body and the doctors and nurses trying to revive them, yet feel very peaceful while observing. Some report a realization that they may have actually died.

Later they develop a perception or a sensation of being pulled towards a type of destination. During the experience, they review their life from birth, until death, and interestingly this review is based upon their humanity.

They don’t review their lives based on what people strive for, like a career, promotions, or an amazing vacation. Their perspective is focused on their humanity. They notice incidents where they lacked dignity, acted inappropriately towards others, or conversely, acted with humanity and kindness.

They re-experience and relive these moments, but also, what’s fascinating, which sort of blows me away because I can’t really explain it, is they also describe these experiences from the other person’s perspective.

If they caused pain, they experience the same pain that other person felt, even if they didn’t realize it at the time. They actually judge themselves. They suddenly realize why their actions were good or bad, and many claim to see the downstream consequences of their actions.

How do studies of cardiac arrest  inform the debate on the nature of consciousness?

Traditionally, researchers had proposed that mind or consciousness – our self – is produced from organized brain activity. However, nobody has ever been able to show how brain cells, which produce proteins, can generate something so different i.e. thoughts or consciousness. Interestingly, there has never been a plausible biological mechanism proposed to account for this.

Recently some researchers have started to raise the question that maybe your mind, your consciousness, your psyche, the thing that makes you, may not be produced by the brain. The brain might be acting more like an intermediary. It’s not a brand new idea. They have argued that we have no evidence to show how brain cells or connections of brain cells could produce your thoughts, mind or consciousness.

The fact that people seem to have full consciousness, with lucid well-structured thought processes and memory formation from a time when their brains are highly dysfunctional or even nonfunctional is perplexing and paradoxical.

I do agree that this raises the possibility that the entity we call the mind or consciousness may not be produced by the brain. It’s certainly possible that maybe there’s another layer of reality that we haven’t yet discovered that’s essentially beyond what we know of the brain, and which determines our reality.

So, I believe it is possible for consciousness to be an as of yet undiscovered scientific entity that may not necessarily be produced by synaptic activity in the brain.

What are PROTACs and How Do They Treat Diseases?

“Optimized degrader molecules will have fast rates of degradation and relatively short exposure with therapeutic doses that result in complete elimination of the target protein, which can result in a more durable and deeper effect.”

Published July 23, 2019

By Robert Birchard

Eric Fischer, PhD

Around 80% of disease-causing proteins, including key drivers of many cancers and other serious neurological conditions like Alzheimer’s disease, cannot be targeted by currently available therapeutics. These so called “undruggable” proteins lack specific surface areas required for treatments such as small molecule inhibitors or antibodies to bind with disease causing proteins and modulate their function.

However, an alternative therapeutic strategy known as targeted protein degradation has shown the potential to remedy these “undruggable” proteins. Utilizing small molecules known as PROTACs, this strategy harnesses the cell’s waste disposal system to promote the destruction of disease-causing proteins. Dr. Eric Fischer, Assistant Professor of Biological Chemistry and Molecular Pharmacology at Harvard Medical School, recently sat down with us to help create this primer on PROTACs, and their potential implications for treating disease.

What are PROTACs?

PROteolysis TArgeting Chimeras, or PROTACS for short, are two separate molecules bound together to form a two headed molecule. One end binds to an ubiquitin ligase, while the other end binds to the “undruggable” protein targeted by pharmacologists. In illustrations, PROTACs are often depicted as dumbbells, but it may be more helpful to think of them as flexible harnesses.

How do PROTACs work?

PROTACs are designed to take advantage of the cell’s waste disposal system that removes unneeded proteins. This system, known as the proteasome, is important for the cell to remove unneeded or damaged proteins and recycle their building blocks to make new proteins. The proteasome plays critical roles in cell growth, management of cellular stress, and in the immune system. One end binds to the targeted proteins, while the other end of the molecule binds to the ubiquitin ligase, which then marks the targeted protein for destruction. This lets the cell’s proteasome know that this specific protein can be destroyed. In this way the body’s regularly occurring mechanisms are co-opted to destroy disease-causing proteins.

Optimized degrader molecules will have fast rates of degradation and relatively short exposure with therapeutic doses that result in complete elimination of the target protein, which can result in a more durable and deeper effect.”

Eric Fischer, PhD

What makes PROTACs so unique?

Most therapies are divided between small molecule inhibitors or therapeutic antibodies/biologics. However, “PROTACs are small molecules and as such not restricted to targeting surface proteins, however, in contrast to traditional small molecule inhibitors, PROTACs are fundamentally different,” explained Dr. Fischer, “While inhibitors need to achieve an almost perfect degree of target engagement over an extended period of time to exert their pharmacologic effect, PROTACs follow more of a hit and run strategy.”

“Optimized degrader molecules will have fast rates of degradation and relatively short exposure with therapeutic doses that result in complete elimination of the target protein, which can result in a more durable and deeper effect,” he explained. “More importantly, however, small molecule degraders completely eliminate the disease-causing protein and as such can target the non-catalytic activity of enzymes but also scaffolding proteins, and other non-enzymatic targets.”

When will PROTACs be more widely available?

While researchers have demonstrated the potential of PROTACs in the lab, the first clinical trials are just opening. Still Dr. Fischer is very optimistic, “The technology has rapidly spread, and we can expect to see many more programs entering clinical development. Due to the pioneering work of a growing academic community spearheading this field, the concepts underlying protein degradation are largely public domain and widely available.”

What is the future of PROTACs research?

“I believe the field of targeted protein degradation is here to stay and will significantly expand our repertoire of therapeutic modalities,” said Dr. Fischer. “I also believe it is still in its infancy and many challenges lie ahead of us to broadly deploy this to the more challenging targets.” PROTACs could potentially prove the impossible is possible by allowing scientists to destroy disease-causing proteins that were previously considered beyond their reach.

Also read: Cancer Metabolism and Signaling in the Tumor Microenvironment

Lockheed Martin Challenge Inspires Innovative Ideas

A shot of a pilot in a cockpit hovering above planet Earth.

The winners of Lockheed Martin’s 2019 Challenge are developing innovative ways to advance national defense.

Published May 15, 2019

By Marie Gentile, Robert Birchard, and Mandy Carr

Big ideas come from the unlikeliest sources. Their only common attributes are the passion and ingenuity of their inventors. Recently, Lockheed Martin sponsored the “Disruptive Ideas for Aerospace and Security” Challenge to find the next big idea. Meet the winners who hope to transform the future with their innovative solutions.

Grand Prize Winner: IRIS

Bryan Knouse

The ability to make decisions can be comprised by cognitive overload, especially during stressful situations, so Bryan Knouse envisioned IRIS — a voice-controlled interface for Patriot Missile Systems — that would help people make better decisions.

“IRIS leverages software automation and speech technology in high pressure scenarios to reduce human cognitive overload and enable the operator to better focus on mission-critical decisions,” explained Mr. Knouse. “I came at this project thinking about using AI and software interfaces to make sophisticated experiences simpler and safer.”

A mechanical engineer by training, but Al software and programing tinkerer by habit, Mr. Knouse believes voice interfaces present the greatest opportunity to make complicated and sophisticated processes simpler. In the aerospace and security field simplicity is valued because complexity can cause poor decision making which loses lives.

“Artificial intelligence excels at not getting overwhelmed with scales of information. Unlike humans, a computer won’t get paranoid, or disturbed, or stressed out after being fed a spreadsheet with millions of rows of data. A computer will process the information.”

“This challenge was an awesome opportunity. Not just because I was able to build a cool project, but also to connect with a company that I’d otherwise not really have an opportunity to interface with,” Mr. Knouse concluded. “These kinds of technology competitions are a great way for the private sector and established companies to interface with innovators.”

Second Place: Improving Urban Situational Awareness

Dan Cornett

Ninety four percent of vehicular accidents in the United States are caused by driver error, but what if assistive technologies could help drivers focus? This is the premise advanced by Garrett Colby and Dan Cornett, two solutions orientated engineering students, from the University of North Carolina at Charlotte.

While no technology can remove modern day distractions, a modular sensor array could collect data about roadside conditions and unobtrusively alert the driver to potential hazards.

The pair plan to combine neural networks, RADAR, LiDAR, and a 360-degree camera, to continuously collect information on roadside conditions. The weakness of one sensor could be compensated for, with the strength of another, while the data provided by each, individually could be compared to ensure accuracy.

Garrett Colby

“Challenges like this are a good illustration for potential engineers that anyone can make a difference,” said Mr. Colby. “This project was different in that the sky was the limit, being a conceptual project you got to really think outside the box,” added Mr. Cornett.

“Challenges like this give young engineers a place to demonstrate their creativity.”

Third Place: Augmented Superluminal Communication

The sense of isolation experienced during space flight could contribute to the degradation of mission performance. Gabriel Bolgenhagen Schöninger, a physics student at the Technical University of Berlin with a communications technology background, believes his proposal could help lonely astronauts focus. The solution is wearable technologies, biometric sensors and augmented reality to simulate conversation.

Gabriel Bolgenhagen Schöninger

The idea came from Mr. Bolgenhagen Schöninger’s own experience with the rigors of living far from his native Brazil.

“My intention was to create an environment where you can simulate a conversation by collecting communications data and then emulating this data in a virtual environment,” he explained.

In advance of space travel, information could be condensed and inserted into intelligent communications platforms. The compressed communications data could then be “reanimated” to respond to the astronaut. While he developed this idea for long distance travel, Mr. Bolgenhagen Schöninger believes it could have implications in the field of education.

“This challenge creates a great opportunity for young people to get feedback on their ideas,” he finished. “It can help motivate young engineers, to display their ideas, while developing more confidence in that idea.”

Learn more about our challenges

Citizen Science in the Digital Age: Eagle Eyes

Science is a tool for combatting disinformation and making informed decisions.

Published May 1, 2019

By Robert Birchard

The term “citizen science” first entered the Oxford English Dictionary in 2014. It describes a long-standing tradition of collaboration between professional and amateur scientists. Perhaps no field is as closely associated with citizen science as astronomy, where amateur stargazers continue to sweep the skies for unidentified heavenly bodies. Today, with the advent of smartphone technology, even more fields of scientific inquiry are open to the curious amateur.

Eagle Eyes

Ana Prieto, GLOBE program volunteer

One of the oldest continuing citizen science projects is the National Audubon Society’s annual Christmas Bird Count (CBC). The CBC was founded in 1900 by Frank Chapman, an ornithologist at the American Museum of Natural History. Conceived as an alternative to traditional hunts, the first CBC included 27 participants at 25 count sites across North America. It has grown to 76,987 participants counting 59,242,067 birds at 2,585 sites. This will be done during the 118th count in the United States, Canada, Latin America, the Caribbean and Pacific Islands.

Documentation and verification of CBC counts has been revolutionized by mobile technologies and digital photography.

“If somebody said they saw a scarlet tanager or an eastern kingbird, which are common in the summer, but which conventional ornithological wisdom says are always in South America during the CBC, those sightings used to be rejected,” explained Geoffrey LeBaron the longtime Audubon Society Christmas Bird Count Director.

Observing the Past, Predicting the Future

“Everything today is 100 percent electronic and no longer published in print. All results are posted online as soon as a compiler checks off that their count is completed. The data then becomes viewable to the public. Once a region is completed, we have a team of expert reviewers that go over every single count. If they feel there’s something that needs documentation, they’ll be in touch with the compiler, who will get in touch with the observer.”

Scientists use the collected CBC data to observe long-term trends. Additionally, they predict future effects of climate change on species at risk.

“When people are analyzing CBC data, they’re not usually looking at year to year variations, because there is too much variability caused by weather and other factors,” explained Mr. LeBaron. “We looked at the center of abundance of the most common and widespread species and how they varied from the 1960s to the present. We found that a lot of species have moved the center of abundance of their range as much as 200 miles northward and inward away from the coasts.”

Keeping Citizens in Science

Citizen science requires enthusiastic participation of the public, but how can researchers keep the public engaged? This question was recently considered in a paper from Maurizio Porfiri, PhD, Dynamical Systems Laboratory at New York University. The paper is titled, Bring them aboard: Rewarding participation in technology-mediated citizen science projects.”

The team hypothesized that monetary rewards and online or social media acknowledgments would increase engagement of participants.

“People contribute to citizen science projects for a variety of different reasons,” said Jeffrey Laut, PhD, a postdoctoral researcher in Dr. Porfiri’s lab. “If you just want to contribute to help out a project, and then you’re suddenly being paid for it, that might undermine the initial motivation.”

“For example, one of the things we point out in the paper is that people donate blood for the sake of helping out another human,” explained Dr. Laut. “Another study found that if you start paying people to donate blood, it might decrease the motivation to donate blood.”

Proper Rewards for Participation

If a citizen science project is suffering from levels of participation, researchers need to carefully choose the level of reward.

“I think with citizen science projects the intrinsic motivation is to contribute to a science project and wanting to further scientific knowledge,” said Dr. Laut. “If you’re designing a citizen science project, it would be helpful to consider incentives to enhance participation and also be careful on the choice of level of reward for participants.”

The technology used and scope of information collected may have changed, but the role remains as important as ever.

“It is important that citizens understand the world in which they live and are capable of making informed decisions,” said Ms. Prieto. “It’s also important that all people understand science, especially to combat disinformation. From this point of view citizen science is vital and a needed contributor to the greater field of science.”


Learn more about citizen science:

How to Improve Your Presentation Skills

A woman gives a presentation.
Jayne Latz

A professional communication coach provides guidance on how you can improve your communication skills.

Published May 1, 2019

By Jayne Latz

You have a major presentation and you work on the perfect PowerPoint and practice reading your notes. But on the big day it feels like your presentation falls flat. Sound familiar?

If public speaking gives you anxiety, you’re not alone. Comedian Jerry Seinfeld once said that “According to most studies, people’s number one fear is public speaking. Number two is death … This means to the average person, if you go to a funeral, you’re better off in the casket than doing the eulogy.”

Unfortunately, such anxiety can interfere with your delivery. It doesn’t matter how strong your presentation is, if you’re unable to speak in a clear, confident manner, your message will suffer. In fact, research has shown that how you say something actually matters twice as much as what you say!

Learning to master the art of public speaking is crucial to professional success. Whether it’s giving a sales presentation, pitching an idea to a committee, or a concept to a potential funder, the ability to speak in an engaging and convincing manner is important. You may only get one chance to make your case, so a polished and dynamic presentation could be a game-changer.

You should develop a style that works best for you, but here are 10 tips that may help you improve your overall presentation skills:

1. Open strong.

Enter with a confident stride and take a moment to make eye contact with the audience. Smile, and take a deep breath. This will help center you and allow you to begin your presentation in a strong, confident way.

2. Own your space.

Be mindful of body language. Don’t fold your arms, stand with your hands on your hips or put your hands in your pockets. Incorporate natural gestures into your speech — but be careful of “talking” with your hands. You will appear more relaxed, confident and in control.

3. Connect with your audience.

Looking into a sea of faces can be intimidating. Focus on connecting with one audience member at a time by making eye contact with individuals rather than just scanning the crowd. If you have a friend or colleague in the audience, focus on that person to start.

4. Tone matters.

When giving a presentation, your vocal quality can make all the difference. Does your tone sound positive or negative? Confident or tentative? The energy in your voice tells your listener a lot about whether what you have to say has value.

5. Be engaging.

Keep your audience involved and invested in your presentation to drive your message home. Ask questions that require a show of hands, have people stand up, or include moments where audience members need to interact.

6. Use strategic pausing to deliver with impact.

Pauses not only make your speech slower and more understandable, it can also be a powerful tool for drawing your audience’s attention to the parts of your message you want to highlight the most.

7. Don’t let your voice “fall into the abyss.”

Be careful not to drop sounds, particularly at the end of words or trail off at the end of a sentence. I refer to this as “falling into the abyss” and your audience may miss your most important point.

8. Let your audience know why your message matters.

Understand your audience and how what you have to say will benefit them. Then, spell it out. Let everyone know what they have to gain up front, and you’ll have a more attentive audience throughout your presentation.

9. Tell stories.

Including a story or specific case study in your presentation that relates to your audience’s interests will help them feel more connected to you and your message.

10. Close strong!

Finish with a quote or a point that illustrates the one takeaway you want the audience to remember. Memorize your closing in advance so that you can concentrate on your delivery and nerves won’t get in the way of a strong ending.

Polishing your public speaking skills will help you to gain confidence and increase your professional credibility. Take the time to focus on your speaking style, and make sure your presentation is doing your message justice. Remember: It’s not just what you say, it’s how you say it!

Jayne Latz is an executive communication coach and President and Founder of Corporate Speech Solutions, LLC.

Are you looking for an expert to present at an upcoming event? Check out our Speaker’s Bureau to find the Academy expert that fits your needs.