Skip to main content

The Structural Design Of The Twin Towers

A shot of the twin tower in downtown NYC, prior to the September 11 terrorist attacks.

One of the structural engineers of the Twin Towers reflects on the destruction of the 9/11 terrorist attacks.

Published January 1, 2002

By Linda Hotchkiss Mehta

The Twin Towers circa March 2001. Image courtesy Jeffmock, GNU Free Documentation License, via Wikimedia Commons. No changes were made to the original work.

Although he lost many friends on September 11, Academy Member Leslie Robertson is thankful to be among the fortunate New Yorkers who did not lose family members or coworkers, as did thousands of others. Still, the shock and grief he felt during and after the attacks might be somewhat akin to the incomparable horror of suddenly losing two dear children.

For Robertson, now Director of Design at Leslie E. Robertson Associates, Consulting Structural Engineers, the World Trade Center has been a central part of his professional life –– the defining project that launched a distinguished career –– since the early 1960s. Together with then partner John Skilling and architect Minoru Yamasaki, Robertson and his team conceived, and helped develop the structural designs for five of the seven buildings in the WTC complex, including the 110-story Twin Towers.

An active member of the Academy’s Human Rights of Scientists Committee, Robertson was in Hong Kong on September 11 discussing a new skyscraper when he first received word that a plane had hit the WTC’s north tower. Everyone believed that it had been a helicopter or other small aircraft. He then was able to reach his wife, Saw-Teen See, an Academy Member and engineer in her own right, who reported the seriousness of the event and that the second tower had been struck. He rushed to his room to prepare for a return to New York.

The Structural Strength of the Towers

After turning on the TV and registering the shock of witnessing the dreaded images of death and destruction taking place, Robertson said his memory of the following hours are somewhat blurred. “You wanted to reach out and stop it,” he recalled, “but there was nothing you could do.”

Although he’s still plagued with thoughts about “what we might have done differently,” Robertson acknowledged in an interview that –– as many Members and other colleagues have told him –– the structural strength of the towers allowed them to stand long enough for perhaps 25,000 occupants to escape after each of the Boeing 767 aircraft crashed into them. The north tower was struck between the 94th and 99th floors at 8:45 a.m. and did not collapse until 10:28 a.m.; the south tower, which was impacted at a lower level, between the 78th and 84th floors, was the first to collapse, at 9:59 a.m., 53 minutes after the second aircraft struck.

“When I started work on this project, the tallest building I’d worked on had only 22 floors,” Robertson said. “The WTC engineering was a first of a new kind of high-rise building.” Aware of the military aircraft that hit the Empire State Building in a dense fog in 1945, Robertson said, “I thought we should consider the structural integrity that would be needed to sustain the impact of a (Boeing) 707 –– the largest aircraft at that time.”

Achieving Structural Strength

Leslie Robertson

Robertson added, “We didn’t have suicidal terrorists in mind.” Rather, he was considering an accident, a 707 flying at low speed, most likely lost in a dense fog. To achieve the structural strength, Robertson and his team designed the Twin Towers as steel boxes around hollow steel cores. An unusually large number of rigid, load-bearing columns of hollow-tube steel –– each column being only 14 inches wide and set just 40 inches on center –– supported the Towers walls.

Because the 767s were traveling at high speeds, were somewhat larger than 707s and each carried about 80 tons of jet fuel, Robertson said, “the energy that was absorbed by the impact was not less than three-times, and probably as much as six-times greater than the impact we had considered.

“The idea that someone might plant a plastic explosive or the like somewhere in the structure was considered in the design. The structure was redundant –– two-thirds of the columns on one face of each of the two towers were removed (by the aircraft) and yet the buildings were able to stand. But it was the combination of the impact from the speeding aircraft and the burning jet fuel –– both the kinetic and petrochemical energy released –– that ultimately brought them down.”

Impact on Future Design

Robertson said he doubts that the attacks will have a major impact on the structural design of new tall structures. “If you design buildings as fortresses that can withstand anything, then the terrorists will just avoid the fortresses,” he said. “There are plenty of other, smaller buildings that could be targets, and the threat of chemical or biological weapons is an even greater concern.

“Structural engineering is applied science. If a ceiling sags or a lobby is too drafty, life goes on. But structural reliability has been high; building collapses are rare. When they do occur, they’re usually caused by natural events –– wind or water or the ground shaking. I don’t believe we should engineer against the kind of event that happened on September 11, much less the impact and fire that could be created by the much larger Boeing 747 or the new AirBus 380.”

Robertson concluded that the solution lies in confronting the root causes of hatred among mankind: “There’s no end to the number of ways that man can do harm to man.”

Also read: Saving Lives in the Aftermath of Sept 11 Attack

The Ethics of Surveillance Technology

A shot of surveillance cameras in the foreground with a tall glass building in the background.

In the wake of the Sept. 11 attacks there’s been more emphasis on protecting public places and tracking terror threats. But what are the ethics of this?

Published January 1, 2002

By Fred Moreno, Dan Van Atta, Jill Stolarik, and Jennifer Tang

Image courtesy of Kate via stock.adobe.com.

Picture yourself living each day under the watchful eye of a network of surveillance cameras that track your movements from place to place. Every time you enter a large building or public space, your facial features are compared with those in a database of known criminals and terrorists. Do you feel safer knowing that someone, somewhere is watching?

This may sound farfetched, or something out of George Orwell’s dystopian novel 1984, but closed circuit TVs (CCTVs) –– like those being widely used in the United Kingdom –– and facial recognition systems are just two of the many well developed technologies the government and private companies are considering to bolster security. The Pentagon issued a request for new security proposals in the wake of the September 11 terrorist attacks and, already, new anti-terrorism laws have expanded the government’s surveillance powers.

Complex technological security measures are “coming on faster than lawmakers and the public can process and evaluate them,” said Susan Hassler, editor-in-chief of the IEEE Spectrum and moderator of a recent media briefing on surveillance technology at The New York Academy of Sciences (the Academy). Sponsored by the Academy and the IEEE Spectrum, the briefing mirrored the debate now being waged in the Congress, the Pentagon, the media –– and on the streets.

A New Manhattan Project

To sift through the myriad security ideas, Michael Vatis, director of the Institute for Security Technology Studies at Dartmouth College, issued “a clarion call for a new Manhattan Project.” Vatis proposed that security experts from industry, academia and government be asked to assess and recommend available surveillance technologies.

“I urge that we develop a mechanism to bring together expertise from across different fields to develop a research and development agenda to counter the threats now facing us,” Vatis said. Such an effort is even more urgent in light of the Pentagon’s recently published security technology “wish list,” he added. Biometrics, a technology used for analysis and quantification of the physical features of an individual, is already “on the radar” of law enforcement and airport security companies.

Biometrics, a technology used for analysis and quantification of the physical features of an  individual, is already “on the radar” of law enforcement and airport security companies. Facial recognition is one aspect of biometrics that could be deployed in counter-terrorism efforts. “The cornerstone of our defense against crime  and terror is our ability to identify and deter those who pose a threat to public safety,” said Joseph Atick, chairman and CEO of the Visionics Corp., a leader in the biometrics field.

Atick said facial recognition systems could be used in airports. As passengers pass through security gates, the systems could capture an image of each face, analyze its features and produce a unique, 84-byte computer code to describe it.

Vatis said this technology is an adjunct to security measures already in place such as X-rays, bag checks and metal detectors. Unlike a person scanning a crowd, he said, this technology “delivers security in a non-discriminatory fashion — free of prejudices.”

Increasingly Pervasive and Invasive Surveillance

Barry Steinhardt, associate director of the American Civil Liberties Union, said he was troubled not only by the specter of increasingly pervasive and invasive surveillance technologies, but also by the danger that government and industry leaders could, under pressure to act, invest in technologies that don’t work and instead provide a false sense of security. “As we look at any technology that may be introduced into society, we have to ask: Does it improve security? How much does it threaten our liberties? And do the benefits outweigh the risks?”

While facial recognition systems may or may not ever be implemented widely, we can look across the Atlantic to study the effects of a surveillance technology that’s been adopted with enthusiasm. Over the past decade, Britons have welcomed the installation of CCTVs in public places, work spaces and homes. Estimates are that some 2 million CCTVs are now scattered throughout the country, said Stephen Maybank, of the department of computer science at the University of Reading in the U.K.

The British fervor for CCTV comes from the belief that the cameras deter criminal activity, a contention that some studies support. The London Underground alone is laced with 4,000 cameras, and the sheer numbers of CCTVs pose problems: how does one store all the data and how can one find a particular image amongst all the data that’s stored?

Better and Cheaper Cameras

Improvements are coming in CCTV technology that will further encourage their use, said Maybank. “Cameras are becoming better and cheaper; they will soon work on low power and will be easy to install –– some are reduced to the size of a thumb. Software for people-tracking and behavior recognition also is improving. And large, coordinated camera networks are coming that will enable the analysis and description of people as they move over large areas.”

Closer to home and on a much smaller scale, anecdotal reports about CCTVs point to drawbacks in their use as crime stoppers. Robert Freeman, executive director of the New York State Committee on Open Government, reported that some residents and shopkeepers on the perimeter of New York City’s Washington Square Park believe the installation of CCTVs in the park simply pushed crime to the fringes of the areas.

New ideas will continue to emerge on how best to protect ourselves from future threats. Government’s challenge will be to select the best of the alternatives, technologies that pose the least threat to our civil liberties, and to knit them together to form an invisible shield –– without creating a technological version of the Emperor’s new clothes.

Also read: The Ethics of Developing Voice Biometrics

What Caused the ‘Bang’ of the Big Bang?

A wide shot of outer space.

We are living in “the golden age of cosmology” as scientists and engineers continue to learn more about the universe’s origin that led to us being here today.

Published January 1, 2002

By Fred Moreno, Dan Van Atta, Jill Stolarik, and Jennifer Tang

Image courtesy of sripfoto via stock.adobe.com.

We’re all familiar with the Big Bang theory –– the most widely known model explaining the evolution of the universe. According to this standard model, the universe began some 1,010 to 1,515 billion years ago in a hot, dense state where particles were rapidly expanding and cooling. As the universe cooled, matter congealed to forms stars, galaxies and clusters of galaxies. Today the universe continues to expand, and at an accelerating rate.

But what sparked the “bang” of the Big Bang? What circumstances existed just prior to this nascent event to trigger the birth of the modern universe? The answers to these questions may lie in another scenario about the origins of the universe –– the inflationary model –– proposed by Alan H. Guth.

Guth, the Victor F. Weisskopf Professor of Physics at the Massachusetts Institute of Technology, described his model at The New York Academy of Sciences (the Academy) in October. The event, “Inflationary Cosmology and the Accelerating Universe,” was jointly hosted by the Academy and the M.I.T. Alumni Club.

An Inflating Cosmos

The notion of an inflating cosmos, which has received substantial support in the last two decades, may explain many of the mysteries of the universe: its enormity, its uniformity, why it began so extraordinarily close to its critical density and why it is considered geometrically “flat.” It even offers a possible explanation for the origin of essentially all matter and energy in the observable universe –– no small feat.

Guth noted that the Big Bang model does not explain the “bang” itself, but rather its aftermath. “Inflation provides a prehistory, a possible explanation for what happened before the Big Bang. Moreover, the same force that was responsible for triggering inflation billions of years ago is still at work, causing our universe to continue to swell in size at a rate faster than ever before.”

According to the inflationary model, the initial matter of the universe could have been a billion times smaller than a single proton. This patch of matter grew exponentially, doubling and redoubling in size every 10-37 seconds, but its density remained the same and energy was conserved. At this point, Guth explained, gravity was “turned on its head.” A repulsive gravitational field arose, the opposite of what we know as gravity here on Earth — a force that repelled, rather than attracted matter. This initial inflationary period was blindingly fast, lasting only a tiny fraction of a second.

The repulsive gravitational field was highly unstable, however, and decayed much like a radioactive substance. It then erupted, releasing energy and creating the hot primordial soup of particles that is thought to have existed at the moment of the Big Bang.

Cooled Too Quickly

According to the Big Bang model, the universe cooled too quickly to explain current uniformity, the even distribution of stars and galaxies. The theory of inflation gives us a way to understand it, since the universe during inflation was small enough to distribute its contents uniformly. Cosmic radiation is also remarkably uniform, the same intensity to about 1 part in 100,000. The inflationary theory received a further boost in 1992 when the Cosmic Background Explorer (COBE) found enough tiny variations, or “ripples,” in this uniformity to explain how, despite inflation and overall uniformity, there could still be local distribution of matter into stars, galaxies, and galaxy clusters, interspersed with patches of empty space.

The inflationary model may also explain the geometric “flatness” of the universe, a universe critically balanced between eternal expansion and eventual collapse. At one second after the Big Bang the critical mass density was apparently very close to a value of one, which would be an inexplicable coincidence without inflation. Guth’s theory shows that unlimited inflation can take any curved surface and make it appear flat, thus providing a general principle for explaining a phenomenon that is at the same time consistent with astronomical observations.

Recent observations of distant supernovae lend further support to Guth’s inflationary model. Astronomers measure changes in the expansion rate of the universe by using supernovae type Ia explosions as “standard candles.” By observing that these supernovae are appearing dimmer — and therefore moving farther away — they’ve determined to their great surprise that the rate of expansion in the universe today is actually larger than it was five billion years ago.

Repulsive Gravity

Guth attributes this once again to repulsive gravity. “So the universe today is not slowing down under the influence of gravity, which is what everybody had thought previously,” he said, “but in fact is actually speeding up in its expansion rate.”

Inflation is certainly not the answer to all of the questions about the origins and future of the universe. For one thing, some of the tenets of the model may be at odds with the uncertainty principles of quantum physics. But the coming together of cosmology and particle physics, coupled with new data generated from recording devices such as COBE, give astrophysicists great reason for excitement. Concluded Guth, “We are living in the golden age of cosmology.”

Also read: Can Our Knowledge of Nature Ever Be Complete?

Saving Lives in the Aftermath of Sept 11 Attack

A shot of lower Manhattan in NYC. Powerful blue lights shoot up to the sky, denoting the area where the Twin Towers once stood.

Academy member and medical doctor Robert Lahita didn’t hesitate to use his medical knowledge to help others during this traumatic experience.

Published November 1, 2001

By Fred Moreno, Dan Van Atta, Jill Stolarik, and Jennifer Tang

Image courtesy of VOJTa Herout via stock.adobe.com.

On September 24, in a cheerful ceremony as part of the Academy’s 183rd Annual Meeting, Dr. Robert Lahita received a special award in appreciation of his years of service as a member of The New York Academy of Sciences (the Academy’s) Board of Governors.

Less than two weeks earlier, on Tuesday, September 11, Lahita was at center stage of a far different venue — a New Jersey pier across from the smoking ruins of what had been the twin towers of the World Trade Center. What had started as a quiet morning making rounds at St. Vincent’s Hospital in New York’s Greenwich Village, where he is Chief of Rheumatology, became a living nightmare of burned and mangled bodies arriving by tugboat and ferry from the collapsed buildings across the Hudson River.

“As soon as I heard about the attack, I left the hospital and caught a train to Jersey City, where I’m the medical director of the mobile intensive care units of Hudson County and EMS at Jersey City Medical Center,” Lahita said. Most of his equipment, such as burn kits and trauma materials for treating patients, was in his car in New Jersey. “An EMS dispatcher sent me to the Colgate-Palmolive piers, where hundreds of victims were being unloaded by the Coast Guard and other groups. Had I parked that morning in Manhattan, I might have gone directly to the scene and been among the missing,” he observed.

The Walking Wounded

When Lahita arrived in Jersey City, a handful of paramedics and EMS technicians were trying to deal with the wounded. As the only doctor on the scene, Lahita took over and began treating injuries that ranged from open skull fractures and crushed pelvises to broken arms and legs. Many were firefighters and police officers, as well as “the walking wounded” – people temporarily blinded from the billowing smoke and ash.

“It was the most devastating scene I’ve ever seen in my life,” he said. “There was lots of blood and a great deal of emotion. It seemed like Armageddon.”

Because the radio transmitter atop the towers was destroyed, Lahita’s efforts to call for more help were thwarted. He immediately assigned specific tasks to everyone working with him. Chairs with wheels were converted into makeshift stretchers, splints were fashioned out of window blinds and, as other supplies like bandages began dwindling, office workers contributed their first-aid kits.

A Scene of Mass Confusion

Dr. Bob Lahita.

After an hour Lahita was joined by another doctor and more medical personnel began arriving. As the 200 most critical patients were delivered to area hospitals, Port Authority officials asked Lahita to accompany them on a caravan headed to “ground zero” via the Holland Tunnel. There he found a scene of mass confusion, debris, smoke, fire and five inches of smoldering ash.

“I saw dust, papers and scattered personal belongings everywhere,” he said. “Everyone was covered with ash and it was difficult to breathe.” Lahita carried boxes of masks and began distributing them to rescue workers.

A resident of Ridgewood, New Jersey, Lahita later learned that 35 people from his area were among the dead. However, he knows that his efforts helped save an untold number of people. “I work best under pressure, but this was beyond what I’ve ever experienced,” he said. “I’ll never forget it.” Nor will the people whose lives he saved.

Lahita joins other Members and friends of the Academy in expressing their condolences to those who have lost loved ones in the tragedy. “The Academy personifies science,” he said. “This is a sad occasion for all of us, as the World Trade Center was also a magnificent feat of engineering science.”

Lahita is a Fellow of The New York Academy of Sciences and has been a Member since 1979. He chairs the Academy’s Conference Committee, which he joined in 1991. He also has co-organized two major Academy conferences, B Lymphocytes and Autoimmunity and Neuropsychiatric Manifestations of Systemic Lupus Erythematosus (SLE). Since 1994, he has been a Member of the Academy’s Committee on the Annals of the New York Academy of Sciences.

Also read: How Trauma Changes Us: Life after 9/11

An Anthropologist’s Reflections on Margaret Mead

A woman wearing pearls.

Dr. Constance Sutton reflects on the lasting impact imparted on her by pioneering female anthropologist Margaret Mead.

Published October 1, 2001

By Constance Sutton, PhD

Margaret Mead

Margaret Mead had a profound influence, personally as well as professionally, on the lives of many people. Dr. Constance Sutton, professor of Anthropology at New York University and a fellow of The New York Academy of Sciences, knew Mead for 24 years. Following are some recollections of Mead and her work by Sutton.

A few weeks after my arrival in New York in late 1954 I began working as Mead’s editorial assistant in her turret office in the American Museum of Natural History. (We were working on the manuscripts of New Lives for Old: Cultural Transformations- Manus, 1928–1953 [1956] and Childhood in Contemporary Cultures [1955], co-edited with Martha Wolfenstein.)

I had come with an MA in anthropology from the University of Chicago and uncertainty about how to chart my future. At that time it was thought that women who were married to non-anthropologists and who wanted to have children would not be able to manage the fieldwork necessary for a PhD in cultural anthropology.

It took only a week for Margaret Mead to banish all that! Having said her characteristic “fiddlesticks” to the reasons I offered when she asked why I wasn’t working for my doctorate degree, she arranged a two-year fellowship for me in anthropology at Columbia University, where she was an adjunct professor and where I was to work as her teaching and research assistant. (She was the only woman professor I was to have in my entire undergraduate and graduate education!)

Make the Way

Peppered with advice on how to handle this and that, she had said in effect that it was possible to “have it all.” I did my doctoral research on sugar workers in Barbados and she chaired my dissertation committee. Indeed, she gave me support and encouragement throughout the rest of her life.

Mead helped “make the way” for many of us, male and female. She early erased the line between the personal and the professional, integrating an interest in our lives with an interest in our ideas and the work we were doing. It was for her both a mode of interaction and a method of scientific work and theorizing. Moreover, she was remarkably open to new ideas whatever their source.

Bridging the personal/professional divide was also present in her teaching and writing about “participant observation”— the code word for field-based research methodology. She was concerned with giving it a scientific grounding. In so doing, she prefigured much that has become current in contemporary writing on research methodology in the human sciences. Writing against the narrow, logical, positivist concept of objectivity prevalent in the social sciences of the ’50s and ’60s, she emphasized not a distancing of oneself from the subjects of one’s research, but an active engagement that included observing oneself and one’s reactions as an explicit part of the data.

Research in New Guinea

Constance Sutton, PhD

About her early research in New Guinea in 1932 with Reo Fortune and Gregory Bateson, she wrote: “Our sense of discovery was completely combined with our own personal sense of discovering ourselves.” I referred to this aspect of the research process in my university teaching as recognizing that you are an important datum in your own research.

Today this is called “positioning yourself” in relation to the people you are researching. Those of us who took Margaret Mead’s course in “Field Methods” at Columbia University in the 50s remember it as the only “hands on” information we received about how to do field research in a systematic way. Most of our training was about what to research, not how to do it.

Margaret Mead has been particularly well known for her belief in the relevance of anthropological knowledge (and social science more generally) to issues of public policy and everyday life. She drew upon the broad sweep of her research and knowledge in addressing both micro-cultural issues, such as nutrition or breastfeeding, and global issues, such as nuclear disarmament—and everything in between.

Mead strongly believed that if we knew how to ask the right questions we could find solutions. Asking the right question meant understanding that the way a question is asked shapes its answer. Given this awareness, she felt that a great deal could be achieved by putting social science knowledge to work on behalf of human welfare and justice. As this involved making social science knowledge widely available, she committed herself to writing in a way that would make what she said accessible to a wide public.

Public Engagement

Mead’s public engagement and what this kind of public engagement means today are important aspects of her contributions to science and one focal point in the series of events at the Academy this Fall celebrating Mead’s centennial. The Academy is an appropriate site for re-examining Mead’s legacy. She had been active at the Academy as vice president in 1966–72, as a Life Governor of the Board, as a member of its Committee on Science and Public Policy in 1974–75, and as a participant in a number of its conferences.

I want to underscore an important point about this celebration. It emphasizes one of Mead’s chief interests, namely the changing nature of the cross-generational transmission of knowledge and culture (see her Continuities in Cultural Evolution, 1964). In examining the ways Mead prefigured the kinds of knowledge and the approaches to knowledge that remain of concern to us today, we will be addressing the key issues of how knowledge is communicated across generations.

Also read: Celebrating Girls and Women in Science

Environmentalism in the K–12 Science Classroom

A teacher gives a classroom demonstration using a model skeleton while students raise hands to ask/answer questions.

Advocacy or science? A recent forum sponsored by The New York Academy of Sciences emphasizes challenges teachers face when teaching environmental science.

Published October 1, 2001

By Fred Moreno, Jill Stolarik, and Jennifer Tang

Educating young people about global warming, biodiversity, the importance of conservation and other matters has become a major issue in K–12 education. Students are taught sensitivity to the natural environment, the potential impact of human activities and the value of conservation. However, ecological science is difficult and complex, and many questions remain open on how we might best understand the diverse factors—geological, biological, economic, societal—involved in natural systems and man-nature interactions.

Some fear that science education is being shortchanged in favor of advocacy, with the promotion of specific policies or practices (e.g., recycling and composting) substituting for a deeper education in the sciences that promotes scientific literacy. In the wake of studies such as the Third International Math and Science Study (TIMSS) that show America’s high school seniors’ math and science skills are superior to their peers only in Cyprus and South Africa, some educators and scientists are concerned that environmental education is yet another field in which students are not learning enough science.

Advocacy or Science?

Is there a way to bring environmental issues into the science classroom while maintaining a strong focus on the underlying science? How does learning ecological science relate to traditional biology, chemistry, and physics?

These questions and more prompted the NYC Science EduNetWork and The New York Academy of Sciences (the Academy’s) Science Education Section to sponsor a forum entitled “Environmentalism in the K-12 Science Classroom: Advocacy or Science?” Featured panelists were: Dr. Paul R. Gross, professor emeritus of biology at the University of Virginia, and coauthor of Higher Superstition: The Academic Left and Its Quarrels with Science; Dr. William F. Schuster, executive director of the Black Rock Forest Consortium, and Mr. Don S. Cook, director of the Tiorati Workshop for Environmental Learning at New York’s Bank Street College.

Environmental Education

No one disputes that K-12 education should offer courses on the environment. The Kyoto Protocol on Global Warming, the energy blackouts in California and other high-profile events attest to the importance of understanding environmental issues. Currently, there are more registered specialists in environmental education in American public schools (26,000) than there are in physical science. Most state K–12 science frameworks and science standards documents place some major emphasis upon environmental science.

Environmental education often covers a wide range of areas including: the workings of ecosystems and threats to ecosystem viability; pollution prevention; conservation; waste and recycling; human health; the economics of electric power grids; and the thermodynamics of planetary atmospheres. However, while some stress the importance of teaching environmental stewardship, others are more concerned that fundamental scientific concepts are being omitted or given less classroom time in environmental education.

Gross espoused the latter view. “The fraction of our population with even minimal comprehension of scientific inquiry and scientific claims is dangerously small and the same holds true, on the whole, for our schoolchildren,” he said. “Are those children, in environmental education, learning the basic science whose classroom and fieldwork time has been preempted by it? From what I have seen and heard, the answer is no.”

Preach Rather Than Teach?

From left: Paul Gross, William Schuster, Don Cook

Gross highlighted two factors affecting the quality of environmental education: the quality of the textbooks being used in schools and the level of teacher preparation in K–12 science education. In some textbooks, he observed, “The dominant tone is one of proud advocacy rather than science.” Although Gross agrees that the existence of serious environmental concerns warrant the inclusion of environmental science in the curriculum, he fears environmentalism in the science classroom may promote an activist mentality in students while failing to teach them the scientific complexity surrounding environmental issues.

He noted that only one out of five science teachers at the middle-school level have ever taken a college physical science course. For teachers who have not been adequately prepared to teach science education, it may be easier for them to “preach rather than teach.”

In addition, Gross believes that environmental education should focus on environmental science. He defined environmental science as an applied science, that is grounded in facts, concepts, and techniques from basic sciences and mathematics. “You cannot have a useful, serious notion of the scientific or even the economic issues of global climate change, historical and current, without a reasonable background in the physics of heat and energy, the elementary thermodynamics of gases, and the elements of geology,” he said.

Environmental Stewardship

While Schuster agrees that advocacy should not replace basic science teaching, he believes environmental literacy should be an integral component of scientific literacy. “From scientific studies, we know we are substantially changing the makeup of our planet’s atmosphere. The quality and availability of water is severely compromised in many areas and human activities are causing one of the biggest episodes of extinction in our planet’s history. These are serious matters and ones that deserve to come under the microscope of scientific research and teaching,” he said.

As executive director of the Black Rock Forest Consortium, an organization that operates a nature preserve 50 miles north of New York City, he has led and overseen outdoor forest experiences for thousands of pre-college students. In his experience, most students enjoy nature field studies and seem to thrive in a classroom “without walls.” He noted that “interest in organisms and their environment often leads not just to knowledge but also to care, respect and even love for these ecosystems. These feelings may naturally engender what is typically considered environmentalism.”

Schuster believes there is more value in holistic science and nature studies than Gross, however, and sees it as a valid way to introduce K–12 students to the scientific world. “Science education should put an emphasis on an active process of inquiry as opposed to an inert body of information to be memorized,” he added. However, he cautioned that classroom lessons and field experiences complement each other and are both necessary to give students “a well-rounded education that includes scientific and environmental understanding, as well as knowledge about human social systems so that they will have the tools they need to make informed, responsible decisions on the environment.”

Experiential Learning

While Cook agreed with Gross’ assessment that science education in the U.S. needs to be improved, his focus was on making science more accessible to students and the importance of experiential learning. He believes that students need to actively engage in subject matter in order to understand it. In order to give students a basis for learning more complex concepts, scientific experiences should begin with phenomena described in everyday language before introducing terminology used by scientists. “We need to rethink the roles of language and experience in the education of non-scientists,” he said.

Also read: From the Lab to the Classroom

The Economics of Transportation Infrastructure

A bird's eye view of a shipping harbor with thousands of shipping containers.

From jobs to goods, the region’s transportation infrastructure is critical to economic prosperity of not just the tri-state, but the entire country.

Published March 1, 2001

By Veronica Hendrickson, Allison L. C. de Cerreño, Ph.D., and Susan U. Raymond, Ph.D.

Image courtesy of leungchopan via stock.adobe.com.

The Tri-State region represents 19% of the nation’s total value of product shipments annually, and 10% of the nation’s tonnage. But, New York and New Jersey are each their own most important trading partner. Roughly 49% of NY’s tonnage shipped remains in-state, for NJ that figure is 39%. Facilitating this intrastate and interstate regional trade are 115,000 miles of roads and 5,000 miles of rail lines, which knit the states together.

Annually, the three states originate and ship product worth more than $305 billion totally withinthe region. This represents nearly 25% of the value of all goods shipped domestically into and out of the region. That value also represents nearly half a billion tons of goods, or 38% of the tonnage shipped domestically into and out of the region.

Moreover, for both tonnage and value, beyond each other, the top trading-partners for all three states are two proximate states: Massachusetts and Pennsylvania. Maintaining the efficiency and effectiveness of the region’s transportation system is critical for the smooth and timely flow of goods and people, which in turn are needed for a strong regional economy.

How Many SUVs can You Fit in Your Garage?

New Jersey holds the dubious distinction of having the greatest number of drivers per square mile in the nation. But the big news on the road is not numbers; it is size. Truck motor vehicle registrations in all three states have increased between 37% and 43% since 1992. And the dominant force behind that growth has been the Sport Utility Vehicle. Between 1992 and 1997, SUV registrations grew by 133% in Connecticut, 138% in New York and 107% in New Jersey. In Connecticut, there is now one SUV for every 9 licensed drivers.

The Port Still Mater, but Region Faces Increased Competition

From the early 1700s through the mid twentieth century, the New York-New Jersey Port served as America’s trading center. By 1950, half of all the nation’s trade entered or left via its docks. Container shipping, born at Port Newark, promised to keep the Port in the forefront of trade. But the world changed. Ships became behemoths, with drafts approaching 45 feet, deeper than the Port’s channels at all but high tide.

Population and production shifted to the south and west of the nation. The Pacific Rim became an economic engine, overcoming Europe’s eastward pull. In turn, shipping shifted, south to Norfolk and Miami and West to Long Beach and Seattle. These ports invested in technology to manage trade growth efficiently. The NY-NJ port invested as well, in containerized cargo capacity and on- dock rail service. But the competition remained stiff.

The price of not keeping pace with trade opportunity is high for the Tri-State region. The Port now accounts for 166,500 jobs in the 17-county metropolitan area. It generates $23 billion annually in economic activity, and saves the region’s citizens and businesses $750 million per year in transportation costs.

The Economic Impact of Globalization

And globalization is providing the opportunity to reassert the region’s shipping leadership, and to increase these economic benefits. Growth in maritime trade could generate an additional 238,000 jobs by 2040, nearly 3 times current levels. That level of growth would generate another 165,000 indirect jobs. Furthermore, Port job growth extends across skill levels, providing opportunity for management, but also for warehousing, transport, cargo handling and trucking personnel. Growth at the Port could be an anchor for job diversity.

But investment will be needed to cope with both growth and new transport technologies. For example, by 2020, 65% of all maritime trade will be carried on ships with drafts of more than 40 feet; 30% on ships with drafts exceeding 45 feet. Dredging the Port’s channels to accommodate such size is estimated to cost $3 billion.

Also read: Railroads and Transportation Infrastructure in the Tri-State

Sources

  • United States Department of Transportation, 1998
  • United States Department of Commerce, Bureau of the Census, 1997 Economic Census, “Vehicle Inventory and Use Survey,” December 1998.
  • Port Authority of New York/New Jersey, Strategy Plan for Building and Expanding the Port of New York/New Jersey, “Building a 21st Century Port,” 2000.

Transportation Infrastructure in the Tri-State

An old railroad bridge across a river in a rural setting.

Adequate funding has enabled the tri-state to develop some of the most robust transportation infrastructure in the nation, but more funding is necessary for the region’s economic prosperity.

Published March 1, 2001

By Veronica Hendrickson, Allison L. C. de Cerreño, Ph.D., and Susan U. Raymond, Ph.D.

Image courtesy of Guy Bryant via stock.adobe.com.

Rail Infrastructure in the Region

TREND: A Shrinking Asset…

The region’s freight system is home to a whopping 5,119 miles of operated track which transported over 111 million tons of freight in 1999. Nevertheless, while miles of track operated in the nation have declined by 1.6% (and increased by 13.6% in Massachusetts!), the region has lost 3.4% since 1996.

UPSHOT: …But Growing Need

Nationally, coal accounts for 42% of rail tonnage. In the region, the picture is more diverse, with mixed commercial freight, nonmetallic minerals, chemicals, and food products accounting for over half the region’s tonnage. While freight tonnage has increased by only .86% nationally, the region is well ahead at 4.0%; nevertheless this is still a smaller percent increase than Massachusetts’ 9.3% growth. Connecticut is doing much better than the rest of the region. It increased its miles of operating track by 8.2%, with a corresponding 4.9% increase in freight tonnage transported.

Jobs and Pay

TREND: A Declining Workforce…

Railroads provide 24,888 Tri-State residents with jobs, a decline of 7.9% since 1996. New Jersey led the way with its 13.6% decrease. Other states have seen smaller decreases (5.6% in California, for example) and some have even seen increases, like Massachusetts at 7.4%.

UPSHOT: …While Wages Rise and Fall

Rail wages in the region totaled about $1.36 billion in 1999. That was down from 1998’s high of $1.48 billion, but still represents a 3.8% increase since 1996. Connecticut’s wages have increased 10.2%, but are still behind Massachusetts, both in terms of actual dollar and percent increases (20.7%).

Ah, the Commute

TREND: Alive and Well on the 7:16…

The region’s commuter rail system comprises 1,645 miles of track and carries nearly 200 million passengers per year. The Long Island Rail Road holds the dubious distinction of being the busiest commuter railroad in the nation.

UPSHOT: …And It Costs a Pretty Penny

Maintaining and improving rail systems is expensive. MTA’s MetroNorth capital program for 2000-2004 totals $1.3 billion, over half of which is dedicated to rolling stock and track improvements.

Maintaining Transportation Infrastructure

“It’s gonna take money, a whole lotta spendin’ money.” George Harrison was right, even though the subject is rolling stock, not true love. Maintaining and improving the capital infrastructure of the Tri-State region’s transportation system takes money, lots of it. The highway capital programs of the region’s three transportation departments totaled just over $4 billion in 1999.

For New Jersey, the capital budget constituted one-third of its total DOT expenditures; for Connecticut, road capital improvement is 58% of the DOT’s combined balance sheet. With 200,000 customer trips per day, the MTA has even bigger plans: a $14.4 billion capital program lasting through 2004. Of that, $10 billion is reserved for New York City Transit, including nearly $4 billion for subway cars and stations.

Also read: The Economic Importance of Transportation Infrastructure

Sources

  • Connecticut, New Jersey, and New York Departments of Transportation; Association of American Railroads, Policy and Communications Department, State Specific Railroad Data, “Railroads and States,” 1996-1999; Annual Reports of the Long Island Railroad, New Jersey Transit, the Metropolitan Transit Authority, and Metro-North.
  • Construction and state highway expenditures budgets of the Departments of Transportation of NY, NJ, and CT; 2000-2004 Capital Improvement Program of MTA Metro-North Railroad.

Federal R&D Spending in the Tri-State Region

The seal for the United States Federal Reserve System as seen on a $100 bill.

With federal research funders like the National Institutes of Health and the Department of Defense, the tri-state region is well positioned to advance research and development in the coming years.

Published January 1, 2001

By Frank B. Hicks, Ph.D., Allison L. C. de Cerreño, Ph.D., and Susan U. Raymond, Ph.D

Image courtesy of AlexGo via stock.adobe.com.

Federal funding for research in the Tri-State region (New York, New Jersey, Connecticut) depends on broader trends in Federal budget allocation. While the Fiscal 2001 shows large increases for research and development (R&D), they come out of a historically shrinking discretionary pot.

In the early 1960’s, discretionary spending (that portion of the budget over which Congress has annual control) represented 70% of the total Federal budget. Today it is about one-third. While the overall budget itself has tripled in real terms since 1960, mandated entitlements have increased nearly ten-fold.

Hence, R&D must compete for resources with other societal and economic sectors within a narrowing portion of the overall budget. This fiscal year, science did itself proud. The final fiscal 2001 budget agreed to in December 2000 contains $91 billion in Federal funding for R&D, a 9% increase over the previous year. The big winner was the National Institutes of Health, with a 14.5% increase ($2.5 billion). The Department of Defense also registered just over $2.5 billion in gains, making the NIH and DOD the largest R&D winners in dollar terms.

All of which is good news for the Tri-State Region. Medical research and the continued presence of a strong DOD infrastructure position the Region to attract greater levels of Federal R&D funding in the coming years.

Federal R&D Funds in the Region

The Tri-State region received about $5.2 billion in Federal funding for research and development during 1999. While in New Jersey this represents only 11% of all statewide R&D expenditures, Federal resources make up 24.5% of R&D funding in Connecticut and 19.5% in New York.

In terms of per capita Federal R&D funds, all three states fall far below the national leaders and technology competitors. The highest concentrations of Federal resources are, unsurprisingly, in Maryland and Virginia, home to many Federal agencies. But both California and Massachusetts receive over twice as much Federal R&D funding per capita as any of the states in the region. Only in New Jersey did Federal funding growth rates outpace the national average.

Apart from funding for Federal laboratories in the region, Federal funds flow to academic research, private sector contracts, and cooperative agreements with both industrial and non-profit institutions.

The Academic Pipeline

The region’s academic institutions receive about $1.5 billion in research grants annually from the Federal government. In 1998, this funding represented more than 10,000 individual academic grants. In Connecticut and New York, academic grants represent one-third of all Federal R&D funding; in New Jersey, where industrial contracts play a much more important role, that portion is 13%.

Federal flows for academic research in the region tend to be highly concentrated. Yale University receives 81% of Connecticut’s academic grants. In New Jersey, Rutgers and Princeton together receive 70%; and in New York, which has a larger number of research universities, Columbia and Cornell together still account for 40% of Federal academic research grant funds.

Health and Defense

For academic institutions, health sector research capacity is critical. The Department of Health and Human Services is the source of 50% of the academic research grant funds in New Jersey, 75% in New York, and a whopping 83% in Connecticut.

Except in New York, however, the Department of Defense remains the largest supplier of overall Federal R&D funding, accounting for 61% of New Jersey’s flow and 50% of the flows to Connecticut.

Also read: Federal Lab and Research Funding in the Tri-State Region

Sources

  • Executive Office of the President, Office of Management and Budget; American Association for the Advancement of Science; The Sciences, November/December 2000.
  • NSF Science and Engineering Indicators 2000; “Discovery and Innovation: Federal Research and Development Activities in the Fifty States, District of Columbia and Puerto Rico,” RAND 2000.

The Convergence of Natural and Human Science

A man speaks into a microphone during an Academy event, while two men in the background listen intently.

Leading scientists and scholars ponder the ethical and philosophical dimensions at the intersection of molecular biology and neuroscience.

Published September 1, 2000

By Henry Moss

Stuart A. Kauffman of the Santa Fe Institute presents material from his forthcoming book, Investigations, at the Academy Conference, “Unity of Knowledge: The Convergence of Natural and Human Sciences.” On the far left is Joshua Lederberg, Nobel laureate and President Emeritus of the Rockefeller University. Seated in the middle is Edward O. Wilson of Harvard University.

It was inevitable that the extraordinary progress in molecular biology and neuroscience of the last few decades would rekindle philosophical debates about human nature and the limits of science. Scientists have been mapping the genetic, neuronal, endocrinal, and somatosensory correlates of human behavior, emotions, memory, language, and thinking, and scenarios abound for explaining our sexual, aesthetic, ethical, and religious predispositions in terms of the blind processes of Darwinian selection. The popular press offers a steady diet of stories and books telling us how much of what we do and think relates to our genes and brains.

The New York Academy of Sciences (the Academy) brought some of the world’s most respected scientists and scholars to New York City from June 23–25 for the conference Unity of Knowledge: The Convergence of Natural and Human Science to reflect on all this before an audience of nearly 400, from virtually every discipline and from as far away as Belgium, Chile, and Armenia.

The conference keynote was delivered by Edward O. Wilson of Harvard whose controversial book Consilience has done much to rekindle this debate. Updating decades of work in sociobiology with recent findings in behavioral genetics and cognitive neuroscience, Wilson expressed confidence that modern biological science would soon provide material evidence of sufficient scope and depth to reduce even the most esoteric of human cultural precepts to underlying deterministic mechanisms.

Unity: Perhaps Possible

On the opening panel with Wilson were Stuart Kauffman of the Santa Fe Institute and Joshua Lederberg, Academy Life Governor, and one of the founders of modern molecular biology. Providing a counterpoint to Wilson, Kauffman suggested that unity was perhaps possible but only through a non-reductionist approach, based upon a universe that spontaneously creates wholes from parts, and order from random interactions.

Presenting material from his forthcoming book Investigations, Kauffman proposed that complexity theory can provide adequate definitions of life and intelligence, ones that will hold up in the laboratory. But Lederberg reminded the audience that grand programs, reductionist or holist, are prone to running ahead of the evidence, and that while we have come a long way in modern biology, human culture appears to have left its natural context far behind, perhaps defying a purely natural interpretation.

The next three panels, organized by neuroendocrinologist Bruce McEwen of Rockefeller University, neurologist Antonio Damasio of the University of Iowa, and psychologist Jerome Kagan of Harvard, presented a remarkable array of research results that have driven biological science deep into domains usually left to traditional social sciences. There were striking examples of the genetic and neurobiological underpinnings of behavior and states of mind including stress, anxiety, and depression, and a particular emphasis on the important role of biologically-rooted emotion and affect in understanding higher-order mental activity including language and reason.

Looking 25 Years Ahead

The Kagan panel asked what social science might look like 25 years hence, given such powerful biological results, initiating a debate that continued for the rest of the conference. Kagan and others insisted that convergence, though real enough, must balance biological and non-biological perspectives, and incorporate the human environment. University of Chicago social scientist Richard Schweder went much further, suggesting that a “science-driven unity” of knowledge was just old, discredited genetic determinism in new guise, the same determinism that brought us social Darwinism, eugenics and other such excesses of past “scientism.”

The ensuing panels, “Science, Culture, Meaning, Values —a Dialogue,” organized by science historian Anne Harrington of Harvard, and “Science in the Liberal Arts Curriculum,” a roundtable chaired by Academy president Rodney Nichols, continued the discussion, drawing in the humanities, religion, education and a lively and engaged audience. And the debate will continue as the Academy plans further excursions into the broader ethical and philosophical implications of the progress of modern science.

Also read: Teaming Up to Advance Brain Research