Strategy or Happenstance: Science Policy in the U.S.A.
President Shirley M. Tilghman
November 1, 2007
Ullyot Lecture, Presented at the Chemical Heritage Foundation
It is a pleasure to be joining you this evening, and to have the honor to give the Ullyot Lecture. Speaking as a lapsed chemist who was seduced by the mysteries of molecular biology, I very much appreciate the work of the Chemical Heritage Foundation to preserve the history of discovery in chemistry and to promote research and teaching in the field. The Ullyot lecture, as you have just heard, is part of that educational mission of promoting the public understanding of science. I am grateful to Glenn and Barbara Ullyot for their commitment to science education, and for creating a lecture series that focuses on the intersection of science and public affairs, and delighted to have Barbara Ullyot with us this evening. Having read the list of distinguished Ullyot Lecturers who have come before me, which includes luminaries such as Robert Langer, Harold Varmus, Jackie Barton, and Roy Vagelos, I know I have a lot to live up to this evening.
I chose the topic of this lecture, “Strategy or Happenstance: Science Policy in the U.S.A.,” because I believe in the profound importance of scientific discovery and innovation as an engine for economic and social progress, and have become increasingly concerned, as a scientist and as a university president, about the lack of rigorous and thoughtful planning to ensure that science and engineering will thrive in the 21st century, as it has done in the 20th.
The remarkable impact that science had in the 20th century is everywhere to see: in the dramatic increase in life expectancy and particularly the reduction in infant mortality; in the virtual eradication of a disease like smallpox through systematic world-wide vaccination; in the generation of household conveniences that have freed us from punishing manual labor; in the provision of safe drinking water and sanitation; in the power of television, radio, and film to foster greater understanding among people of different cultures; and in the development of the Internet, a powerful tool that provides global and instantaneous access to everything from the world’s great literature and art to mindless chatter on blog sites.
All of this progress, and the economic prosperity it has created, arose from public and private investments in science and technology throughout the world. The economic return on investments in science and technology has been documented many times over. It has been estimated that upwards of 40% of the growth in the U.S. economy over the last 50 years has come from investments in fundamental research. In the last 20 years we have seen the creation of entirely new industries – industries that depended on discoveries such as recombinant DNA, semiconductors, the Internet, and lasers. These form the bases for some of the most powerful drivers of today’s economy. What is remarkable is that most of these advances grew out of research in university laboratories and, as often as not, research conducted by students and faculty pursuing knowledge for its own sake, with no commercial application in mind.
This remarkable scientific progress did not happen by chance. It arose out of a social contract between the federal government on the one hand and research universities and institutes on the other. Although it is hard to imagine it today, prior to the Second World War, the U.S. government did very little investing in fundamental scientific research. In those days, foundations like the Rockefeller Foundation were the most important supporters of research in universities, with state and federal governments providing relatively modest funds. The war changed everything as the federal government turned to academic scientists, particularly in physics, to develop the weapons that would ultimately end the war. National research laboratories were created at Oak Ridge and Los Alamos, and others that already existed were greatly expanded. The impact of academic scientists on the outcome of the war was probably startling at the time, but it helps to explain what happened next. When President Harry Truman turned to Vannevar Bush, his science advisor during the war, to advise him on postwar science policy, Bush changed history by writing a highly influential report entitled “Science – the Endless Frontier.” In it he laid out the principles by which the federal government would link its future investments in fundamental research with education, particularly the education of graduate students. By investing in the young, the system acquired a vitality, an energy, and a capacity to change continually that would make it the envy of the world.
The confidence that society placed in scientific progress as the path to prosperity was reflected for decades in everything from surveys that identified science as among the most respected professions to the yearly generous allocation of tax dollars to basic and applied research. In return for this broad support, society rightfully expected, and indeed received, the discovery of new knowledge that would lead to better lives for everyone.
So why, you might ask, am I concerned about the future? Shouldn’t we just continue to do what we have been doing all along? Before I go into the basis for my concerns, it might be helpful to lay out, at the 30,000 foot level, three issues that need to be addressed in a coherent and comprehensive science policy. 1) What are the scientific priorities for the country, and what is the best way to allocate finite resources to address those priorities? 2) How do you balance the funding of fundamental research that has 50-100 year time horizons for pay back, with the funding of applied research whose benefits will be immediately realized? 3) How do universities attract the best and brightest men and women to become the next generation of scientists and engineers? This is by no means an exhaustive list of issues you would like a science policy to address, but they constitute a good beginning.
I would like to explore these questions with a series of vignettes – beginning with a story about priority setting gone awry. On January 14, 2004, President George W. Bush announced major new goals for the publicly funded exploration of space, most prominently, the goals of sending humans back to the moon by 2015 and eventually to Mars. This announcement came at a difficult time in the history of NASA. Its two programs in human-based space exploration, the International Space Station and the Space Shuttle Program, are both in trouble. The International Space Station, originally announced by President Ronald Reagan in 1984 for completion in ten years, is dramatically behind schedule and over budget, and the Space Shuttle Program, just beginning to recover from the 2003 Columbia shuttle disaster, is slated for mothballing in 2010.
President Bush’s announcement also came at one of the most extraordinarily productive points in the history of astronomy and cosmology, when explorations with satellite space telescopes such as the Hubble Telescope, the Wilkinson Microwave Anisotropy Probe, and the ground-based Sloan Digital Sky Survey, as well as unmanned space missions like Voyager, are providing us with breathtaking insights into the structure of the universe and our solar system. We are learning that our cosmos is much stranger than we thought. It is flat, not round or spherical, and it is flinging itself apart at an accelerating rate. To explain these observations, cosmologists have invoked a new force, to which they have given the Darth Vader-like moniker of “dark energy,” another way of saying we don’t understand what we are observing. At the same time, we are beginning to fill in remarkable details about our solar system, with new galaxies and celestial bodies being discovered almost monthly, to the point where we have had to reconsider what constitutes a planet in our lexicon. These discoveries comprise a golden age of space exploration – but of a very different kind from what President Bush has proposed.
This highlights a tension that has always existed between the scientific community and the political process whereby priorities are set. Ideally, priorities should reflect the relative importance and potential impact of competing questions, coupled with a dispassionate assessment of the likelihood that they can be answered by the proposed experimental or theoretical approach. In many fields, including my own, priority-setting has been a “bottom-up” process, in which scientists compete individually or in groups for resources through a peer review system. While government agencies like the NIH can, and sometimes do, create set-asides for Congressional priorities like HIV vaccines or bioterrorism prevention, the system is always open to new ideas that arise in the minds of individual creative scientists.
Astrophysics, of course, faces a challenge for which there is little precedent in biology – a single “experiment” can cost hundreds of millions of dollars and require years of up-front investment before any payoff is realized. For that reason the astrophysics community has evolved a variation on “bottom-up” priority setting in which its leaders come together once every 10 years, under the auspices of the National Academy of Sciences, and, through an inclusive and collegial process, establish priorities for the next decade. In 2002, less than two years before President Bush’s announcement, the National Academy of Sciences produced one of these decadal reports, entitled “New Frontiers in the Solar System: An Integrated Exploration Strategy.” In this document, the Academy proposed priorities and recommended substantial investments in space flights like the Voyager missions to the outer planets, as well as Earth-based experiments. It was a comprehensive list of projects and missions, including the exploration of Mars, but not by human beings.
Now there are a number of plausible reasons why the president and NASA chose to ignore the advice of our nation’s most distinguished scientists. They may have made a practical judgment that the American public will not continue to support large outlays of dollars for “pure science” in which new knowledge is an end in itself, but instead will require the tangible – even romantic – symbols of space science that the Apollo missions provided. They may have made a military decision that establishing American dominance in space is strategically important, or an economic decision that mining the natural resources in space will be essential to the future prosperity of the United States. Former President George H. W. Bush (a.k.a. Bush 41) endorsed the International Space Station with another rationale in his State of the Union Address in 1989: “Why the Moon? Why Mars? Because it is humanity’s destiny to strive, to seek, to find. And because it is America’s destiny to lead.”
Without judging the persuasiveness of these possible rationales, it is worth noting that if President Bush’s proposal to launch manned flights to the moon and, ultimately, Mars goes forward, the United States will repeat the decision-making process that led it to establish the International Space Station. Then, as now, the scientific community was highly skeptical of its utility, most especially its scientific value, and was concerned that support for the Station would preclude support for what, in their view, were significantly higher scientific priorities. Scientists then, as now, were anxious that the project not be seen as a scientific priority, or worse, be judged by its scientific accomplishments. Twenty years later, history has proven the skeptics of the 1980s to have been highly prescient.
The resignation last year of three members of NASA’s Advisory Council – all prominent scientists – over the failure of the agency to give due weight to its scientific programs does not bode well for the future. Two of the three departures were forced, prompting one former Advisory Council member to lament, “If we can’t have a robust debate at the NAC level, then where in the heck is it supposed to happen?” This is a reasonable question and reflects the frustration that many scientists feel when political considerations overshadow scientific ones. One lesson I would draw from this case is that “top-down” politically-driven science projects, especially those that will be enormously expensive, are unlikely to be successful in scientific terms unless they have the input and support of scientists who understand the challenges and potential benefits of the undertaking. Keeping scientists in the loop, and paying close attention to their expertise, is essential for a big science project to succeed.
Once scientific priorities are set, whether by top-down or bottom-up strategies, those decisions must be translated by the administration and the Congress into appropriations, and ultimately by the scientific agencies into individual funding decisions. This process is highly Balkanized, with over a dozen federal agencies competing for research dollars, making it very difficult to match strategic priorities with appropriations. Balkanization also makes it difficult to develop a policy or target for overall funding, and science must compete within the discretionary segment of the domestic budget. One downside of this dispersion of responsibility is that for the first time in 25 years overall federal spending for academic research and development, after adjusting for inflation, fell in FY2006. In FY 2004 federal funding in the physical sciences as a fraction of GDP was 54% less than in 1970; in engineering, it was 51% less. In other words, we have been under-investing, particularly in the physical sciences. And although the total national R&D budget has been growing steadily for many years, the ratio of government to private sector investing has reversed itself, from the government providing two-thirds of the total budget 40 years ago to one-third today. Some of that is good news, of course, but the underlying problem is that private sector investing tends to be for short-term gain, which means that investments in fundamental sciences have been declining. This is equivalent to eating your seed corn.
Indeed, retrenchment in the support of science and engineering could not come at a worse time, for there are dangerous signs that America’s dominance in scientific competitiveness is at risk from newly rising economic powers, particularly in Asia. That concern lies at the heart of a highly influential report that was released in the fall of 2005 by the National Academy of Sciences entitled “Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future.” In it a panel of experts ably chaired by Norm Augustine (Princeton Class of 1957) warned Americans – and I quote – “that the scientific and technical building blocks of our economic leadership are eroding at a time when many other nations are gathering strength. . . . We fear the abruptness with which a lead in science and technology can be lost – and the difficulty of recovering a lead once lost, if indeed it can be regained at all.” The signs include the fact that the U.S. share of the world’s leading edge semiconductor manufacturing capacity dropped from 36% to 11% in the last seven years; IBM, once the gold standard for computer hardware, recently sold its once-promising PC business to a Chinese company; nearly 60% of patents filed in the U.S. in information technology now originate in Asia; in 2000, the U.S. ranked 1st in broadband Internet access; now we are 16th in the fraction of our citizens having broadband connections, and 61st in the use of cell phones per capita. I chose those examples for a reason – they represent industries in which the U.S. was not just pre-eminent, but industries we invented with research that began in universities and research institutes. We are not only losing our edge in what could be considered “old” manufacturing industries such as textiles, steel, and automobiles. We are losing ground in areas that are considered the “New New Thing,” to use the title of Michael Lewis’ book about Netscape founder Jim Clark. Darwin famously said that it is not the strongest who survives, or even the most intelligent, but the individual who is most responsive to change. And to change we have to be investing in innovation and creativity. This will be the only way to compete in this new flat world described so starkly by New York Times columnist Tom Friedman.
Another concern I have about the way we fund science can best be appreciated by looking at the slide containing the annualized growth rate in the NIH budget from 1971 to 2005. As you can readily see, the budget has never had a sustained period of stability, but has been in what biochemists call a futile cycle of growth and retrenchment over the past 35 years. This whipsaw effect, which is a direct consequence of the political budget process in Washington, means that careful and effective planning that permits wise allocation of resources is virtually impossible. Scientific priorities that need years to nurture are initiated, and then suddenly caught without essential support. Universities also fall victim to the mixed signals, for example, by building new facilities and hiring new faculty in response to the availability of additional resources during the doubling in the NIH budget from 1998-2004, only to find that those new faculty cannot attract funding when the tide turned, as it inevitably did. This cycle has particularly corrosive effects on young investigators – or as I will indicate in the next section, the not-so-young investigators – who have the misfortune to enter the grant system at one of the downturns in funding. They find themselves unable to attract the grants needed to begin their independent research careers after years of training. At this moment, when the NIH budget is “in the red,” the morale in the field is as low as I have ever seen it in my career. This is clearly a case where the tortoise, not the hare, wins the race. What is needed is a way to stabilize the annual growth in science budgets to avoid these feasts and famines.
This brings me to the third policy issue I wish to address – the education of the future scientific work force. Vannevar Bush’s social contract between the federal government and universities catalyzed an enormous expansion in the number of graduate students that were trained in the sciences in the 1950s and 1960s. The expansion occurred to meet two needs: students became the unit of scientific work; they were the workers who carried out the research agenda of the country. At the same time, the expansion created the next generation of scientists and faculty members, who were badly needed as the research enterprise expanded. However, eventually this exponentially growing apparatus – a classical Malthusian system – had to slow down. The problem became: how could it slow down, that is, produce fewer students, without having a negative effect on scientific progress?
The answer to this question has been resolved in different ways in different fields. In physics, a field which is relatively small and coherent as a discipline, and where funding has been relatively constant over the last few decades, there was a nation-wide effort by the American Physical Society to decrease graduate admissions over the period of the 1980s and early 1990s to adjust to the fact that there were no longer enough jobs for its graduates.
In my own field of life sciences – a much larger and more diverse intellectual landscape that includes everything from evolutionary biology to public health – no such agreement could be reached, and the number of students didn’t simply remain constant, but fueled by additional funds from the NIH, continued to grow faster than the number of available jobs. Something had to give, and what gave was the length of time that students spent in training. Since I was a graduate student in the 1970s, the average time it takes to obtain a Ph.D. in molecular biology has expanded by more than two years, from four to well over six, and the length of postdoctoral training has extended at least that many years. This has resulted in young scientists who are in “training” well into their 30s while their classmates from college are settling down, raising families, and adding to their pension plans. The average age of first-time principal investigators at the NIH is now 42.9 years – a truly shocking number. I have referred to this phenomenon as the “La Guardia effect.” Students stayed longer and longer in graduate school and postdoctoral fellowships as they metaphorically circled La Guardia airport, waiting for their turn to land in a job.
Aside from the personal cost to individual students, should we be worried that late 30-somethings are still in training positions? I think the answer is yes, and the most important reason comes from conversations that I have had with undergraduates at Princeton over the last 10 years. Princeton attracts some of the most talented students in the world, and for those who concentrate in molecular biology, many have the intellectual potential to become world-class scientists. Yet every year they look at their options – which are infinite – and conclude that the long and indeterminate training regimen that leads to a very difficult job market simply doesn’t stack up against their other options, where the training may be long but at least they know how long, and the job prospects are much brighter. I hasten to add that this is not about money, but about a sense of fairness in the trade-off they are being asked to make between lost incomes while they train, versus the likelihood of finding the job of their dreams.
There is no surer way to strike the death knell of science than to have a career path that discourages highly qualified students from entering the field. In my own view, it is the responsibility of universities and professional scientific societies to strike the right balance between the conduct of research on the one hand, and the education of graduate students on the other. This cannot be accomplished without paying close attention to trends in the labor market. A graduate student rightfully expects to be educated by the faculty; otherwise we should not call them students but workers. Graduate education needs to be more focused on what a student needs to learn in order to become a scientist, and less focused on how much they are able to produce over longer periods of time. Our 50-year-old system that links fundamental and applied research with graduate education has created the best engine for innovation and training in the world, precisely because we attracted the very best and ablest students into the profession.
Devising a coherent set of science policies that set priorities based on the intersection between anticipating scientific opportunities and societal needs, stably funding the enterprise to ensure America’s future economic competitiveness, and educating the next generation of scientists and engineers for productive careers will not be easy. It almost goes without saying that American science cannot be conducted in a vacuum: we depend on public funding to sustain our research, and much of what we do is governed by laws and regulations enacted by non-scientists. Without the public’s confidence, and without the support of their elected representatives, the scientific enterprise will founder. For this reason there is a pressing need for intelligent and open-minded discourse among scientists and policymakers so that, together, we can craft sound science policy for the United States at a time when its scientific pre-eminence is being challenged by other nations. This exchange is essential in any society but especially in a democracy, where policies forged in isolation will ultimately fail.
And while I have emphasized the difficulties in sustaining our scientific pre-eminence in the flattened world, I remain optimistic. In the second half of the 20th century America created the most impressive and powerful engines for innovation and creativity that the world has ever known. The ingredients of that success are still with us, slightly battered but unbowed: this nation always encouraged a competitive and entrepreneurial spirit that is so critical to scientific progress; it always bet on the young by giving them their independence and freedom to explore relatively early in their careers; it welcomed foreigners to study in our great universities, many of whom stayed and contributed to America’s prosperity; it rewarded the best ideas through a peer review process; and through investments in fundamental research, it took the long view. As long as we continue to nurture and protect those qualities that made America’s scientific enterprise such a source of economic growth and prosperity, we will prevail.
Thank you all for coming this evening, and I look forward to your questions.