When the U.S. Senate Judiciary Committee asked me to testify about the National Security Agency’s Internet surveillance program in October, I welcomed the opportunity.
Beyond the implications of the program itself for every American, the revelations were a lightning bolt that illuminated the gap between the policy process and the complex reality of today’s digital technologies. Making good policy in any area requires some understanding of that field, but many policymakers are out of the loop when it comes to technology. Closing this gap is one of the most important challenges in technology policy.
The particular point I discussed at the committee hearing was the privacy implications of the NSA’s collection of phone call “metadata”– who called whom, when and for how long they talked – about all Americans. Listening to actual conversations would be like trying to identify a crime suspect by sifting papers at a city dump. The highly structured, organized nature of metadata allows cleverly written computer code to identify telling patterns and extract stunningly specific inferences at a vast scale. Some have sought to dismiss the privacy implications of metadata surveillance by assuring the public that authorities are looking only at metadata, which perhaps seems anonymous. Although I did not take a stance on the cost-benefit calculation for the nation – it is indeed a difficult judgment – I wanted to make clear that metadata collection has a very real impact on Americans’ privacy.
In a later submission to the committee, I addressed a broader issue. A key complaint of the Foreign Intelligence Surveillance Court, which scolded the NSA, was that it had not received adequate technical information about the NSA’s technology, causing the court to mistakenly approve certain NSA activities. I urged the Senate to take advantage of this nation’s great technical expertise by giving technical experts a direct role in informing decisionmakers such as the court.
I became acutely aware of the scarcity of such expertise in Washington when I served as the first chief technologist at the Federal Trade Commission. Since returning to Prince- ton, I’ve been on a soapbox advocating for scientists and engineers to serve their country by engaging in public-policy processes that connect with their expertise.
This need is not confined to questions of Internet security. It’s hard to imagine any area of science and engineering that does not have some policy connection. As we develop technology to mitigate the effects of climate change or achieve breakthroughs in biological engineering for human health, we inevitably will create capabilities that, along with their intended benefits, have potential for misuse. Policy can foster the development and use of breakthrough technologies, and at the same time mitigate their harmful side effects and limit their misuse. Choices about how to use technology are fundamentally human; as we train new generations of scientists and engineers we must ensure that they are broadly educated and prepared to engage and communicate in the public arena.
Research and teaching at that intersection is the primary goal of the Center for Information Technology Policy (CITP) but also is a growing focus across the engineering school. The Program in Technology and Society brings together CITP with the Andlinger Center for Energy and the Environment and the Keller Center to create special tracks of study for students interested in the broader implications of information technology or energy. The Keller Center also administers the Wong Fund for Engineering and Policy, which supports internships and other projects for engineering students.
The engagement of so many of my engineering colleagues on matters of policy and the enthusiasm of our students bodes well. Their voices will illuminate hidden pitfalls and build bridges for decision makers who lack direct expertise in technology. Attacking the grand challenges of our society requires not only transformative technologies but also wise policy.