Can we control surveillance tech?

By Arthur Molella, NMAH
He's watching you Darth helmet

The National Museum of American History's Lemelson Center for the Study of Invention and Innovation recently hosted its annual symposium, this year focusing on surveillance inventions and developments. Art Molella, the Center's director, discusses our surveillance society. Note: you can now view the symposium video online.

The theme of our New Perspectives on Invention and Innovation symposium was ripped from the headlines. We knew we had a hot potato in this topic when we began planning early in 2013, but we had no idea it would soon explode when Edward Snowden leaked information about the National Security Agency's (NSA) spying at home and abroad. To say that this has raised the stakes for us at the Lemelson Center would be a huge understatement. While there are myriad aspects to the ongoing controversy, our symposium focused on the technology of surveillance and the related issues of social and ethical responsibility.

 

Surveillance-header

 

If history has taught us anything, it is that technology and invention can often escape our control, however good our intentions. If we wait to address social problems downstream after they arise, it is usually too late. It then becomes mostly a futile game of catch-up. The ideal approach is to try to anticipate such problems from the start of major projects, building in front-end attention to the social and ethical impacts of emergent technologies. I say "ideal," because there are major obstacles to doing so. Mostly it's a matter of money, but second-guessing an emerging technology in this way may also be criticized and dismissed as a brake on innovation.

Occasionally, though, such foresight is evident. Consider the government's Human Genome Project, in which the National Institutes of Health (NIH) and the Department of Energy (DOE) were major players (the latter because of concern over health issues related to radiation from atomic testing and chemical exposure). Both the DOE and the NIH genome programs set aside fully 3-5% of their annual budgets for risk assessment and for the investigation of Ethical, Legal, and Social Implications (ELSI). While such efforts may not have fully anticipated, much less solved, future problems, at least they were a move in the right direction.

 

The Human Genome Project budgeted for risk assessment and investigation of ethical, legal, and social implications. Image courtesy the U.S. Department of Energy Genome Programs.
The Human Genome Project budgeted for risk assessment and investigation of ethical, legal, and social implications. Image courtesy the U.S. Department of Energy Genome Programs.

That privacy issues were deeply implicated with the genome project was recognized early on. There was a real danger that personal genetic information could get into the wrong hands or be used inappropriately, with truly scary consequences for individuals, including denial of employment and health insurance. One might reasonably ask why the same care has not been taken with information and communications technologies that have allowed the NSA to do what it does. Why were we taken by surprise by both government and commercial abuse of digital innovations? Perhaps it's because they emerged over a relatively long period of time and from a disparate set of players, whereas the Human Genome Project(s) had much more the flavor of a crash program like the Manhattan Project, with its reliable funding, central management, and tightly controlled access.

 

During World War II, the "careless talk" campaign used posters like these to discourage leaking secret information
During World War II, the "careless talk" campaign used posters like these to discourage leaking secret information

Yet, as with the genome project, government- and particularly military-sponsored R&D played a critical role in the launching of today's breakthrough digital technologies, and they still do. The Internet owed its birth to the Defense Department's Advanced Research Project Agency, which developed ARPANET. With national security as the over-arching motivation and justification, keeping the new technologies in control and out of the wrong hands had to be a major concern. Because of the veil of secrecy, I can't say if there was actually an attempt to establish regulations with respect to the privacy issues that bedevil us today. Perhaps there was, but clearly the events of September 11, 2001, have fundamentally changed the rules of the game.

 

During World War II, the Office for Emergency Management produced posters such as this one, which depicts the stylized helmet of a German soldier
During World War II, the Office for Emergency Management produced posters such as this one, which depicts the stylized helmet of a German soldier

A major factor in the whole problem of management and control was the privatizing of government R&D, resulting in hybrid organizations combining private and government sectors (see Kevin R. Kosar, The QuasiGovernment: Hybrid Organizations with Both Government and Private Legal Characteristics, Congressional Research Service, June 22, 2011). The pattern was established after the Second World War with the creation of Federally Funded Research and Development Centers (FFRDCs)—so-called GO-CO (government owned, contractor operated) organizations. The first of these was the Air Force’s RAND Corporation, established in 1947 in Santa Monica, California. Government atomic weapons labs like Oak Ridge and Los Alamos national laboratories soon followed suit. Such quasi-government arrangements allowed for much more flexibility in terms of spending, procurement, hiring, personnel adjustments, and more rapid technology transfer from basic research to application.

 

The entrance to Oak Ridge National Laboratory. Part of the Department of Energy, the Lab is run by a contractor.
The entrance to Oak Ridge National Laboratory. Part of the Department of Energy, the Lab is run by a contractor.

There were clear advantages to this model, but it has come under increasing scrutiny and criticism over the last decade because of the potential for corruption, lack of accountability and oversight, and loss of government control of research. In particular, the FFRDCs greatly complicated the problem of regulation. More than a decade ago, public policy expert Ann Markusen argued persuasively against privatizing national security. She pointed out that government out-sourcing requires strong management, but "such capacity is undercut by the unpopularity of regulation and unwillingness to spend on it" (The Case Against Privatizing National Security, June 2001).

As I write, the question of governmental oversight of the National Security Agency's data-mining, monitoring, and outright spying is being hotly debated. Perhaps NSA was and is indeed working within its own regime of regulation and accountability. But the cozy relationship today between government agencies like NSA and the companies they outsource to makes it far too easy for classified government innovation and information to flow into the commercial sector, where there is little if any incentive for regulation. The Snowden case was a prime example.

 

The Detectifone was touted as "a mechanically perfect device for producing the evidence." Its brochure boasted, "The Detectifone is Insurance, a Silent Watchman bringing directly to your ear absolute facts, which are necessary to your business."
The Detectifone was touted as "a mechanically perfect device for producing the evidence." Its brochure boasted, "The Detectifone is Insurance, a Silent Watchman bringing directly to your ear absolute facts, which are necessary to your business."

Today, I often hear it argued that no one should have been surprised by the revelations of government spying. After all, social media users, not to mention online shoppers, have willingly, with little if any apparent concern for the consequences, already ceded much of their personal privacy to corporations. As my colleague Jeff Brodie noted (with tongue firmly in cheek), "We want our cake, we want the icing, and we want to eat it without gaining weight." (A penetrating satire on this incredibly self-destructive social behavior is David Egger's recent novel on the ultimate perils of Big Data, The Circle.)

Invention and innovation, however, can also be powerful forces for democracy and the public good. Recent history has shown that cell phones and social media have made it far more difficult for dictators to control information. Such technology has clearly been crucial to the Arab Spring, for example. But it is also a double-edged sword that can be used by ill-intentioned regimes to undermine democracy in unprecedented ways. With mounting concerns for national security, surveillance technologies are not going away. But is it too late to bring them back under at least some semblance of democratic control?

Art Molella is the director of the museum's Lemelson Center for the Study of Invention and Innovation. This post originally appeared on the Center's blog, Bright Ideas. Art has previously blogged about the intersection of art and technology, as well as the life of James Smithson.