Alex Wellerstein’s Restricted Data is an extensive and impressive study of the organisational production of nuclear secrecy in the United States. The overarching rationale of Restricted Data is to trace the development of the political, social, and organisational mechanisms which limited the spread of the scientific and technical knowledge of the nuclear technology. The aim of nuclear secrecy was to secure the governmental ownership—and monopoly—of the lethal nuclear arsenal. Wellerstein defines nuclear secrecy as “a regime,” by which he refers to “a bundle of thoughts, activities, and organizations that try to make secrecy ‘real’ in the world, to perform the multitude of acts of epistemological slicing that result in some people knowing things, and other people not” (6).
H-Diplo | Robert Jervis International Security Studies Forum
Roundtable Review 15-11
Alex Wellerstein, Restricted Data: The History of Nuclear Secrecy in the United States (Chicago: University of Chicago Press, 2021) ISBN: 9780226020389
30 October 2023 |PDF: https://issforum.org/to/jrt15-11 | Website: rjissf.org | Twitter: @HDiplo
Editor: Diane Labrosse
Commissioning Editor: Thomas Maddux and Diane Labrosse
Production Editor: Christopher Ball
Copy Editor: Bethany Keenan
Contents
Introduction by Eglė Rindzevičiūtė, Kingston University London. 2
Review by Michael A. Falcone, Harvard University. 7
Review by Jonathan Hunt, US Naval War College. 12
Review by Sam Lebovic, George Mason University, and Kaeten Mistry, University of East Anglia. 16
Review by Sudha Setty, City University of New York (CUNY) School of Law.. 21
Response by Alex Wellerstein, Stevens Institute of Technology. 26
Introduction by Eglė Rindzevičiūtė, Kingston University London
Alex Wellerstein’s Restricted Data is an extensive and impressive study of the organisational production of nuclear secrecy in the United States. The overarching rationale of Restricted Data is to trace the development of the political, social, and organisational mechanisms which limited the spread of the scientific and technical knowledge of the nuclear technology. The aim of nuclear secrecy was to secure the governmental ownership—and monopoly—of the lethal nuclear arsenal. Wellerstein defines nuclear secrecy as “a regime,” by which he refers to “a bundle of thoughts, activities, and organizations that try to make secrecy ‘real’ in the world, to perform the multitude of acts of epistemological slicing that result in some people knowing things, and other people not” (6).
In terms of chronology, the scope of the book is ambitious: Restricted Data explores the evolution of the secrecy regime from the Second World War and the Manhattan Project to the end of the Cold War. As noted by some of the book’s reviewers in this forum, this long view does impose some limitations on the depth of the analysis. However, it does help the reader to understand the bureaucratic and legalistic legacies of the twentieth century that fundamentally shape the present governmental institutions. As Wellerstein shows, the end of the Cold War did not constitute a particular rupture in this path of development: although the Second World War and the Manhattan Project were fundamental for the establishment of the bureaucratic system of secrecy, the history of the modern forms of controlling technoscientific secrets goes back to the eighteenth and nineteenth centuries.
The implications of this study are wide-ranging and significant for many fields of scholarship, particularly the sociology of knowledge, science and technology studies, legal studies, public policy, and governance. The reviews in this forum, Michael A. Falcone, Jonathan Hunt, Sam Lebovic, Kaeten Mistry, and Sudha Setty, make it clear just how multifaceted the study is and just how widely ranging is its impact. To research the production of nuclear secrecy, argues Wellerstein, is vital. The aim is not to disclose the secrets, many of which need to stay out of the public domain, but to keep governmental and corporate institutions accountable, especially as there is a lot of evidence of the damage that excessive and pervasive concealing has caused to society and the environment. Sudha Setty considers the legal pathway for the normalisation of secrecy as it was connected to national security and defence in the US. However, as Setty notes, the secrecy regime has been used not only to enhance security, but also to cover up organisational inefficiencies and malfunctions, particularly where the operation of the nuclear establishments had high human and environmental cost. It is therefore important, argued Setty, that the public continues probing and challenging the secrecy, even though the available means are limited. Indeed, such challenges come from different parts of civil society, not the least contemporary art, as, for instance, in the work of Trevor Paglen, which explores the spatial traces of the infrastructures of national security.[1]
In Restricted Data, Wellerstein focuses on the systemic, organisational apparatus that was developed to secure nuclear technoscience. His study, therefore, stands out in the context of existing research that raised the importance of nuclear secrecy, but focused on perhaps less mundane elements, such as spying, censorship and suppressing the information about the human and environmental impacts of nuclear accidents and contamination.[2] The forum reviewers note the welcome focus on the systemic and routine practices as opposed to spectacular events or controversies. Sam Lebovic and Kaeten Mistry praise the focus on the “really existing scientific bureaucracy” that produced nuclear secrecy as a “quotidian” practice.
Like Setty, Michael Falcone considers the implications of Restricted Data on the understanding of the character and extent of the national security state in the US. Falcone points out the many ways in which the security state is far from being a monolith, as Restricted Data shows, where the demand and supply of secrecy were shaped not only by the instrumental military-industrial goals, but also by other agencies, many of which chose not to challenge the systematic production of non-knowledge about the nuclear weapons. Falcone reflects on the clash between the civil society and the security state, the public indifference that leads to damaging technocracies, a clash between openness and secrecy. However, it should not be forgotten that the lack of public interest in the nuclear sector, particularly in obtaining more fine-grained knowledge on nuclear risks, is also a due to systematic lobby actions[3] and the refusal of institutions to collect data about the radiological hazards.[4] To this I would add the observation that the idea of at whom transparency should be directed was articulated differently at different times. As Stefano Geroulanos showed in his excellent study of the idea of transparency in France, the idea that the state—and power structures—should be transparent to society is indeed quite recent, dating to the 1980s. In turn, in modern France there has been a strong social demand for opacity, where the individual personal right to opacity has been long cast as key prerogative to assert personal autonomy from the state.[5]
In Restricted Data, Wellerstein focuses on nuclear weapons even while acknowledging that the boundaries of military nuclearity are blurred: it is not clear to what extent the supporting engineering and material sciences can be considered nuclear and therefore subject to secrecy regimes. But the blur occurs even when focusing on atomic weapons. For instance, in his review Jonathan Hunt refers to Hiroshima and Nagasaki as “the one and only bombings,” using this as an argument that nuclear deterrence works. However, this line of argument has been widely questioned, where critics point out that nuclear test explosions were real bombings, equal to the deployment of the bombs over Hiroshima and Nagasaki. There is a growing research area in which the secrecy associated with nuclear testing is probed and challenged by the army veterans who participated in the US, British, and French nuclear testing; for instance, see the emerging work by Christopher Hill and the many theorists who traced the environmental, medical, and social impacts of nuclear colonisation expressed through testing.[6] In turn, historians like Gabrielle Hecht, Susan Bauer, and Tanja Penter have demonstrated just how intertwined uranium mining has been with Western and Soviet colonisation.[7]
Lebovic and Mistry consider the possibility of international control of the atomic weapons in the 1940s as an alternative scenario to the US secrecy regime. However, it is very unlikely that such controls at this very early stage would have been supported by the Soviet Union. The historians of Soviet atomic weapons, such as David Holloway and Michael Gordin, emphasize the paranoid Stalin’s personality.[8] Soviet forces were encroaching on Eastern Europe, and Soviet leaders, who viewed the Western democracies as hostile, had their own reasons for secrecy given that Stalin was complicit with the Nazi invasion of Poland and its division into spheres of influence, as agreed to through the secret protocol of Ribbentrop-Molotov pact of 23 August 1939, named for the German and Soviet foreign ministers. This agreement and the secret protocol were discovered by the Western powers only in 1945, and Soviet leaders denied their existence until 1989.[9] Soviet dissident scientists, such as Andrei Sakharov, began to call for the abolishment of nuclear weapons only in the 1960s. Later, the idea of greater international regulation of nuclear weapons was voiced by Soviet globalist scientists in the 1970s-80s.[10] However, these voices were absent during the 1940s. The designers of the first Soviet bomb, according to Paul Josephson, were loyal to the Soviet regime and deeply sceptical about the Western bloc.[11] They read the Smyth report, the first official and detailed account about the history of the Manhattan Project written by the physicist Henry DeWolf Smyth, which was published in the wake of Hiroshima and Nagasaki bombings and made available in Russian in the late 1946.[12] The Soviet nuclear scientists and engineers, who were deeply invested in the idea of nuclear parity and were not prepared to lose the arms’ race, constituted the grassroots of the Soviet nuclear secrecy. There remains a long way to go to fully account for the evolution of Soviet nuclear secrecy regime, even if its important elements have been discussed by Gordin and Holloway.
Restricted Data is such a goldmine of details, insights, and arguments that no review can do justice to its complexity. In addition to points discussed by the roundtable reviewers, and in line with Alex Wellerstein’s response to the readers, I would like to suggest that Restricted Data makes an important contribution to the research on the late modern cultures of secrecy.[13] Secrecy, as anthropologists have demonstrated, is a complex social phenomenon where restricting access to knowledge, events, or objects is not only about the distribution of power.[14] Anthropologists have shown that secrecy is also a process of valuation and communication of the societal values and priorities. Secrecy is key part of creating rituals that attract, engage, and bind. Secrets, as scholars argued, are never fully and completely concealed, for if they were such, they would cease fulfilling their societal role: complete lack of knowledge would equal non-existence. Secrets are always partially revealed.[15] In the case of security, total concealing can be a destabilising factor, as was famously articulated in Stanley Kubrick’s film “Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb” (1964).[16] Wellerstein’s book is a masterly work that demonstrates precisely that the organisation and management of revealing is a constitutive part of a secrecy regime: “The bomb may have been born in secrecy, but that secrecy was always controversial and contested” (3).
In his response to the reviews, Wellerstein explains that his focus in the book was the rhetoric, practices, and institutions of secrecy, but I would like to add to this that he has also revealed the importance of materiality. At its early stages, as Wellerstein shows, nuclear secrecy revolved around materialities: forms of engineering, physical locations, and materials rather than equations. Wellerstein argued that the historiographic focus on nuclear fission in effect obscured “the contributions of chemists, metallurgists, and engineers.” In turn, “concentration of attention at the very high level (administrative choices) and the very low level (basic physics) preserved technical secrecy” (102) which was prized the most. Michael Gordin made a similar argument. In commenting on the question whether the first Soviet atomic bomb was a copy of the American plutonium bomb, he noted that “the basic building materials—epoxies, washers, solder—were different in the Soviet Union and the United States, so the physical properties of the Soviet materials had to be established precisely even in those few cases where the Soviets had access to full American details.”[17] To this I would add that the focus on materialities has important implications for the consideration of the historical development of the hierarchies of fundamental and applied sciences and the legacies of these hierarchies in the existing historiography of science.[18] Indeed, it is symptomatic that the newly released film “Oppenheimer,” written and directed by Christopher Nolan (2023), focused predominantly on the theories of fission and fusion as well as the themes of political loyalties and spying.[19] The cyclotrons, the first pile reactor, the wires—the whole material apparatus that enabled the bomb—make only brief appearances in the background of a theoretical and human drama.
The reviews highlight distinct themes, and together demonstrate just what a valuable work Restricted Data is. They offer much food for thought, not only to historians of nuclear power, but also to all of those interested in the development of late modern government of liberal democracies. I would like to see the study of nuclear secrecy extend from the creation of the bomb and non-proliferation, to the study of the creation of secrecy around uranium mining, nuclear testing, and nuclear decommissioning, to embrace the full cycle of nuclear technology. After all, as Wellerstein writes in his response, the institutional configurations of secrecy are embedded in the pragmatic of action where the goals and rationales will change depending on the context. Wellerstein argued that there is no ready-made, “nuclear” or “Cold War” regime of secrecy. Instead, performing secrecy is part of performing power, a process whose genealogy will go back in time as far as scholars care to look.
Contributors:
Alex Wellerstein is Associate Professor and Director of Science and Technology Studies at Stevens Institute of Technology in Hoboken, New Jersey. Prior to becoming a professor, he was an Associate Historian at the Center for the History of Physics at the American Institute of Physics, and was prior to that a postdoctoral fellow at the Belfer Center’s Managing the Atom and International Security Programs at the Harvard Kennedy School. He received his PhD from the Department of History of Science at Harvard University in 2010, and received a BA in History from the University of California, Berkeley, in 2002.
Eglė Rindzevičiūtė is Associate Professor of Criminology and Sociology in the Department of Criminology, Politics and Sociology, Kingston University London, the UK. She is the author of The Power of Systems: How Policy Sciences Opened Up the Cold War World (Cornell University Press, 2016) and The Will to Predict: Orchestrating the Future through Science (Cornell University Press, 2023). She is currently writing a book on nuclear cultural heritage.
Michael A. Falcone is Ernest May Fellow in History and Policy at the Harvard Kennedy School. His work has appeared in the L.A. Times and Washington Post, among others. His book project, When Knowledge Became Power: Technology and the Rise of U.S. Global Power, 1940-1960 examines the role of British technological diplomacy in influencing the United States’ transition to global power during and after the Second World War.
Jonathan R. Hunt is Assistant Professor of Strategy at the US Naval War College. He is the co-editor with Simon Miles of The Reagan Moment: America and the World in the 1980s (Cornell University Press, 2021) and the author of The Nuclear Club: How America and the World Governed the Atom from Hiroshima to Vietnam (Stanford University Press, 2022).
Sam Lebovic is Professor of History at George Mason University. He is the author of Free Speech and Unfree News: The Paradox of Press Freedom in America (Harvard 2016) and A Righteous Smokescreen: Postwar America and the Politics of Cultural Globalization (Chicago 2022). His next book, State of Silence: The Espionage Act and the Rise of America’s Secrecy Regime, will be published by Basic Books in November 2023.
Kaeten Mistry is Associate Professor of American History at the University of East Anglia. Among his publications are The United States, Italy, and the Origins of Cold War: Waging Political Warfare (Cambridge 2014) and, with Hannah Gurman, Whistleblowing Nation: The History of National Security Disclosures and the Cult of State Secrecy (Columbia 2020). He is currently writing a monograph that will be published by Harvard University Press, on the culture of state secrecy in modern America.
Sudha Setty is Dean and Professor of Law, City University of New York (CUNY) School of Law. She is the author of National Security Secrecy: Comparative Effects on Democracy and the Rule of Law (Cambridge University Press 2017), the editor of Constitutions, Security, and the Rule of Law (iDebate 2014), and the author of over two dozen articles and book chapters on comparative national security and rule of law. She currently serves on the editorial board of the Journal of National Security Law and Policy, and has been an elected member of the American Law Institute since 2018.
Review by Michael A. Falcone, Harvard University
I am on a list now for having written this review, and readers are on a list for clicking on it. Alex Wellerstein is likely already on a list. In his case, it could be for his track record of historical expertise in matters of atomic energy and secrecy (of which the excellent Nuclear Secrecy Blog and horrifying-but-useful NUKEMAP are just a primer).[20] These days, however, it is equally likely that the security establishment is interested in him for Restricted Data—an ambitious, edifying, and thought-provoking contribution to the historiography on science and the national security state.[21] Among the book’s important interventions is its consideration of a conundrum that has yet to be fully resolved by scholars of the US state and society: On the one hand, in 2022 I can say “I am on a list” and readers understand what I mean—the will and capacity of the US government to be Orwellian is a commonplace. On the other hand, as Wellerstein shows, the American state had almost no secrecy framework until the 1940s; the one it eventually got was made up on the fly; and its attempts to install and maintain secrecy have always been deeply contested, even during the witch-burning frenzy of the McCarthy era. So how can a security regime, which is today perceived as so totalizing, so panoptic, have been fabricated from almost nothing? How did we get from the one to the other, especially given the stubborn context of devolved US political culture? And how totalizing is it, really? Those are the questions that Restricted Data convincingly addresses.
This is less a book about the evolution of secrecy policy than one about the never-ending contestation of the notion of secrecy itself. Inevitably, then, it is a story of dead-ends and long, grinding debates about policy nuances and procedural amendments that, in many cases, went nowhere after years of effort. But the potential frustration in seeking coherence in this story becomes a lesson in itself: As Wellerstein demonstrates, the slog of red tape and the exasperation of fruitless clashes of incompatible worldviews are actually what nuclear secrecy looks like on the inside. It is a grind that, if far from the dark military-industrial tale we are often told, also seems far from productive or salutary, at least in public policy terms.
Wellerstein chalks up this endless contestation to national characteristics: the US context for these debates matters, he argues, because the United States is the nuclear power that has struggled most with the meaning, implications, and values of nuclear secrecy. Other countries—and not just totalitarian ones but even democracies like France—find the nuclear omerta far easier to assimilate into national life. They seem to accept Langdon Winner’s classic formulation that “if you accept nuclear power plants, you also accept a techno-scientific industrial-military elite. Without these people in charge, you could not have nuclear power.”[22] But the United States, the very inventor of that elite, has never stopped writhing against the technological determinism at the heart of Winner’s notion. Paradoxically, then, the United States produced both McCarthyism and the Smyth Report—that astonishingly complete technical and institutional disclosure of the Manhattan Project, which was released just days after the attack on Nagasaki (98-105).
The presence and persistence of these contradictions suggests that the US national security state has never been as totalizing as hawks and critics alike often think. As Wellerstein guides us through key episodes in nuclear secrecy history, it becomes apparent that many attempts by the Atomic Energy Commission (AEC), the Department of Energy, and the Department of Justice (DOJ) to circle the wagons around “restricted data” have stemmed not from the power of those agencies but from their potential weakness—and, especially, from their knowledge that the whole secrecy regime could be dismantled overnight if American courts were to use the government’s own laws against it. Security administrators and their detractors, in other words, have seemed always to sense that the tools to take down the secrecy apparatus are already vested in the state and constitution.
The book makes a convincing case of this by detailing flashpoints that reveal obvious shortcomings in the security framework vis-à-vis the justice system. To maintain secrecy during the 1951 espionage trial of Ethel and Julius Rosenberg, for example, government prosecutors who were bound up by legal and security red tape had to rely on stoking the emotions and paranoias of the public and jury, on hoping the defense did not probe too much on certain subjects, and on trusting (in vain) that the press would not print too much of what it heard. In this light, the actors behind the secrecy regime look almost like the Freemasons, their power derived from the control one thinks they have, rather than the dreary bureaucratic reality.
It is important, however, not to overstate the limitations. Monolithic regime or not, the Rosenbergs were executed, the midcentury Lavender Scare did successfully persecute LGBTQ+ scientists, and former Manhattan Project leader Robert Oppenheimer did get publicly disgraced. Those outcomes—and the means by which administrators effectuated them using a limited toolkit—remind us that national security is a practice, rather than a thing, and it can often manifest through the misuse of bureaucracies, legal arrangements, and regulations, rather than through design. Restricted Data thus gives us important food for thought on how we, as scholars, seek to characterize and label institutions and the people inside them, particularly when we fall into the trap of valuing essences (like “the censor”) over outcomes.
Further breaking down commonly held perceptions about secrecy, Restricted Data highlights that the most volatile tensions between secrecy and openness have frequently occurred within the state, rather than between state actors and civil society. Indeed, the press and public often come across badly in this story. The journalist’s nose for drama usually blunted sustained public interest in the dry scientific and engineering minutiae at the heart of nuclear decision-making. As Wellerstein details, nuanced attempts to shift nuclear energy towards something cooperative and multilateral were often met with yawns by the press, whose members were not interested, for example, in the positivist world vision of Atoms for Peace, but rather in the titillation of whether the program might give away America’s precious “secret” (275).
Moreover, the book cautions its readers not to underestimate the appetite of both Congress and the public for violence and potential atrocity. Once the populace learned of the existence of a potential H-bomb project, for example, “there was hardly any position [President Harry Truman] could take but to approve it,” so great was the hunger to stay “ahead” and to have bigger and more powerful weapons (220). The moral and genocidal implications of thermonuclear bombs did nothing to stymie the fervor, suggesting that openness is a quite distinct variable from pacifism.
The exception to this public complicity is what Wellerstein calls the anti-secrecy movement. Emerging from the late-60s and early-70s anti-establishment zeitgeist, and using the leaked Pentagon Papers as its lodestar, anti-secrecy waged a protracted struggle to disinfect the national security state with sunshine. Leaks, scientific interviews, probing study of public documents, crowdsourced “hydrogen bomb collegiate design contest[s],” and other tactics became a continual nuisance to national security administrators, and they evolved into the culture of watchdog journalism, Open-Source Intelligence, and “secret seeking” that we have today (337-362). As a result, an issue originally focused on the openness of science came to recenter on the volcanic political question of freedom of speech.
Wellerstein’s narrative of these ongoing battles between secrecy and its latter-day discontents provides perhaps the best window of all into not just the information practices of secrecy but its philosophical and ideological implications. In the case of activist Howard Morland, for example, most of the data he amassed came from encyclopedias, but in order to control him, the government had to declare how accurate and dangerous his information nevertheless was. Cases like these went (and continue to go) in circles, and their endpoint is mosaic theory, which holds that a member of the lay public can possess classified data by being good enough at assembling unclassified data. That argument gives the government, in the words of scholar David Pozen, an “unfalsifiable” claim with which to assert and abuse legal privilege.[23] (Other circular classification formulations highlighted by Wellerstein include this 1947 gem from the AEC: “The areas in which we can make no comment are as follows…” [191]).
There is also rich potential here to analyze these radical anti-secrecy activists in the context of existing science and technology studies (STS) literature on hackers and hacker culture.[24] Indeed, the parallels are striking between the exploits of nuclear anti-establishmentarians like Morland, Chuck Hansen, and Keith Brueckner and those of the early dissidents of Silicon Valley, who were driven by many of the same ideological impulses as the anti-nuclear “secret seekers,” resorted to some of the same tactics, communicated using the same means (campus newspapers), moved in the same spaces (tech-focused universities and clubs for anti-establishment ‘outsiders’), and were driven by similar incentives of play and peer recognition. Morland and Hansen did not want to build an H-bomb, after all—rather, they wanted simply to puncture and deflate the establishment, reveling in the raw desire to untangle the “magician’s tricks” of science (365). The nonconformists who wrested the world of computers away from its military-dominated status quo thought along much the same lines—and were similarly problematic in their sociocultural, racial, and gender insularity.
It is also worth mentioning (albeit straying outside the realm of nuclear) that for all the epistemological changes detailed here—and particularly the transformation of the concept of ‘secret material’ into a non-material abstraction—many of the most high-profile secrecy controversies in recent decades (think Wikileaks, the Panama Papers, and the battle between the DOJ and James Risen at the New York Times over source anonymity) actually pertain to good old-fashioned document-stealing (or, as Risen’s charge sheet put it, the “unauthorized conveyance of government property”).[25] Given Wellerstein’s argument that this history is about the constant interrogation of what secrecy means, it is perhaps fitting that some of today’s most explosive leak scandals have circled back to a classification formulation first used during World War I (31).
Beyond the link with hackers, Wellerstein’s work is so authoritative and elaborate that many of the historical themes it touches on could merit whole other monographs. More study would be welcome, for example, on who populated bodies like the AEC’s Office of Technical and Public Information. As the book says, national security administrators tried at first to separate the roles of security agents and scientists, but the politics of secrecy and the ever-growing reach of nuclear energy and its discourses meant that the two categories became hopelessly blurred. The question, then, is who did this kind of work, in the mid-level, technocratic arena of implementation? What were their political backgrounds? Was there a revolving door between the AEC and the nongovernment laboratory, or was redaction a second-tier career? What are the implications of that? Drilling further into the recruitment, training, retention, and oversight of what we might call ‘the redactor class’ could provide further insight into the everyday practice and institutional entrenchment of secrecy.
It would also be useful to know more about how nuclear secrecy did or did not operate in a vacuum vis-à-vis other forms of secrecy. Wellerstein’s work naturally stokes curiosity, for example, about the presence of feedback loops between military and civilian security regimes. Moreover, the notion of secrecy practices spilling across institutional lines also applies in a transnational register—what, for example, was the role of British precedent in the evolution of US secrecy? As the book mentions, the United States’ first secrecy acts were directly modelled on British ones (30). And Anglo-American emulation continued in the formation of bodies like the National Security Council.[26] Given how tightly intertwined the US and UK nuclear programs usually were; given how critical European expatriates were to US atomic research; given that it was British impetus that kickstarted the Manhattan Project in the first place; and given that (as Wellerstein notes) the bilateral agreement with Britain necessitated a surprising degree of mutuality before disclosures like the Smyth Report could be made, the issue of mutual learning on secrecy practices takes on an intriguing relevance.
And what of flows in the other direction? How did US ideas and precedents influence other states’ secrecy regimes? There is a rich seam here to be mined by scholars of the transnational movement of science. Wellerstein briefly discusses American administrators’ attempts to impose domestic “restricted data” regulations on allies like Germany and the Netherlands, for example (294). But what about emulation—information flows in which bureaucratic organization for secrecy itself becomes a form of “tacit knowledge” for new and aspiring nuclear states? Put another way, how did the manner in which the US cobbled together its nuclear secrecy apparatus cause that apparatus to seem natural and automatic, even in other countries? How did the relationship between atom-splitting and the growth of an atomic elite become self-fulfilling, as Winner suggested?
Perhaps, in the end, Restricted Data is an invitation for renewed engagement in public policy. The story here suggests that if politics and civil society cannot meaningfully engage with these issues, then they get the technocratic secrecy regime they deserve. Wellerstein rightly gestures at contemporary parallels in his conclusion, and indeed, the urgency that drove the makeshift secrecy of the Manhattan Project looks decreasingly sui generis as time goes on and as new technologies emerge that potentially pose just as great a set of policy conundrums as nuclear. Does Winner’s formulation about atomic power and technocracy apply to artificial intelligence (AI) and biotechnology?
Of course, the question of the appropriateness and limits of states policing risky scientific fields has been a fraught one for decades. Witness the striking parallels between the 1939 Szilard-Joliot controversy on scientific self-policing and the 1975 Asilomar conference on the new technique of gene editing. In the latter case, just as a quarter-century earlier, absence of state regulation led a group of individual scientists to raise the alarm about certain directions of research (in this case, the possibility for uncontainable genetic mutation). And just as a quarter-century earlier, their call for a voluntary moratorium on research and development (R&D) met with fierce pushback from other scientists, who railed against perceived violations of scientific freedom. Unlike in 1939, however, state actors largely stayed out of it, and the result was an antagonistic conference, a narrow set of voluntary technical guidelines, and an eschewal of any real engagement with the responsibility or ethics of the issue. R&D sallied forth free of specialized restrictions, and the $750 billion biotech industry was born.[27] Was that too ‘open’ a result?
The sequel came in 2015, when concerned scientists raised new alarm at the potential impropriety of germline genetic modification—but this time, even the concerned scientists stopped short of looking too deeply into the question of ethics. No call came for even a consensus-driven set of research guardrails, and the momentum for scientific self-regulation slowed. Two years later, the bombshell news broke that Shenzhen geneticist He Jiankui had sloppily used CRISPR technology to alter the genes of twin baby girls, stoking global outrage. In the aftermath of the scandal, the Chinese government responded by mulling use of a very blunt hammer: draconian regulations across a wide swath of biomedical science, which could have the result of hampering important research even in uncontroversial areas.[28] Is that result too ‘closed?’
In both cases, the exhortations by science administrators Niels Bohr, Vannevar Bush, and James Conant from 1945 that revolutionary new technologies might “require a remaking of the world…[and that] international politics and industrial practices would both have to be forever changed” (139) still echo today. But so do their failures to bring such change about. Effective frameworks for governing risky technologies require something beyond the dialectic that has defined nuclear secrecy—the needle swinging violently between policy and protest, action and reaction, fiat and resistance. Rather, such frameworks require constructive, nuanced, and society-wide political engagement. If that does not happen, then in arenas like AI and gene editing we might be doomed not only to the wild cycles of punch and counterpunch between secrecy and openness that are detailed so commandingly in Restricted Data, but also to the force that inexorably fills the void between them: an entrenched, permanent, and inflexible technocracy, one that long outlives the technologies and people it is ostensibly there to protect.
Review by Jonathan Hunt, US Naval War College[29]
Historians have long pondered what Sherlock Holmes’s “dog that did not bark’—referred to as “the curious incident of the dog in the night-time” in Arthur Conan Doyle’s Silver Blaze with reference to a canine who remains mute as a prize racehorse is stolen—reveals about nuclear science and technology.[30] After all, the most interesting fact about nuclear weapons is what they have not done—mercifully, Hiroshima and Nagasaki remain the last and only atomic bombings.
But what about dogs whose barks were redacted by government censors? Alex Wellerstein’s Restricted Data pursues this elusive quarry: the secrecy regime that enshrouded the Manhattan Project and thereafter the US national-security state. The result is a wide-ranging, confident, and often surprising mediation of how the broken nucleus led scientists and officials to classify whole branches of natural knowledge in the name of national security and contrary to the professed principles of an open society.
Wellerstein is a companionable guide to an arcane realm. He has earned a cult following through his blog (same title as this book) and NukeMap, an online tool that lets aspiring wizards of Armageddon visualize what Little Boy (the first atom bomb) or Ivy MIKE (the first hydrogen bomb) would do to their favorite (or least favorite) town. It’s a lifesaver for those who teach Nuclear Weapons 101 and a timewaster for everyone else. As of July 10, 2021, the website counted 223.9 million virtual detonations since 2012.
Nuclear secrecy is a subject with which most historians engage only in their efforts to puncture it by dint of Freedom of Information Act (FOIA) Requests.[31] The National Security Archives at George Washington University derives its raison d’être from this activity. Wellerstein is more dispassionate, crediting the “Cold War regime” of secrecy that sent Julius and Ethel Rosenberg to the electric chair for disseminating “as widely as possible” any data “deemed safe” (7). His acknowledgement that US declassification efforts have outstripped those of fellow nuclear club members rings true. However frustrating FOIA requests may be, regulations around the storage and transmission of classified documents has had one positive effect: the executive branch is legally required to preserve its secret papers. So much so that administrations have found ever more inventive ways of bypassing FOIA requirements, whether by means of using home e-mail servers or the time-honored approach of not writing things down.
Wellerstein identifies “tension with democratic desires” at the boundary of the classified and the public as the central dynamic of secrecy since before the Manhattan Project. It was neither paranoid politicians nor risk-averse bureaucrats who introduced atomic censorship, but rather scientists such as Hungarian polymath Leo Szilard and Italian Nobel Laureate Enrico Fermi (both in flight from fascist central Europe). The power to suppress weaponizable data quickly slipped their grasps. Szilard later dubbed the “SECRET stamp … the most powerful weapon ever invented”—an epigram made all the more memorable by an earlier reference to physicist C. P. Baker’s plea, “WE NEED A STAMP,” after having written “SECRET” on every page of a technical manual by hand (3-4).
Manhattan Project chief operating officer Army Brigadier General Leslie Groves systematized secrecy in the United States, introducing such terms as “Top Secret” and “over-classification,” as well as such practices as compartmentalization, secrecy oaths, codenames, misinformation, and background checks. Wellerstein makes clear that the project’s reputation as “the best-kept secret of the war” told half the story. Groves later admitted to the US Congress that his censors were better at preventing leaks to the media, to allied nations, and to legislators, than at catching spies such as Ethel Rosenberg’s brother David Greenglass or British physicist Klaus Fuchs. The presumed race with Nazi Germany had come first. While mass recruitment of physicists and chemists limited those who were capable of uncovering the project’s secret purpose, it also let in not a few moles.
What the author of the Manhattan Project’s official history, Henry De Wolf Smyth, styled “the problem of secrecy” became even more challenging after August 1945 (86). Where Groves had sequestered freethinking scientists behind federal contracts and barbed-wire fences, after the bombings they were let loose on university campuses to train the next generation of scientists and engineers. While Groves sought to control atomic energy by monopolizing uranium and pushing a massive “Publicity” campaign with the complicity of New York Times science journalist William Laurence, Niels Bohr (the Danish theoretical physicist whose quantum model had helped inspire the wartime breakthrough), J. Robert Oppenheimer, Los Alamo’s scientific director, and likeminded allies promoted transparency and international control of atomic energy on grounds that a state could never monopolize nature’s inner mysteries.
With fresh memories of President Franklin D. Roosevelt’s New Deal, in peacetime the US Congress turned a skeptical eye on a state-dominated nuclear sector. Events intervened, not for the last time, as the uncovering of a Soviet spy ring in Canada led Capitol Hill to stiffen punishments for disclosures of “restricted data.” This phrase, which the 1946 Atomic Energy Act would define as “all data concerning the manufacture or utilization of atomic weapons, the production of fissionable material, or the use of fissionable material in the production of power,” lacked constitutional precedent (154).[32] According to it, nuclear secrets were not made; they were born that way.
The remainder of the book traces repeated challenges to this secrecy regime, from within and without. Unwelcome developments repeatedly frustrated reform. Early efforts to emphasize “declassification” were sidelined. David Lilienthal, the first director of the US Atomic Energy Commission (AEC), found his attempt to lower information drawbridges undone by a series of hammer blows: the first Soviet nuclear test in 1949, the Klaus Fuchs espionage affair, and malicious leaks from internal debates over development of a “Super” weapon—the hydrogen bomb. The odds of reform were never high. In a telling moment, Wellerstein relates to the reader that the Committee on Declassification’s initial report was itself classified.
At times, his judgments feel uncharitable. The miseducation of Lilienthal, who began his tenure committed to democratic public deliberation, is easier to grasp when early Cold War paranoia is borne in mind. Lilienthal was accountable to the inquisitorial and turf-minded Joint Committee on Atomic Energy (JCAE). His successor, Gordon Dean, comes off better in Wellerstein’s eyes largely because he never claimed to be anything but a by-the-book lawyer. Are “lofty ideals” (269) not in the final analysis preferable to no ideals, even dressed in a gray-flannelled suit?
After all, Lilienthal and Oppenheimer’s push to emphasize information that might facilitate the acquisition of nuclear weapons by foreign powers anticipated the U.S. government’s dawning focus on nuclear nonproliferation. Oppenheimer became one of the Red Scare’s most infamous causalities, as the “Super” debate set in motion his public humiliation via that mainstay of secrecy regimes: a security hearing to review his “Q” clearance. (Among the many morsels of trivia served up by Restricted Data is that “Q” originated in the acronym for “personal security questionnaire,” 193-194).
Dean’s successor, Lewis Strauss, the villain of Oppenheimer’s show trial, consolidated a Cold War regime with a “deliberately ambiguous mandate”—uphold national security while releasing commercially or diplomatically beneficial information. Together the revised US Atomic Energy Act and President Dwight D. Eisenhower’s “Atoms for Peace” program institutionalized and internationalized this approach in 1954,which would “propel declassification efforts well beyond anything the postwar reformers could have imagined, in the name of generating peaceful and cheap electricity” (233)—also the subject of Jacob Darwin Hamblin’s The Wretched Atom.[33] To win hearts and minds the following year at the First International Conference on Peaceful Uses of Atomic Energy in Geneva, Switzerland, the AEC declassified reams of studies for a sixteen-volume conference set. The larger Second International Conference three years later yielded thirty-three more volumes on subjects ranging from basic chemistry and physics to reactor principles, the geology of uranium and thorium to radiation effects on plants, animals, and people.
The final third of the book addresses “ruptures” in that private-public regime that would multiply in the aftermath of the Pentagon Papers and Watergate (283). The blurred line between peaceful and non-peaceful work yielded issues related to increasingly compact uranium-enrichment centrifuges and fusion reactors whose physical principles mirrored those of hydrogen bombs. The case of Abdul Qadeer Khan, who disseminated blueprints stolen from a Dutch contractor to URENCO (a British/Dutch/German private-public uranium-enrichment consortium) attested to the difficulty of governing a nuclear trade in which states played leading roles. Pakistan, Iran, Libya, and North Korea would all patronize Khan’s black market and all four established nuclear-weapons programs with Pakistan and North Korea succeeding in their endeavors.
Meanwhile, the AEC’s successor, the US Department of Energy (DOE), weathered frontal assaults from “anti-secrecy” campaigners against its “crown jewel”—the Teller-Ulam design for thermonuclear explosives. Wellerstein compellingly recreates the detective work of Howard Morland, a journalist, to reverse-engineer H-bomb secrets from public sources ranging from the Encyclopedia Americana to the highly technical Effects of Nuclear Weapons.[34] When The Progressive, a radical magazine, moved to publish his article with accompanying faux-blueprint illustrations, the DOE faced a dilemma: censor the publication via prior restraint and risk tacitly authenticating it (the dreaded “Streisand effect” whereby censorship merely draws attention to a controversy) or stay quiet and hope that the tempest passed.[35] Its leadership did not anticipate that The Progressive’s editors would seek out censorship with the aim of drawing attention to themselves and their anti-secrecy cause. When the DOE offered to “sanitize” the article, they “took the bait” (348-9). Although The Progressive v. the United States never acquired the notoriety of New York Times v. Nixon, it did result in a US Circuit Court opinion that invalidated the concept of “restricted data.”
A brief final chapter on “Nuclear Secrecy and Openness after the Cold War” recounts the largely-forgotten labors of Hazel O’Leary under President Bill Clinton, when the Office of Classification was rebranded the Office of Declassification. Dueling congressional reports from the Commission on Protecting and Reducing Government Secrecy under conservative Democratic Senator Daniel Patrick Moynihan and the partisan Cox Committee to investigate accusations of Chinese Communist Party espionage established the new parameters of secrecy talk. Wellerstein concludes that the rhetoric of secrecy continues to surprise in an era of resurgent conspiracy thinking and “newfound powers of scrutiny and observation” (388). With the White House and the Pentagon under presidents Donald Trump and Joe Biden beating the drums of “great-power” and “strategic” competition with the People’s Republic of China, the friction between openness, distrust in elite institutions, and the raison d’état will continue to throw up sparks. One can only hope that secrecy laws will preserve the documents that historians need to do their jobs even as the watchdogs of democracy challenge government secrecy and surveillance at every pass.
Review by Sam Lebovic, George Mason University, and Kaeten Mistry, University of East Anglia
“Science, Secrets, and the State”
The challenges to writing a history of nuclear secrecy are considerable. To do the job well, one must master not only the science, but also the bureaucratic and legal machinery. To understand how these things evolved over time means tracking arcane and technical (and technocratic) debates across multiple bureaucratic and institutional sites—an archival task that is frankly intimidating even to imagine. And reading all that dry, yellowing paper is the best-case scenario. When the subject matter is something as highly classified as nuclear secrecy, there are inevitable gaps in the record, things that cannot be known. Where a word or a paragraph has remained stubbornly redacted even decades after the fact, as Alex Wellerstein notes wryly, the gaps are sometimes literally in the middle of a document.
Luckily for us, Wellerstein is a remarkably skilled historian of the scientific bureaucracy. Despite all the hurdles, he has with remarkable clarity given us an account that recreates the improvised, experimental, and evolutionary construction of the nuclear secrecy regime. Secrecy emerges here not as the result of a functionalist drive to bureaucratic rationality or as a geopolitical imperative produced by runaway scientific breakthroughs, but as the product of quotidian politics. It was shaped by clashing personalities and by tired, frustrated employees who made it up as they went along. The reader can sympathize, for instance, with a Cornell physicist who, when forced to write “secret” at the top of every page of a long 1942 report, added a final note on the last page: “we need a stamp” (40). The process could have unintended consequences, as revealed by the Smyth report, an official history of the Manhattan project published after the war in an effort to give the public enough information that it would not demand that others go digging for more. While the report emphasized the physics of bomb development because that was less secretive than the more crucial engineering processes, it ironically added to the public assumption that there was some “secret” to the bomb hidden in the darkest corners of atomic science.
Wellerstein is particularly good at explaining how institutional logics emerged out of the granular problem of trying to keep a domain of information secret. He provides a terrific account, for instance, of the drafting of the Atomic Energy Act, showing how its remarkably sweeping rules about “Restricted Data” emerged out of a mess of political imperatives. He gives us, too, the best account we have read of the dynamics of the failed effort to block The Progressive magazine from publishing nuclear “secrets” in 1979. Wellerstein walks the reader through the difficulties of trying to close the barn door when so much information was already free-range—so much, in fact, that it turned out to be hard for bureaucrats to work out what was still supposed to be secret. “Nobody really knew, and nobody could know, what was out there,” confessed one frustrated Department of Justice (DOJ) lawyer shortly before the government abandoned the case (361).
The adept handling of these processes means that Wellerstein expertly guides us through some of the historiographical histrionics of the subject. His treatment of the atomic spies clearly explains what Klaus Fuchs et al. actually did and, moreover, the effect of their actions.[36] Espionage did not speed up the Soviet acquisition of the atomic bomb by much, if at all, but the episodes had a crucial legacy for the popular understanding—fueled by political rhetoric—of enemies stealing American secrets (228). Not for the first time, the disconnect between the science and public perception around nuclear issues is striking. While this is not a book about political philosophy, Wellerstein’s conclusions speak to the question of secrets themselves. Is there such a thing as a ‘nuclear secret’ and can it be regulated by a secrecy regime? Most of the information for the atomic bomb was based on universal, open, and shared scientific knowledge. There are some secrets concerning the Teller-Ullam design with respect to thermonuclear weapons. Yet secrecy has not stopped the development or spread of nuclear weapons, where the hurdles are more material and political than gaining access to secrets per se. “Ultimately,” Wellerstein concludes, “the value of ‘secrets’ appears overblown” (407).
There is of course a tradeoff to Wellerstein’s measured scholarship. The detailed exploration of the scientific bureaucracy and myriad complexities of nuclear science make this a long book (just shy of 500 pages including notes). The analytical focus is primarily on the three decades after 1939. Such detail means that we don’t reach 1950 until page 233; the half-decade up to the present is relatively quick-fire and patchy. A book cannot cover everything, of course, and choices must be made, although the imbalance is telling.
More importantly, the author’s scholarly caution means that Restricted Data does not have a big, splashy argument. “The singular motif that reappears throughout this work,” notes Wellerstein at the outset, “is that of tension” (3). More curiously, when reflecting on the big question of whether nuclear secrecy actually works, he more or less throws up his hands: “As someone who has thought about this for a very long time now, I admit: I still don’t know, and I am not sure I ever will” (405). How one feels about such a conclusion is likely to depend on personal taste. In a culture of hot takes, this approach is refreshing. Nonetheless, it is perhaps not the payoff we might hope, given Wellerstein’s unparalleled knowledge of the institutional logics.
The circumspection seems to reflect the ambivalence of Wellerstein’s main protagonists. This is a history of nuclear secrecy written in the frame of the history of science, and largely from the perspective of scientists. Caution, prudence, and dedication to the ethics and practice of science are the order of the day. That is, of course, a necessary part of the story. But the history of nuclear secrecy is also a story of democracy, of the problems raised by a secretive military and foreign policy infrastructure. Nuclear secrets are one, albeit unique, part of a broader secrecy regime. We want to suggest here two places where thinking about the story in this wider frame might help generate further insight.
The first comes in the formative period of the 1940s. The decision to keep the bomb secret was a function of two questions that were so tightly coiled around each other it is still hard to think them through separately. Was the bomb a technology of an existentially different order to previous weapons, one whose apocalyptic terror required a new regime of non-proliferation? And if so, could any single nation-state, or even any one alliance of nations, be entrusted with monopoly power over that weapon? For the architects of atomic secrecy, the answers to both of these questions was “yes.” The bomb was different, and only a virtuous, liberal hegemon like the US could be trusted to possess it. Ergo, there was a lot of secrecy.
Wellerstein gives us reason to think that the bomb was not, at first, quite so obviously an existential breach with what had come before. The first decisions to keep the project secret, he points out, were made at a time when the bomb was a pipe-dream; it was not yet certain that it would work, let alone pose a threat greater than, say, firebombing at scale. In this light, it was less the nature of the bomb that drove secrecy than the ideologies surrounding it, which presumed that there were military secrets that had to be kept from enemies, and that for this reason also had to be kept from the American public. If the fear of information spilling from the inside of the nascent national security bureaucracy to its outside pre-dated the bomb, it suggests that the secrecy which accompanied the bomb was a product not of the technology itself, but of a series of ideological assumptions. Wellerstein does not give us much of a genealogy of national security secrecy in domains other than the nuclear—for example, fears about preserving the secret of the Norden bombsight in 1930s—which is perhaps fair enough: no one can do everything. But placing the bomb in that lineage does raise questions about how exceptional the bomb was, and whether it might have been possible, with different political commitments, to manage the technology in a more transparent fashion.
The stakes of the decision to try to prevent proliferation by controlling knowledge were astonishingly high, as Wellerstein acknowledges in his conclusion. The effort to preserve the secret of the bomb played a central role in bringing into being the very kind of threatening, zero-sum, Hobbesian world order that the logic of secrecy presumed. Trying, and failing, to keep the bomb secret contributed to the souring of US-Soviet relations at the dawn of the Cold War, to the rise of the Second Red Scare, to the legitimation of a broader culture of national security secrecy, to the creation of a disconnect between the US public and its state, a space into which flowed all manner of conspiracy theories.
Was there another path? Wellerstein acknowledges that the best hope for an off-ramp came in the flux of the 1940s. But Wellerstein never really considers whether such an off-ramp could have been built, in large part because he assumes that “no workable system of international control” could ever have been developed. It is a big, important assumption. “Had international control somehow been implemented much earlier,” he concedes, “it is possible to imagine a very different twentieth century” (411).
It is not exactly clear why Wellerstein concludes that international control was such a non-starter. To be sure, it would have taken a lot of politics, a lot of clever thinking, and a lot of negotiating. But some of the best minds of the era were invested in precisely such a system, and even if only to historicize what did emerge, it seems worthwhile to explore what such a system could have looked like. Yet Wellerstein discusses plans for international control only in passing, dismissing physicist Niels Bohr’s 1944 proposals as perhaps “naively idealistic” and Bohr as “probably the worst ambassador for science one could imagine.” (137). All that may be true, yet Bohr was not the only one thinking about international control. Wellerstein’s reasons for focusing on Bohr reflect his broader methodological disposition to focus on the scientific bureaucracy—it was Bohr who had the inside route, Bohr who presented international control to the powers-that-be, and therefore Bohr’s inadequacies which helped consign it to the dustbin of history. Whether that was inevitable or a quirk of fate, whether the problem was with the message or the messenger, is unclear. If Wellerstein had broadened the lens of his book, one wonders whether a clearer story about the possibilities and limits of international control might have emerged. But that would have been a story about democracy and diplomacy as much as a story about science.[37]
The limitations of the narrative frame also appear in a second moment, in the 1970s-1980s when, with the Cold War consensus fractured, secrecy was debated in the United States in an unprecedented manner. Wellerstein correctly notes that public cynicism toward official secrecy stemmed principally from disillusionment around the Vietnam War and Watergate. Yet this poses a problem for any assessment of how nuclear secrets contributed to wider discussions about security and liberty. “By the end of the 1970s, the politics of secrecy had undergone a transformation in the United States,” acknowledges Wellerstein, “This had less to do with the atomic bomb than other factors” (335). In short, the bomb’s centrality to the story fizzles out. This is where the exceptionalism of nuclear secrets comes into sharpest focus, making it difficult to consider them in relation to the broader culture of national security secrecy. While there are obvious overlaps, these are essentially two distinct types of secrets. The “Restricted Data” categorization—where all information related to nuclear weapons is ‘born secret’—stands apart from the rest of the secrecy regime that is rooted in classification. Indeed, the key shifts in public opinion were on the back of exposures of classified national security information—journalistic exposés of government surveillance and funding of private groups, leaks like the Pentagon Papers, investigations by Congress, and exposures by CIA whistleblowers[38]—rather than disclosures of Restricted Data.
Wellerstein’s main attempt to place atomic exposures in the zeitgeist revolves around the concept of “anti-secrecy” (335). The notion is a little undeveloped, but he implies that “anti-secrecy” is an intentionally hostile approach that rejects the underlying premise of secrecy. It is “the revolution necessary before contemplating what the new order might be.” Wellerstein seeks to distinguish it from reform—fixing glitches in the system—and transparency, which rebuts secrecy by embracing the opposite (335-6). The skepticism toward mainstream liberal reform efforts is comprehensible. Even so, the nature of “anti-secrecy” might be clearer if the concept had been developed in closer dialogue with a critical transparency literature that questions the scale and upside of openness.[39] Moreover, the radicalism of “anti-secrecy” is curiously narrated through examples of kooky amateur sleuths trying to publish bomb plans from public sources (chapter 8). There is a reason why computer programmers like Charles “Chuck” Hansen are not widely known in the public sphere, nor represented a bête noire for US officials like the Pentagon Papers whistleblower Daniel Ellsberg. Indeed, the implication is that the Hansens of the era took the Ellsbergian impulse of disclosing lies and wrongdoing to an unproductive extreme via “secret seeking,” (337, 371) a form of exposure as political action.
Wellerstein depicts “anti-secrecy” as a crude overreaction, in part by minimizing the escalation of secrecy during the Reagan administration. Reagan, he suggests, was not “uniquely secretive” given what had come before (380), and the protests of activists only make sense if they are viewed as registering a cultural shift in the “public discourse.” (335). There may be something to this, but it is hard to get a full grasp of the dynamics through the subject of nuclear secrecy. The real action was elsewhere, in the proliferation of new techniques for policing non-nuclear secrecy: mandatory non-disclosure agreements; the 1982 Intelligence Identities Protection Act; the use of polygraphs; reforms to the Freedom of Information Act (FOIA) and classification rules; and prepublication review. These measures, supported by Congress and the courts, were prominent in the struggles over secrecy and transparency. Equating critics with a handful of bomb enthusiasts is somewhat unfair. It also runs the risk of making the transparency activists, rather than the secretive hawks in the Reagan administration, seem to be the new extremists. But to tell that story would require, once again, embedding the history of nuclear secrecy in a broader history of national security secrecy.
Since the 1940s, the idea that there are nuclear secrets has helped to legitimize an extraordinary secrecy regime in the United States. The standard political calculus has set up a need to balance “liberty” and “security”—every advocate of transparency, reform, or any challenge to secrecy needs to take pains to clarify that there are, of course, some things that need to be secret. Few examples illustrate the axiom better than nuclear secrets. By giving us our best account of the nuclear secrecy regime, Wellerstein has laid down an important plank for those who are interested in rethinking such assumptions. Nuclear secrecy, he shows us, was developed in a quotidian, bureaucratic fashion. There may be some things that truly need to be kept secret—launch codes, details of bomb design—and which the public does not need to know in order to conduct a democratic debate about the development of such fearsome weapons. But there are a whole range of other subjects and information which have ended up as ‘secrets’ which really are important to the democratic control of nuclear technology. Understanding how that happened, and how the line between secrecy and transparency could be redrawn or even reconceptualized, will ultimately require embedding the history of nuclear secrets more firmly in the history of democracy. Those taking up that project will be deeply indebted to Wellerstein’s work, which has done an invaluable service in documenting the complex world of nuclear secrecy and bureaucratized science, and in making its internal processes legible to the rest of us.
Review by Sudha Setty, City University of New York (CUNY) School of Law
Alex Wellerstein offers an engaging history of the secrecy that developed and has been sustained around nuclear technology and weaponry, and asks the difficult and perhaps unanswerable question of how much of our norms, traditions, and values we are willing to sacrifice in the name of fear.[40]
Wellerstein begins by focusing on the prolonged and wrenching internal struggles of the World War II-era scientists who worked on nuclear technology, who had to manage the tension between whether that information ought to be shared publicly (or at least within some circles of the scientific community), as had been the standard practice for scientists with regard to new discoveries, or whether to bow to political pressure and existential fear around the use of nuclear weapons to keep the information secret. This work is illuminating and provides historical context and compelling human stories that help readers outside of the scientific community better understand the tensions that scientists felt at the time and as the Cold War developed. When Wellerstein later outlines similar tensions in later decades, we have been primed by the detailed descriptions of the conversations in earlier years to understand that those stories, even if told more briefly, no doubt contain the same complexities and hard choices that he shares elsewhere.
Wellerstein frames his exploration in parallels around in the ways in which even arguably necessary secrecy may corrode the ability of scientists to conduct and trust in their work, and may likewise corrode the foundations of transparency on which liberal democracies are aspirationally built. These parallels are tantalizing to those of us who are steeped in questions surrounding government secrecy and how this issue impacts civil and human rights, as well as what it means to live in a free society, but remain only lightly explored in the text. Nonetheless, Wellerstein’s ultimate conclusions—that the story of nuclear secrecy and that of maintaining secrets in a democratic society lead to a “messy story” that is “uncomfortable and often regretted”—seem accurate and drawn from a credible place given his deep knowledge of the issues that he does write about in depth (5). I find Wellerstein’s historical account compelling, and it left me wanting to learn more about the linkages that he alludes to in his framing, which is perhaps more of a compliment to his work than a critique.
As I reflect on the parallels that Wellerstein frames, I focus on those scholars who have detailed the ways in which the professionalization of the military, and the concentration of security decision-making in the executive branch and administrative agencies, increased significantly during the late 1930s and the World War II era,[41] roughly the same era in which scientific developments regarding nuclear technology are being taken into account in the military’s strategic planning process. This concentration of decision-making authority was a planned feature of the rise of the administrative state that came with the New Deal, the implementation of which reflected the public’s experiences with the Great Depression and the concomitant willingness to cede decision-making authority to agencies in which experts could deal with complicated questions that implicated the entire nation more efficiently than state governments, Congress, or the courts. This new structural arrangement, at a time of extraordinary upheaval in US society, constituted a significant shift in the public understanding of the role of the executive branch, such that being safe meant trusting the federal government to make far more decisions as to what constituted “security” and how to maintain it. It is clear from Wellerstein’s account that nuclear scientists were caught up in this shift, and that their work reinforced arguments in favor of this concentration of decision-making authority.
In other areas of the federal government—we can think about the Department of Agriculture and a significant increase in administrative decision-making authority regarding the wheat market during the Depression[42]—this increased power to regulate did not necessarily lead to an increase in secrecy. But that is not the case with regard to matters of national security. The concentration of security-related decision-making during World War II and the early Cold War era subverted the fundamental democratic principle that the public, the courts, and Congress have the right to information about security decision-making. Again, we see from Wellerstein’s account that the scientific developments around nuclear weapons only strengthened the arguments of those who wanted to keep secret anything linked to nuclear security and, more broadly, national security, despite concerns that state control of scientific knowledge hearkened back to Nazi attempts to control as much information as possible for propaganda purposes (19-20).
Wellerstein’s account of the push from 1945 onward to tie the US government’s own publicity machine to narratives around the need for secrecy to maintain security offers a detailed insight into just how the dynamics of selective withholding and selective sharing of information were deployed. In particular, Wellerstein’s discussion of the mid-1945 publicity blitz organized around the anticipated bombing of Hiroshima and Nagasaki is chilling both for its historical account and for its resonance in the debate over the truthfulness of news that we find ourselves in the midst of today. The multiple layers that Wellerstein describes are stunning in terms of their effectiveness and for the Orwellian horrors they evoke: releasing news for propaganda purposes and releasing information in the name of scientific sharing to demonstrate supposed transparency and the values of a liberal democracy, while simultaneously holding back news for security concerns, holding back news to shape a positive narrative around US military decisions, holding back information about the devastating human cost of atomic warfare, and holding back news to avoid scrutiny from the public and from Congress (118-120).
The end stage of World War II and the early Cold War period represented a crucial turning point in public, judicial and congressional access to national security-related information.[43] The existential fear brought about by the rise of nuclear weapons and the specter of the Soviet Union having its intelligence operations deeply embedded into the United States paralyzed those who might otherwise object to this shift in power and transparency,[44] and instead cemented this secrecy in how government has operated since then. Much of the domestic rhetoric surrounding the early Cold War included a deep unease that Soviet interests had permeated American society such that the loyalty of citizens was questionable.[45] Once this concern took root, and despite concerns that the military could use secrecy as a tool to avoid congressional oversight and public or judicial accountability, the argument that access to security-related information must be curtailed to prevent injury to national security became compelling to courts and to Congress.[46] Wellerstein’s account of how this dynamic played out with regard to David Lilienthal, the first chair of the Atomic Energy Commission, vividly illustrates this point. As Wellerstein describes it, the process of Lilienthal trying to navigate the tensions between a desire for some civilian control over the regulation of atomic energy and the ability of scientists to collaborate and share information effectively, while acknowledging the perceived need for secrecy due to national security threats, led to a situation in which lip service was offered to the ideals of collaboration, while the grind of pressure toward compartmentalization of information and secrecy brought to bear by the military was seemingly inexorable (180-192).
Alongside pressure from the military came Congress’s willingness to cede its authority to make national security decisions and to control access to national security-related information. In a series of statutes passed in the early Cold War period, Congress enabled administrative control of the collection and classification process for secret information.[47] Public resistance to these broad classification powers was fleeting. Although Congress established the Commission on Government Security (“the Wright Commission”) in 1955,[48] with the purpose of conducting active oversight of security matters and to “study and investigate the entire Government security program, including…national defense secrets,”[49] the Wright Commission’s only legislative proposals were to amend criminal statutes to enable prosecution of those who made classified information public and to allow for evidence of subversion obtained through wiretaps authorized by the Attorney General to be admissible.[50] Thus, the legacy of the Wright Commission, in terms of its ability to shed light on military decision-making, was only to encourage increasing and protecting secrecy against the constitutional interests of free expression and privacy.[51] As Wellerstein points out, this raising of the specter of domestic and international terrorism tied to nuclear weapons in the 1960s and 1970s by the military perpetuated the idea that secrecy was essential to combat the existential threats to the United States (319-322).
Periodic efforts have been made since the post-Watergate era to curtail security-related secrecy, and increase transparency and accountability against potential government overreach and abuse, but with limited effect (389). As such, security-related secrecy has become fully normalized as the public continues to give primacy to the existential nature of foreign, domestic, and nuclear threats, such that the lack of transparency feels familiar and normal. Yet, Wellerstein rightly argues that we should resist comfort in this space, not only because of the erosion of democratic norms and values, and not only because of the lack of ability for scientists to collaborate freely and productively. What we should remain aware of is the very human costs of unchecked secrecy. A visit example is apparent in United States v. Reynolds,[52] a US Supreme Court case dealing with a case brought by the widows of three civilians killed in the 1948 crash of a military plane during a test flight who sought compensation from the government.[53] The government asserted the state secrets privilege[54] in response to a request during the discovery phase of litigation for the flight accident report.[55] The Supreme Court, hearing the case in 1953, evaluated whether the flight accident report ought to remain secret and whether the case could go forward without the report. The Court reasoned that “control over the evidence in a case cannot be abdicated to the caprice of executive officers,”[56] but noted that “this is a time of vigorous preparation for national defense”[57] and feared that even the trial court reviewing the flight accident report privately to see if there were any legitimate state secrets involved would unacceptably compromise security interests.[58] The widows’ loss of their spouses, although acknowledged by the Court, was ultimately unpersuasive. Yet, had the Court allowed the trial court to examine the flight accident report, it would have been able to take into account the information that became public when the report was declassified in the 1990s: [59] the military secrets claimed by the government did not exist, but there was evidence that the plane’s lack of standard safeguards likely contributed to its crash—evidence of the negligence alleged in the widows’ lawsuit from the 1950s.[60]
The story of Reynolds is the story of the human cost of a failure to engage honestly in the complex, messy business of parsing necessary secrecy from that which is simply convenient for political or other reasons. By detailing the long-term growth and persistence of secrecy, and how it has sometimes compromised the integrity of scientific inquiry and growth, Wellerstein offers us a powerful reminder that every time we embrace secrecy, we should consider what values and interests we are promoting, and honestly consider how much is lost.
Response by Alex Wellerstein, Stevens Institute of Technology
As it has been some time since my book came out (and I take some credit for being quite behind on my review response!), and given that the scholars in this roundtable review are so essentially favorable and polite, I thought that rather than taking this space to respond to them at length, I would instead primarily use it as a place to reflect on the book I wrote. I am especially appreciative of Eglė Rindzevičiūtė’s introduction.
But first, to respond to the (very charitable and generous) comments by the roundtable reviewers. It is such a remarkable thing to feel “seen,” even if others are not entirely in agreement with what they see. Michael Falcone’s review has an excellent discussion of the book, and I entirely agree even with the parts that could be interpreted as critique, such as his distinction between the “essences” and the “outcomes.” My book gestures at times towards outcomes, but is really concerned with making sense of the essences—the ways in which the nuclear secrecy regime was established, operated, conceptualized itself, and reacted to both internal and external challenges. I gesture towards large-scale outcomes, but I find it hard to disentangle those from the numerous other social and historical forces, and perhaps feared too much the idea that the book could become a loose polemic.
And there is no one more acutely aware than me of the logarithmic pacing of the book that Sam Lebovic and Kaeten Mistry point out: it was something I considered and struggled with from beginning until end, ultimately justifying it along the grounds that as with many novel historical entities, the early years are when the real “choices” are made that end up governing what options are available later. This also (all-too) conveniently is reflected in source access as well: the book shifts (not subtly) from arguments that are focused around internal documents towards more public-facing sources. When I was working on the project, it was suggested to me that perhaps I ought to just draw an even-more arbitrary line at, say, 1979, but in the end I just could not make my peace with that, for better or worse. I agree that making exact sense of what was appropriate or overreaction to the Reagan administration’s secrecy is tricky: my point was not to argue about the truth of the matter (which I think is rather hard to establish at this point—we are still too close to it—and gets into a broader history of government secrecy practices well beyond the nuclear, as they point out), but to talk about how the rhetoric and politics of secrecy/anti-secrecy had created the conditions for an interestingly different response to secrecy than one found before.
As for Jonathan Hunt’s comments on my approach to David Lilienthal, the first chairman of the US Atomic Energy Commission, it is exactly his ideals and failures that make him interesting to me (and his immediate successor, Gordon Dean, by comparison, far less so). Both Lilienthal and Lewis Strauss are interesting to me because they take strong positions; ideologically, I am far more aligned with Lilienthal than Strauss, and it certainly reflects my own predisposed judgment to view Lilienthal’s failures as a comment on the human condition: it is hard to do the expansive and risky things.
To Sudha Setty’s helpful discussion of the Reynolds case, I would just add that it is interesting to contemplate a world in which the state secrets privilege did not exist. What has always been fascinating to me about the Reynolds case, separate from the outright fraud involved (which only compounds, rather than changing, the issue), is that it involved a branch of the government ceding vast power to another, on trust alone. I think that most of our standard lay models of individual and organizational psychology would suggest this to be a very unlikely outcome, and yet, here we are.
And now some reflections on my book, from the vantage point of 2023. At its heart, I see the book as a work of narrative history that is trying to make a few large, “implicit” arguments, and several smaller, “explicit” arguments about its subject, nuclear secrecy. The largest argument is that nuclear secrecy can be historicized as a human activity. This seems perhaps obvious after a history book has been written, but it a sentiment that has been frequently absent from most discussions of secrecy and nuclear weapons, past and present. Secrecy has often been regarded as sort of an emergent property from the weapons themselves, and as a somewhat transhistorical and technologically-determined force that, once established, was essentially static and ungraspable. The major goal of my book is to argue, by its very existence, that this is not a very useful way to think about nuclear secrecy, and that rather it should be seen as something that has undergone significant change over time, and emerged out of a number of very human decisions made under very human circumstances (an approach, it must be acknowledged, that runs the risk of “humanizing” secrecy too much, in a positive sense, though I try to resist this).[61]
My editor and I hoped that the book would be of interest to people well outside my home discipline of the history of science, and so the amount of theory and jargon was kept to a bare minimum (which suited me just fine). But in my head, the model I ended up with for thinking about government secrecy as a historical subject had three major theoretical components.
The first I tend to call (but not in the book, because of dreaded jargon) the “rhetoric” of secrecy, by which I just mean the goals, justifications, fears, hopes, ideals, and other mental states that are the motivation for secrecy in the first place. What is to be accomplished by secrecy, and why is it to be done? As the book indicates, the answer to this question shifted dramatically over the course of the period studied: from hoping that Nazi scientists would not think about nuclear chain reactions, to keeping Congress ignorant of the Manhattan Project so that its members would not shut it down, to surprising the Japanese in the theatre of war, to the Cold War arms race with the Soviet Union, to fears of nuclear proliferation and terrorism, and so on (this list is not exhaustive, nor are these cases all mutually exclusive). These “rhetorics” also outline what the secret actually is, and the fact that this reality changed historically is an important observation, I thought, in part because it also is what enables the possibility of even the writing of this history, as what was once secret no longer is (an important thing for a history where the great majority of documents consulted were once classified, and I do not have, and have never wanted, a security clearance).
The second are what I call the “practices” of secrecy, and in using this language I am making a definite, if vague, reference to the work of anthropologists and sociologists of “practice” like Erving Goffman, Pierre Bourdieu, and Bruno Latour.[62] The practices I have in mind are simply all of the ways in which people attempt to reify the goals of their rhetoric of secrecy—to make them “real in the world.” So if my goal is to prevent the Nazis from learning about nuclear chain reactions, I might try to convince (as the emigre physicist Leo Szilard did in 1939) scientists who were not Nazis to avoid publishing on the subject of nuclear chain reactions. Or if my secret is the paltry size of the US nuclear stockpile in 1947, I might (as David Lilienthal did) avoid even writing down the number of assembled bombs on a document I was relating to the president, and instead simply recite the number from memory.
The practices of nuclear secrecy were vast and, again, could be historicized. There were practices that did not exist that came into existence, ones that changed once they existed, and some that were quite temporary (like denying even the existence of Los Alamos). I have an incomplete list of practices employed during the Manhattan Project that I like to show in talks to give a sense of what I mean by the wide variety of practices: self-censorship (by scientists, early on), document control (stamps, guidelines), personnel security investigations and clearances (classifying people), physical security (fences, safes, guards), site isolation (remote facilities), code-names (obfuscation), indoctrination (security “culture”/OPSEC, as well as “oaths”), compartmentalization (“need to know”), censorship (of media, of project personnel), misinformation (deliberate circulation of false information publicly and within the secrecy system), counterintelligence forces (use of G-2, FBI, for investigation and intimidation), black budget (to avoid accountability), and legal punishments (the Espionage Act). These are the ways in which the idea of secrecy has any chance of becoming a reality, because simply desiring something be known by some people and not others is not enough.
Finally, I also find it useful to think in terms of ‘institutions’ of secrecy. Institutions in this sense (heavily influenced by the work of Tim Lenoir and Sheila Jasanoff) are the social organizations necessary to standardize the rhetorics and practices of secrecy, to enforce them (using their own practices), and to essentially elevate these systems above the level of individuals.[63] One could argue that institutions are just another form of practice at some level, but I find it useful to regard them separately, because they end up giving secrecy the sense of continuity and momentum that it has (the idea that once the secrecy gets put into place, it is very hard to undo it, for example). And I found that many of my historical actors working within the US nuclear infrastructure had this sensibility as well.
The combination of these three components—the rhetorics, the practices, and the institutions—I dubbed, without much theoretical fireworks or creativity, the “secrecy regime” (6), and saw the evolving state of this theoretical entity as the subject of my historical study. The advantage to this approach felt clear to me: it gives you something to actually historicize, as opposed to more gauzy notions of secrecy, and it also meant that one could be very precise in one’s analysis and even critique of secrecy. And it also kept it, in my mind, from being a case of “memo-to-memo” policy history, of the sort that Richard Hewlett (pioneer that he was) wrote when he did his pioneering official histories of the U.S. Atomic Energy Commission.[64]
The downside of this approach is that, of course, this is only one way to slice up this history. It was a little joke between my editor and me that the subtitle had to be The History of Nuclear Secrecy in the United States and not A History, as she was well aware that historians these days blanche at the idea of telling ‘the’ story, but that having a solid-sounding title was important to selling books to people other than historians. My approach to this history is that of a historian of science trying to understand how this subject, “nuclear secrecy,” worked and changed over time. So I give a lot of more attention to questions of epistemology than, say, a historian of foreign relations might be inclined to do, for better or worse. This also meant that, especially in the later chapters, I had to figure out how exactly to distinguish between nuclear secrecy, government secrecy in general, and the national security state—something that proved somewhat impossible, because these topics tended to overlap dramatically, potentially hindering my attempts to keep things focused but not myopic.
I also saw my job as representing the aspect of secrecy that is the hardest to see from the outside: how it works, and looks, from the inside. Representing accurately and charitably the mind of the “censor” struck me as an interesting and important function, because most of our public discussions of secrecy are from the perspective of people outside of the secrecy regime. If you work on or are interested in topics that are in any way classified, it is easy to see how tempting it is to imagine all censors as petty, paranoid, nefarious bureaucrats representing a singular “official” view. It is the experience one has, anyway, when being told that the topic you are interested is not available to you, not because the information does not exist, but because the government has not deemed you worthy of knowing it. And while there are certainly enough petty and paranoid bureaucrats to go around in the Cold War, in the archives I more often found people on the “inside” wrestling with the same concerns held by those on the “outside,” but trying to square them with both the rhetorics of secrecy (in which they often believed earnestly), as well as institutional practices that they could not violate unless they wanted to lose their jobs, go to prison, have their government agency taken over by people they thought were even less charitably minded, etc. This is not to excuse the “censor,” or the nefarious public effects of secrecy, but it is to understand secrecy not purely through the (itself historical) rhetoric of “anti-secrecy,” which sees government secrecy as universally abusive.
So a critique one could make of the book, on the whole (and it has been made, here and there), is that it is far too charitable towards those who developed and operated the secrecy regimes, and gives not enough attention to the harms of secrecy, or those on the “outside” of the regimes. And, because of the various (sexist, patriarchal, racist, etc.) forces involved especially with Cold War security culture, this also means a story focused largely around middle-aged white men. And there is some truth to this; the questions that interested me most, and my approach to the research, resulted in one book, and one can imagine different questions and approaches resulting in a different book.
I deliberately avoided going too far down the path of the “ills of secrecy” (the “outcomes”) mainly because I felt that position was, if anything, overrepresented in our current (post-Watergate, post-9/11) culture, and my book was more about making sense of how the secrecy regimes developed and work than on the reasons you might want to dismantle them. I like to believe that if you goal is dismantling the secrecy regimes, understanding their history would be a useful first step to practically doing so.
But I did not write the book as a polemic or with an eye towards policy change. For one thing, I felt that there were, again, probably enough of those out there. For another, I did find that the deeper I got into the minds of “the censors,” the harder it was to see simple answers, conceptually or practically, to the problems they faced. It is well and good to complain about excessive government secrecy—I will join most readers on that—but it is harder to figure out how exactly one would go about changing such a thing, without focusing mostly on the very small scale (like better funding for the National Archives’ Freedom of Information Act office). One telling thing that stuck out to me about the anti-secrecy activists is that they often acknowledged the need for some secrecy, it was just that they believed that only they had the wisdom to decide where the line was. To me that rather sums up the problem very well.
It has been interesting to me that even though the saliency of nuclear weapons has become much more potent in the years since I started working on this project, discussions of nuclear secrecy largely have not. That is, the issues we face in the nuclear world tend not to be framed as questions of knowledge and its control, but in other terms. In some cases, it sometimes has stymied discussions of other topics, like the question of presidential control of nuclear weapons which resurfaced just before, and especially during, the Trump administration. On this topic, I found even experts far too willing to say, essentially, “we just don’t know enough to make reasonable assessments of this threat, or to make recommendations about it.” I happen to think that this is false, but the interesting thing is that I did not notice people using it as an argument for getting rid of that secrecy; it was just accepted as a necessary way of things.
The other area where it came up, fairly recently, was in relationship to the classified documents that Donald Trump took with him to Mar-a-Lago post-presidency. At several junctures it has been alleged that these documents contain information relating to nuclear weapons, and this has been highlighted above almost any other possible contents of the documents.[65] Any reader of my book, I think, would understand why that is: because nuclear secrets are still used to mean those that are “the most secret secrets” (even though, often, they have not been, and the vast majority of what is still classified as a “nuclear secret” would be viewed by most as impossibly banal), and that this was an irresistibly potent weapon to use against Trump (especially given his endless use of Democratic presidential nominee Hillary Clinton’s e-mails as a rallying point). If anything, I regretted having to say, “well, nuclear secrets are often pretty boring, and the non-nuclear documents classified ‘Top Secret/Sensitive Compartmented Information,’ which likely include information about foreign intelligence sources and methods, are probably more important from a security standpoint.” While I welcomed the many questions I received about the distinction between “Restricted Data” (the “special” legal category of nuclear weapons in the United States, created in 1946), “Formerly Restricted Data,” (a confusingly-named category that refers to still-classified information about nuclear weapons that has been removed from the “Restricted Data” category, usually to facilitate its use by the military), and other classified information, I also felt (more so than usual) as though I was being used to tell a story that wasn’t quite the one I would want to be telling; if Trump has committed crimes with classified information, whether it is nuclear or not is probably not the major issue to be worrying about, and a big part of my book is about historicizing (and to a degree, problematizing) the obsession with nuclear secrets.
I think my book does a good job at trying to create a framework for the history it is trying to tell, within the constraints of length, time, access to information, and the stories I am particularly interested and capable of telling. I believe there is still quite a lot of historical ground still uncovered, both because of material that is still classified or otherwise held tightly, and because asking different questions than I did would certainly result in different answers.
For better or worse, the discussions of secrecy in science and technology today have tended not to focus on nuclear weapons, but rather on areas like virology, cyberwarfare, artificial intelligence, and more generalized issues regarding educational and industrial espionage by the People’s Republic of China. The history of nuclear secrecy does not give any simple answers to these questions, except for one truism that sometimes still needs repeating: whatever choices are made today will strongly constrain the options that seem available in the future, as these systems are easier to create than to un-create.
[1] Trevor Paglen, Blank Spots on the Map: The Dark Geography of the Pentagon’s Secret World (New York: Dutton, 2009).
[2] John Baylis and Anthony Eames, Sharing Nuclear Secrets: Trust, Distrust and Ambiguity in Anglo-American Relations since 1939 (Oxford: Oxford University Press, 2023), Kate Brown, Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters (Oxford: Oxford University Press, 2015), Paul Josephson, Red Atom: Russia’s Nuclear Power Program From Stalin to Today (Pittsburgh: The University of Pittsburgh Press, 2005).
[3] Steve Rayner, “Uncomfortable Knowledge: The Social Construction of Ignorance in Science and Environmental Policy Discourses,” Economy and Society 41: 1 (2012): 107-125; Naomi Oreskes and Erik Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Climate Change (London and New York: Bloomsbury, 2011).
[4] Adriana Petryna, Life Exposed: Biological Citizens after Chernobyl (Princeton: Princeton University Press, 2013); Kate Brown, Manual for Survival: A Chernobyl’s Guide to the Future (New York: W.W. Norton, 2019).
[5] Stefanos Geroulanos, Transparency in Postwar France: A Critical History of the Present (Stanford University Press, 2017).
[6] Christopher Hill, Jonathan Hogg and Raminder Kaur, eds. Fallout: Radioactive Reckonings in the Nuclearocene (Liverpool: Liverpool University Press, forthcoming); Karen Barad, “After the End of the World: Entangled Nuclear Colonialisms, Matters of Force, and the Material Force of Justice,” Theory & Event 22: 3 (2019): 524-550.
[7] Gabrielle Hecht, Being Nuclear: Africans and the Global Uranium Trade (Cambridge: The MIT Press, 2012); Susan Bauer and Tanja Penter, eds., Tracing the Atom: Nuclear Legacies in Russia and Central Asia (London and New York: Routledge, 2022).
[8] David Holloway, Stalin and the Bomb: The Soviet Union and Atomic Energy, 1939-1956 (New Haven: Yale, 1994); Campbell Craig and Sergey Radchenko, The Atomic Bomb and the Origins of the Cold War (New Haven: Yale University Press, 2008); Michael D. Gordin, Red Cloud at Dawn: Truman, Stalin, and the End of the Atomic Monopoly (Picador, 2010).
[9] Una Bergmane, Politics of Uncertainty: The United States, the Baltic Question, and the Collapse of the Soviet Union (Oxford: Oxford University Press, 2023)
[10] Matthew Evangelista, Unarmed Forces: The Transnational Movement to End the Cold War (Ithaca: Cornell University Press, 1999).
[11] Paul Josephson, Nuclear Russia: The Atom in Russian Politics and Culture (London: Bloomsbury, 2023).
[12] Josephson, Nuclear Russia, 49.
[13] For a comparative approach, see Asif Siddiqi, “Soviet Secrecy: Toward a Social Map of Knowledge,” American Historical Review 126: 3 (2021): 1046-1071.
[14] See the special issue “The Life and Death of the Secret,” edited by Lenore Manderson, Mark Davis, Chip Colwell, and Tanja Ahlin for Current Anthropology 56, S12 (2015).
[15] Manderson, Davis, Colwell, and Ahlin, “On Secrecy, Disclosure, the Public, and the Private in Anthropology,” Current Anthropology 56, S12 (2015): 183-190.
[16] Stanley Kubrick, dir., Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb, Columbia Pictures, 1964.
[17] Gordin, Red Cloud, 154.
[18] See, for instance, Joseph D. Martin, Solid State Insurrection: How the Science of Substance Made American Physics Matter (Pittsburgh: University of Pittsburgh Press, 2018).
[19] Christopher Nolan, dir., Oppenheimer, Universal Pictures, 2023.
[20] Wellerstein, Restricted Data: The Nuclear Secrecy Blog, http://blog.nuclearsecrecy.com, and NUKEMAP, https://nuclearsecrecy.com/nukemap.
[21] Other key contributors to the literature on secrecy and closed knowledge include Peter Galison (“Secrecy in Three Acts,” Social Research 77 [2010], and the documentary Galison and Robb Moss, dirs., Secrecy [Harvard, 2008]); Kate Epstein, “Intellectual Property and National Security: The Case of the Hardcastle Superheater, 1905–1927,” History and Technology 34, no. 2 (2018), as well as her in-progress book on intellectual property and the development of the U.S. national security state; and Roger Shattuck, Forbidden Knowledge: From Prometheus to Pornography (New York: Harper, 1997).
[22] Langdon Winner, “Do Artifacts Have Politics?,” Daedalus 109:1 (Winter 1980), 130.
[23] David E. Pozen, “The Mosaic Theory, National Security, and the Freedom of Information Act,” Yale Law Journal 115:3 (2005), pp. 629-679, https://www.yalelawjournal.org/pdf/358_fto38tb4.pdf
[24] See, for example, Tim Jordan and Paul Taylor, “A Sociology of Hackers,” The Sociological Review 46:4 (November 1998); and McKenzie Wark, A Hacker Manifesto (Cambridge, Mass.: Harvard University Press, 2004).
[25] Associated Press, “Justices Reject Reporter’s Bid to Protect Source,” Ap.org 2 June 2014, https://www.ap.org/ap-in-the-news/2014/justices-reject-reporters-bid-to-protect-source.
[26] For the modelling of the NSC on Britain’s Committee on Imperial Defense, see Townsend Hoopes and Douglas Brinkley, Driven Patriot: The Life and Times of James Forrestal (Annapolis, MD: Naval Institute Press, 1992), chapter 19.
[27] The Asilomar Conference is well known in science and technology studies. For a brief retrospective, see Alexander Capron and Renie Schapiro, “Remember Asilomar? Reexamining Science’s Ethical and Social Responsibility,” Perspectives in Biology and Medicine 44:2 (2001), 162-169. For more on the history of the conundrums of ethics and the life sciences, see especially Sheila Jasanoff’s work, including the recent Jasanoff, Can Science Make Sense of Life? (New York: Wiley, 2019), and Jasanoff and J. Benjamin Hurlbut, “A Global Observatory for Gene Editing,” Nature (Spring 2018).
[28] Dennis Normile, “China Tightens Its Regulation of Some Human Gene Editing, Labeling It ‘High-Risk,” Science Insider, 28 February 2019, https://www.science.org/content/article/china-tightens-its-regulation-some-human-gene-editing-labeling-it-high-risk.
[29] The views contained herein are those of the author and do not represent those of the US Air Force, the Air University, or the US Department of Defense.
[30] Arthur Conan Doyle, Silver Blaze (Oxford: Heinamann, 1987).
[31] Stephen I. Schwartz, ed., Atomic Audit: The Costs and Consequences of U.S. Nuclear Weapons since 1940 (Washington, DC: Brookings Institution Press, 1998).
[32] U.S. Atomic Energy Act, Public Law 585, 79th Congress, Chapter 724, 2nd session, S.1717,
https://www.atomicarchive.com/resources/documents/deterrence/atomic-energy-act.html (accessed 13 September 2021).
[33] Jacob Darwin Hamblin, The Wretched Atom (New York: Oxford University Press, 2021).
[34] Lowell A. Martin, ed., The Encyclopedia Americana (New York: Americana Corp., 1977); Samuel Glasstone and Philip J. Dolan, The Effects of Nuclear Weapons (Washington, DC: United States Department of Defense and the United States Department of Energy, 1977).
[35] Howard Morland, “The H-Bomb Secret,” The Progressive (November 1979): 3-12; Mario Cacciottolo, “The Streisand Effect: When Censorship Backfires,” 15 June 2012, BBC News, https://www.bbc.com/news/uk-18458567 (accessed 13 September 2021).
[36] For introductions to the debate about atomic spies, see: Ellen Schrecker, “Soviet Espionage in America: An Oft-Told Tale,” Reviews in American History 38: 2 (2010): 355–361; Allen Weinstein and Alexander. Vassiliev. The Haunted Wood : Soviet Espionage in America‑the Stalin Era (New York: Modern Library, 2000); John Earl Haynes, Harvey Klehr, Alexander Vassiliev, Philip.Redko, and Steven Shabad, Spies: The Rise and Fall of the KGB in America (New Haven: Yale University Press, 2009); Katherine A. S. Sibley, Red Spies in America : Stolen Secrets and the Dawn of the Cold War (Lawrence: University Press of Kansas, 2004); Walter Schneir and Miriam Schneir, Final Verdict : What Really Happened in the Rosenberg Case (Brooklyn, N.Y: Melville House, 2010).
[37] Classic works on nuclear diplomacy include: Barton J. Bernstein, “The Quest for Security: American Foreign Policy and International Control of Atomic Energy, 1942-1946,” Journal of American History 60:4 (1974); Martin J. Sherwin, A World Destroyed : the Atomic Bomb and the Grand Alliance (New York: Knopf, 1975);
[38] Hugh Wilford, The Mighty Wurlitzer: How the CIA Played America (Cambridge MA: Harvard University Press, 2008); Kathryn Olmsted, Challenging the Secret Government: The Post-Watergate Investigations of the CIA and FBI (Chapel Hill, NC: University of North Carolina Press, 1996); Daniel Ellsberg, Secrets: A Memoir of Vietnam and the Pentagon Papers (New York: Viking, 2002); James Goodale, Fighting for the Press: The Inside Story of the Pentagon Papers and Other Battles (New York: CUNY Journalism Press, 2003); Kaeten Mistry, “The Rise and Fall of Anti-Imperial Whistleblowing in the Long 1970s” and Jeremy Varon, “Winter Soldiers of the Dark Side: CIA Whistleblowers and National Security Dissent,” in Kaeten Mistry & Hannah Gurman, eds., Whistleblowing Nation: The History of National Security Disclosures and the Cult of State Secrecy (New York: Columbia University Press, 2020): 123-186.
[39] See for instance, Michael Schudson & David Pozen, eds., Troubling Transparency: The History and Future of Freedom of Information (New York: Columbia University Press, 2018); Mark Fenster, The Transparency Fix: Secrets, Leaks and Uncontrollable Government Information (Stanford: Stanford University Press, 2017); Pozen, “Transparency’s Ideological Drift,” Yale Law Journal 128:1 (October 2018): 100-165.
[40] Some elements of this review are drawn from a previous work: Sudha Setty, The Rise of National Security Secrets, 44 Connecticut Law Review 1563, 1566-1570 (2012).
[41] See Aziz Rana, “Who Decides on Security?,” Connecticut Law Review 44:1, 39–42 (2012) (discussing the professionalization of the military in the 1930s and 1940s).
[42] Wickard v. Filburn, 317 U.S. 111 (1942).
[43] See Wallace Parks, “Secrecy and the Public Interest in Military Affairs,” George Washington Law Review 26:1 (1957), 23 (noting the shift in access to national security-related information following World War II).
[44] See Leo Albert Huard, “The Status of National Internal Security During 1955,” Georgetown Law Journal 44 (1956) 179-180: “The excrescence of international communism and the constant presence of total war, hot or cold, has made the keeping of national secrets an absolute necessity. We must show the world that democracy can be secure without silencing its citizens or suppressing the free expression of their political thought.”
[45] Certainly the activities of the House Un-American Activities Committee, the Senate Permanent Subcommittee on Investigations, and other congressional bodies reflect this concern in the early Cold War era. Eleanor Bontecou, “The Federal Loyalty-Security Program,” Indiana Law Journal 647 (1957), 29; see also Commission on Government Security, Report of the Commission on Government Security 102 (1953) (noting that significant evidence was given that various congressional committees had been infiltrated, and recognizing the “vast powers which hidden Communists could exercise from such a vantage point”). The executive branch also reacted strongly to fears of Communist infiltration. See Exec. Order No. 10450, 18 Fed. Reg. 2489 (27 April, 1953) (supplanting E.O. 9835 and mandating federal agencies to investigate whether employees posed a security risk to the nation); Exec. Order No. 9835, 12 Fed. Reg. 1935 (21 March 1947) (setting forth a broad mandate for the Federal Employee Loyalty Program); see also Huard, supra note 6, at 180 (“We must take steps to remove disloyal persons from positions where national secrets are available to them… [S]ome allege, that the balance between preservation of our form of government and protection of individual rights has been upset…. Some of this criticism can be dismissed as communist-inspired.”).
[46] See Parks (noting a serious concern with “the implications of the withholding of information on the American institutions of civil-military relations with primary emphasis on the informational needs of the non-governmental community and the Congress”) at 30.
[47] For example, Internal Security Act of 1950, 64 Stat. 987, §§ 1006–13 (codified as amended at 50 U.S.C. §§ 831–35 (2006)) (providing limitations and guidelines on who has access to classified information at the National Security Agency); Central Intelligence Agency Act of 1949, 50 U.S.C. § 403g (2006) (holding the Director of National Intelligence accountable for safeguarding intelligence information from disclosure); National Security Act of 1947, 50 U.S.C. § 435(governing the process of classifying information and accessing classified information); 50 U.S.C. § 403-5d (2006) (limiting the dissemination of privileged information); 50 U.S.C. § 404g (2006) (disallowing intelligence from being shared with the United Nations); 50 U.S.C. § 421 (2006) (punishing individuals who reveal the identity of undercover agents and classified information); 50 U.S.C. § 432 (2006) (allowing operational files of the National Geospatial-Intelligence Agency to not be disclosed or viewed by the public).
[48] Pub. L. 304, 69 Stat. 595, 595 (1955); see also Daniel Patrick Moynihan, “Foreword,” Report of the Commission on Protecting and Reducing Government Secrecy XXXIII (Washington, D.C.: Government Printing Office, 1997).
[49] Pub. L. 304, § 6, 69 Stat. at 596.
[50] Commission on Government Security, supra note 8, at 629. See also Rana, 5 (observing the same congressional enabling of executive power at different times in US history).
[51] Although the legislative proposals of the Wright Commission were not immediately adopted, four decades later Senator Patrick Moynihan reflected on the work of the Wright Commission: “In retrospect, the importance of the Wright Commission was not what it proposed, but that its proposals were never seriously considered. It had become clear to the nation . . . that even in a time of Cold War, the United States Government must rest, in the words of the Declaration of Independence, on ‘the consent of the governed.’ And there can be no meaningful consent where those who are governed do not know to what they are consenting. Moynihan, xxxiii-xxxiv (citing David Wise and Thomas B. Ross, The Invisible Government (New York: Random House, 1964), 6 (internal quotations omitted).
[52] 345 U.S. 1 (1953).
[53] Reynolds, 345 U.S. at 2–3.
[54] The state secrets privilege is a common law evidentiary privilege enabling the government to prevent disclosure of sensitive state information during litigation. A court upholding a claim of privilege has the power to shape the litigation according to the perceived degree of harm, with consequences ranging from the denial of a discovery request for a document to the dismissal of a suit. See, for example, Mohamed v. Jeppesen Dataplan, Inc., 614 F.3d 1070, 1073–74 (9th Cir. 2010) (dismissing suit seeking recovery for rendition and torture); El-Masri v. United States, 479 F.3d 296, 300 (4th Cir. 2007) (same).
[55] Reynolds, 345 U.S., 3–4. The government also cited to Air Force Regulation No. 62-7(5)(b), which precluded disclosure of such reports outside the authorized chain of command without the approval of the Secretary of the Air Force. Reynolds, 3–4 & n.4.
[56] Reynolds, 9-10.
[57] Reynolds, 10-11.
[58] Notably, Justice Frankfurter was among the dissenting justices in Reynolds. Although the dissenting justices chose not to write a separate opinion, they cited to the Third Circuit’s opinion in the litigation as reflective of the dissenting justices’ thinking on the subject. Id. at 12 (Black, J., dissenting). The Third Circuit held that the flight accident report should be examined and that the scope of the state secrets privilege ought to be construed narrowly so as to allow the litigation to proceed. Reynolds v. United States, 192 F.2d 987, 998 (1951). Judge Maris offered his view of the philosophical dangers of secrecy in a democracy: “We need to recall in this connection the words of Edward Livingston: “No nation ever yet found any inconvenience from too close an inspection into the conduct of its officers, but many have been brought to ruin, and reduced to slavery, by suffering gradual imposition and abuses, which were imperceptible, only because the means of publicity had not been secured.” And it was Patrick Henry who said that “to cover with the veil of secrecy the common routine of business, is an abomination in the eyes of every intelligent man and every friend to his country…” Reynolds, 995 (internal citations omitted).
[59] Louis Fisher, In the Name of National Security: Unchecked Presidential Power and the Reynolds Case (Univ. Press of Kansas 2006) at 165-66.
[60] Fisher, 181–82. For an in-depth account of the Reynolds case, see generally Fisher (analyzing the case and contextualizing it by assessing its effect on the United States); see also Patrick Radden Keefe, “State Secrets: A Government Misstep in a Wiretapping Case,” New Yorker, 28 April 2008, at 28 (describing the frequency with which state secrets, as outlined in Reynolds, are invoked); cf. Herring v. United States, 424 F.3d 384, 392 (3d Cir. 2005), which held that the United States did not commit a fraud on the court in its representations during the Reynolds litigation.
[61] The classic technological deterministic approach to nuclear politics is Langdon Winner, “Do Artifacts Have Politics?” Daedalus 109:1 (1980), 121-136, which regards the bomb as a “special case” where it is self-evident that “very rigid relationships of authority are necessary in its immediate presence.” (131).While I don’t disagree that the bomb, as a technology, generates specific politics, I would note that the exact form of those politics has definitely varied by historical context, and that much of what makes the history of nuclear weapons interesting is that there is considerable lack of consensus on what exactly those necessary politics end up being.
[62] E.g. Erving Goffman, The Presentation of Self in Everyday Life (New York: Anchor Books, 1959); Pierre Bourdieu, The Logic of Practice (Stanford University Press, 1980); and Bruno Latour, Science in Action: How to Follow Scientists and Engineers Through Society (Cambridge, MA: Harvard University Press, 1987).
[63] Esp. Tim Lenoir, Instituting Science: The Cultural Production of Scientific Disciplines (Stanford University Press, 1997), and Sheila Jasanoff, “Ordering Knowledge, Ordering Society,” in Sheila Jasanoff, ed., States of Knowledge: The Co-Production of Science and Social Order (New York: Routledge, 2004), 13-45.
[64] Richard G. Hewlett and Oscar E. Anderson, Jr., A History of the United States Atomic Energy Commission, Volume 1: The New World, 1939-1946 (Pennsylvania State University Press, 1962); Richard G. Hewlett and Francis Duncan, A History of the United States Atomic Energy Commission, Volume 2: Atomic Shield, 1947-1952 (Pennsylvania State University Press, 1969); Richard G. Hewlett and Jack M. Holl, Atoms for Peace and War, 1953-1961: Eisenhower and the Atomic Energy Commission (Berkeley, CA: University of California Press, 1989). I mean this without any disrespect to Hewlett—the pressures of an official historian are substantially different than a university professor. See e.g., Richard G. Hewlett and Jo Anne McCormick Quatannens, “Richard G. Hewlett: Federal Historian,” The Public Historian 19:1 (Winter 1997): 53-83, and Barton C. Hacker, “Writing the History of a Controversial Program: Radiation Safety, the AEC, and Nuclear Weapons Testing,” The Public Historian 14:1 (Winter 1992): 31-53.
[65] See, e.g., Graeme Wood, “Not Even the President Can Declassify Nuclear Secrets,” The Atlantic (12 August 2022). For a longer discussion of the issues here as as see them (or saw them based on the initial reporting), see Alex Wellerstein, “Can Trump Just Declare Nuclear Secrets Unclassified?,” Lawfare (18 August 2022): https://www.lawfaremedia.org/article/can-trump-just-declare-nuclear-secrets-unclassified