What is the nature of the responsibility of a nuclear scholar and how can we ensure we are up to the mark?
Given the destructive potential, secrecy, technicality, cost and limits of command and control over nuclear weapons, those are crucial and surprisingly unaddressed questions. The context of Trident renewal and the possible independence of Scotland make them even more pressing. In this, I urge those nuclear scholars among us to broaden our definition of policy-relevant scholarship and to rethink our responsibility vis-à-vis the public. That responsibility must not be confined to communicating the existing terms of the elite policy debate. In other words, I urge us to think beyond the narrow notions of deterrence and non-proliferation, to go back to the problem of nuclear vulnerability, and to engage with the public as well as policymakers beyond the terms of the policy debates of the day. I finally urge us to always be explicit about the ethical underpinnings of the policies we advocate and to resist the temptation of overconfidence.
Why should we do that? One core reason is that we currently overestimate and underestimate our impact at the same time.
On the one hand, we dramatically overestimate our impact on nuclear weapons policy making and nuclear weapons policy elites. Most often, we merely provide justifications for existing policies and political agendas and conflate access with impact. That would not be a problem if this did not lead most of us to make compromises in the name of this quest for impact on the elites. As a result, we rule out radical courses of action and give up talking about the ethical underpinnings of our recommendations in order to be able to provide answers we regard as acceptable to the existing elites. Nuclear scientists during the partial test ban debate offer an early example of this implicit bet on policy elites at the expense of the public and its potentially damaging consequences. By choosing to sacrifice their larger ethical concerns and limit their discourse to technicalities, they were pursuing an uncontested scientific authority and direct influence over government policy. As a result, they did indeed lose their moral authority, gave up their arguments against fallout, and narrowed down the scope of acceptable dissent when the arms race was starting. However, this did not give them either the uncontested authority they were looking for or the influence they hoped to have within the government. Similarly, when we forget or choose to sacrifice our responsibility to the public in order to have an impact on elites, we can be sure of what we lose and of what we deprive the public, but we certainly cannot know what we gain from this bet.
On the other hand, we underestimate our authority and responsibility vis-à-vis the public at home and abroad and how our impact on this broader audience can have an indirect effect on the policy process. As a result, due to an exclusive focus on influencing a very constrained policy process, we unduly narrow the scope of policy options to which our audiences are exposed and feed into the illusion that science alone can determine policy, thereby excluding wisdom and judgement. This in turn constrains the scope of public discussion about nuclear weapons when we could act as imaginative sources of policy alternatives and voices to open a debate. Strangely enough, this attitude also assumes that having an impact on the public will not give you any traction on the policy process. This is an unwarranted assumption given the existing scholarship on how domestic protest groups in the U.S. led the country to enter three sets of arms control negotiations during the Cold War: the partial test ban treaty in the 1950s, the Strategic Arms Limitation Talks (SALT) in the 1960s, and the Strategic Arms Reduction Talks (START) in the 1970s as well as the scholarship on the impact of public protest on nuclear weapons policy worldwide.
So, why should we choose to give up on our duty of circumspection and creativity to the public hoping that we will gain influence vis-à-vis elites? Since, in most cases, we do not, we have to avoid that pitfall.
Given the messages of overconfidence policymakers often send to the public, recognizing – even emphasizing – the well-documented perils of overconfidence becomes an ethical imperative.
Narrowing the realm of the discussion to nuclear non-proliferation and deterrence as we currently do is to focus on only two possible policy responses to the broader problem of nuclear vulnerability (of peoples and societies and not only of the weapons). In a context where there is no foreseeable defence against a nuclear strike, where accidents can happen and escalation might be extremely fast, this is overconfident in the absence of any alternative course of action and obfuscates a whole set of ethical and political choices. President Dwight D. Eisenhower’s confession to the British Ambassador to the U.S. in December 1959 that he would rather be atomized than communized is one of the most explicit statements of the ethical and political choices underlying any policy option in our world of global nuclear vulnerability. For now, our conversation implicitly turns the problem into part of a pre-ordained solution made of deterrence and non-proliferation, which keeps the ethical choices on which it is based implicit. Instead, going back to the vulnerability problem, and resisting overconfidence in the ability of science alone to dictate policy, opens space for ethical and political debates that the public deserves.
Constrained ‘policy-relevant’ scholarship not only deprives the public of informed commentary that has something to say about the ethical and political underpinnings of existing policy options, not to mention radical and unexplored ones; it also deprives policy makers of an opportunity to consider ideas outside of their comfort zone. By bolstering a discourse of constrained, ‘realistic’ policy alternatives, this kind of scholarship implants in the minds of elites the idea that radical alternatives are indeed impossible, even if they themselves might not believe that to be the case.
This blog is an adapted and abridged version of an entry on the H-Diplo/International Security Studies Forum “what we talk about when we talk about nuclear weapons” in which historians and political scientists discussed nuclear weapons related scholarship. It was published in July 2014.
 For the purposes of this short blog entry, I do not distinguish between our students as citizens and the rest of the public.
 By overconfidence I mean the belief that science alone can dictate policy, without prior political and ethical judgements or ethically and politically loaded bets on the future, and the illusion of control.
 On the specific overstatement of impact of nuclear scholars, and the so-called “wizards of Armageddon” in particular, see Bruce Kuklick, Blind Oracles. Intellectuals and War from Kennan to Kissinger. Princeton: Princeton University Press, 2006; Marc Trachtenberg, “Social Scientists and National Security Policymaking” and Francis J. Gavin, Nuclear Statecraft. History and Strategy in America’s Atomic Age. Ithaca: Cornell University Press, 2012, 4) Marcus Raskin’s November 1963 critique of the ‘Megadeath intellectuals’ remains a powerful expression of it.
On this problem in the U.S. and the U. K., see Steve Smith, “Power and Truth. A Reply to William Wallace”, Review of International Studies 23:4, Oct. 1997, 511-512; Ken Booth, “Discussion: A Reply to William Wallace,” Review of International Studies 23:3, June 1997, 372-373 and Thomas J. Biersteker, “The ‘Peculiar Problems’ of Scholarly Engagement in the Policy Process,” International Studies Review 10:1, 2008, 173-174. Let us remember the widely accepted claim that Soviet scientists had more influence on Soviet leaders because in an unfree society, groups with access to leaders have less interference to overcome. See Matthew Evangelista, Unarmed Forces. The Transnational Movement to end the Cold War. Ithaca: Cornell University Press, 2002 and Kai-Henrik Barth, “Catalysts of Change: Scientists as Transnational Arms Control Advocates in the 1980s,” in Osiris 21: Global Power Knowledge: Science and Technology in International Affairs, ed. John Krige and Kai-Henrik Barth, 182–206.
This limited impact on policy is not specific to nuclear experts and has been documented in other subfields of national security policy like counterterrorism, in particular after 9/11. See Lisa Stampnitzky, Disciplining Terror. How experts invented terrorism. Cambridge: Cambridge University Press, 2013.
 Donald McKenzie wrote about the ‘titanic effect’ in computer safety in “Computer-related Accidental Death: An Empirical Exploration,” Science and Public Policy 24:1, 1994. “The safer a system is believed to be, the more catastrophic the accidents to which it is subject.” This notion has recently been applied in the context of nuclear weapons safety and security by Eric Schlosser in Command and Control, New York: Allen Lane, 2013, 313; Dominic D. P. Johnson links overconfidence to the breakout of war in Overconfidence and War. The Havoc and Glory of Positive Illusions, Cambridge, MA: Harvard University Press, 2004 and the heuristic biases producing overconfidence are described in Daniel Kahneman, Thinking Fast and Slow, New York: Penguin, 2011, part III.
 For the purposes of this discussion, I put “crisis management” under the heading of deterrence.
 Quoted in Ira Chernus, Apocalypse Management. Eisenhower and the Discourse of National Insecurity. Stanford, Stanford University Press, 2008, 186. On the problem of global nuclear vulnerability beyond the Eisenhower case, please see my edited collection in preparation under this title.
 We know how constrained and self-censored the inner circles of nuclear weapons policy are. As George Perkovich recently noted: “the pressures to conform to orthodoxy stifle expression of alternate views when one is operating within the system.” George Perkovich, “Do unto Others. Towards a Defensible Nuclear Policy,” Washington DC: Carnegie Endowment for International Peace, 2013, 30.