Apocalyptic imaginaries: Risk and regulation in discourses of military AI and nuclear weapons
Apocalyptic imaginaries: Risk and regulation in discourses of military AI and nuclear weapons
ABSTRACT
Discourses on nuclear weapons and military applications of artificial intelligence (AI) portray them either as apocalyptic super weapons, posing catastrophic risks, or as indispensable to states’ national survival and the international security architecture. At the same time, debates about and governance efforts to regulate these weapons have become contested, stalled, or even abandoned. We examine the intersection of both regulatory discourses by asking: How do contemporary apocalyptic discourses about military AI and nuclear weapons shape international security governance of these technologies? Through the concept of apocalyptic imaginaries, we analyze how future-oriented visions capture the simultaneous utopian and dystopian implications of destructive technologies. We identify two cross-cutting apocalyptic imaginaries—exceptionalism and control—that produce specific security governance practices. Our findings reveal how shared apocalyptic imaginaries shape regulatory approaches, increasingly prioritizing risk management and non-proliferation over systemic discussions of disarmament or preventive prohibitions.
Download the full article here.