The New Nuclear Age: At the Precipice of Armageddon – IST Hosts Book Talk with Author Ankit Panda

November 17, 2025

IST’s Nuclear Policy team hosted international security expert and author Ankit Panda in Palo Alto to learn more about his latest book unpacking the trilateral nuclear competition between the United States, China, and Russia. IST CEO Philip Reiner sat down for a fireside chat with Ankit, highlighting in their discussion the need to collectively identify political and technical solutions to address the nexus of emerging technologies and the new nuclear age.

On October 23, 2025, the Institute for Security and Technology’s (IST) Nuclear Policy team hosted author and international security expert Ankit Panda in Palo Alto, California for a convening on his new book, The New Nuclear Age: At the Precipice of Armageddon. The public event, followed by a private lunch roundtable, connected Bay Area national security experts and technologists with the DC policy community invested in reducing nuclear risk. The event facilitated an open discussion about the future of the global nuclear order and the direction of U.S. nuclear policy to address emerging security challenges. 

Ankit Panda is the Stanton Senior Fellow in the Nuclear Policy Program at the Carnegie Endowment for International Peace in Washington, DC. Ankit is a frequent expert commentator in print and broadcast media around the world on nuclear policy and defense matters. He has consulted for the United Nations in New York and Geneva, and his analysis has been sought by U.S. Strategic Command, Space Command, and Indo-Pacific Command. His engagement with industry leaders working on emerging technologies affecting nuclear stability is an essential component of policy research and bridging the priorities of these technologists with stakeholders in nuclear policy planning. 

The conversation, a fireside chat with IST CEO Philip Reiner, charted the key points of Ankit’s book, which covers the developing trilateral nuclear competition between the United States, China, and Russia. Introducing the book, Philip offered context: “What’s really critical about [this book] at this juncture is, for the longest time…nuclear weapons were really not at the forefront of international security conversations. That has fundamentally shifted,” he said. 

The discussion first characterized the nature of the relationship between Russia and China, where Philip observed a high level of cooperation on “a whole variety of different capabilities, strategy, and command and control,” particularly in the context of the war in Ukraine. 

Ankit countered that the nature of the relationship is based on a degree of mistrust, noting that “as difficult as alliance management is for democracies, it’s actually harder, I would argue, for persistently distrustful authoritarian leaders. Despite these misgivings, U.S. nuclear planners have to consider the possibility, however unlikely, of coordinated nuclear strikes against its homeland.” He noted, “the United States will have to accept a level of risk in its nuclear strategy heading into this new nuclear age that [it’s] been previously unwilling to accept” because of resource constraints in the defense industrial base, its unwillingness to pursue an arms race, and expanding cooperation between nuclear-armed adversaries. 

Given the increasing presence of emerging and disruptive technologies in the nuclear field, Philip questioned whether “AI means anything for strategic stability, fundamentally.” Ankit offered insights on the potential opportunities and risks of how artificial intelligence (AI) affects nuclear stability. He pointed out the “extremes” of AI’s potential to enable states’ Intelligence, Surveillance, and Reconnaissance (ISR) capabilities to both locate mobile nuclear forces, such as submarines and mobile missile launchers, and to carry out “deception, camouflaging, and concealment” of nuclear forces. However, Ankit maintained he is not convinced that is the definite future the world is heading in. He highlighted innovation’s potential to be destabilizing in itself, saying that “the actual observable effects of the technology in the real world matter less than the expectation of what that technology can accomplish.” 

Philip and Ankit also debated how new emerging technologies could be a stabilizing factor in the nuclear order. Ankit argued that new delivery systems like hypersonic weapons “inflict a condition of vulnerability that renders defense undesirable or infeasible” which would deter states from escalating to nuclear use. Philip “wasn’t quite sure if [he] agrees or disagrees.” Even with these potentially destabilizing factors, Ankit asserted that the threshold for the first use of nuclear weapons remains high if the stakes are not existential for the survival of the state. He noted, “the [United States] will be hard-pressed to see the first use of nuclear weapons as genuinely advancing its political stakes even if it’s a tactical prudent choice.”

This event highlighted the need to collectively identify political and technical solutions to address the nexus of emerging technologies and the new nuclear age, where there are more multipolar dynamics and potential flashpoints for conflict than ever before. In particular, the nuclear policy community and technologists must come together in Track I.5 workshops, working groups, and exercises to understand national security priorities and identify points of collaboration in order to facilitate decisionmaking in a crisis and reduce nuclear risk. In addition, as Ankit pointed out, the nuclear community needs to support the next generation of nuclear policy professionals through investments from “government, civil society, and philanthropists.” To support this goal, IST will continue to act as bridge-builders and facilitate engagements among the national security and technical communities by hosting events like this book talk, workshops, and exercises. 

IST operates at the forefront of addressing nuclear risk reduction efforts and the nexus of nuclear weapons and emerging technologies. IST’s AI and NC3 project, supported by Longview Philanthropy, is pioneering action-oriented efforts to explore how the integration of artificial intelligence with nuclear command, control, and communications integration could pose risks–or opportunities–for nuclear-armed states. The team recently released a report, “Artificial Intelligence in Nuclear Command, Control & Communications: A Technical Primer,” which examines what constitutes ‘novel’ AI in the context of nuclear weapons decision-making. 

Related Content