AI Risk Barometer

Measuring national security professionals’ perceptions of AI futures through a technically-informed survey

AI is advancing faster than our ability to govern it. As we move toward artificial general intelligence (AGI)—and possibly superintelligence (ASI)—these systems stand to reshape global security. Despite a growing discourse on AI and security, there remains limited empirical research on how national security actors perceive, assess, and prepare for the technological shifts and emerging risks associated with the potential for AGI/ASI. 

To address this issue, the Institute for Security and Technology (IST), with support from and in partnership with the Future of Life Institute (FLI), is launching the AI Risk Barometer project, inspired by the work of Nobel Laureate Arthur Compton during the Manhattan Project to calculate a “Compton Constant” for a potentially catastrophic nuclear accident during a test. Then, as in now, there are no clear right answers, and the project seeks to learn from leading national security stakeholders about how they view the risks and opportunities of developing ever-powerful AI.  This new effort seeks to elucidate AGI and ASI capability thresholds; potential benefits and harms, including a catastrophic AI loss of control scenario; timelines; the efficacy of potential governance approaches to mitigate risk; and policymakers’ risk appetites given tradeoffs.

“National security leaders and AI researchers aren’t speaking the same language about the risks posed by cutting-edge AI. Policymakers need clear-eyed, evidence-based insights on AI risks, and on which governance tools can buy down those risks without stifling innovation.”

Featured Content

Recent Content

AI Risk Barometer Team

Philip Reiner

Chief Executive Officer

Steven M. Kelly

Chief Trust Officer

Mariami Tkeshelashvili

Director for Artificial Intelligence Security Policy

Ritika Verma

Senior Analyst for Artificial Intelligence Security Policy

MENU

GET IN TOUCH

Email: [email protected]
Send us a message: Contact

JOIN THE CATALINK MAILING LIST