Skip to content
Sign Up for Our Newsletter
About
Close About
Open About
About the Institute for Security and Technology
Our Team
Board Of Directors
Careers
Contact Us
Featured Events
Cyber Policy Awards
Critical Effect DC
Projects
Close Projects
Open Projects
AI and NC3
Pioneering action-oriented efforts to explore how advanced AI capabilities will be integrated into nuclear command, control, and communications
AI Antitrust and National Security
Exploring how to more effectively account for national security considerations in AI antitrust cases while respecting precedent, scope, and the core principles of antitrust law
AI Risk Reduction Initiative
Assessing the emerging risks and opportunities of AI foundation models and developing risk reduction strategies
AI Chip Export Control Initiative
Safeguarding U.S. national competitiveness by closing critical compliance and enforcement gaps
AI Risk Barometer
Measuring national security professionals’ perceptions of AI futures through a technically-informed survey
CATALINK
Preventing the onset or escalation of conflict by building a resilient global communications system
Energy FIRST
Powering U.S. and allied security & prosperity through a resilient energy future
Ransomware Task Force (RTF)
Combating the ransomware threat with a cross-sector approach
Religious Voices and Responsible AI
Engaging religious communities on safe and beneficial AI
SL5 Task Force
Strengthening AI security through a multistakeholder approach
UnDisruptable27
Driving more resilient lifeline critical infrastructure for our communities
All Projects
» Explore all of IST's projects, past and current
Focus Areas
Future of Digital Security
Geopolitics of Technology
Innovation and Catastrophic Risk
Events
Insights
Contact
Search
Donate
Archive
artificial intelligence
Report
NC3 and Crisis Instability: Growing Dangers in the 21st Century
Daryl Press focuses on the growing threats to nuclear command and control and communication (NC3) systems around the world and the links between vulnerable NC3 and strategic instability due to the risky steps that nuclear weapons states may adopt to protect their arsenals during crises or wars.
accident
,
accuracy revolution
,
air defense
,
alert level
,
arms race
,
artificial intelligence
,
cold war
,
conventional weapon
,
crisis instability
,
data
,
decision-making
,
deterrence
,
escalation
,
force planning
,
modernization
,
NATO
,
NC3
,
nuclear
,
preemptive strike
,
stability
,
survivability
October 17, 2019
AAR
,
Report
Roundtable Discussion: AI and Human Decision Making
On June 29, 2018, Technology for Global Security and the Center for Global Security Research hosted a roundtable discussion. The discussion specifically investigated the potential security implications of these technologies as they are considered for use in military capacities. The discussion was attended by a mix of academics, research scientists, venture capitalists, civil society, and industry.
artificial intelligence
,
autonomous weapons
,
black box
,
cyber attacks
,
decision-making
,
deterrence
,
disinformation
,
ethics
,
military
,
stability
November 28, 2018
Report
AI and Human Decision Making: AI and the Battlefield
As the 21st-century geopolitical balance shifts in uncertain ways, there is an increasing eagerness to deploy AI tech into both the physical and digital battlefields to gain both tactical and strategic advantage over adversaries. Based on workshops with IST and the Center for Global Security Research, "AI and Human Decision-Making" considers the implications of AI and ML for the physical and digital battlefield.
algorithms
,
arms race
,
artificial intelligence
,
black box
,
decision-making
,
fog of war
,
military
,
wargaming
November 28, 2018
AAR
,
Report
Assessing the Strategic Effects of Artificial Intelligence
On September 20-21, 2018, the Center for Global Security Research (CGSR) at Lawrence Livermore National Laboratory (LLNL), in collaboration with IST, hosted a workshop to examine the implications of advances in artificial intelligence (AI) on international security and strategic stability.
artificial intelligence
,
decision-making
,
deterrence
,
instability
,
modernization
,
nuclear war
,
quantum
,
stability
,
surveillance
September 20, 2018
AAR
,
Report
AI and the Military: Forever Altering Strategic Stability
IST and the Center for Global Security Research in 2018 convened workshops to explore the role of AI in 21st century security context. Should we be concerned about an “AI arms race”? What are the risks of unintended consequences and strategic surprise driven by AI?
arms race
,
artificial intelligence
,
drone
,
electromagnetic pulse
,
machine learning
,
national security
,
strategic stability
February 13, 2019
Previous
Page
1
Page
2
Page
3
Page
4
Page
5
Next
Search
Search
MENU
HOME PAGE
About
FAQ
The CATALINK Brief
Insights
Events
Analysis
Podcasts
Why crisis communications?
Our Team
GET IN TOUCH
Email:
[email protected]
Send us a message:
Contact
JOIN THE CATALINK MAILING LIST
First Name
Last Name
Organization
Email
Subscribe