Skip to content
Sign Up for Our Newsletter
About
Close About
Open About
About the Institute for Security and Technology
Our Team
Board Of Directors
Careers
Contact Us
Featured Events
Cyber Policy Awards
Critical Effect DC
Projects
Close Projects
Open Projects
AI and NC3
Pioneering action-oriented efforts to explore how advanced AI capabilities will be integrated into nuclear command, control, and communications
AI Antitrust and National Security
Exploring how to more effectively account for national security considerations in AI antitrust cases while respecting precedent, scope, and the core principles of antitrust law
AI Risk Reduction Initiative
Assessing the emerging risks and opportunities of AI foundation models and developing risk reduction strategies
AI Chip Export Control Initiative
Safeguarding U.S. national competitiveness by closing critical compliance and enforcement gaps
AI Risk Barometer
Measuring national security professionals’ perceptions of AI futures through a technically-informed survey
CATALINK
Preventing the onset or escalation of conflict by building a resilient global communications system
Energy FIRST
Powering U.S. and allied security & prosperity through a resilient energy future
Ransomware Task Force (RTF)
Combating the ransomware threat with a cross-sector approach
Religious Voices and Responsible AI
Engaging religious communities on safe and beneficial AI
SL5 Task Force
Strengthening AI security through a multistakeholder approach
UnDisruptable27
Driving more resilient lifeline critical infrastructure for our communities
All Projects
» Explore all of IST's projects, past and current
Focus Areas
Future of Digital Security
Geopolitics of Technology
Innovation and Catastrophic Risk
Events
Insights
Contact
Search
Donate
Archive
compliance failure
Report
Navigating AI Compliance, Part 2: Risk Mitigation Strategies for Safeguarding Against Future Failures
"Navigating AI Compliance, Part 2: Risk Mitigation Strategies for Safeguarding Against Future Failures" presents 39 risk mitigation strategies co-created by a multistakeholder working group of experts that aim to avoid institutional, procedural, and performance failures of AI systems.
AI
,
artificial intelligence
,
compliance failure
,
infrastructure protection
,
institutional failures
,
performance failures
,
procedural failures
,
risk reduction
March 20, 2025
Blog
Patrick J. McGovern Foundation Renews Commitment to Supporting IST’s AI Risk Reduction Efforts
Over the last two years, with the support of the Patrick J. McGovern Foundation, the Institute for Security and Technology (IST) has been on a mission to assess the risks and opportunities associated with the development and deployment of cutting-edge AI foundation models. IST is again excited to announce renewed support from the Patrick J. McGovern Foundation to further advance this vital work.
AI foundation models
,
artificial intelligence
,
compliance failure
,
malicious use
,
risk reduction
March 18, 2025
Blog
Q&A: Navigating AI Compliance
In this month's newsletter, we sat down with Senior Associate for Artificial Intelligence Security Policy Mariami Tkeshelashvili to learn more about the research process behind IST's latest report "Navigating AI Compliance, Part 1: Tracing Failure Patterns Through History," their findings, and what's next for this effort.
artificial intelligence
,
case study
,
compliance failure
,
GDPR
,
governance
,
risk-mitigation
December 17, 2024
Report
How Does Access Impact Risk? Assessing AI Foundation Model Risk Along a Gradient of Access
Uninhibited access to powerful AI models and their components significantly increases the risk these models pose across a range of categories, as well as the ability for malicious actors to abuse AI capabilities and cause harm.
artificial inteligence
,
bias
,
capability overhang
,
compliance failure
,
foundation model
,
human out of the loop
,
LLMs
,
malicious use
,
risk-mitigation
December 13, 2023
Search
Search
Home
About
CATALINK BRIEF
FAQ
Our Team
Why do we need crisis communications?
Activities
Events
Insights
Podcasts
Press
Get In Touch