Artificial Escalation

This work of fiction seeks to depict key drivers that could result in a global Al catastrophe: - Accidental conflict escalation at machine speeds; - Al integrated too deeply into high-stakes functions; - Humans giving away too much control to Al; - Humans unable to tell what is real and what is fake, and; - An arms race that ultimately has only losers. The good news is, all of these risks can be avoided. This story does not have to be our fate. Please share this video and learn more at This video has been informed by a 2020 paper from the Stockhold International Peace Research Institute (SIPRI): Boulanin, Vincent et al. ‘Artificial Intelligence, Strategic Stability and Nuclear Risk’. The sequel to this video:
Back to Top