Carl Robichaud joins the podcast to discuss the new nuclear arms race, how much world leaders and ideologies matter for nuclear risk, and how to reach a stable, low-risk era.
Carl Robichaud joins the podcast to discuss the new nuclear arms race, how much world leaders and ideologies matter for nuclear risk, and how to reach a stable, low-risk era. You can learn more about Carl's work here: https://www.longview.org/about/carl-robichaud/ Timestamps: 00:00 A new nuclear arms race 08:07 How much do world leaders matter? 18:04 How much does ideology matter? 22:14 Do nuclear weapons cause stable peace? 31:29 North Korea 34:01 Have we overestimated nuclear risk? 43:24 Time pressure in nuclear decisions 52:00 Why so many nuclear warheads? 1:02:17 Has containment been successful? 1:11:34 Coordination mechanisms 1:16:31 Technological innovations 1:25:57 Public perception of nuclear risk 1:29:52 Easier access to nuclear weapons 1:33:31 Reaching a stable, low-risk era
Adam Gleave, CEO of FAR.AI, discusses post-AGI scenarios, risks of gradual disempowerment, defense-in-depth safety strategies, scalable oversight for AI deception, and the challenges of interpretability, as well as FAR.AI's integrated research and policy work.
Nate Soares discusses his book on the risks of superintelligent AI, arguing that current approaches make AI unpredictable and uncontrollable, and advocates for an international ban on research toward superintelligence.
Tom Davidson discusses the risks of AI-enabled coups, examining how advanced artificial intelligence could facilitate covert power grabs, alter democratic processes, and outlining strategies to mitigate these emerging threats.