Rick Rosner: So, we can think of three major existential threats to humanity: AI run amok Climate change Nuclear war And then there’s the more manageable threat of an asteroid impact. Can you think of any other major risks? Scott Douglas Jacobsen: Supervolcanic eruption. Biological weapons. Solar flares. Gamma-ray bursts, global economic and societal collapse, …
Continue reading Ask A Genius 1296: Existential Risks, AI Alignment, and Global Stability









