Stanford Existential Risks Initiative
The Existential Risks Initiative is a collaboration between Stanford faculty and students dedicated to mitigating global catastrophic risks (GCRs). Our goal is to foster engagement from students and professors to produce meaningful work aiming to preserve the future of humanity by providing skill, knowledge development, networking, and professional pathways for Stanford community members interested in pursuing GCR reduction. Concrete programming we run includes a summer research program, speaker events, discussions, and a Thinking Matters class taught by the two faculty advisors for the initiative called Preventing Human Extinction (THINK 65).
What is a Global Catastrophic Risk?
We think of global catastrophic risks (GCRs) as risks that could cause the collapse of human civilization or even the extinction of the human species. Prominent examples of human-driven global catastrophic risks include 1) nuclear war, 2) an infectious disease pandemic engineered by malevolent actors using synthetic biology, 3) hostile or uncontrolled deployments of artificial intelligence, and 4) climate change and other environmental degradation creating biological and physical conditions that thriving human civilizations would not survive. Other significant GCRs exist as well, and we welcome proposals that address them.
Summer 2020 Projects
SERI’s inaugural cohort of summer undergraduate research fellows presented their work at a symposium on Friday 8/28, from 10AM to 12PM PST. We're in the process of putting together a compendium of all the projects, and in the meantime check out the abstract book.
- Summer Research
- THINK 65
- Future Work
- Learn More
SERI ran its inaugural Summer Undergraduate Research Fellowship this past summer, funding 20 undergraduate students to work with a mentor (a faculty member or industry professional) to carry out a 10-week research project dedicated to mitigating global catastrophic risks. Through speaker events, discussion groups, and social events, we also educated the cohort on the broader field of existential risks, built up our community of existential risk-focused researchers, and provided opportunities for students to explore career pathways.
Some projects from the program include a project exploring the potential of data visualization to communicate compelling GCR-scale arguments and tackle cognitive biases, a project applying models of economic growth to AI-enabled technological growth in hopes of understanding plausible timelines for transformative AI, and a project designing an ideal building fitted with a variety of engineering non-pharmaceutical interventions to mitigate both seasonal and catastrophic infectious agents.
THINK 65, Preventing Human Extinction, is a class designed for Stanford freshman interested in exploring topics around global catastrophic risk. The class has been taught for the last two years by SERI’s faculty leaders, Paul Edwards and Steve Luby. Students in THINK 65 engage with plausible scenarios by which catastrophe could occur, as well as with prospective solutions. They also discuss the psychological, social and epistemological barriers that inhibit society from recognizing and evaluating these threats.
More information about the class can be found here: https://www.stanforddaily.com/2020/06/01/think-65-examines-paths-to-human-extinction/
SERI is a very young group, founded in spring 2020, and we have a lot of other plans in the works. Check back soon to find out more, or get in touch at email@example.com!
If you want to learn more about existential or global catastrophic risks, here’s a list of resources you can refer to: