In the last ten years, there has been an uptick in attention paid to existential threats (threats that could wipe out humanity). This is potentially great news.
Last night, I watched an episode of Elementary, which is one of my favorite television shows. The episode’s plot revolved around existential threats, with a focus on artificial intelligence.
That was enough to get me to write this post.
___
A couple years ago, I read Nick Bostrom’s which catalogues the various threats that might lead to human extinction.
Since then, I’ve maintained a passing interest in the field. I even went to the Singularity Summit.
Over the past few weeks, I’ve been playing around the internet trying to get more caught up on the field.
The good news: there seems to be a lot of talented people working on these issues.
The bad news: I’ve found very little publicly availability data analysis on the issue. I was curious which risks were most likely to occur; which risks were most solvable by human intervention; and the amount of resources that were currently being devoted to each risk.
I found very little of this information. Of course, perhaps this information exists in secret government departments; or perhaps the research exists and I just did a poor job of finding it.
I did see that the Future of Humanity Institute has launched a Global Priorities Project, which aims to answer some of the questions, I think. The Centre for the Study of Existential Risk also seems to be working on the issue. But neither of them have put out reports that I could find.
But, overall, I was pretty surprised at how little easy accessible information was out there.
___
I’d love to see the data I mentioned above (and that I tried to capture in the below bubble chart).
Note: I spent 30 minutes creating this chart. I don’t think I’m right on any of the values I placed on these threats. I just wanted to try and create an easy way to visualize the problem.
Does anyone know if such data exists in an easily digestible format?