In the trilogy The Three Body Problem, Liu Cixin builds his novels around the idea that the universe is a Dark Forest – i.e., when you’re moving through a dark forest and you hear the rustling of leaves, the optimal reaction is to shoot first.
More fully, the Dark Forest theory of the universe is built upon these first principles:
- The primary goal of each civilization is to survive.
- There are finite resources and space in the universe.
- Civilizations tend to expand.
- Civilizations tend to advance technologically.
- You have no way of truly knowing whether an alien species is peaceful or hostile.
So, if you detect an alien species – what do you do?
Under the Dark Forest theory, you kill them.
The reason you kill them is that even if they’re not hostile now, at some point they will want to survive, need more sources, and have advanced technology – which means they might just kill you.
I have no idea if the Dark Forest theory accurately describes the first principles of the universe.
But it made me think about something we might be able to understand with greater precision: could the Earth ever become a Dark Forest?
Right now, the Earth is not a Dark Forest largely because of nuclear deterrence.
North Korea might very well recognize that the existence of the United States will likely bring down their regime at some point, but they can’t act on this knowledge because we could respond to any nuclear attack with an attack that wipes them out.
Even for more robust nuclear powers, each side must live with the fact that a massive nuclear war could destroy all of humanity.
Culture also acts against the Earth becoming a Dark Forest. The scaling of large societies has been in part been sustained through cultural evolution: we now identify with nation and world instead of just kin, which, presumably, partially mitigates the 5th aforementioned principal (lack of trust).
But these conditions are not immutable: so it’s worth considering, how could one sided deterrence, assured mutual destruction, and trust…. end?
Unfortunately, it’s not hard to describe a scenario:
- There is a shortage of an important resource that is necessary to a nation’s survival, which makes securing that resource more important to a nation’s survival than the benefits of trade.
- This shortage, as well as the already significant cultural differences between existing rival nations (such as USA and China), erode trust.
- Technology advances in a manner that allows a nation that launches a first strike to kills all other humans, not allow for a return strike, and preserve the attacking nation.
How about this: there’s a water shortage that fuels nationalism, that leads to rising animosity between populous nations, and then one of them develops a synthetic virus that instantly kills all humans that haven’t received the vaccine – a vaccine that the attacking nation released in their own water supply the week before launching the attack.
I’m not in expert in these issues, so maybe I’ve gotten much wrong.
But if the universe can become a Dark Forest, the Earth probably can too.
If this is true, we’ll need a deterrence system for whatever set of weapons come after nuclear warheads.
But what are the odds that for every new weapon we develop we’ll also near simultaneously have an equally strong deterrence system?
They don’t seem high.
Please do let me know where my logic is off.