Category Archives: Human Survival

Could the Earth ever become a Dark Forest?

In the trilogy The Three Body Problem, Liu Cixin builds his novels around the idea that the universe is a Dark Forest – i.e., when you’re moving through a dark forest and you hear the rustling of leaves, the optimal reaction is to shoot first.

More fully, the Dark Forest theory of the universe is built upon these first principles:

  1. The primary goal of each civilization is to survive.
  2. There are finite resources and space in the universe.
  3. Civilizations tend to expand.
  4. Civilizations tend to advance technologically.
  5. You have no way of truly knowing whether an alien species is peaceful or hostile.

So, if you detect an alien species – what do you do?

Under the Dark Forest theory, you kill them.

The reason you kill them is that even if they’re not hostile now, at some point they will want to survive, need more sources, and have advanced technology – which means they might just kill you.


I have no idea if the Dark Forest theory accurately describes the first principles of the universe.

But it made me think about something we might be able to understand with greater precision: could the Earth ever become a Dark Forest?


Right now, the Earth is not a Dark Forest largely because of nuclear deterrence.

North Korea might very well recognize that the existence of the United States will likely bring down their regime at some point, but they can’t act on this knowledge because we could respond to any nuclear attack with an attack that wipes them out.

Even for more robust nuclear powers, each side must live with the fact that a massive nuclear war could destroy all of humanity.

Culture also acts against the Earth becoming a Dark Forest. The scaling of large societies has been in part been sustained through cultural evolution: we now identify with nation and world instead of just kin, which, presumably, partially mitigates the 5th aforementioned principal (lack of trust).

But these conditions are not immutable: so it’s worth considering, how could one sided deterrence, assured mutual destruction, and trust…. end?


Unfortunately, it’s not hard to describe a scenario:

  1. There is a shortage of an important resource that is necessary to a nation’s survival, which makes securing that resource more important to a nation’s survival than the benefits of trade.
  2. This shortage, as well as the already significant cultural differences between existing rival nations (such as USA and China), erode trust.
  3. Technology advances in a manner that allows a nation that launches a first strike to kills all other humans, not allow for a return strike, and preserve the attacking nation.

How about this: there’s a water shortage that fuels nationalism, that leads to rising animosity between populous nations, and then one of them develops a synthetic virus that instantly kills all humans that haven’t received the vaccine – a vaccine that the attacking nation released in their own water supply the week before launching the attack.


I’m not in expert in these issues, so maybe I’ve gotten much wrong.

But if the universe can become a Dark Forest, the Earth probably can too.

If this is true, we’ll need a deterrence system for whatever set of weapons come after nuclear warheads.

But what are the odds that for every new weapon we develop we’ll also near simultaneously have an equally strong deterrence system?

They don’t seem high.


Please do let me know where my logic is off.

Book Review: Homo Deus


Homo Deus is Yuval Harari’s follow-up to Sapiens, which was excellent.

I. Book Summary 

The Past 

For most of time, humans struggled to overcome three evils: famines, plagues, and wars.

In part because humans really had no good answers to these problems, God became the center piece of coping with this evils. It was God’s will, rather than human agency, that was the causal foundation for what happened on Earth.

The Turning Point 

The Enlightenment and the Industrial Revolution changed all this – rationality and science allowed humans to begin taming famines, plagues, and war – which also eroded God’s standing.

The Present 

Together, the emergence of the Enlightenment and the Industrial Revolution – as well as the decline of religion – led to a very turbulent 20th century, where numerous countries and societies experimented with new social structures.

Ultimately, capitalistic welfare states won out on the economic front, and Humanism (seeking meaning by looking inward rather than by following God’s will) is winning out on the social / spiritual front.

Because we’ve made so much progress defeating famine, plagues, and war – we’re now turning our attention to achieve immortality, happiness, and, ultimately, god like abilities.

The Future

Humanistic capitalism will be threatened by the rise of robots / computers that will undermine the foundations of both humanism and capitalism.

Because machines will be become more advanced than us, it won’t make sense for human intuition and reasoning to be the foundation for morality; and because machines will takeover the human economy, human centered capitalism / welfare states will no longer be the optimal way to structure an economy.

The two most likely futures are: techno-humanism (humans become part machine) or data-ism (humans become functionally obsolete and are replaced by intelligent machines that will likely not be conscious).

Harari indicates that techno-humanism would likely collapse on itself pretty quickly and that data-ism is our more likely future.

II. Harari is a Great Writer and Historian

It’s hard not to envy Harari as a writer: he’s logical, funny, insightful, and has an uncanny ability to elucidate complex subjects through pithy one-liners, stories, and thought experiments.

We’d all be a lot smarter if more non-fiction writers wrote with his intelligence.

Harari also does an incredible job of identifying and explaining the drivers of human material and cultural development.

III. Harari Adds Little to Futurism

Most of the main ideas in Harari’s analysis of the future can be found in deeper and more expansive works (writers along the lines of Ray Kurzwel, Robin Hanson, etc.)

While Harari’s writing and analytical abilities make him a first class historian, these skills do less work in enabling him to make insightful predictions about the future.

What I would have thought would be obvious topics of deep exploration – such as technical analysis of the computing power needed for a singularity type event, as well as the underpinnings of consciousness – receive very little treatment.

Harari just argues that data-ism will likely occur and that we can’t really predict what that will be like.

I would have loved to read a much deeper analysis of on how and when data-ism might occur, as well as some hard thinking about what economics and values might govern this new world.

Sapiens is required reading.

Homo Deus is worth reading, but, unfortunately, it’s not groundbreaking.

Is Our Democracy Good Enough?


The elections this week had me reflecting on democracy.

I find much of electoral politics to be madness.

I also find both parties’ political agendas to be frustratingly incomplete. There are many issues that threaten the future of our country, as well as all of humanity, and these issues only make up a small part of either party’s agenda.

Of course, democracy has many benefits.

Why Democracy is Great

1. Peaceful transitions of power.

2. A general check on keeping government from doing extremely awful things.

3. A general willingness to consider expert opinion.

Given our species terrible history of self-government, these three benefits should not be minimized.

But it would be highly surprising to me if our current form of democracy is the best our species will ever do.

So How Might Democracy be Improved?

In terms of substance, I think governance would be better if:

1. There was a tighter connection between delivering results and getting re-elected.

2. Policy creation would weight expert opinion much more than the median voter’s opinion.

3. Political agendas were more connected to existential threats facing our nation and humanity itself.

In short: more accountability, better decision-making, and better issue prioritization.

How Might We Structure Government to Deliver these Improvements?

1. We could change what government does. Reducing the role of government in operational activities (and increasing the role of markets) could increase accountability in these areas. Increasing the role of government in existential threat activities (by creating formal departments for these issues) could increase political prioritization of these issues. Or to put it another way: I would trade having a federal department of education for having a federal department of asteroids and volcanoes.

2. We could change how government selects policies. Per Robin Hanson, we could vote on goals and create prediction markets for policy selection. This could capture the power of expert opinion and market accountability while still allowing citizens to set the government’s agenda.

3. We could increase competition amongst governments. Open borders, charter cities, and voluntary annexation policy regimes could all increase innovation and accountability by forcing governments to compete for citizens.

Our Democracy is Not Good Enough

To answer the title of the post, our democracy is not good enough.

Too often, people think that the problem with government has to do with the fact that their preferred party doesn’t have full control.

But both parties continuously ignore existential threats to our species.

Additionally, people over emphasize the minor, but real, imperfections of our current system (lobbying, voter registration issues, gerrymandering, etc.)

But making our current structures better at the margins doesn’t seem to address the fundamental weakness of our form of democratic government.

What people don’t spend enough time on is debating how we might fundamentally restructure our democracy to increase the probability that our country will thrive and our species survive.

I’m not an expert in governance, so perhaps the ideas I threw out above wouldn’t really work. Personally, I think they’re worth trying, but I could of course be wrong.

But, despite not being an expert on how to improve our democracy, I do feel confident that our current form of governance is leading to suboptimal outcomes that are due, in part, to governmental structure.

And I will continue to think so until, at the very least, we have a federal department of avoiding extinction.