Hacker News new | past | comments | ask | show | jobs | submit login

>> Even though it's science fiction, it actually seems hard to argue with the theory in the book -- that civilizations must act to eliminate each other or they are overwhelmingly likely to be eliminated themselves.

This is a great conceit for a sci-fi novel (two, actually) but I wouldn't take it too seriously as a real-life argument. To begin with, it's a theory about essentially evolutionary imperatives. It says that every civilisation, regardless of species, location, or any other condition, will eventually reach the same conclusion about the nature of the universe and the rules defining the co-existence of civilisations inside it.

To put it lightly, that is a very long stretch. We have no way to know anything about how a non-human civilisation will think, or even if it will "think" in the same sense that we do (they may be space-faring social insects without a real capacity for conscious thought, say- a classic sci-fi trope). We have no way to know what are the goals and rules that such a civilisation might choose for itself. Some may see it as their duty to protect and help life in the universe, as something unique and irreplaceable. Some might indeed see it as a survival imperative to go out and destroy or conquer other civilisations. Some might choose to develop along a sustainable path that will not end up in them exhausting their local systems' resources (eliminating the second axiom of cosmic sociology).

It's a huge universe. There's space for any number of different outlooks on life, the universe and everything. I don't doubt that the Dark Forest theory is one thing that may well have arisen somewhere, somewhen, but to assume that everything that is conscious in the universe will have "developed the hiding gene and the cleansing gene", that's just assuming way, way too much (and first of all- about the way genetic information is transferred).

We don't know anything about other technological civilisations. Cautiousness is well advised, but part of that is not assuming that we understand how they will necessarily think.




Not everyone reaches the conclusions of the Dark Forest.

But those who don't reach it are exterminated.


No.

There are about 100 other parameters whose relative balance determine game theoretical solutions to the problem. And as with many things, Cixin Liu handwaves away all that complexity in service of narrative.

Don't confuse fiction with reality.

Look at the Drake Equation as a much simpler example. Even given its simplicity, small tweaks in assumptions produce wildly different projections.


> There are about 100 other parameters whose relative balance determine game theoretical solutions to the problem.

Name three.

I'm by no means an expert on game theory, but accepting the two axioms that are quite explicitly stated in the story, "dark forest" seems to follow quite naturally.


1. The value of alien ideas being introduced into a culture.

2. The rarity of habital planetoids.

3. How quickly civilizations spread within their local area.


Not confusing fiction with reality. Simply stating the premise of the novel.

I don't object to anyone handwaving away complexity in service of narrative. Virtually all authors do that. (Two exceptions that come to mind are Tolstoy's War and Peace, and Proust's Remembrance of Things Past.)


In the books, or in the real world? In the real world there are many competing theories about why the universe is silent:

https://en.wikipedia.org/wiki/Fermi_paradox


In the books. It's just the author's theory for the Fermi Paradox. And it's a compelling and terrifying theory.


Only if there are means to exterminate them, which in the books are rather... speculative and hinging on utter idiocy of human civilization.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: