Advertisement
Guest User

Untitled

a guest
May 22nd, 2019
124
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 25.49 KB | None | 0 0
  1. avesque 22 hours ago [-]
  2.  
  3. A bit off-topic, but I've kind of struggled mentally since finishing The Dark Forest. Even though it's science fiction, it actually seems hard to argue with the theory in the book -- that civilizations must act to eliminate each other or they are overwhelmingly likely to be eliminated themselves. I'd like to believe it's not true, but so long as any two civilizations are likely to have dramatically different rates of technological advancement and so long as crossing the gaps in space between civilizations takes sufficiently long due to the laws of physics, it seems hard to deny that there might be strong reasons for civilizations to fear each other.
  4. reply
  5.  
  6.  
  7. TeMPOraL 16 hours ago [-]
  8.  
  9. Keep in mind that the dark forest theory hinges on two axioms that will likely be true for interstellar relations, while remain false for terrestrial ones. The two axioms are clearly stated in the books: 1) large communication delay, and 2) unpredictable technological jumps. The idea of the dark forest is that communication delay makes it difficult to build trust, and even if you start in superior position, it's possible for the other side to leap ahead of you technologically in the middle of those very long talks.
  10. A hypothetical Earth scenario would be an early industrial empire contacting an age-of-sail empire across the ocean via letters sent on whales, only to have the latter suddenly respond with an ICBM.
  11.  
  12. reply
  13.  
  14.  
  15. YeGoblynQueenne 22 hours ago [-]
  16.  
  17. >> Even though it's science fiction, it actually seems hard to argue with the theory in the book -- that civilizations must act to eliminate each other or they are overwhelmingly likely to be eliminated themselves.
  18. This is a great conceit for a sci-fi novel (two, actually) but I wouldn't take it too seriously as a real-life argument. To begin with, it's a theory about essentially evolutionary imperatives. It says that every civilisation, regardless of species, location, or any other condition, will eventually reach the same conclusion about the nature of the universe and the rules defining the co-existence of civilisations inside it.
  19.  
  20. To put it lightly, that is a very long stretch. We have no way to know anything about how a non-human civilisation will think, or even if it will "think" in the same sense that we do (they may be space-faring social insects without a real capacity for conscious thought, say- a classic sci-fi trope). We have no way to know what are the goals and rules that such a civilisation might choose for itself. Some may see it as their duty to protect and help life in the universe, as something unique and irreplaceable. Some might indeed see it as a survival imperative to go out and destroy or conquer other civilisations. Some might choose to develop along a sustainable path that will not end up in them exhausting their local systems' resources (eliminating the second axiom of cosmic sociology).
  21.  
  22. It's a huge universe. There's space for any number of different outlooks on life, the universe and everything. I don't doubt that the Dark Forest theory is one thing that may well have arisen somewhere, somewhen, but to assume that everything that is conscious in the universe will have "developed the hiding gene and the cleansing gene", that's just assuming way, way too much (and first of all- about the way genetic information is transferred).
  23.  
  24. We don't know anything about other technological civilisations. Cautiousness is well advised, but part of that is not assuming that we understand how they will necessarily think.
  25.  
  26. reply
  27.  
  28.  
  29. beat 21 hours ago [-]
  30.  
  31. Not everyone reaches the conclusions of the Dark Forest.
  32. But those who don't reach it are exterminated.
  33.  
  34. reply
  35.  
  36.  
  37. ethbro 21 hours ago [-]
  38.  
  39. No.
  40. There are about 100 other parameters whose relative balance determine game theoretical solutions to the problem. And as with many things, Cixin Liu handwaves away all that complexity in service of narrative.
  41.  
  42. Don't confuse fiction with reality.
  43.  
  44. Look at the Drake Equation as a much simpler example. Even given its simplicity, small tweaks in assumptions produce wildly different projections.
  45.  
  46. reply
  47.  
  48.  
  49. TeMPOraL 16 hours ago [-]
  50.  
  51. > There are about 100 other parameters whose relative balance determine game theoretical solutions to the problem.
  52. Name three.
  53.  
  54. I'm by no means an expert on game theory, but accepting the two axioms that are quite explicitly stated in the story, "dark forest" seems to follow quite naturally.
  55.  
  56. reply
  57.  
  58.  
  59. beat 20 hours ago [-]
  60.  
  61. Not confusing fiction with reality. Simply stating the premise of the novel.
  62. I don't object to anyone handwaving away complexity in service of narrative. Virtually all authors do that. (Two exceptions that come to mind are Tolstoy's War and Peace, and Proust's Remembrance of Things Past.)
  63.  
  64. reply
  65.  
  66.  
  67. YeGoblynQueenne 21 hours ago [-]
  68.  
  69. In the books, or in the real world? In the real world there are many competing theories about why the universe is silent:
  70. https://en.wikipedia.org/wiki/Fermi_paradox
  71.  
  72. reply
  73.  
  74.  
  75. beat 21 hours ago [-]
  76.  
  77. In the books. It's just the author's theory for the Fermi Paradox. And it's a compelling and terrifying theory.
  78. reply
  79.  
  80.  
  81. hoseja 8 hours ago [-]
  82.  
  83. Only if there are means to exterminate them, which in the books are rather... speculative and hinging on utter idiocy of human civilization.
  84. reply
  85.  
  86.  
  87. 013a 20 hours ago [-]
  88.  
  89. Its an interesting theory. IMO: the difficulty of destroying a civilization is probably harder than communicating with them, and the advantages of keeping one around probably outweigh the advantages of killing them if you're already more technically advanced.
  90. A big point in TDF is that the speed of light limits our ability to communicate and understand other civilizations. But, by extension, weapons are also limited by that limit. Of course, maybe its possible to break the speed of light, but then communication could also break this limit.
  91.  
  92. So, if you're a technologically advanced alien civilization and you become aware of humanity, you can send an Envoy or you can send some Nukes (or, of course, you can do nothing and watch). Just within the context of The Dark Forest Theory, there's practically no reason to send Nukes; at worst, you discover that humanity sucks and then you go ahead and nuke us. But life is probably special; maybe humanity would make great trading partners, maybe our brains function differently and can solve problems differently, maybe our natural proclivity for war would make us great allies, or maybe we'd make great slaves for the mines on the Vartoth 12 colony. All of these are options for a civilization a thousand years ahead of us; they can pick what they'd like, but none of them would classify as a result of The Dark Forest theory.
  93.  
  94. There's also a possibility that maybe this hypothetical alien civilization tried that course of action in the past with another civilization, and they were friends for a while, helped each other, then fought in a bloody war, and now that civilization is distrustful so they destroy any young civilizations while they still easily can instead of even trying to come to terms. But that's not a Dark Forest theory situation; the Dark Forest Theory suggests that its natural for civilizations to want to kill each other off.
  95.  
  96. reply
  97.  
  98.  
  99. shmed 17 hours ago [-]
  100.  
  101. Cixin book covers those aspect pretty well. One of the main element behind his version of the Dark Forest theory is something he calls the "Chain of suspicion". He argues that in a world were communication is limited by the speed of light, there's no way to establish a coherent dialog during first contact between two civilizations that would allow you to completely trust the other party. If you cannot trust that the other party will not destroy you given the chance, then the only way for you to be guaranteed not to be destroyed is to destroy them first. Even if both parties want peace, there's no way for you to convince the other party that you want peace without also giving them the time to destroy you. It's basically a game theory situation where trying to go for peaceful communication is way too risky, and the stake in play are the survival of your civilization. It's also implied that civilization that played the "peace" card will simply get eliminated the moment they encounter someone playing the "destroy" card, making you much less likely to actually encounter a peaceful civilization.
  102. reply
  103.  
  104.  
  105. tomxor 17 hours ago [-]
  106.  
  107. > But life is probably special; maybe humanity would make great trading partners, maybe our brains function differently and can solve problems differently, maybe our natural proclivity for war would make us great allies, or maybe we'd make great slaves for the mines on the Vartoth 12 colony.
  108. Slight digression, and I know your half joking but: I always find the believability lacking in sci-fi that suggest a civilisation capable of faster than light travel could possibly be interested in trading commodities or manual labour of another. Such a feat provides unlimited access to the universe, all materials and energy you could ever need, such a civilisation would likely have basically achieved alchemy transmutation (but for real).
  109.  
  110. Then, the only thing we posses of possible interest, would be our (and our planet's / ecosystem's) uniqueness, our information, our culture and likely very different perspective of the universe... as you put it "our brains function differently", and they most definitely would, even if only at a cultural level rather than fundamental level. That is something that cannot be obtained anywhere else in the universe, even with FTL, because it's endemic.
  111.  
  112. reply
  113.  
  114.  
  115. TeMPOraL 16 hours ago [-]
  116.  
  117. > I always find the believability lacking in sci-fi that suggest a civilisation capable of faster than light travel could possibly be interested in trading commodities or manual labour of another.
  118. I think the Stargate universe solved this well; the civilizations capable of building FTL drives (including the one that built stargates themselves) didn't need to trade much with anyone (one of them did use slave labor from other worlds, but for somewhat plot-justifiable reasons). Humanity used increasing amount of FTL tech over the series, but those were either stargates, borrowed, stolen or donated tech, not something humans built themselves - and thus humanity did interplanetary trade.
  119.  
  120. > the only thing we posses of possible interest, would be our (and our planet's / ecosystem's) uniqueness
  121.  
  122. That's the core point in David Brin's Uplift saga - ecosystems were the most interesting things a species would own (or rather, lease).
  123.  
  124. reply
  125.  
  126.  
  127. hannasanarion 15 hours ago [-]
  128.  
  129. There are several reasons not to fear Dark Forest, chief among them,
  130. 1. Any species that develops a civilization capable of space travel must necessarily have evolved a strong cooperative spirit, and so is more unlikely to exercise homicide as a first resort.
  131.  
  132. 2. If Dark Forest were true, and there was a murderous civilization that wanted to kill every other, we would all be dead already. You can make an entire galaxy's worth of relativistic kill missiles with the mass of a medium size asteroid.
  133.  
  134. Sterilizing an entire galaxy to make sure that there is never a challenge to your rule can be accomplished in a few thousand years. Without needing any interstellar transit.
  135.  
  136. reply
  137.  
  138.  
  139. 013a 1 hour ago [-]
  140.  
  141. > Any species that develops a civilization capable of space travel must necessarily have evolved a strong cooperative spirit, and so is more unlikely to exercise homicide as a first resort.
  142. It's more accurate to say that HUMANITY would need a cooperative spirit in order to reach interstellar travel. I don't think it's accurate to assume that every species would need this.
  143.  
  144. And, beyond that, TDF addresses this through the assertion that humanity, at multiple turns, didn't want to partake in this Dark Forest ideology; we wanted to make peace and friends, but the other more advanced civilizations weren't having it. So even if they had a cooperative spirit in the past, that doesn't preclude that the spirit sticks around when the destruction of your species is at stake.
  145.  
  146. > If Dark Forest were true, and there was a murderous civilization that wanted to kill every other, we would all be dead already. You can make an entire galaxy's worth of relativistic kill missiles with the mass of a medium size asteroid.
  147.  
  148. That's being awfully presumptive.
  149.  
  150. Constructing even a galaxy's worth of kill missiles and launching them wildly like a shotgun isn't a strategy that any logical civilization would do. They'd end up destroying tons of empty planets that could have rich resource deposits or be habitable for future colonies, and they'd know that. Maybe there are illogical interstellar civilizations, but it stands to reason that they'd be More Rare than Logical ones, given it takes some form of logic to even accomplish space travel.
  151.  
  152. Targeted strikes make more sense. So you have to have a Target. Humanity has been broadcasting RF for about 100 years. The milky way is 105,000 light years across; at best we're detectable by 0.095% of just our own galaxy. Maybe we haven't gotten one of these kill missiles because they haven't heard us yet. Or it's on its way.
  153.  
  154. But all this requires life to common; the dark forest theory asserts that the reason life seems rare is because its common and everyone is just being quiet, but the other possibility is that life is, well, rare, and that's probably more likely. If life is rare, then the dark forest theory kinda goes out the window because the novelty of finding new life would invariably overwhelm a hypothetical risk that your species has never encountered before.
  155.  
  156. IMO space contact & travel will have one of three outcomes. (1) The light speed limit can be worked around and none of this matters. (2) Life is Common, and the Dark Forest Theory is valid between species. (3) Life is Rare, humanity will colonize the stars, but a version of the Dark Forest Theory will begin to apply between human colonies separated by hundreds of lightyears. If its been 150 years since anyone in your colony has ever talked to Earth, then you're not really a colony anymore, but it's obvious to you that Earth won't see it that way. Your colony creates other colonies, and it takes earth hundreds of years to find out about them, if you decide to even tell them (because what do you owe earth?). So a thousand years pass and we've got dozens/hundreds of Tribes of humans who can't really talk to each other, most of whom don't even know of each other's existence, all developing culture, science, and weapons at independent rates. Really starts to look Dark Forest-esque, doesn't it?
  157.  
  158. reply
  159.  
  160.  
  161. YeGoblynQueenne 21 hours ago [-]
  162.  
  163. Regarding how long it takes to travel between habitable systems- Cixin Liu, like most other hard sci-fi authors, and everybody else who thinks about those things, seems to assume that lightspeed is just as problematic for everyone else as it is for us humans.
  164. And yet we have no way to know that for sure. We have no idea what the average lifespan of an intelligent species is. It may just be that humans are particularly short-lived, among all the species in the galaxy, or the entire universe. If a species has a lifetime of a couple thousand years then interstellar travel, even at sub-light speeds, would be a lot more manageable than it is for us.
  165.  
  166. Which to me, means we can relax a little about the risk of being destroyed by hostile aliens. We don't know what we don't know. Chances are, if they were going to destroy us, they would have already done so in the last million years or so. We're probably a lot less appealing, as a world, than we think we are. Perhaps the universe really hates salt water or oxygen atmospheres? Who's to say?
  167.  
  168. reply
  169.  
  170.  
  171. beat 20 hours ago [-]
  172.  
  173. He makes a lot of interesting hash from the speed of light (like the Dark Domain defense). But he more or less handwaves away the speed of light as a limitation, introducing viable FTL travel in Death's End. (As an aside, one of the things I like about the books is how terror scales... from the understandable, relativistic terror of the Trisolarans to the much more threatening terror of the Dark Forest, to the sheer hopelessness of some species altering the very laws of physics as a weapon.)
  174. reply
  175.  
  176.  
  177. TeMPOraL 16 hours ago [-]
  178.  
  179. > to the much more threatening terror of the Dark Forest, to the sheer hopelessness of some species altering the very laws of physics as a weapon.
  180. Yup. Reading the implication that the universe has only 3 dimensions because dimensionality reduction weapons were used in past wars triggered a small existential crisis in me.
  181.  
  182. reply
  183.  
  184.  
  185. YeGoblynQueenne 10 hours ago [-]
  186.  
  187. Spoiler tags please! For the uninitiated.
  188. Dimensinoality reduction? I never thought of those weapons as akin to universal PCA before. Thanks for the image.
  189.  
  190. reply
  191.  
  192.  
  193. Freaky 17 hours ago [-]
  194.  
  195. > Cixin Liu, like most other hard sci-fi authors, and everybody else who thinks about those things, seems to assume that lightspeed is just as problematic for everyone else as it is for us humans.
  196. ... the primary weapon of the Trisolans in the first book are subatomic ansible faeries.
  197.  
  198. I thought it was kind of funny, because the whole point behind the Dark Forest theory is that it emerges because aliens are so unknowable, there can never be proper trust between them. Yet a relatively young and slowly-developing race is literally more sure of what humanity is doing than we are of our neighbouring countries.
  199.  
  200. "Hard sci-fi", pfft :P
  201.  
  202. reply
  203.  
  204.  
  205. YeGoblynQueenne 10 hours ago [-]
  206.  
  207. +++ SPOILERS +++
  208. Eh, they had inside information (the Earth-Trisolaran Organisation was in contact with them).
  209.  
  210. And it's hard sci-fi because they were AIs made of protons, not actual fairies found in the magical forest of Elthrolien that had to be seduced with promises of space mead.
  211.  
  212. That's how it goes with sci-fi, innit. You can come up with anything you like as long as it's obvious that it's just advanced technology, not magic. Cixin Liu usually manages to weave in a couple of natural laws to every impossible thing so he passes.
  213.  
  214. I was more annoyed by the lightspeed contrails to be honest. That really comes out of nowhere and is a total literary device that has no basis on anything we know of. Makes the whole endeavour space opera if you ask me - which is not bad in and of itself. But in that case, where's the nuclear energy-sword wielding hero who saves humanity? Disappoint.
  215.  
  216. reply
  217.  
  218.  
  219. Freaky 1 hour ago [-]
  220.  
  221. > And it's hard sci-fi because they were AIs made of protons, not actual fairies
  222. Does mentioning your faeries are made out of atoms stop them being fantasy?
  223.  
  224. > That's how it goes with sci-fi, innit. You can come up with anything you like as long as it's obvious that it's just advanced technology, not magic.
  225.  
  226. Sure, I mean, even Star Wars is still considered within the genre despite it all being completely made-up - but the point of hard sci-fi is you're not just making it up and rubbing science-words on your endless stream of arbitrary plot contrivances.
  227.  
  228. reply
  229.  
  230.  
  231. dmitrygr 21 hours ago [-]
  232.  
  233. Not really, since DFT covers how fast information travels, not biological bodies. Lifespans aren't really part of the equation.
  234. reply
  235.  
  236.  
  237. YeGoblynQueenne 21 hours ago [-]
  238.  
  239. That's not to do with Dark Forest Theory, but why do you say it's about how fast information travels? I take it as a theory about the outlook of technological civilisations with respect to each other.
  240. And yes, lifespans are important. Perhaps not in the books, but in the real world our science and technology advances at least some in every generation, as new scientists and technologists continue the work of their forebears.
  241.  
  242. If human generations lasted a thousand years, it would perhaps take us a lot longer to make the same technological progress that it now only takes us a few years- if nothing else because the urgency of making progress would be reduced accordingly.
  243.  
  244. Lifespans may also influence the actualy speed by which species form thoughts and communicate. Remember the Ents in The Lord of the Rings? A species that lived ten thousand years may take an Earth day to form a meaningful utterance. Technological advancement would take considerably longer for them than for us.
  245.  
  246. reply
  247.  
  248.  
  249. asark 20 hours ago [-]
  250.  
  251. Oh boy, read Blindsight by Peter Watts. Kinda similar idea but more deeply dreadful. Messed me up for days. Now added to the pile labeled "Existentially Horrifying Crap I have to Not Think About to Make It Through Another Day".
  252. reply
  253.  
  254.  
  255. Freaky 17 hours ago [-]
  256.  
  257. Link, along with much of his other work: https://rifters.com/real/shorts.htm
  258. reply
  259.  
  260.  
  261. davesque 20 hours ago [-]
  262.  
  263. I've read Blindsight. It's a good one. Actually, the premise didn't bother me as much as the Three Body trilogy.
  264. reply
  265.  
  266.  
  267. simion314 22 hours ago [-]
  268.  
  269. I also had a similar feeling after finishing the book, this video https://www.youtube.com/watch?v=zmCTmgavkrQ made some good points, mainly finding problems with the axioms so it convinced me that the conclusion in the book is probably wrong since the axioms are probably not true.
  270. reply
  271.  
  272.  
  273. ColanR 21 hours ago [-]
  274.  
  275. Sadly, game theory bears out that observation. [1] Trust is difficult to cultivate.
  276. [1] https://en.wikipedia.org/wiki/Prisoner%27s_dilemma
  277.  
  278. reply
  279.  
  280.  
  281. notduncansmith 16 hours ago [-]
  282.  
  283. True, but I think there’s more hope when you take superrationality[1] into account.
  284. Multiple actors can independently arrive at compatible cooperative solutions to the survival-and-utility game (which is not zero-sum).
  285.  
  286. Wikipedia says it assumes all other players are also superrational, but I think in practice a successful variant of this strategy is to identify whether other players are superrational and avoid them if not.
  287.  
  288. There is sometimes the option of converting existing self-interested rational players into superrational ones, but that is another subject entirely.
  289.  
  290. [1] https://en.m.wikipedia.org/wiki/Superrationality
  291.  
  292. reply
  293.  
  294.  
  295. jasaloo 19 hours ago [-]
  296.  
  297. I hear you. Liu's outlook is deeply pessimistic. His obvious scientific background knowledge seems to be infused with a dash of western, neoliberal ideology-- that it's just dog-eat-dog out there, and we're all in it for our own self-interest.
  298. It's a bit ironic considering Liu is Chinese- wasn't China's Ming Dynasty the first to land on America in the early 1400s and historically demonstrate that exploration and discovery weren't necessarily accompanied by colonization/enslavement/destruction?
  299.  
  300. That example gets held up a lot when mainstream historians spin the narrative that Colombus et al enslaved/raped/pillaged because that's just what societies do (while ignoring that it was the debt that the explorers owed to their financiers that drove most of that bad behavior).
  301.  
  302. reply
  303.  
  304.  
  305. westmeal 17 hours ago [-]
  306.  
  307. >that Colombus et al enslaved/raped/pillaged because that's just what societies do (while ignoring that it was the debt that the explorers owed to their financiers that drove most of that bad behavior).
  308. Would you mind extrapolating on that a bit? In my totally 'unbiased' and 'truthful' American public school they never mentioned a word of this. I've always wanted to learn more about the real origins of this broken country.
  309.  
  310. reply
  311.  
  312.  
  313. davejohnclark 53 minutes ago [-]
  314.  
  315. I found the oatmeal's comic about Columbus enlightening (and disturbing) https://theoatmeal.com/comics/columbus_day
  316. reply
  317.  
  318.  
  319. TeMPOraL 16 hours ago [-]
  320.  
  321. Wait until you discover that the story behind Thanksgiving is pretty much a proper alien invasion sci-fi, featuring technologically superior invaders landing in ships, and promptly unleashing biological weapons on the local population.
  322. reply
  323.  
  324.  
  325. westmeal 6 hours ago [-]
  326.  
  327. Well I knew that part. <sarcasm> They did it to promote freedom of course. </sarcasm>
  328. reply
  329.  
  330.  
  331. jbattle 21 hours ago [-]
  332.  
  333. If you are in the market for a good story touching similar topics, I thought Pandora's Star (first of a two-book series) was a great read (I bounced hard off the dark forest)
  334. reply
  335.  
  336.  
  337. underwater 19 hours ago [-]
  338.  
  339. I figure that the age of global discovery and colonialism that happened here on Earth is a case study. Fixed resources, different cultures, huge differences in technology. It wasn't great, but also, many distinct cultures and peoples survived.
  340. reply
  341.  
  342.  
  343. TeMPOraL 16 hours ago [-]
  344.  
  345. Age of colonization meets zero of the two axioms underpinning "dark forest" - there was a possibility of establishing trust, and there was no possibility of sudden and unexpected technological leaps of the weaker side.
  346. reply
  347.  
  348.  
  349. mlcrypto 21 hours ago [-]
  350.  
  351. I think with the rate of technological growth an intelligent species can improve to match another quicker than they can travel across space. Or perhaps even surpass an isolated travelling group.
  352. reply
  353.  
  354.  
  355. AgentME 21 hours ago [-]
  356.  
  357. Personally, I think it's possible that abiogenesis is so rare that we are the only life within the observable universe. I don't know if that's more or less depressing.
  358. reply
  359.  
  360.  
  361. empath75 22 hours ago [-]
  362.  
  363. People don’t always or even normally act at the level of civilizations. Why would this apply at the civilization and not the town or the state or country or corporate level.
  364. reply
  365.  
  366.  
  367. AgentME 21 hours ago [-]
  368.  
  369. Different groups of people get along because we're all still human and very similarly culturally and understand each other's desires and goals. Things start going badly when there's major social and technological differences. Think of how things went for the natives when America was discovered.
  370. Now amplify it. Imagine if different countries were isolated from each other for millions of years, evolving into different species, and advancing technology the whole time, lost track of any commonalities between each other, and were then exposed to each other.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement