Home Artists Posts Import Register

Content

Before we get started, I just wanted to let you know we have some limited time only Solar Eclipse merch for sale at the merch store. I’ll tell you more at the end of the episode.

In 1974 we sent the Arecibo radio beamed a message towards the few hundred thousand stars of Messier 13, a globular cluster near the edge of the Milky Way,. It’ll take 10s of thousands of years to arrive, so no surprise we haven’t heard back yet. But if there are other civilizations out there, surely we expect many of them to have a head start on us—and so could have been shouting into the void louder and longer than we have. So why haven’t we spotted their own efforts to make contact? What if the silence from the stars is a hint that we shouldn’t be so outgoing? What if aliens are deliberately keeping quiet for fear that they might be destroyed?

For over 60 years we have been listening for messages from aliens. Yet we haven’t heard, or seen, anything or anyone. This is dubbed the Great Silence by author David Brin. With over five thousand exoplanet discoveries confirmed to date, it seems increasingly likely that there has to be life out there. The mismatch between the number of possible origins of life and the apparent absence of any other technological life is known as the Fermi Paradox, which of course we’ve discussed once or twice in the past. But today we’re going to explore one of the more terrifying solutions to the Fermi Paradox - the idea that there are other technological species in our galaxy, but they’re all silent - because those that broke their silence were quickly destroyed.

This is the dark forest hypothesis, and it’s the core idea of Liu Cixin’s book of the same name, the sequel to the Three Body Problem. Liu Cixin coined the term, although the idea has been proposed by others.

But the imagery of the universe as a dark forest is so chilling, let’s lean into it.

Imagine you are a lone hunter stalking through dense forest. Visibility range is almost zero so you have no idea what’s hiding over the rise or behind the next tree. You suspect there may be other hunters but you’re not sure. You carry a deadly weapon and know you could destroy other hunters in an instant … but only if you spot them first. You are forced to assume that they have the same capability, which means they could also destroy you.

So what do you do? Do you call out and hope to find allies? Friends? Or do you keep very still and quiet and hope no one finds you? Or do you keep stalking, ready to destroy any other hunter you find out of fear that they’ve also chosen this last option?

The dark forest solution to the Fermi Paradox proposes that almost all civilizations will choose silence—whether a passive silence or a watchful, trigger-happy silence. And those that do not so choose are no longer with us. It’s certainly a great creepy premise for a scifi novel, but the idea is really interesting because it’s based on some reasonably concrete game theory. And also some pretty solid physics.

Let’s look at the scenario again without the forest metaphor. We have two planets that have developed advanced civilisations. Let’s call them the Alicians and Bobarians, or A and B for short. Both A and B are capable of sending messages across the universe and could, if required, destroy another planet relatively easily and without personal risk. We’ll come back to how. 

We can model their interactions in game theory as a sequential game, just like chess—in which ”players” choose their actions in sequence. Each of the civilisations act in response to the other’s last action, leading to a tree of possibilities and some payoff associated with each branch. Each civilization should choose its responses to maximise the payoff likelihood. For example, if the civilizations make friends that would have a positive payoff for both—they gain strength in numbers and maybe share technology. If one civilization destroys the other that would be a relatively neutral payoff for the destroyer, but an extremely negative payoff for the destroyee. How negative? Well nothing could be worse, so maybe infinitely negative payoff or infinite cost.

So let’s look at one possible scenario. Civilization B intercepts a signal from A, but A doesn't know about B. Maybe the Alicians are actively sending out signals hoping for a response. Or maybe their own radio communications are just really loud—the equivalent of walking noisily through the forest.

So when B notices A, what options do they have? They could ignore A, reply to A, or destroy A.

Let’s say they ignore. A will remain blissfully oblivious of the existence of B, and B will continue its antisocial ways. This scenario has a zero payoff for both A and B. Slightly boring, but at least nobody gets hurt.

What if B takes the second option and destroys A? There’ll be a finite cost to destroying a planet in resources—but for an advanced civilization that’s pretty small. And there might also be a resource gain if that solar system had any good loot. Maybe there’s a cost in guilt at annihilating billions of sentient creatures, but the point is that if there’s a cost it’s finite. For the Bobarians anyway.

At worst, B experiences a finite negative payoff by taking the destroy option, while A experiences infinite negative payoff. Finally, if B chooses to reply to A’s message, A now knows about the existence of B. This gives additional knowledge to A that they previously did not have. Then, A has the same three choices that B had—including the potential to attack and destroy B.

So of all the options—ignore, destroy, reply—only reply leads to a potential infinite negative payoff for B. If you just sum the potential payoffs of the three branches, it’s clear that B should never reply—assuming the Bobarians care about their own survival.In fact, if we extend the tree a little more we uncover new outcomes—for example, in the “ignore” branch there’s a possible future branching in which A discovers the existence of B and gets its turn to play the game.

That means both ignore and reply have the potential for infinite cost … and by that logic a civilization should always choose to destroy any other civilization they detect. So, if we assume that civilizations that can detect other civilizations also know game theory, they know the game-theoretic dominant strategy—destroy the other. Or, if they place very heavy weight on the guilt cost, at the very least they will remain very very quiet. Hence the Great Silence and the solution to the Fermi Paradox.

This is a pretty dark conclusion, so let’s see if we can find flaws in this reasoning in the hope that the universe isn’t as terrifying as the dark forest proposes.

First up, let’s look at some of the assumptions. We said that these civilizations can destroy each other relatively easily. That’s actually a reasonable assumption if both civilizations have concentrated population centers—say, on one or a handful of planets. AT the risk of telling you how to destroy planets—Perhaps the simplest approach is the relativistic kill vehicle. An advanced civilization worth its salt should be able to harness, say, a percent of its home star’s energy towards accelerating one or more masses to a good fraction of the speed of light. Send those bodies to the offendin world and you at the very least ionize their atmosphere and vaporize their oceans. Moreover, the target planet really has no way to protect itself. The vehicle is traveling at a good fraction of the speed of light, so the target will have little warning before the weapon arrives, and not much they could do about it anyway.

It doesn’t even matter if the Bobarians don’t think the Alicians have the technology to do this. Technological progress can be exponential, and so enormous advances can happen on the timescale of the light travel time between two star systems. Say those systems are 100 light years apart. In 100 years, Moores law yields a quadrillion-fold increase in computing power. Other technologies have also been shown to obey exponential improvement rates, albeit with different doubling times. Even if that rate is more staggered, one thing is clear: whatever technological state you perceive your neighbours to be in is probably not the technological state they’ll be in when they receive your reply, or when they find you.

The vast distances between the stars is really the driver of the dark forest hypothesis. It ensures that both sides have potential access to the instant kill option, and in general it’s what allows us to approximate this as a sequential game. Civilizations can’t interact with each other in real time—they can’t “feel out” the intentions of the other, or react to perceived hostility. They can only choose their move, knowing that if the other side chooses hostility in response it likely means complete annihilation. Those vast distances also mean you’re not safe even if you send a friendly message—how does the receiver of the message know that you’re truthful, or that you’ll stay friendly over the centuries?

So the physics behind the hypothesis seems to hold together. But there are also assumptions—for example there’s some psychology. We assume that all aliens will ascribe an overwhelmingly large cost or negative payoff to the extinction outcome. Seems fair—the “better them than us” philosophy seems mostly universal among humans. And if all species crawled out of the mud via Darwinian processes then they should all have competitive and self-preserving tendencies.

But it’s also possible that we’re projecting our own primitive psychological tendencies unfairly. We don’t know what value systems advanced aliens might have. Perhaps there’s a psychological transition that essentially all civilizations go through in which they, like, realise the value of all sentience of something. Perhaps the ones that don’t transition exterminate themselves. In that case they may not value their own existence as infinitely higher than that of their neighbours.

There’s also another aspect to the cost analysis besides fear and empathy—and that’s curiosity. Humans did not spread across the globe or invent fire or build great civilizations or discover the laws of physics by each of us placing infinite value in our personal survival. Yes we sought the resources and strength needed for survival, but we were also curious. And that curiosity proved a huge survival advantage in the end. It’s that interplay—that balance between the desire to be safe and the curiosity to peer over the next horizon, or round the next tree—that kept us walking through the metaphorical forest to reach the metaphorical … I dunno, sun-dappled meadow of the modern world.

Anyone we meet out there will also have curiosity and wariness in different measures. Some will want to know what other life and cultures and minds are like. And maybe that curiosity will outweigh their fear.

This is wild speculation. The point is, we can’t really assume that the payoff calculation will be the same for everyone. Others may be playing different game.

So, how well are we playing the game? So far, we’ve sent out a few messages here and there. The Arecibo message will reach the M13 globular cluster in something like 27,000 years. We’ve sent a motley array of signals to 30 or so stars that’ll arrive over the next few decades to few centuries (some have already arrived). It’ll take a pretty advanced civilization actively looking for signals to see any of them. Some signals have reached their destinations and there’s been time for a response—for example the invasion fleet from Altair was due in 2015. Most likely there aren’t advanced civilizations quite that close, and we haven’t been shouting too loudly. We’re walking pretty quietly through the forest, maybe breaking a twig or two. We haven’t really started playing the game yet—at least as far as we know.

So should we try to contact aliens? Well maybe it’s wise to be a bit cautious. On the other hand, there’s one last assumption in the Dark Forest hypothesis—and that's the idea that any civilization gets to make unilateral decisions that follow the perfect logic of game theory. “We” don’t decide to things. Individuals decide, and collective action emerges in very complex ways. Even now, individual humans, or at least smallish groups, could start projects that make quite a lot of noise on a fairly short timescale. And maybe we will. And maybe when we meet our first non-human civilization we’ll wish we’d respected the Great Silence. Or maybe the curiosity and empathy of the individuals from both civilizations will win the day. Perhaps we’ll continue through the dark forest side by side and stronger for it, in hope of finding brighter metaphorical landscapes of space time.

Files

What If We Should NOT Contact Aliens? | Dark Forest

Check out the Space Time Merch Store https://www.pbsspacetime.com/shop In 1974 we sent the Arecibo radio message towards Messier 13, a globular cluster near the edge of the Milky Way, made up of a few hundred thousand stars. The message was mostly symbolic; we weren’t really expecting a reply. Yet surely other civilisations out there are doing the same thing. So, why haven’t we heard anything? What if the silence from the stars is a hint that we shouldn’t be so outgoing? What if aliens are deliberately keeping quiet for fear that they might be destroyed? Sign Up on Patreon to get access to the Space Time Discord! https://www.patreon.com/pbsspacetime PBS Member Stations rely on viewers like you. To support your local station, go to:http://to.pbs.org/DonateSPACE Sign up for the mailing list to get episode notifications and hear special announcements! https://mailchi.mp/1a6eb8f2717d/spacetime Search the Entire Space Time Library Here: https://search.pbsspacetime.com/ Hosted by Matt O'Dowd Written by Richard Dyer & Matt O'Dowd Post Production by Leonardo Scholzer, Yago Ballarini & Stephanie Faria Directed by Andrew Kornhaber Associate Producer: Bahar Gholipour Executive Producers: Eric Brown & Andrew Kornhaber Executive in Charge for PBS: Maribel Lopez Director of Programming for PBS: Gabrielle Ewing Assistant Director of Programming for PBS: John Campbell Spacetime is a production of Kornhaber Brown for PBS Digital Studios. This program is produced by Kornhaber Brown, which is solely responsible for its content. © 2024 PBS. All rights reserved. End Credits Music by J.R.S. Schattenberg: https://www.youtube.com/user/MultiDroideka Space Time Was Made Possible In Part By: Big Bang Sponsors John Sronce Bryce Fort Peter Barrett David Neumann Alexander Tamas Morgan Hough Juan Benet Vinnie Falco Mark Rosenthal Quasar Sponsors Glenn Sugden Alex Kern Ethan Cohen Stephen Wilcox Mark Heising Hypernova Sponsors Chris Webb David Giltinan Ivari Tölp Kenneth See Gregory Forfa drollere Bradley Voorhees Scott Gorlick Paul Stehr-Green Ben Delo Scott Gray Антон Кочков Robert Ilardi John R. Slavik Mathew Donal Botkin Edmund Fokschaner chuck zegar Jordan Young Daniel Muzquiz Gamma Ray Burst Sponsors Jessica M. Kandal, Ph.D. Anthony Crossland Grace Seraph Frank Plessers Max Paladino Stephen Saslow Robert DeChellis Tomaz Lovsin Anthony Leon Leonardo Schulthais Senna Lori Ferris Dennis Van Hoof Koen Wilde Nicolas Katsantonis Joe Pavlovic Justin Lloyd Chuck Lukaszewski Andrea Galvagni Jerry Thomas Nikhil Sharma John Anderson Bradley Ulis Craig Falls Kane Holbrook Ross Story teng guo Harsh Khandhadia Michael Lev Terje Vold James Trimmier Jeremy Soller Paul Wood Joe Moreira Kent Durham jim bartosh John H. Austin, Jr. Diana S Faraz Khan Almog Cohen Daniel Jennings Russ Creech Jeremy Reed David Johnston Michael Barton Isaac Suttell Oliver Flanagan Bleys Goodson Mark Delagasse Mark Daniel Cohen Shane Calimlim Tybie Fitzhugh Eric Kiebler Craig Stonaha Frederic Simon Tonyface John Robinson Jim Hudson David Barnholdt John Funai Adrien Molyneux Bradley Jenkins Amy Hickman Vlad Shipulin Thomas Dougherty King Zeckendorff Dan Warren Joseph Salomone Patrick Sutton Julien Dubois

Comments

No comments found for this post.