Pages

Wednesday, February 28, 2018

The danger of SETI

Seth Shostak writes:
With all the news stories these days about computer hacking, it probably comes as no surprise that someone is worried about hackers from outer space. Yes, there are now scientists who fret that space aliens might send messages that worm their way into human society — not to steal our passwords but to bring down our culture.

How exactly would they do that? Astrophysicists Michael Hippke and John Learned argue in a recent paper that our telescopes might pick up hazardous messages sent our way — a virus that shuts down our computers, for example, or something a bit like cosmic blackmail: “Do this for us, or we’ll make your sun go supernova and destroy Earth.” Or perhaps the cosmic hackers could trick us into building self-replicating nanobots, and then arrange for them to be let loose to chew up our planet or its inhabitants.
This sounds crazy, but if I believed that there were really advanced intelligent space aliens out there, then we should ignore their signals.

Imagine a super-intelligent DNA-based race of beings out there. What would they do? It would be impractical for them to come here. However, they could easily beam their DNA sequences here, and sucker us into synthesizing life forms based on them.

A super-advanced race could program DNA sequences as easily as we program ordinary digital computers. We are centuries away from understanding DNA that well. Even if we understood DNA well enough to create sequences to do a desired purpose, we might never understand it well enough to understand a clever DNA sequences written by someone else.

After all, computer software is that way. We can write programs to do more or less anything we want to do, but it is also possible to write a program with hidden features that are completely impractical to discover by inspecting the code.

So the space aliens could send us a DNA sequence. We would be curious, and someone would eventually synthesize it. Maybe we would get creatures that appeared to be ideal slaves. They would be strong and smart and do what they are told. Eventually we would mass-produce them and trust them with responsibility. Then some cryptographically concealed portion of the DNA gets triggered, and the creatures revolt, exterminate the humans, and start producing clones of the space aliens.

The only way to avoid this scenario is to never listen for the alien signals in the first place. The Search for Extraterrestrial Life should be stopped, as it is a threat to humanity if it ever succeeded.

4 comments:

  1. But Roger, it's bad science. Ever hear of the relativistic rocket equation? They could never get here. Furthermore, they would already know where we are, if they could send a signal. Otherwise, you get spherical attenuation. We can barely communicate with probes in our own solar system using graph-based turbo code and MIMO!

    ReplyDelete
  2. Roger,
    You might try reading Wasteland of Flint by Thomas Harlan. It's a science fiction novel series about the idea that very ancient alien civilizations might leave booby traps designed to lure in unwary developing species. The premise of the series is that there are many very ancient civilizations that would look upon our own up and coming species as an 'infection' in their territory. Some of these ancient civilizations might even be long gone, and only their machines tasked with keeping things tidy for their creators might be lurking around looking for 'problematic' life forms to weed out.

    ReplyDelete
  3. That question is tackled, at some length, in 'An Introduction to Planetary Defense: A Study of Modern Warfare Applied to Extra-terrestrial Invasion' (Boan and Taylor). A counter-theory is proposed: Perhaps they're benign, and we ought not fear Greeks bearing gifts which might come in the form of a cure for cancer. Based on my readings in military history, I'm not on board with what might be termed The Wilkommen Approach; but military history also suggests how radically idiotic it would be to stop listening for sounds in the forest. Please stick to what you know, Roger. Thanks in advance! (:

    ReplyDelete
  4. Reiterating my earlier comment a fortiori, given someone's (Carl Sagan et al.'s) bonehead blunder of sending out The Golden Disk.

    ReplyDelete