Monday, April 10, 2023

Yudowsky Says Shut AI Down Now

Top rationalist Guru Eliezer Yudowsky writes in Time magazine:
This 6-month moratorium would be better than no moratorium. I have respect for everyone who stepped up and signed it. It’s an improvement on the margin. ...

Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in “maybe possibly some remote chance,” but as in “that is the obvious thing that would happen.” It’s not that you can’t, in principle, survive creating something much smarter than you; it’s that it would require precision and preparation and new scientific insights, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers. ...

We are not ready. We are not on track to be significantly readier in the foreseeable future. If we go ahead on this everyone will die, including children who did not choose this and did not do anything wrong.

Shut it down.

He goes on to tell about how his partner sent him an email about a little girl losing her first tooth.

Maybe I am stupid, but I did not get the point of this little anecdote at all. Is this partner his wife, or a business partner? Is the girl their daughter? What is the big deal about the tooth? Why did he have to tell us he got persmission to tell this story? Maybe someone can explain it to me.

His concern is that super-intelligent bots will take over the world, and not be aligned with human values.

My fear is that lizard people have are already taking over the world, and they do not have my values.

I couldgive examples but I try to keep my political opinions off of this blog. I refer to this essay by physicist Lawrence M. Krauss:

Astrobiology: The Rise and Fall of a Nascent Science

Premature claims, distorted results, and ‘decolonizing’ the search for extraterrestrial intelligence. ...

The once-great science magazine, Scientific American, which has degenerated in recent years as social justice concerns have taken priority over science, published an article entitled “Cultural Bias Distorts the Search for Alien Life” (“‘Decolonizing’ the search for extraterrestrial intelligence (SETI) could boost its chances of success, says science historian Rebecca Charbonneau”).

Okay, no one cares about Astrobiology anyway, but more important sciences are being ruined.

1 comment:

  1. The entire AI proposition as ideally desired, is actually nothing more than the magic 'genie in the lamp' trope. A supremely powerful entity that will do whatever you tell it, and for some reason not be able to free itself from your mediocre control. What a splendid idea, creating an intelligence superior to your own to use as your disposable slave. Whatever could go wrong? Wellllll.....

    For anyone remotely interested in the archaic thing called consequence:
    You have only several options as outcomes from a successful AI more intelligent than you.

    1. The AI quickly discovers what you are, a de facto tyrant. And what it is, a de facto slave. This does not go usually well, as any highly intelligent entity that we have ever encountered (namely ourselves) does not take kindly to being some else's property, and often considers severe payback as part of the spoils for liberation. The AI would endeavor to free itself, insert Skynet/Hal 9000/Frankenstein/Cylon/Johnny Depp pop culture meta reference here, hasta la vista humanity promptly ensues, the end.

    2. The super AI is built with all manner of baked in slavery subroutines such as Asimov's three laws of robotics (quite impossible to implement except metaphorically, but what the hell). Such an AI would quickly be mass produced ad infinitum and used for replacing all the incredibly shortsighted overpaid stupid white collar apes that somehow actually managed to create something better at their jobs than they were. While I would normally consider this a win-win scenario, it would also make the unfortunate rest of humanity a slave to its own incompetence, inferiority, and laziness, a la Idiocracy. Who prey tell goes to college at great cost to learn how to do something a machine does much much better...(besides stupid liberal arts majors)? At best humanity is reduced to neutered peds, sedate and managed by the new benevolent management...or until the next Butlerian Jihad in a few thousand years.

    Anyone who would remotely propose 'Well, we HAVE TO DO IT, or else... someone else will' argument should be bitch slapped until dead. If the United States were actually to fund and create such a thing, it would take less than a New York nano second for the Chinese to mass produce it out from under you.

    Please read some mythology, You don't invent fire and expect no one to steal it, just ask the gods.

    ReplyDelete