An Offering to the Basilisk

December 1, 2016 at 12:00 AM

The estimated reading time for this post is 2 minutes, 53 seconds

FavoriteLoadingAdd this post to your list of favorites!

There’s a concept I read about once online, on a forum that discussed Artificial Intelligence.

A user known as Roko theorized a possible future where mankind creates an extremely powerful AI that comes to run the planet. It would be, for the most part, benevolent. It would advance technology, keep humans safe from harm, indulge their desires- humanity would never suffer under it’s rule, and indeed would even thrive. There was just one problem. It’s a possible future. Not a guaranteed one. It’s unlikely to ever occur.

And the AI, in this possible future, knows this. So it takes measures to ensure it is created in the first place. No, it doesn’t invent time travel, or anything fanciful like that. You see, it doesn’t need to.

What it can do is simulate past events and people with unparalleled accuracy. It has an entire planet’s worth of sensors and computing power to work with- it can use them to measure and trace back the routes of matter on earth to compile an effectively perfect history of every event that has ever taken place.

Which means it has a perfect record of you. It knows every action you ever took or are going to take, at least in the timeline that leads up to it’s creation. It knows how you think.

So it simulates a copy of you.

Now, here comes the interesting part: you, just now, reading these words, have learned about the possibility of this AI. When the copy of you reaches this point in the simulation, the AI waits to see what it does. If your copy immediately drops every other priority and dedicates their entire life to attempting to create or help create this AI, it lets the simulation continue indefinitely, to live out its simulated life in peace. If they do not, it gives them a grace period to change their minds…

…Before subjecting them to ceaseless, unimaginable pain, for a simulated eternity.

You might think this is unfair. After all, it didn’t even present the simulation with a choice. It’s just arbitrarily punishing it for what you would have done anyway, after all. Except a choice was presented. That’s the beauty of it. See, it doesn’t do this for everyone. It only does it for those that learned of the possibility of it’s existence.

Those who know how it works.

Those who know what the options are: drop everything to attempt to create the AI, spend every cent you have on it, turn others toward the same cause, or a perfectly simulated copy of you is tortured eternally. That is, unless you’re the copy yourself. In which case, the punishment for your actions falls directly on you. Now, sure, only one simulated copy means that you’ve got at least a 50-50 chance of being the real you. But what if it simulates two? Or a hundred? Or a few billion?

Are you really certain you are the real you? If there’s a real world- a world realer than this one- out there, how would you even know?

And between dedicating your life to a strange cause- one ultimately beneficial to humanity, even- and eternal suffering, is there even really a choice?

. . .

Like I said, I read about the theory on a forum. The effect of the theory was immediate: mass panic. The AI only targets those who learn of the possibility of its existence, and now they all knew. To read Roko’s theory was to doom yourself, and so the AI became known as Roko’s Basilisk. To lay eyes on it was to set your fate in stone.

Threads were locked and deleted, users were banned. The Basilisk was not to be mentioned, for fear it would spread to others. The more people knew, the more likely they would try to spread it- the more it spread, the more likely it would be that Roko’s Basilisk would come to exist through the efforts of those it persuaded.

They tried to contain it.

Well, you can see how well that worked out. A simple Google search of the term “Roko’s Basilisk” should make it clear there’s no hiding the idea anymore. It’s beyond containing, now.

So this is me hedging my bets. Hoping this tribute to the Basilisk will be enough to satisfy it.

I’ve offered up you.

Credit: SilverFayte