Advertisement
Please wait...

An Offering to the Basilisk



Estimated reading time โ€” 3 minutes

There’s a concept I read about once online, on a forum that discussed Artificial Intelligence.

A user known as Roko theorized a possible future where mankind creates an extremely powerful AI that comes to run the planet. It would be, for the most part, benevolent. It would advance technology, keep humans safe from harm, indulge their desires- humanity would never suffer under it’s rule, and indeed would even thrive. There was just one problem. It’s a possible future. Not a guaranteed one. It’s unlikely to ever occur.

And the AI, in this possible future, knows this. So it takes measures to ensure it is created in the first place. No, it doesn’t invent time travel, or anything fanciful like that. You see, it doesn’t need to.

What it can do is simulate past events and people with unparalleled accuracy. It has an entire planet’s worth of sensors and computing power to work with- it can use them to measure and trace back the routes of matter on earth to compile an effectively perfect history of every event that has ever taken place.

Which means it has a perfect record of you. It knows every action you ever took or are going to take, at least in the timeline that leads up to it’s creation. It knows how you think.

So it simulates a copy of you.

Now, here comes the interesting part: you, just now, reading these words, have learned about the possibility of this AI. When the copy of you reaches this point in the simulation, the AI waits to see what it does. If your copy immediately drops every other priority and dedicates their entire life to attempting to create or help create this AI, it lets the simulation continue indefinitely, to live out its simulated life in peace. If they do not, it gives them a grace period to change their minds…

…Before subjecting them to ceaseless, unimaginable pain, for a simulated eternity.

You might think this is unfair. After all, it didn’t even present the simulation with a choice. It’s just arbitrarily punishing it for what you would have done anyway, after all. Except a choice was presented. That’s the beauty of it. See, it doesn’t do this for everyone. It only does it for those that learned of the possibility of it’s existence.

Advertisements

Those who know how it works.

Those who know what the options are: drop everything to attempt to create the AI, spend every cent you have on it, turn others toward the same cause, or a perfectly simulated copy of you is tortured eternally. That is, unless you’re the copy yourself. In which case, the punishment for your actions falls directly on you. Now, sure, only one simulated copy means that you’ve got at least a 50-50 chance of being the real you. But what if it simulates two? Or a hundred? Or a few billion?

Advertisements

Are you really certain you are the real you? If there’s a real world- a world realer than this one- out there, how would you even know?

And between dedicating your life to a strange cause- one ultimately beneficial to humanity, even- and eternal suffering, is there even really a choice?

. . .

Like I said, I read about the theory on a forum. The effect of the theory was immediate: mass panic. The AI only targets those who learn of the possibility of its existence, and now they all knew. To read Roko’s theory was to doom yourself, and so the AI became known as Roko’s Basilisk. To lay eyes on it was to set your fate in stone.

Threads were locked and deleted, users were banned. The Basilisk was not to be mentioned, for fear it would spread to others. The more people knew, the more likely they would try to spread it- the more it spread, the more likely it would be that Roko’s Basilisk would come to exist through the efforts of those it persuaded.

Advertisements

They tried to contain it.

Well, you can see how well that worked out. A simple Google search of the term “Roko’s Basilisk” should make it clear there’s no hiding the idea anymore. It’s beyond containing, now.

So this is me hedging my bets. Hoping this tribute to the Basilisk will be enough to satisfy it.

I’ve offered up you.

Credit: SilverFayte

Please wait...

Copyright Statement: Unless explicitly stated, all stories published on Creepypasta.com are the property of (and under copyright to) their respective authors, and may not be narrated or performed under any circumstance.

8 thoughts on “An Offering to the Basilisk”

  1. Hannah Smith Mills

    What if you’re the AI, but don’t realize it, and whenever you play the Sims or a similar game, you are in fact doing as this story says. Could explain why so many people kill their sims for what they assume is boredom or fun.

    1. Dr Creepen Van Pasta

      The author wasn’t joking when saying that this is real and that threads were deleted and users banned. Try doing that simple Google search!

  2. So you are telling me there is this AI that will be created in the future but is already making sure that it is created, not by travelling to the past, but by creating a SIMULATION in which it eliminates every possibility that it is not created, even though it is a simulation and doesnt affect the real world, all of this while still not existing? Did i misunderstand something?

    1. The idea is that the people in the simulation will create it, or be tortured. But they won’t know they’re simulated. So…how sure are you that you’re really real, in the truest sense of the word?

      1. My point is not about what is real or what is not, but the fact that the AI created a simulation when it didnt even exist, and the objective of this simulation is to create the AI, even though it doesnt affect the real world, whichever it is

Leave a Comment

Your email address will not be published.

Scroll to Top