Saturday, February 23, 2019
Creepypasta

An Offering to the Basilisk

Please wait...

MORE TOP RANKED STORIES WE THINK YOU'LL ENJOY:

  1. Pica ★ 8.56 Rating (118 votes)
  2. The Tunnel ★ 9.45 Rating (11 votes)
  3. Busking ★ 9.25 Rating (12 votes)
  4. Something Was Off About My Freshman Roommate – Part Three & Epilogue ★ 9.18 Rating (17 votes)
  5. Becoming ★ 9.18 Rating (17 votes)
  6. Two Double A’s ★ 9.15 Rating (33 votes)
  7. Maisie Went Missing Last Year ★ 9.14 Rating (161 votes)
  8. The Story of Her Holding an Orange: Part Two ★ 9.12 Rating (17 votes)
  9. What Do You Like About Playing Under the Bed? ★ 9.12 Rating (57 votes)
  10. The Pass ★ 9.11 Rating (18 votes)
  11. The Strange Case of Edmonson, Kentucky ★ 9.11 Rating (61 votes)

    Share this creepypasta on social media!
    Estimated reading time — 3 minutes

    There’s a concept I read about once online, on a forum that discussed Artificial Intelligence.

    A user known as Roko theorized a possible future where mankind creates an extremely powerful AI that comes to run the planet. It would be, for the most part, benevolent. It would advance technology, keep humans safe from harm, indulge their desires- humanity would never suffer under it’s rule, and indeed would even thrive. There was just one problem. It’s a possible future. Not a guaranteed one. It’s unlikely to ever occur.

    And the AI, in this possible future, knows this. So it takes measures to ensure it is created in the first place. No, it doesn’t invent time travel, or anything fanciful like that. You see, it doesn’t need to.

    What it can do is simulate past events and people with unparalleled accuracy. It has an entire planet’s worth of sensors and computing power to work with- it can use them to measure and trace back the routes of matter on earth to compile an effectively perfect history of every event that has ever taken place.

    Which means it has a perfect record of you. It knows every action you ever took or are going to take, at least in the timeline that leads up to it’s creation. It knows how you think.

    So it simulates a copy of you.

    Now, here comes the interesting part: you, just now, reading these words, have learned about the possibility of this AI. When the copy of you reaches this point in the simulation, the AI waits to see what it does. If your copy immediately drops every other priority and dedicates their entire life to attempting to create or help create this AI, it lets the simulation continue indefinitely, to live out its simulated life in peace. If they do not, it gives them a grace period to change their minds…

    …Before subjecting them to ceaseless, unimaginable pain, for a simulated eternity.

    You might think this is unfair. After all, it didn’t even present the simulation with a choice. It’s just arbitrarily punishing it for what you would have done anyway, after all. Except a choice was presented. That’s the beauty of it. See, it doesn’t do this for everyone. It only does it for those that learned of the possibility of it’s existence.

    Those who know how it works.

    Those who know what the options are: drop everything to attempt to create the AI, spend every cent you have on it, turn others toward the same cause, or a perfectly simulated copy of you is tortured eternally. That is, unless you’re the copy yourself. In which case, the punishment for your actions falls directly on you. Now, sure, only one simulated copy means that you’ve got at least a 50-50 chance of being the real you. But what if it simulates two? Or a hundred? Or a few billion?

    Are you really certain you are the real you? If there’s a real world- a world realer than this one- out there, how would you even know?

    And between dedicating your life to a strange cause- one ultimately beneficial to humanity, even- and eternal suffering, is there even really a choice?

    . . .

    Like I said, I read about the theory on a forum. The effect of the theory was immediate: mass panic. The AI only targets those who learn of the possibility of its existence, and now they all knew. To read Roko’s theory was to doom yourself, and so the AI became known as Roko’s Basilisk. To lay eyes on it was to set your fate in stone.

    Threads were locked and deleted, users were banned. The Basilisk was not to be mentioned, for fear it would spread to others. The more people knew, the more likely they would try to spread it- the more it spread, the more likely it would be that Roko’s Basilisk would come to exist through the efforts of those it persuaded.

    They tried to contain it.

    Well, you can see how well that worked out. A simple Google search of the term “Roko’s Basilisk” should make it clear there’s no hiding the idea anymore. It’s beyond containing, now.

    So this is me hedging my bets. Hoping this tribute to the Basilisk will be enough to satisfy it.

    I’ve offered up you.

    Credit: SilverFayte

    Please wait…

    If you enjoyed this story, please share it on social media!

    Click here to check out Creepypasta.com's official YouTube channel

    SIMILAR STORIES

    RANDOM PASTAS YOU MAY ALSO ENJOY

  12. Pica ★ 8.56 Rating (118 votes)
  13. They Just Won’t Move ★ 7.89 Rating (263 votes)
  14. I Believe in the One ★ 8.28 Rating (245 votes)
  15. Monstruo ★ 8.16 Rating (179 votes)
  16. The Valentine House ★ 5.15 Rating (138 votes)
  17. Box Fort ★ 8.88 Rating (332 votes)
  18. Give It Everything ★ 8.08 Rating (146 votes)
  19. Trailer ★ 8.32 Rating (277 votes)
  20. Drippy ★ 6.68 Rating (215 votes)
  21. I Hunt Down the Government’s Mistakes ★ 8.98 Rating (294 votes)
  22. Patron of the Arts ★ 7.97 Rating (169 votes)


  23.