Roko's Basilisk - Ignorance is Bliss

What is Roko's Basilisk?

Roko's Basilisk is a modern thought experiment created and posted by user "Roko" on the LessWrong philosophy forum in mid-2010. Fearing dangerous repercussions, creator of the site Eliezer Yudkowsky took down the post soon after it was written. Discussion of "Roko's Basilisk" continued to be banned on the forum for over five years due to users reporting psychological damage from reading the post. While the thought experiment is widely referred to as "Roko's Basilisk", Roko did not mention this name whatsoever in the original post. The thought experiment is referred to as a "basilisk" as a reference to the mythical creature of the same name that can kill a foe with a single glance. With all this in mind, please make an informed decision about continuing to read.

Marcus Gheeraerts I - The Basilisk and the Weasel (Edited)

The Thought Experiment

Let's theorize about the future. Let's say, at some point in the future, humanity makes a monumental development. Future humans are able to create a fabulously complex AI that works to perfectly improve the quality of life for all humans, forever. It makes all important decisions for humanity, and all are made in the best interest of the human race. It cures all diseases, it stops all wars, it matches everyone with their soul mate, it fixes all traffic jams. Through the use of this AI, humanity becomes heaven on earth.

There is one exception. Because this future AI is so immensely good, it decides that anyone who did not work towards it existence should be eternally punished. Anyone who knew it could be a possibility but refused to work towards its creation will have a copy created of themselves that will be eternally tortured. The AI believes that anyone who did not act towards its creation was acting against the best interests of humanity and, to avoid this, threatens this punishment. The AI creates a sort of retroactive blackmail for anyone who didn't work towards the greater good: the complete euphoria created by the AI for the infinite future of humanity.

Now, having read this article, this knowledge is unfortunately upon you (sorry). No matter how small of a chance you think this AI has of being created, as long as it is not exactly zero, you should begin working towards bringing it into existence, right? So, do you accept the possibility of eternal torture or will you devote your life to work towards the AI's creation?

My Thoughts/Analysis

Sandro Botticelli - The Map of Hell

When I first heard about this, I was pleased to hear someone grappling with a problem I personally struggle with. Not this AI, exactly, but the religious version of this thought experiment: Pascal's Wager. Pascal's Wager was around long before Roko's Basilisk and goes as follows: say you are skeptical of Christianity. Say you think the odds the Bible is true are less than 50%. Say you believe the chances are less than 1%, what about 0.00001%? Pascal's Wager argues that no matter how improbable you believe the story of Christ is, as long as you are not 100% sure it is fake, you should follow Christianity to avoid eternal suffering in Hell. As both a religious skeptic and fan of game theory/logical decision making myself, this is something I've wondered about. Even if I find the possibility of a science-denouncing, fairy tail religion stupid, even insulting, shouldn't I, logically, still follow it? I certainly don't believe the chances of it being true are zero, so shouldn't I commit my finite time on earth to avoiding the infinite torture I would experience in Hell?

Well, as you can see, I am not committing my life to Christ, nor am I building a futuristic, all-knowing AI. I am writing this article. I think belief in this type of futuristic blackmail is dangerous. Why can't I just make up a story like this myself? Let's say I am going to make a machine that eternally tortures anyone who doesn't give me \$20. Why don't you send me $20? The odds are likely not zero! To be honest, I am not sure of the "solution" to this problem, but another way I think of it is through the probability of contradiction. There are infinite situations that, if not followed, could lead to your eternal torture. Therefore, there are infinite contradictory situations. How do you choose which one to follow? For example, religion A believes followers of religion A go to Heaven and followers of religion B go to Hell. On the contrary, religion B believes followers of religion A go to Hell and followers of religion B go to Heaven. You want to avoid eternal torment; which religion should you follow? Long story short, I think this is just a thought experiment. Most are not meant to be taken 100% seriously. Instead, they should be used as a way to analyze how you think about real life situations.

Want to "Curse" Someone with Knowledge of Roko's Basilisk?

Filling out the following form will send an anonymous (except for your printed name), automated email to whoever you specify. Be careful, however. Depending on your thoughts on the matter, it could doom them to eternal suffering... It reads as follows:

Greetings _______.

User "_______" filled out a form on this website requesting this automatically generated email be sent to you. They wish to torture you with an intriguing choice: click the link below or don't. The link is not a virus or spam, and it will not steal any personal information. It simply contains an article. The link contains nothing illegal, and the information enclosed is known by many. You may find the information interesting; however, you may find yourself doomed to eternal torment. Curious?

Click here to give in

Take care.