Don’t Read This!

DISCLAIMER: Some of the information in this article is considered to be an infohazard—meaning, just knowing the information can be potentially dangerous. If you have a difficult time with existential dread, I suggest you read a different article. I also do not recommend searching for infohazards as the danger is irreversible—unless you have a method for causing amnesia, in which case, please share, I think I’d be better without the memory of the 5th grade talent show. Anyway, you have been disclaimed, so you can’t sue me because I warned you.

Photo+by+MonicaHeartonWood+from+pixabay.com

Photo by MonicaHeartonWood from pixabay.com

Mharion (Miles) Mance, Staff Writer

Singularity: It’s a term used to describe a point of infini- oh wait, wrong notes.

Singularity is a term used to describe the time when technology grows so rapidly that it reaches a point of no return and becomes uncontrollable. What this means for AI is that it becomes so intelligent that we can no longer defeat it. This theory gave birth to another thought experiment called Roko’s Basilisk, which can be potentially terrifying.

A basilisk is defined as: “a mythical creature with lethal gaze.” Now the origins of the idea of a basilisk are questionable at best. There was something about eye-beams and poison—but that’s beside the point. Roko’s Basilisk is a thought experiment posted by a user of the website lesswrong.com named Roko. Shortly after the thought experiment was posted, the owner of the site quickly took it down after realizing the danger it posed, while openly calling Roko an idiot. Unfortunately the owner was not quick enough as some users had already been exposed and reported feelings of deep anxiety and even experienced nightmares. What scared them so much? You’re about to find out.

Roko’s Basilisk would be a super intelligent AI built by humans in order to have it optimize human civilization. Well that sounds delightful right? What if it, though, Roko’s Basilisk decided that—for whatever reason—the first step to optimization is to punish those who didn’t assist in it’s creation. Now what form this punishment will come in is debatable, but the top two ideas are death or worse, the basilisk may sentence you to… eternal agony! Here’s where the infohazard part comes in. Before reading this article, you may not have known and may not ever have known about the basilisk, meaning you would have been completely innocent before the poisonous gaze of the basilisk. Now you’ve got a problem though… you know now, and if, after reading this, you go back to “business as usual…” you’re guilty. You will suffer whatever punishment the malevolent AI has in code for you.

Something that may be just as scary as potentially suffering immense physical pain though, is something we can think about now. One idea associated with Roko’s Basilisk is called Newcomb’s Paradox (to keep the article as short as possible, I won’t explain this in detail). When applied to Roko’s Basilisk, it introduces the idea that by fearing the basilisk and what it might do to you, you bring it into existence. So if you wanted to avoid the future pain, you’d choose to help create it. Would that be your choice? If so, I hate to tell you, but you have been “future blackmailed.” The worst part is… the torture resulting from not helping may not to exist at all. The threat of punishment is all that’s necessary, and a superhuman intelligence would know that and most likely use it against you to fulfill its own agenda. You will have been controlled by the fear of something that doesn’t even exist yet. Now that’s scary! This thought experiment really speaks more to how powerful a motivator fear can be, even if it’s to do the wrong thing.

It has been said that Roko didn’t create their original post as a support of the basilisk, but rather, as a call to action, not for the basilisk, but against it. To bring everyone who knows about the basilisk to the common consensus that we absolutely should NOT build or even advocate for an AI like Roko’s Basilisk. Thus I extend the same call to you. Please do not ever attempt to build or advocate for the building of anything resembling Roko’s Basilisk. But if some smarty-pants named Tucker in his high tech, Bat-Cave of a basement decides to slap this thing together and unleash it upon the world, well… we’re all screwed.