Roko’s basilisk is a thought experiment which states that an otherwise benevolent artificial superintelligence (AI) in the future would be incentivized to create a virtual reality simulation to torture anyone who knew of its potential existence but did not directly contribute to its advancement or development, in order to incentivize said advancement.It originated in a 2010 post at discussion board LessWrong, a technical forum focused on analytical rational enquiry. The thought experiment’s name derives from the poster of the article (Roko) and the basilisk, a mythical creature capable of destroying enemies with its stare.

While the theory was initially dismissed as nothing but conjecture or speculation by many LessWrong users, LessWrong co-founder Eliezer Yudkowsky reported users who panicked upon reading the theory, due to its stipulation that knowing about the theory and its basilisk made one vulnerable to the basilisk itself. This led to discussion of the basilisk on the site being banned for five years. However, these reports were later dismissed as being exaggerations or inconsequential, and the theory itself was dismissed as nonsense, including by Yudkowsky himself. Even after the post’s discreditation, it is still used as an example of principles such as Bayesian probability and implicit religion. It is also regarded as a simplified, derivative version of Pascal’s wager.

Found out about this after stumbling upon this Kyle Hill video on the subject. It reminds me a little bit of “The Game”.

  • dwindling7373@feddit.it
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    TIL.

    It sounds like it’s mostly a matter that does not involve the AI but the people working on it, maybe even working on it because of the fear they are subjected to after being the subject of this revelation (possibly by other people involved in the AI that coincidentally are the only ones that could push for such a thing to be included in the AI!).

    Something something any cult, paradise/hell, God/AI has nothing to do with this and could even not exist at all.

      • dwindling7373@feddit.it
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        No, “The Game” works only as long as you accept to take part in it, to give validity to the empty statement that you are now inevitably playing “The Game”.

        The Basilisk is meant to force that onto you, outside of any arbitraty convention.