ugjka@lemmy.world to Technology@lemmy.worldEnglish · 6 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square185fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 6 months agomessage-square185fedilink
minus-squareCorhen@lemmy.worldlinkfedilinkEnglisharrow-up0·6 months agohad the exact same thought. If you wanted it to be unbiased, you wouldnt tell it its position in a lot of items.
minus-squaremelpomenesclevage@lemm.eelinkfedilinkEnglisharrow-up0·6 months agoNo but see ‘unbiased’ is an identity and social group, not a property of the thing.
minus-squareSeasoned_Greetings@lemm.eelinkfedilinkEnglisharrow-up0·edit-26 months agoNo you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant. Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial
had the exact same thought.
If you wanted it to be unbiased, you wouldnt tell it its position in a lot of items.
No but see ‘unbiased’ is an identity and social group, not a property of the thing.
No you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant.
Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial