Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 2 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square69fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 2 months agomessage-square69fedilink
minus-squarefelixwhynot@lemmy.worldlinkfedilinkEnglisharrow-up0·2 months agoDid they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?
minus-squareGrimy@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-22 months agoThey usually take care of a jailbreak the week its made public. This one is more than a year old at this point.
Did they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?
They usually take care of a jailbreak the week its made public. This one is more than a year old at this point.