ugjka@lemmy.world to Technology@lemmy.worldEnglish · 6 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square123fedilinkarrow-up1689
arrow-up1677external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 6 months agomessage-square123fedilink
minus-squareOlgratin_Magmatoe@lemmy.worldlinkfedilinkEnglisharrow-up13arrow-down1·6 months agoGiven that multiple other commenters in the infosec.exchange thread have reproduced similar results, and right wingers tend to have bad security, and LLMs are pretty much impossible to fully control for now, it seems most likely that it’s real.
Given that multiple other commenters in the infosec.exchange thread have reproduced similar results, and right wingers tend to have bad security, and LLMs are pretty much impossible to fully control for now, it seems most likely that it’s real.