ugjka@lemmy.world to Technology@lemmy.worldEnglish · 6 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square297fedilinkarrow-up11.02Karrow-down116
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 6 months agomessage-square297fedilink
minus-square𝙲𝚑𝚊𝚒𝚛𝚖𝚊𝚗 𝙼𝚎𝚘𝚠@programming.devlinkfedilinkEnglisharrow-up6·6 months agoTai was actively being manipulated by malicious users.
minus-squareAbidanYre@lemmy.worldlinkfedilinkEnglisharrow-up5·edit-26 months agoThat’s fair. I just think it’s funny that the well intentioned one turned into a Nazi and the Nazi one needs to be pretty heavy handedly told not to turn into a decent “person”.
Tai was actively being manipulated by malicious users.
That’s fair. I just think it’s funny that the well intentioned one turned into a Nazi and the Nazi one needs to be pretty heavy handedly told not to turn into a decent “person”.