Lee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 1 month agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square8fedilinkarrow-up182arrow-down12cross-posted to: [email protected]
arrow-up180arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comLee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 1 month agomessage-square8fedilinkcross-posted to: [email protected]
minus-squareGrimy@lemmy.worldlinkfedilinkarrow-up11·1 month ago‘Safety systems’ is simply censorship, pushed to the public as a good thing so it’s easier to kill and ban open source solutions when the time comes.
minus-squarelumen@feddit.nllinkfedilinkarrow-up1·1 month agoYou think? I’m convinced the guardrails LLM companies impose only serve to up these companies’ reputation.
‘Safety systems’ is simply censorship, pushed to the public as a good thing so it’s easier to kill and ban open source solutions when the time comes.
You think? I’m convinced the guardrails LLM companies impose only serve to up these companies’ reputation.