That’s more of a tone thing, which is something AI is capable of modifying. Hallucination is more of a foundational issue baked directly into how these models are designed and trained and not something you can just tell it not to do.
@NikkiDimes@Wlm racism is about far more than tone. If you’ve trained your AI - or any kind of machine - on racist data then it will be racist. Camera viewfinders that only track white faces because they don’t recognise black ones. Soap dispensers that only dispense for white hands. Diagnosis tools that only recognise rashes on white skin.
The camera thing will always be such a great example. My grandfather’s good friend can’t drive his fancy 100k+ EV. Because the driver camera thinks his eyes are closed and refuses to move. So his wife now drives him everywhere.
Shits racist towards tho with mongolian/east Asia eyes.
It’s a joke that gets brought out every time he’s over.
Yeah totally. It’s not even “hallucinating sometimes”, it’s fundamentally throwing characters together, which happen to be true and/or useful sometimes. Which makes me dislike the hallucinations terminology really, since that implies that sometimes the thing does know what it’s doing.
Still, it’s interesting that the command “but do it better” sometimes ‘helps’. E.g. “now fix a bug in your output” probably occasionally’ll work. “Don’t lie” is not going to fly ever though with LLMs (afaik).
That’s more of a tone thing, which is something AI is capable of modifying. Hallucination is more of a foundational issue baked directly into how these models are designed and trained and not something you can just tell it not to do.
@NikkiDimes @Wlm racism is about far more than tone. If you’ve trained your AI - or any kind of machine - on racist data then it will be racist. Camera viewfinders that only track white faces because they don’t recognise black ones. Soap dispensers that only dispense for white hands. Diagnosis tools that only recognise rashes on white skin.
The camera thing will always be such a great example. My grandfather’s good friend can’t drive his fancy 100k+ EV. Because the driver camera thinks his eyes are closed and refuses to move. So his wife now drives him everywhere.
Shits racist towards tho with mongolian/east Asia eyes.
It’s a joke that gets brought out every time he’s over.
Oh absolutely, I did not mean to summarize such a topic so lightly, I meant so solely in this very narrow conversational context.
Soap dispensers that only dispense for white hands.
IR was fine why the fuck do we have AI soap dispensers?! (Please for “Bob’s” sake tell me you made it up.)
Yeah totally. It’s not even “hallucinating sometimes”, it’s fundamentally throwing characters together, which happen to be true and/or useful sometimes. Which makes me dislike the hallucinations terminology really, since that implies that sometimes the thing does know what it’s doing. Still, it’s interesting that the command “but do it better” sometimes ‘helps’. E.g. “now fix a bug in your output” probably occasionally’ll work. “Don’t lie” is not going to fly ever though with LLMs (afaik).