ikt@aussie.zone to Lemmy Shitpost@lemmy.world · 19 hours agoHow to centre a divfiles.catbox.moevideomessage-square25fedilinkarrow-up1212arrow-down10
arrow-up1212arrow-down1videoHow to centre a divfiles.catbox.moeikt@aussie.zone to Lemmy Shitpost@lemmy.world · 19 hours agomessage-square25fedilink
minus-squareLeon@pawb.sociallinkfedilinkEnglisharrow-up2·8 hours agoThat’s because it’s a false premise. LLMs don’t hallucinate, they do exactly what they’re meant to do; predict text, and output something that’s legible and human written. There’s no training for correctness, how do you even define that?
That’s because it’s a false premise. LLMs don’t hallucinate, they do exactly what they’re meant to do; predict text, and output something that’s legible and human written. There’s no training for correctness, how do you even define that?