• EpeeGnome@feddit.online
    link
    fedilink
    English
    arrow-up
    13
    ·
    3 days ago

    I like the comparison but LLMs can’t go insane as they just word pattern engines. It’s why I refuse to go along with the AI industry’s insistance in calling it a “hallucination” when it spits out the wrong words. It literally can not have a false perception of reality because it does not perceive anything in the first place.