• Strider@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    A little out of context. Although tech is my life, it’s never been social media or things like that.

    So in a way I am very much permanently touching grass (had to look that up / language barrier).

    The psychological issue with llm and hallucination is that humans lose the ability to judge the output, whether you touch grass in between or not.

    Verifying all output is almost impossible and at scope completely defeats the purpose.