• Lfrith@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 hours ago

    I got hallucination from trying to find a book I read but didn’t know the title of. And hallucinated NBA play off results of the wrong team winning. And gotten basic math calculations wrong.

    Its a language model so its purpose is to string together words that sound like sentences, but it can’t be fully trusted to be accurate. Best it can do is give you source so you can got straight to the resource to read that instead.

    It’s decent at generating basic code, and testing yourself to see if it outputs what you want. But I don’t trust it as a resource when it comes to information when even wrong sports facts have been provided.

    • Holytimes@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      The three things I’ve found search engine LLMs to be useful for. Searching for laptop since it’s absurdly good at finding weird fucking regional models or odd configurations that arnt on the main pages of most shops.

      Like my current laptop wasnt on newegg Amazon or even msi’s own shop. It was on a fucking random ass page on their website that nothing linked to and was some weird ass model that wasn’t searchable even.

      The second most useful one was generating a metric crapload of boiler plate json files for a mod.

      The third thing is bad dnd roleplaying while I’m bored at work. The hallucinations are a upside lol