• WalrusDragonOnABike [they/them]@reddthat.com
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    21 hours ago

    Except when you ask it for the meaning of an acronym and they say something with totally different letters. Yet people treat it as a source on something they know so little about that they cannot possibly tell it its just spitting out nonsense.

    • Ech@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      4 hours ago

      its just spitting out nonsense

      That’s exactly it. LLMs and their image counterparts have no innate or burgeoning knowledge as people tend to assume. Their singular, core function is to generate their output from literal random noise, like the static you used to see on TV. So the response to the same question will change because the random noise changed, not because the algorithm learned or reconsidered anything. And if you used the same noise, the answer would be identical. No knowledgeable or self-sufficient AI will ever evolve from that.

    • Diurnambule@jlai.lu
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      9 hours ago

      It allow to pass from zero knowledge on a subject to 0.1 knoweldge which make people with a little brain to win z little time at the start of a projet. It is still a ecological disaster though. The people comparing it to the ring is the closest I think.

    • trolololol@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      19 hours ago

      Just like your regular uncle. Or ultra right podcaster.

      It’s no surprise llms behave like the most vocal and dubiously confident people in the world.