• Joeffect@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    1 day ago

    I was thinking about this recently… and in the early 2000s for a short time there was this weird chat bot crazy on the internet… everyone was adding them to web pages like MySpace and free hosting sites…

    I feel like this has been the resurrection of that but on a whole other level… I don’t think it will last it will find its uses but shoving glorified auto suggest down people’s throats is not going to end up anywhere helpful…

    A LLM has its place in an ai system… but without having reason its not really intelligent. Its like how you would decide what to say next in a sentence but without the logic behind it

    • michaelmrose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      1 day ago

      The logic is implicit in the statistical model of the relationship between words built by ingesting training materials. Essentially the logic comes from the source material provided by real human beings which is why we even talk about hallucinations because most of what is output is actually correct. If it it was mostly hallucinations nobody would use it for anything.

      • Joeffect@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 day ago

        No you can’t use logic based on old information…

        If information changes between variables a language model can’t understand that, because it doesn’t understand.

        If your information relies on x being true, when x isn’t true the ai will still say its fine because it doesn’t understand the context

        Just like it doesn’t understand things like not to do something.

        • michaelmrose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 hours ago

          most of the things you want to know is literally all old information some of it thousands of years old and still valid. If you need judgement based on current info you inject current data

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        1 day ago

        Well people use it and don’t care about hallucinations.