How an artificial intelligence (as in large language model based generative AI) could be better for information access and retrieval than an encyclopedia with a clean classification model and a search engine?

If we add a step of processing – where a genAI “digests” perfectly structured data and tries, as bad as it can, to regurgitate things it doesn’t understand – aren’t we just adding noise?

I’m talking about the specific use-case of “draw me a picture explaining how a pressure regulator works”, or “can you explain to me how to code a recursive pattern matching algorithm, please”.

I also understand how it can help people who do not want or cannot make the effort to learn an encyclopedia’s classification plan, or how a search engine’s syntax work.

But on a fundamental level, aren’t we just adding an incontrolable step of noise injection in a decent time-tested information flow?

  • NigelFrobisher@aussie.zone
    link
    fedilink
    arrow-up
    5
    ·
    7 days ago

    If you have to go and fact check the results anyway, is there even a point? At work now I’m getting entirely AI generated pull requests with AI generated descriptions, and when I challenge the dev on why they went with particular choices they can’t explain or back them up.

    • SolOrion@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 days ago

      That’s why I don’t really use them myself. I’m not willing to spread misinformation just because ChatGPT told me it was true, but I also have no interest in going back over every response and double checking that it’s not just making shit up.