I remember when I suggested that I shouldn’t learn to write in 1998, because you can just type on the computer, I was laughed at. I was told that at best I’d still need to learn to write, and at worst computers can turn out as a fad due to them requiring electricity to work, they can crash and go bad, etc. Pease note that my dislike of writing was heavily influenced by likely having dyspraxia, and a lot of cheaper pens/pencils being mildly painful to hold.

However, the very same people are now disencouraging anything that the AI is promised to replace. Don’t draw, just use Dall-E. Don’t code, just use ChatGPT. Don’t play music, just use Suno. Don’t make movies, just wait until it can do it good enough. The music one is even often being pushed by those who absolutely despised electronic music for “not requiring any talent, just pressing buttons”, all while AI music is literally what ignorant rock/metal kids thought electronic music production was. Even one person, who criticized me for using amp sims on my PC instead of a wall of tube amplifiers is more favorable than not towards AI music.

I wonder if those who now disencourage art classes in favor of a short lesson on how to prompt an image generator will also disencourage writing due to speech-to-text technologies. Maybe the problem is that they don’t use LLMs, but often a more primitive version of neural networks.

And I’m not 100% against new tools. I even use Neural Amp Modeler, sometimes even two instances with one having a Boss HM-2 response for that Swedish chainsaw tone. But these prompt machines are barely more than toys for real professional work, due to the lack of actual control beyond prompting.

  • zd9@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 day ago

    Sorry but you don’t really know what you’re talking about. I think your idea of AI is very narrow to LLM for customer facing applications, but it has been and will continue to be used in thousands of applications.

    • aesthelete@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      No, the common definition of AI has largely changed to refer to LLM chatbots / “generative AI”. You can thank Sam Altman and his butt sucking buddies for that.

      You just want to argue definitions because it tickles your fancy.

      This very community is named “fuck AI” but I doubt you’ll find many people here against OCR (which is technically “AI”).

      • zd9@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        I literally research AI for a living, and the “common definition” just means it’s what you personally think, because to me the common definition of AI is… AI, which includes LLMs, CNN, LSTM, multi-modal, symbolic AI, generative AND discriminative, etc.

        • aesthelete@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 day ago

          Good for you! 👍 Would you like a cookie? Here ya go: 🍪

          It’s not what I personally think, dude. I don’t even like that the definition has shifted, but it has thanks largely to those with a chatbot fetish.

          Again, look at the community you’re in. Do you think we’re here because of Google translate?

          • zd9@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            21 hours ago

            Yes I would like a cookie, thank you.

            Words have meanings, so just be more specific is all I ask, because otherwise it negatively paints an entire technology with a broad brush.

            • aesthelete@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              16 hours ago

              You simply don’t like that the popular meaning has changed. I don’t really either but it doesn’t matter.

              I’m not a cloud. So go find one of those to yell at instead.