• Honytawk@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    15
    ·
    1 day ago

    AI uses 1/1000 the power of a microwave.

    Are you really sure you aren’t the one being fed lies by con men?

    • jimjam5@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      18 hours ago

      What? Elon Musk’s xAI data center in Tennessee (when fully expanded & operational) will need 2 GW of energy. That’s as much as some entire cities use in a year.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        5
        ·
        15 hours ago

        Rockstar games: 6k employees 20 kwatt hours per square foot https://esource.bizenergyadvisor.com/article/large-offices 150 square feet per employee https://unspot.com/blog/how-much-office-space-do-we-need-per-employee/#%3A~%3Atext=The+needed+workspace+may+vary+in+accordance

        18,000,000,000 watt hours

        vs

        10,000,000,000 watt hours for ChatGPT training

        https://www.washington.edu/news/2023/07/27/how-much-energy-does-chatgpt-use/

        Yet there’s no hand wringing over the environmental destruction caused by 3d gaming.

        • jimjam5@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 hours ago

          Semi non sequitur argument aside, your math seems to be off.

          I double checked my quick phone calculations and using figures provided, Rockstar games with their office space energy use is roughly 18,000,000 (18 million) kWh, not 18,000,000,000 (18 billion).

          • Blue_Morpho@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            4 hours ago

            I put the final answer in Watt hours, not Kw hours to match. ChatGPT used 10B watt hours, not 10B Kwatt hours.

            • jimjam5@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              3 hours ago

              Ahh was wondering where the factor of 1000 came from.

              Without turning into a complete shootout, I can kind of see the point with comparing energy usage, but as others have said with these massive data centers it’s like comparing two similar but ultimately different kinds of beasts.

              Beyond just the energy used in training of generative AI models in data centers, there’s also the energy it needs to fulfill requests once implemented (24/7, thousands of prompts per second).

              • Blue_Morpho@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                3 hours ago

                here’s also the energy it needs to fulfill requests once implemented

                Just like everyone playing the 3d game once its finished development and sold. A few hours of gaming or a few hours of making AI slop photos is the same watts. No one notices the energy when its spread across millions of homes as compared to centralized at a data center. A few years ago Nvidia, Microsoft and others were pushing gaming as a streaming service (The games were being run remotely and your keyboard/gamepad was transmitted to their servers, then the video was streamed back). Those used massive data centers. Yet no one was screaming to stop gaming.

                • jimjam5@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  3 hours ago

                  Now it will be PCs spread out in addition to large data centers in combo that will be consuming energy.

                  And I do remember that phase of game/device streaming! I was a bit skeptical of it all and ended up never using those technologies but that did allow me to learn about alternatives like Moonlight/Sunshine.

        • Glitterkoe@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          14 hours ago

          And then you have a trained model that requires vast amounts of energy per request, right? It doesn’t stop at training.

          You need obscene amounts GPU power to run the ‘better’ models within reasonable response times.

          In comparison, I could game on my modest rig just fine, but I can’t run a 22B model locally in any useful capacity while programming.

          Sure, you could argue gaming is a waste of energy, but that doesn’t mean we can’t argue that it shouldn’t have to cost boiling a shitload of eggs to ask AI how long a single one should. Or each time I start typing a line of code for that matter.

    • Ace T'Ken@lemmy.ca
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      21 hours ago

      Hi. I’m in charge of an IT firm that is been contracted to carry out one of these data centers somewhat unwillingly in our city. We are currently in the groundbreaking phase but I am looking at papers and power requirements. You are absolutely wrong on the power requirements unless you mean per query on a light load on an easy plan, but these will be handling millions if not billions of queries per day. Keeping in mind that a single user query can also be dozens, hundreds, or thousands of separate queries… Generating a single image is dramatically more than you are stating.

      Edit: I don’t think your statement addresses the amount of water it requires as well. There are serious concerns that our massive water reservoir and lake near where I live will not even be close to enough.

      Edit 2: Also, we were told to spec for at least 10x growth within the next 5 years which, unless there are massive gains in efficiency, I don’t think there are any places on the planet capable of meeting the needs of, even if the models become substantially more efficient.