• Wildmimic@anarchist.nexus
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    4
    ·
    12 hours ago

    OP, this statement is bullshit. you can do about 5 million requests for ONE flight.

    i’m gonna quote my old post:

    I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.

    The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That’s nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.

    Even compared to other online activities that use data centers ChatGPT’s power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!

    Water is about the same, although the positioning of data centers in the US sucks. The used water doesn’t disappear tho - it’s mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure . If you want to reduce water usage, go vegan or fix water pipes.

    Read up here !

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 hours ago

      If you want to look at it another way, if you assume every single square inch of silicon from TSMC is Nvidia server accelerators/AMD EPYCs, every single one running AI at full tilt 24/7/365…

      Added up, it’s not that much power, or water.

      That’s unrealistic, of course, but that’s literally the physical cap of what humanity can produce at the moment.

    • Reygle@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      13
      ·
      9 hours ago

      If you only include chat bots, your numbers look good. Sadly reality isn’t in “chat bots”.

        • Reygle@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          4 hours ago

          Image/Video generation, analysis (them scrubbing the entire public internet) consumes far, far more than someone asking an AT “grok is this true”

          • lets_get_off_lemmy@reddthat.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            Do you have a source for this claim? I see this report by Google and MIT Tech Review that says image/video generation does use a lot of energy compared to text generation.

            Taking the data from those articles, we get this table:

            AI Activity Source Energy Use (per prompt) Everyday Comparison
            Median Gemini Text Prompt Google Report 0.24 Wh Less energy than watching a 100W TV for 9 seconds.
            High-Quality AI Image MIT Article ~1.22 Wh Running a standard microwave for about 4 seconds.
            Complex AI Text Query MIT Article ~1.86 Wh Roughly equivalent to charging a pair of wireless earbuds for 2-3 minutes.
            Single AI Video (5-sec) MIT Article ~944 Wh (0.94 kWh) Nearly the same energy as running a full, energy-efficient dishwasher cycle.
            “Daily AI Habit” MIT Article ~2,900 Wh (2.9 kWh) A bit more than an average US refrigerator consumes in a full 24-hour period.
            • MangoCats@feddit.it
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 hour ago

              Another way of looking at this: A “Daily AI Habit” on your table is about the same as driving a Tesla 10 miles, or a standard gas car about 3 miles.

              Edit 4 AI videos, or detour and take the scenic route home from work… about the same impact.

              • lets_get_off_lemmy@reddthat.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                28 minutes ago

                I like that as well, thank you! Yeah, the “Daily AI Habit” in the MIT article was described as…

                Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise.

                Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram.

                You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.

                As a daily AI user, I almost never use image or video generation and it is basically all text (mostly in the form of code), so I think this daily habit likely wouldn’t fit for most people that use it on a daily basis, but that was their metric.

                The MIT article also mentions that we shouldn’t try and reverse engineer energy usage numbers and that we should encourage companies to release data because the numbers are invariably going to be off. And Google’s technical report affirms this. It shows that non-production estimates for energy usage by AI are over-estimating because of the economies of scale that a production system is able to achieve.

                Edit: more context: my daily AI usage, on the extremely, extremely high end, let’s say is 1,000 median text prompts from a production-level AI provider (code editor, chat window, document editing). That’s equivalent to watching TV for 36 minutes. The average daily consumption of TV in the US is around 3 hours per day.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        7 hours ago

        I’m not sure what you’re referencing. Imagegen models are not much different, especially now that they’re going transformers/MoE. Video gen models are chunky indeed, but more rarely used, and they’re usually much smaller parameter counts.

        Basically anything else machine learning is an order of magnitude less energy, at least.