In a study published on Monday by the peer-reviewed journal Patterns, data scientist Alex de Vries-Gao estimated the carbon emissions from electricity used by AI at between 33 million and 80 million metric tons.

That higher figure would put it above last year’s totals for Chile (78m tons), Czechia (78m tons), Romania (71m tons), and New York City (48m tons, including both CO2 and other greenhouse gases).

  • brownsugga@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    17 hours ago

    At what point do people just start sabotaging these things? Datacenters and AI may represent things that only exist virtually but they must exist in our physical world as well

  • bridgeenjoyer@sh.itjust.works
    link
    fedilink
    arrow-up
    17
    arrow-down
    9
    ·
    3 days ago

    Is this actually true?

    A lot of people dont think there is a big impact of using llms vs having a gaming pc or heating your house…its just another waste of the modern world that humanity will figure out how to manage, like rebuilding the grid (which has to be done anyway)

    Let me clarify, I hate llms and every corporation pushing this shit, but let’s be sure we are getting mad about correct data.

    • bcovertigo@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      3 days ago

      They acknowledge that they’re forced to estimate because companies are intentionally unclear with the data.

      From the study

      “The lack of distinction between AI and non-AI workloads in the environmental reports of data center operators means it is possible to assess the environmental impact of AI workloads only by approximating them through data centers’ general performance metrics. Company-wide metrics from the environmental disclosure of data center operators suggest that AI systems may have a carbon footprint equivalent to that of New York City in 2025, while their water footprint could be in the range of the global annual consumption of bottled water. Further disclosures from data center operators are urgently required to improve the accuracy of these estimates and to responsibly manage the growing environmental impact of AI systems.”

      • bridgeenjoyer@sh.itjust.works
        link
        fedilink
        arrow-up
        21
        ·
        edit-2
        3 days ago

        Its quite shady. These plants have thousands of meters and transmitters tracking everything, including water usage.

        Why is that data not avilable to us?

        Also, idgaf if its ai related or data center related. Its all bad in my book. Theyre boiling the oceans to serve me a personalized ad.

        Ah, capitalism.

    • real_squids@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      3 days ago

      If your gaming pc is using as much power as a house heater you have a massive problem lol

      I gotta admit though, 300W from my PC (at peak load) is plenty to keep myself and my room comfortable. And unlike heaters you get to play videogames at the same time

      • WorldsDumbestMan@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Meanwhile, I can charge my tablet within about a hour on a portable solar charger, it shows no heat as far as I’m aware, and I rarely need other electronics.

        • Lost_My_Mind@lemmy.worldM
          link
          fedilink
          arrow-up
          7
          ·
          3 days ago

          Ah yes. Blast the demons from hell as your thermostat raises your house’s tempature to make it hot as hell in your home.

          That’s called immersion.

    • Blaster M@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      12
      ·
      edit-2
      3 days ago

      Last time I saw numbers, the power requirements of 100,000 ChatGPT responses equates to the same energy usage at the server end as one person watching 1 hour of Netflix.

      A gaming PC typically has a GPU that pulls between 200-300W when running a AAA game, plus the 90W for the CPU being stressed, plus another 100 for other system components. Add 45-60W for your monitor as well.

      Gaming takes a lot of power.

      Running a Local LLM using the same GPU, you need about 10 seconds of just the GPU and supporting haddware energy, versus how many hours you’re running a game. Gaming is more environmentally damaging that AI is.

      The reason these big scary numbers are here is because all that energy usage is collected in one spot. If we added up everyone’s individual gaming habits, it might make datacenter energy usage look small in comparison.

      The only real difference is the datacenters using open liquid cooling instead of air or closed loop, which the latter two are much more environmentally ideal.

      • [deleted]@piefed.world
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        3 days ago

        Last time I saw numbers, the power requirements of 100,000 ChatGPT responses equates to the same energy usage at the server end as one person watching 1 hour of Netflix.

        They use the majority of water during the training phase, but present only usage numbers for people to fall for like you are doing right here.

        That is like only counting the time spent by a delivery driver walking packages to a house and ignoring all of the time spent getting it to the delivery company, sorting it, driving it to the airport, flying it to another city, driving it to the distribution center, sorting it again, and then driving it to your house. Sure, if you only count the time delivery people spent walking to houses it isn’t that much time at all!

        • brucethemoose@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          edit-2
          2 days ago
          • For sane models, that’s way overstated. Stuff like GLM 4.6 or Kimi K2 is trained on peanuts, and their inference GPU time blows it away.

          • I have not checked on the latest OpenAI/Grok training cost claims. But if any company is spending tens of millions (or hundreds?) on a single training run… that’s just stupid. It means they’re burning GPUs ridiculously inefficiently, for the sake of keeping up appearances. Llama 4 rather definitively proved that scaling up doesn’t work.

          The hype about ever increasing training costs is a grift to get people to give Sam Altman money. He doesn’t need that for the architectures they’re using, and it won’t be long before everyone figures it out and switches to cheaper models for most usage.

      • cron@feddit.org
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        2 days ago

        Those numbers comparing ChatGPT with Netflix do not seem plausible to me.

        Streaming a video is basically sending a file over the internet, while ChatGPT requires multiple GPUs to run.

        I found different numbers online:

        • ChatGPT: 0.3 to 3 Wh per query
        • Netflix: 77 Wh per hour (source)

        By this rough calculation, your estimate is quite a few magnitudes wrong. It is maybe 100-500 ChatGPT queries that are equal to watching Netflix, not 100.000.

      • bridgeenjoyer@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        arrow-down
        7
        ·
        3 days ago

        Exactly. Id rather focus on the dumbing down of society and destruction of the internet (not to mention facism and mass surveillance) from ai rather than nitpick environmental concerns no one cares about.

        • Catoblepas@piefed.blahaj.zone
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          3 days ago

          ‘Who cares that the house is on fire? My internet is crap!’ is exactly the dumbing down of society you’re talking about. You can’t use the internet if the planet doesn’t support human life.

          • bridgeenjoyer@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            2 days ago

            Very true. Im just saying there’s a lot of other much worse environmental concerns to go after. Harping on ai for this reason just gets us laughed at as old man yelling at cloud.

  • SendMePhotos@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    15
    ·
    3 days ago

    Silver lining, the Ai companies stress because they will run out of freshwater and scramble to find a way to manufacture clean water which then benefits the people

    • [deleted]@piefed.world
      link
      fedilink
      English
      arrow-up
      43
      ·
      3 days ago

      No they fucking wont.

      Did cigarette companies ever sponsor health initiatives to keep their customers from dying off? Do oil companies do anything to improve the environment? Do large companies polluting waterways do anything to clean them?

      What a ridiculous idea.

      • VeganBtw@piefed.social
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        3 days ago

        Calm down, buddy. Exxon’s website tells us that they are trying to “achieve [their] 2030 emission-intensity reduction plans.” You see, they are so keen to help the planet, they made up their own plans and will try to respect them. If that’s not improving our environment, you’re lost my dude 😎.

    • Catoblepas@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      22
      ·
      3 days ago

      I absolutely would not count on these companies to innovate. They can’t even make a functioning product and they’re supposed to find a new way to make potable water?

      • Blaster M@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        3 days ago

        I mean, we can and do mass produce potable water from seawater right now, but there is an ugly problem to the distillery method: seawater has a lot of unwanted stuff in it, and if we just dump it back in the ocean, it raises the toxicity locally for that area.

        • Hackworth@piefed.ca
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          2
          ·
          3 days ago

          Data centers can also use closed-loop cooling, air cooling, immersion cooling, etc; they’re just using potable water because it is the cheapest (for them). But even if they didn’t innovate at all, the high end of that estimate is like 0.02% of yearly global freshwater withdrawals. As you say, the devastating part is that location constraints determine who bears the externalities.

    • prole@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      4
      ·
      3 days ago

      Why the fuck do they need to use fresh water in the first place? Isn’t it for cooling? Why do they need potable water for that?

      • bridgeenjoyer@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        ·
        2 days ago

        My easy guess from working in piping: corrosion of piping/fittings/sensors.

        Same reason you cant run hose water in your car radiator. You need to use distilled mixed with coolant.