The energy consumption of a single AI exchange is roughly on par with a single Google search back in 2009. Source. Was using Google search in 2009 unethical?
Total nonsense. ESRGAN was trained on potatoes, tons of research models are. I fintune models on my desktop for nickels of electricity; it never touches a cloud datacenter.
At the high end, if you look past bullshiters like Altman, models are dirt cheap to run and getting cheaper. If Bitnet takes off (and a 2B model was just released days ago), inference energy consumption will be basically free and on-device, like video encoding/decoding is now.
Again, I emphasize, its corporate bullshit giving everything a bad name.
Even if the data is “ethically sourced,” the energy consumption is still fucked.
Depends on what ur producing, running llama 3.1 locally on a raspberry pi doesnt produce any meaningful impact on the climate.
The energy consumption of a single AI exchange is roughly on par with a single Google search back in 2009. Source. Was using Google search in 2009 unethical?
Total nonsense. ESRGAN was trained on potatoes, tons of research models are. I fintune models on my desktop for nickels of electricity; it never touches a cloud datacenter.
At the high end, if you look past bullshiters like Altman, models are dirt cheap to run and getting cheaper. If Bitnet takes off (and a 2B model was just released days ago), inference energy consumption will be basically free and on-device, like video encoding/decoding is now.
Again, I emphasize, its corporate bullshit giving everything a bad name.