This is a little besides the point, but even those use-cases LLMs have the fatal flaw of being obscenely resource intensive. They require huge amounts of electricity and cooling to continue operating. Not to mention most of them are trained on stolen data.
Even when they’re an effective tool for a given task, they’re still not an ethical one.
That’s true; I didn’t touch on those points but I very much agree. (Yes, while I occasionally use it. It’s easy to ignore the implications of what you’re doing for a moment.)
This is a little besides the point, but even those use-cases LLMs have the fatal flaw of being obscenely resource intensive. They require huge amounts of electricity and cooling to continue operating. Not to mention most of them are trained on stolen data.
Even when they’re an effective tool for a given task, they’re still not an ethical one.
That’s true; I didn’t touch on those points but I very much agree. (Yes, while I occasionally use it. It’s easy to ignore the implications of what you’re doing for a moment.)