AI is good for producing low-stakes outputs where validity is near irrelevant, or outputs which would have been scrutinized by qualified humans anyway.
It often requires massive amounts of energy and massive amounts of (questionably obtained) pre-existing human knowledge to produce its outputs.
They’re also good for sifting through vast amounts of data and seeking patterns quickly.
But nothing coming out of them should be relied on without some human scrutiny. Even human output shouldn’t be relied on without scrutiny from different humans.
AI is good for producing low-stakes outputs where validity is near irrelevant, or outputs which would have been scrutinized by qualified humans anyway.
It often requires massive amounts of energy and massive amounts of (questionably obtained) pre-existing human knowledge to produce its outputs.
They’re also good for sifting through vast amounts of data and seeking patterns quickly.
But nothing coming out of them should be relied on without some human scrutiny. Even human output shouldn’t be relied on without scrutiny from different humans.