Yup, literally seeing human features in random noise. LLMs can’t think and aren’t conscious; anyone telling you otherwise is either trying to sell you something or has genuinely lost their mind.
I don’t even think necessarily that they’ve lost their mind. We built a machine that is incapable of thought or consciousness, yes, but is fine tuned to regurgitate an approximation of it. We built a sentience-mirror, and are somehow surprised that people think the reflection is its own person.
Even more than a sentence mirror, it will lead you into a fantasy realm based on the novels it’s trained on, which often include… AI becoming sentient. It’ll play the part if you ask it.
Yup, literally seeing human features in random noise. LLMs can’t think and aren’t conscious; anyone telling you otherwise is either trying to sell you something or has genuinely lost their mind.
I don’t even think necessarily that they’ve lost their mind. We built a machine that is incapable of thought or consciousness, yes, but is fine tuned to regurgitate an approximation of it. We built a sentience-mirror, and are somehow surprised that people think the reflection is its own person.
I’d always thought that philosophical zombies were a fiction. Now we’ve built them.
Even more than a sentence mirror, it will lead you into a fantasy realm based on the novels it’s trained on, which often include… AI becoming sentient. It’ll play the part if you ask it.