Since Twitter is making the image edits itself, wouldn’t it be liable primarily for the SA/CP charges, while the requester is “just” an accessory?
I mean if I tell you to kill someone, and you do it, you don’t get to dodge charges by saying “I’m just a platform facilitating wishes”
One of Musk’s bootlickers was reported to have Tweeted that Grok is just like a pen, and you can’t blame the pen for writing illegal content. Then Musk Retweeted the Tweet, adding that anybody who uses Grok to make illegal images will be banned.
It’s a good point, except a pen can’t be “told” what to write (or draw). A person still has to actually “do” the work. Telling Grok (or any other AI) that you want to see a child in a bikini should be a denied request every single time.
The answer is a law that says each of us owns our likeness and PII (Personally Identifying Information) and that we alone set the terms for licensing it. Since minors cannot consent, their likeness and PII would be forbidden to be used. For adults, they could set the price, and if a woman demanded $1M for her likeness, an AI would be legally obligated to collect that fee before generating that image. Otherwise, the company that operates that AI would be on the hook.
I called it when I said Musk wanted this to happen in an earlier post.
I would say that anyone uploading pictures of minors to the pedo app is almost as liable.




