themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 5 days agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square109fedilinkarrow-up1604arrow-down122cross-posted to: [email protected]
arrow-up1582arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 5 days agomessage-square109fedilinkcross-posted to: [email protected]
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·4 days agoWhy say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?