AI company’s chatbot faces criticism over its generation of sexualized, nonconsensual images of women and girls

Elon Musk’s artificial intelligence company has raised $20bn in its latest funding round, the startup announced Tuesday, even as its marquee chatbot Grok faces backlash over generating sexualized, nonconsensual images of women and underage girls.

xAI’s Series E funding round featured big name investors, including Nvidia, Fidelity Management and Resource Company, Qatar’s sovereign wealth fund, and Valor Equity Partners – the private investment firm of Musk’s longtime friend and former Doge member Antonio Gracias.

The funding round exceeded its initial $15bn target, according to xAI’s press release. The company touted Grok’s image generation abilities in the announcement of its latest funding round

  • NotSteve_@piefed.ca
    link
    fedilink
    English
    arrow-up
    33
    ·
    3 days ago

    over Grok deepfakes

    I feel like the title of this article kind of undersells the fact that Elon Musk’s own platform, X, was creating and serving CSAM content and has yet to commit to any policy changes that’d prevent it from continuing to happen

    It’s driving me crazy how much every major news org is just sane-washing everything that is happening right now

    • dust_accelerator@discuss.tchncs.de
      link
      fedilink
      arrow-up
      3
      ·
      3 days ago

      That’s why i’ve started that in regular conversation: “XTwitter? Oh the child pornography website? No, I don’t frequent such websites”

      reach may be limited, but word of mouth is still a more trusted medium in my bubble.

    • SillyGooseQuacked@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      3 days ago

      Doubly crazy when you consider that (1) none of the other mainstream AI services make it so easy to create and distribute CSAM content - all it takes on X is an @ symbol while every other ai service requires an actual prompt injection attack, and (2) a ton of well-meaning people still use X, despite the simultaneous Nazi (and apparently Pedophile) bar problem.