cross-posted from: https://lemmy.world/post/7258145

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. Is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.

ARTICLE - Technology Review

ARTICLE - Mashable

ARTICLE - Gizmodo

The researchers tested the attack on Stable Diffusion’s latest models and on an AI model they trained themselves from scratch. When they fed Stable Diffusion just 50 poisoned images of dogs and then prompted it to create images of dogs itself, the output started looking weird—creatures with too many limbs and cartoonish faces. With 300 poisoned samples, an attacker can manipulate Stable Diffusion to generate images of dogs to look like cats.

  • SereneSadie@quokk.au
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 days ago

    Right, because artists are all about hoarding their work to themselves and not letting anyone get copies ever.

    They’ll have to DRM it by restricting access to being there in person only, no recording devices whatsoever.

    Clearly thats the only logical solution left.