How can one configure their Lemmy instance to reject illegal content? And I mean the bad stuff, not just the NSFW stuff. There are some online services that will check images for you, but I’m unsure how they can integrate into Lemmy.

As Lemmy gets more popular, I’m worried nefarious users will post illegal content that I am liable for.

  • snowe@programming.dev
    link
    fedilink
    arrow-up
    8
    ·
    14 hours ago

    You can set up Cloudflare as your CDN and turn on CSAM detection. It will automatically block links to known CSAM from the managed global CSAM hash lists.

    If you want something in addition to that, you can use db0’s plugin that adds in a similar capability.

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      13 hours ago

      From your description it’s unclear, does this also block CSAM that’s physically on your infrastructure, or just any links to external content?

      CloudFlare is currently attempting to block LLM bots and doing a shit job at it. I’m guessing that any CSAM blocking would be incomplete at best.

      What happens if some “gets through”, or if non-CSAM content is blocked, both materially, as-in, what happens, and, what are the legal implications, since I doubt that CloudFlare would ever assume liability for content on your infrastructure.

      Edit: I thought I’d also point out that this is not the only type of content that could get you into a legal black hole. For example, if a post was made that circumvented a legal ruling, say when a court in Melbourne, Australia, makes a suppression order that someone breaches. Or if defamatory content was published, etc.

  • Onno (VK6FLAB)@lemmy.radio
    link
    fedilink
    arrow-up
    7
    ·
    14 hours ago

    I am not a lawyer and I don’t play one on the internet.

    To my understanding the process is only prevented by controlling who can have an account on your instance.

    That said, it’s not clear to me how federated content is legally considered.

    The only thing I can think of is running a bot on your instance that uses the API of a service such as what you mention to deal with such images.

    Your post is the first one I’ve seen recently that is even describing the issue of liability, but it’s in my opinion the single biggest concern that exists in the fediverse and it’s why I’ve never hosted my own instance.