Onno (VK6FLAB)

Anything and everything Amateur Radio and beyond. Heavily into Open Source and SDR, working on a multi band monitor and transmitter.

#geek #nerd #hamradio VK6FLAB #podcaster #australia #ITProfessional #voiceover #opentowork

  • 9 Posts
  • 822 Comments
Joined 1 year ago
cake
Cake day: March 4th, 2024

help-circle
  • From your description it’s unclear, does this also block CSAM that’s physically on your infrastructure, or just any links to external content?

    CloudFlare is currently attempting to block LLM bots and doing a shit job at it. I’m guessing that any CSAM blocking would be incomplete at best.

    What happens if some “gets through”, or if non-CSAM content is blocked, both materially, as-in, what happens, and, what are the legal implications, since I doubt that CloudFlare would ever assume liability for content on your infrastructure.

    Edit: I thought I’d also point out that this is not the only type of content that could get you into a legal black hole. For example, if a post was made that circumvented a legal ruling, say when a court in Melbourne, Australia, makes a suppression order that someone breaches. Or if defamatory content was published, etc.


  • I am not a lawyer and I don’t play one on the internet.

    To my understanding the process is only prevented by controlling who can have an account on your instance.

    That said, it’s not clear to me how federated content is legally considered.

    The only thing I can think of is running a bot on your instance that uses the API of a service such as what you mention to deal with such images.

    Your post is the first one I’ve seen recently that is even describing the issue of liability, but it’s in my opinion the single biggest concern that exists in the fediverse and it’s why I’ve never hosted my own instance.