• 0 Posts
  • 34 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle

  • That was probably his stance when YouTube ad revenue was his stream of income.

    In 2024 they pay pennies, and his real income is from sponsorships like those d-brand skins and manscaping utilities. And their own merch, of course.

    They’ve been pushing their own media platform (floatplane), so I’m willing to bet this was a bit of a game of chicken with YouTube. YouTube wouldn’t ban one of their biggest channels, and even if they did it’d turn into great publicity for floatplane.

    While I don’t think they’d be able to get a lot of their subscribers over to floatplane completely, I do think they’d be able to pull over lots of random views by having their shorts on Facebook, Instagram and whoever else is trying to mimic tiktok these days.








  • I think you got the point. Criminals use the same services as the rest of us. CSAM is being used as pretext to outlaw or bypass end-to-end encryption.

    It’s a noble cause, but it puts all of us in a vulnerable position. As post-communist countries know from past experience, once these measures are in place the next government will use it for surveillance of all kind when it’s their turn.

    Yes, I know. If you have nothing to hide, you have nothing to fear. I’m not doing anything illegal at the toilet, but I still prefer to keep the door closed - even if I’m home alone.

    Chat control 1.0 has been voluntarily inplemented by big platforms, but it has not been fruitful. Lots of false positives and not enough resources to look at the true positives. The delegates preparing this have demonstrated poor technical understanding.

    Whistleblowers won’t have confidence in anonymity. A journalist asked the author (Ylva Johansson) of the proposal if he, as a journalist, would still be able to receive tips from whistleblowers with secrecy. She stumbled ln her answer and said that CSAM should be illegal.

    Police and officials are of course exempt from chat control 2.0. Secrecy for me, but not for thee. . .








  • Even without the privacy concerns, I think it removes the sovereignty of your own computer.

    I decide what code I run on my computer.

    A few years ago I had some peripheral that started iTunes Music.app every time I plugged it in. (Bluetooth headphones, I think). As I don’t use it, and there was no way to disable it I figured i could just delete it.

    Nope! Music.app is a system application on a read-only partition shadowed on your root filesystem. Apparently it is possible by booting with the partition in read-write developer mode, but you’ll get to do it all over again with every update.


  • No. It kind of falls on Dijkstra’s old statement. “Testing can only prove the presence, not absence of bugs.”

    You can prove logical correctness of code, but an abstract thing such as “is there an unknown weakness” is a bit harder to prove. The tricky part is coming up with the correct constraints to prove.

    Security researchers tend to be on the testing side of things.

    A notable example is how DES got its mixers changed between proposal and standardisation. The belief at the time was that the new mixers had some unknown backdoor for the NSA. AFAIK, it has never been proven.