Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)DR
dragonfornicator @partizle.com
Posts 0
Comments 7
Apple Expands Its On-Device Nudity Detection to Combat CSAM — WIRED
  • I am aware that it's local, i just assumed it would also call home.

    My threat model here is based on cases like this: https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation

    And yes, i did see it as a privacy issue, not a censorship one. Inevitably, if this finds the pressure to expand it towards other content, it could be a problem comparable to the "Article 13" Europe was, or is, facing.

    Generally, blocking specific types of content is a valid option to have. As long as it is an option, and the user knows it is an option. I just distrust it coming from the likes of google or apple.

  • Apple Expands Its On-Device Nudity Detection to Combat CSAM — WIRED
  • It's something that not talked about, which, given our data-obsessed world, i interpret as "we just do it by default (because nobody will complain, it's normal, yada yada)".

    Besides, it's stated that the scanning itself does only happen on your device. If you scan locally for illegal stuff, it's not really far fetched that someone gets informed about someone having, for example, CSAM on their device. Why else would you scan for it? So at the very least, that information is collected somewhere.

  • Apple Expands Its On-Device Nudity Detection to Combat CSAM — WIRED
  • So more scanning of arbitrary data for the sake of sanctimonious reasons, and definitely not for the sake of collecting data. I'm curious what is send where regarding those scans. There has been a scandal regarding amazon and those ring cameras. That software might run on the device, but whatever detection it's using is bound to make mistakes, and who sees the results? Is everything fully automated? Or human verified? I don't know which one would concern me more. Not even talking about young people taking photos of their body for various reasons. And just because it runs on your device does not necessarily mean that whatever is scanned is never sent anywhere. It just means that the scanning happens on your device.

    Quite frankly, if it wasn't horrible i'd find the idea of some secret ring inside of apple using that CSAM-detection to collect material to sell on the dark net rather interesting. Might make an interesting plot for a thriller or novel...

  • We're giving Lemmy a try: Welcome to [email protected]
  • First.

    But seriously, i am really curious how this whole shebang will turn out. Some subs will go dark for 2 days, which will probably result in not very much. But what about the exodus when the third party launchers will go down. How many will just suck it up and use the official app? How many will actually migrate? Will reddit kneel to the community? Time will tell. Grab some popcorn and enjoy the show!