Apple Expands Its On-Device Nudity Detection to Combat CSAM

Instead of scanning iCloud for illegal content, Apple’s tech will locally flag inappropriate images for kids. And adults are getting an opt-in nudes filter too.

⦿Source