Read what Apple has to say about not scanning iCloud Photos

Apple has explained why it abandoned plans to roll out a feature that detects known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos. The company stated that scanning users’ privately stored iCloud data would create new threats for data thieves and potentially lead to bulk surveillance and unintended consequences.

Leave a Reply