Apple delays plans to roll out CSAM detection in iOS 15 after privacy

Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology, which it announced last month, citing feedback from customers and policy groups.

If you remember, that reaction has been largely negative. The Electronic Frontier Foundation said this week it has received more than 25,000 signatures from consumers. On top of this, nearly 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to drop plans to roll out the technology.

In a statement Friday morning, Apple told :

“Last month we announced a plan of facilities aimed at helping protect children from predators who use communication devices to recruit and exploit, and limit the spread of child sexual abuse material. Based on feedback from advocacy groups, researchers and others, we have decided to take additional time over the coming months to gather input and make improvements before releasing these important child protection features.

Apple’s so-called NeuralHash technology is designed to identify known CSAMs on a user’s device, without having to hold the image or know the content of the image. Because user photos stored in iCloud are encrypted so that even Apple cannot access the data, NeuralHash instead scans for known CSAMs on the user’s device, which Apple claims is the current blanket used by cloud providers. More privacy-friendly than scanning.

But security experts and privacy advocates have expressed concern that the system could be misused by highly resourceful actors, such as the government, to implicate innocent victims or to manipulate the system to trace other materials that authoritarian nations may use. States find objectionable.

Within weeks of announcing the technology, the researchers said they were able to create a “hash collision” using NeuralHash, effectively tricking the system into thinking it had two separate images.

iOS 15 is expected to arrive later in the next few weeks.

This report has been updated with more details about NeuralHash and to clarify that iCloud Photos are encrypted but not end-to-end encrypted.

Read more:

Related Posts

error: Content is protected !!