Apple delays controversial CSAM detection feature

Apple Sign
Image: Shutterstock via Dennis

The mobile phone giant has backtracked on plans to scan images on its devices

Print

PrintPrint
Life

Read More:

6 September 2021 | 0

Apple has delayed its controversial image scanning feature following negative feedback.

The company updated its briefing page on the technology, explaining it was delaying the feature based on the response it received from customers and privacy advocates.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said.

 

advertisement



 

The feature, announced in August, would have allowed Apple to scan photos uploaded to iCloud. The photos would be scanned on user devices and compared against a set of hashes known as Child Sexual Abuse Material (CSAM) images.

This database was originally supposed to come from the National Center for Missing and Exploited Children (NCMEC), but Apple later explained it would only scan images that also appeared in clearing houses across multiple countries.

Apple also said it would only trigger a human review if it found 30 CSAM matches using this method.

The company also announced a second feature that would scan images sent to children in its iMessage app to detect nudes and notify parents.

Both plans provoked concerns from global privacy groups. They were concerned Apple’s scanning technology could be used to scan for other kinds of imagery, opening users up to government surveillance. The company vowed not to allow government requests for expanded searches.

This week, the Electronic Frontier Foundation delivered a petition protesting the technology, which was slated to be included in the next version of iOS and initially restricted to US users.

The organisation pointed out that the technology breaks the end-to-end encryption functionality Apple has touted in its operating systems.

“Apple’s surveillance plans don’t account for abusive parents, much less authoritarian governments that will push to expand it. Don’t let Apple betray its users,” it added.

© Dennis Publishing

Read More:



Comments are closed.

Back to Top ↑