Apple changes plans to scan iPhones for child abuse images after criticism

After being criticized for a week for its plans to scan iPhone images to detect child sexual abuse material (CSAM), Apple on Friday announced that it will seek only images flagged by clearinghouses in multiple countries including the US National Center for Missing and Exploited Children (NCMEC). 

The company previously declined to tell how many matched images on iPhones or computers it would take for the system to notify Apple workers to perform human reviews. 

In its Security Threat Model Review of Apple’s Child Safety Features paper published last Friday, the iPhone-maker said it would begin with an initial match threshold of 30 images and this number could change later depending on the improvement of the system. 

We expect to choose an initial match threshold of 30 images,” Apple wrote in the paper. ‘’Since this initial threshold contains a drastic safety margin reflecting a worst-case assumption about real-world performance, we may change the threshold after continued empirical evaluation of NeuralHash false positive rates – but the match threshold will never be lower than what is required to produce a one-in-one trillion false positive rate for any given account.’’

Apple first announced the controversial plan earlier this month which was met with global backlash due to privacy concerns. Although some people support the company saying that the system could help protect child safety, many people including WhatsApp CEO Will Cathcart and even Apple’s own employees criticized the plan saying it would put user privacy at risk. 

‘’Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.’ said WhatsApp CEO Catchart last week. ‘’Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.’’

Read more: WhatsApp CEO slams Apple’s plans to scan iPhones for child abuse images

Last week, Reuters reported that Apple employees held a discussion of over 800 messages in the company’s internal Slack channel and many of them expressed their concerns regarding the plans. 

Written by Maya Robertson


Leave a Reply

    One Ping

    1. Pingback:

    Leave a Reply

    Your email address will not be published. Required fields are marked *


    Turkish game developers creating top-chart games open up to the world

    Cisco acquires app monitoring startup Epsagon for $500 million