After being criticized for its plans to scan iPhone images to detect child sexual abuse material (CSAM), Apple announced on Friday that it has delayed the rollout of its CSAM detection technology.
In mid August, an international coalition of over 90 policy and rights groups published an open letter to Apple CEO Tim Cook asking the company to cancel its plans. WhatsApp CEO Will Cathcart took aim at Apple on his Twitter account saying that the iPhone-maker’s plan to scan images to detect child sexual abuse material is ‘’ the wrong approach and a setback for people’s privacy all over the world’.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple said in a statement on Friday
Apple first announced its plans in early August and said that the software, which previously was planned to arrive later this year with the release of iOS 15 and iPadOS 15, will use cryptography and artificial intelligence to scan images when they’re stored onto iCloud Photos.
Images will be given unique numerical codes with the ‘hashing’ technology called NeuralHash. If the code of an image matches the code of an already known CSAM, Apple will review them manually, and if it finds them illegal, it will report them to the National Center for Missing and Exploited Children (NCMEC).
The civil society organizations had said that the software could be used by governments to detect not only child sexual abuse material but also other images that they find ‘objectionable‘, such as images of political protests.
Comments
Loading…