Apple kills plans to scan iCloud images to detect CSAM

Tech giant Apple has officially cancelled its controversial plans to scan iCloud images to detect child sexual abuse material (CSAM).

The company first announced its plans to launch a software to do so in August 2021, but quickly came under fire by advocacy groups, researchers and its customers raising security concerns. 

Days after the announcement, an international coalition of over 90 policy and rights groups wrote an open letter to CEO Tim Cook, demanding the iPhone-maker to abandon the plans. While Apple said that the software would be privacy-focused, critics argued that it could be used by governments as a surveillance tool, putting people’s privacy at risk. 

Following the global criticizm, the company said it would ‘’take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features’’.

However, it has now decided not to continue with its previous plans, Wired reported.

‘’After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,’’ Apple said in a statement. ‘’We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.’’

Written by Maya Robertson

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

6 Best Practices for Mobile Marketers to Make Better Ad Creatives

Disney+ launches its ad-supported subscription tier