Apple plans to scan iPhones for child pornography, fuels privacy concerns

Apple has announced new measures to protect children from predators and ensure their safety, including a new software that will scan images on iPhones in the United States to detect child sexual abuse material (CSAM). However, the announcement was met with privacy concerns. 

The software, which will arrive later this year with the release of iOS 15 and iPadOS 15, will use cryptography and artificial intelligence to scan images when they’re stored onto iCloud Photos. 

Images will be given unique numerical codes with the ‘hashing’ technology called NeuralHash. If the code of an image matches the code of an already known CSAM, Apple will review them manually, and if it finds them illegal, it will report them to the National Center for Missing and Exploited Children (NCMEC). 

Image Source Apple

In its detailed CSAM Detection – Technical Summary report, the company said: 

  • Apple does not learn anything about images that do not match the known CSAM database.
  • Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
  • The risk of the system incorrectly flagging an account is extremely low. ( In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.)
  • Users can’t access or view the database of known CSAM images.
  • Users can’t identify which images were flagged as CSAM by the system.

Apple’s expanded protection for children is a game changer,” said John Clark, president and CEO of NCMC. “The reality is that privacy and child protection can coexist.”

However, following the announcement, many experts expressed their user privacy concerns and said that the new software could be used by governments to spy on citizens. Some also say that Apple’s decision could force other tech companies to offer similar software too. 

Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” said Matthew Green, a security researcher at Johns Hopkins University.

“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone,” he added.

In addition to the CSAM detection system, Apple also announced that the iMessage app will warn children and their parents when they receive or send sexually explicit photos.

Image Source Apple

When children receive this type of content, Apple will automatically blur the photos and ask them if they really want to view it. It will also offer helpful sources and tell them it’s okay if they don’t want to view. 

If they choose to view the photo or send similar photos to others, Apple will notify their parents by sending them a message. 


Also Read: Instagram bans adults from DM’ing teens under 18, defaults users under 16 into private accounts


Matthew said that the new feature could be used by governments to demand Apple to conduct mass surveillance.

Now Apple has demonstrated that they can build a surveillance system, for very specific purposes, that works with iMessage,” he said. “I wonder how long they’ll be able to hold out from the Chinese government?

What do you think?

Written by Sophie Blake

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

[Report] Scammers offer Instagram ban services for $60

Trendyol becomes Turkey’s first decacorn with $16.5 bn valuation