Apple has announced new measures to protect children from predators and ensure their safety, including a new software that will scan images on iPhones in the United States to detect child sexual abuse material (CSAM). However, the announcement was met with privacy concerns.
The software, which will arrive later this year with the release of iOS 15 and iPadOS 15, will use cryptography and artificial intelligence to scan images when they’re stored onto iCloud Photos.
Images will be given unique numerical codes with the ‘hashing’ technology called NeuralHash. If the code of an image matches the code of an already known CSAM, Apple will review them manually, and if it finds them illegal, it will report them to the National Center for Missing and Exploited Children (NCMEC).
In its detailed CSAM Detection – Technical Summary report, the company said:
- Apple does not learn anything about images that do not match the known CSAM database.
- Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
- The risk of the system incorrectly flagging an account is extremely low. ( In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.)
- Users can’t access or view the database of known CSAM images.
- Users can’t identify which images were flagged as CSAM by the system.
“Apple’s expanded protection for children is a game changer,” said John Clark, president and CEO of NCMC. “The reality is that privacy and child protection can coexist.”
However, following the announcement, many experts expressed their user privacy concerns and said that the new software could be used by governments to spy on citizens. Some also say that Apple’s decision could force other tech companies to offer similar software too.
“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” said Matthew Green, a security researcher at Johns Hopkins University.
“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone,” he added.
In addition to the CSAM detection system, Apple also announced that the iMessage app will warn children and their parents when they receive or send sexually explicit photos.
When children receive this type of content, Apple will automatically blur the photos and ask them if they really want to view it. It will also offer helpful sources and tell them it’s okay if they don’t want to view.
If they choose to view the photo or send similar photos to others, Apple will notify their parents by sending them a message.
Matthew said that the new feature could be used by governments to demand Apple to conduct mass surveillance.
“Now Apple has demonstrated that they can build a surveillance system, for very specific purposes, that works with iMessage,” he said. “I wonder how long they’ll be able to hold out from the Chinese government?”
What do you think?