Apple introducing New Child Safety features across Messages, iCloud, Siri & Search

CSAM (Child Sexual Abuse Material) features to be coming later this year in iOS 15, iPadOS and macOS.

Prajwal Pazare
Mac O’Clock

--

What is CSAM?

Child Sexual Abuse Material — CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts.

Apple is set to release its new CSAM feature across their platforms, aimed at protecting children online. Apple will implement new tools in Messages that will allow parents to be more informed about explicit activities when their children communicate online. The company also uses a new system that leverages cryptographic techniques to detect collections of CSAM stored in iCloud Photos to provide information to law enforcement. Apple is also updating Siri & Search for CSAM guidance.

CSAM in iCloud & Communication Safety in Messages use completely different technologies, aimed at solving one.

How it works in iCloud?

The system performs on-device machine learning to analyze and detect image attachments using a database of known CSAM image hashes provided by NCMEC, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned, sending a notification to the parents. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices.

A diagram of the actual working of CSAM in iCloud Photos

Apple says, This process is secure, and is expressly designed to preserve user privacy, ensuring that they can provide information about criminal activity to the proper authorities without threatening the private information of law-abiding users.

Communication Safety in Messages

Expanding protection of child safety online, a new feature will also debut in Apple’s Messages App across devices, showing warnings to children and parents while sending or receiving sexually explicit images.

Example of how CSAM works in Messages

Siri & Search updates

Apple said it will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations.

For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

As Apple notes, “these interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

Apple Privacy Assurances

  • Apple does not learn anything about images that do not match the known CSAM database.
  • Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
  • The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.
  • Users can’t access or view the database of known CSAM images.
  • Users can’t identify which images were flagged as CSAM by the system.

“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe,” the company wrote in a press release.

Users on reddit are worrying about the same. People say, “its ridiculous that the more you read it, the more creepy it sounds.”

Thanks for reading. Hope you like my blog on Apple’s CSAM. If you do, please consider following and give some claps.

You can also connect with me on Twitter or LinkedIn

--

--