To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Apple Launches New Security System That Detects Child Abuse Images In Your iCloud

Apple Launches New Security System That Detects Child Abuse Images In Your iCloud

The tech company said 'protecting children is an important responsibility​'.

Stewart Perrie

Stewart Perrie

Apple has announced a new security system that will scan your iCloud account for child sexual abuse material (CSAM).

The new technology will reportedly use photo identification features and hashing algorithms to detect things like explicit child pornography and other types of abuse.

It essentially checks to see if any photo in your iCloud library matches with one of the images from the National Center for Missing and Exploited Children.

If it comes back with a match, then Apple can alert the Center, who can then pass on the information to the police.

PA

Apple explained: "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the unreadable set of known CSAM hashes.

"This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result.

"Private set intersection (PSI) allows Apple to learn if an image hash matches the known CSAM image hashes, without learning anything about image hashes that do not match. PSI also prevents the user from learning whether there was a match."

They said 'protecting children is an important responsibility' and that their operations will 'evolve and expand over time'.

Apple added that they will have the power to manually review each flagged account by a human before it is officially reported.

If a report is made, then the tech company said it has the power to shut the iCloud account down pending further investigation.

People will be able to file an appeal if their account is locked and Apple will also have a threshold system that prevents people unnecessarily being implicated in the technology.

PA

Apple hopes the new system will still protect users' privacy, however there will no doubt be concerned individuals and groups about how far the technology can go.

President and CEO of the National Center for Missing and Exploited Children, John Clark, said: "Apple's expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material."

Apple will also be rolling out a new feature for messages that will help protect users from seeing sensitive material.

If a child is in a family iCloud account then they will get a warning if they receive or try to send something sexually explicit.

When they receive something sensitive, the image will be blurred and if the child taps 'View Photo' then they'll get a pop-up message that tells them why the image is considered sensitive.

If they proceed, then a family member will get a notification of what has happened and to ask their child if they are 'okay'.

When a child tries to send a sexually explicit photo, Apple will again warn the user before sending a parent a notification if the kid is under the age of 13.

Featured Image Credit: PA

Topics: News, Technology, Apple