To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Apple To Notify Parents If Children Send Or Receive Sexual Photos

Apple To Notify Parents If Children Send Or Receive Sexual Photos

It's one of a number of new measures aimed at tackling the spread of child sexual abuse material

Jake Massey

Jake Massey

Apple has announced that it will notify parents if their children send or receive sexual photos on their iPhones and iPads.

A new feature in the Messages app will warn children and their parents using linked family accounts when sexually explicit photos are sent or received, with on-screen alerts appearing and images blocked from view.

It is hoped the tools will help to protect children.
PA

The tool will reassure children that it is OK if they do not want to view the image, as well as presenting them with helpful resources.

It will also inform them that as an extra precaution, if they do choose to view the image, their parents will be sent a notification.

Similar protections will be in place if a child attempts to send a sexually explicit image, Apple said.

It's one of a number of new child safety tools designed to protect young people and limit the spread of child sexual abuse material (CSAM).

Among the features is new technology that will allow Apple to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

Guidance in Siri and Search will also point users to relevant resources when they perform searches related to CSAM.

The tools are set to be introduced as part of the iOS and iPadOS 15 software update in autumn, and will initially be introduced in the US only, but with plans to expand further over time.

Apple said the detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user's photo album.

Some people have concerns about the privacy implications of the new tools.
PA

Instead, the system will look for matches securely on the device, based on a database of 'hashes' (a type of digital fingerprint) of known CSAM images provided by child safety organisations.

This matching will only take place when a user attempts to upload an image to their iCloud Photo Library.

Apple said that only if a threshold for matches for harmful content is exceeded would it then be able to manually review the content to confirm the match and then send a report to safety organisations.

The company reiterated that the new CSAM detection tools would only apply to those using iCloud Photos and would not allow the firm or anyone else to scan the images on a user's camera roll.

However, critics argue the tools represent a worrying invasion of privacy.

According to the Daily Mail, Ross Anderson - professor of security engineering at Cambridge University - said: "It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops."

Featured Image Credit: Pexels/Porapak Apichodilok

Topics: US News, Technology, Apple, iPhone