How Apple’s plan to go after child abuse it could affect you – Guide

Tech giant unveils tech to fight child exploitation, but privacy concerns remain.

Apple has long claimed to prioritize user privacy and security, but a new technology designed to detect child exploitation images on iPhones, iPads and Macs has sparked debate over the veracity of these promises.

Apple has announced a new feature for its upcoming software updates that will detect child exploitation images and videos stored on devices. The technology works by converting images into unique code, known as hashes, which are then checked against a database of known child exploitation content managed by the National Center For Missing and Exploited Children. If matches are found, Apple may investigate further. The release date for the software is yet to be announced, though Apple recently delayed improvements and privacy issues resolution.

Why is Apple doing this now?

Apple announced it has been working to combat child exploitation for some time, citing the National Center for Missing and Exploited Children’s receipt of 65 million material reports last year. The tech giant noted this is a significant increase from the 401 reports it received two decades ago.

Thorn, a non-profit organization fighting child exploitation, estimates that the 65 million files of exploitative material reported by Apple is only a fraction of what’s actually out there. The head of Thorn noted that US law requires tech companies to report such material if they find it, but does not require them to actively search for it.

Tech giants Facebook, Microsoft, Twitter and Google (including YouTube) are using advanced technologies to detect and remove illegal content from their platforms.

Apple has been scanning emails and other files stored in iCloud since 2019, according to a statement from the company to 9to5Mac.

Apple’s new system stands out for its ability to scan users’ devices, rather than relying on data stored on the tech giant’s servers.

Apple’s new hash scanning system will only be applied to photos stored in the iCloud Photo Library. Those who don’t use the service can opt out, as images and videos stored in a device’s photo app won’t be hashed.

Can this system be abused?

Apple faces a critical question: should it employ a particular approach to combat child exploitation?

Privacy experts are concerned that Apple’s tools could be used for surveillance, such as if the Chinese government were to add data related to the 1989 Tiananmen Square protests into Apple’s child exploitation detection system.

Apple has implemented features to protect user data from being accessed without permission. Rather than scanning photos, the system compares hash codes stored on the device itself, not an online database. This allows security researchers to more easily audit the process and ensure data is secure.

Apple’s head of software engineering Craig Federighi told The Wall Street Journal on August 13 that the company wanted to locate photos in the cloud without looking at them, unlike other cloud services which scan and analyze photos. He clarified that Apple is only matching fingerprints of known child pornography images, not analyzing any other type of photo.

Apple announced a plan to ensure the accuracy of its online database of child exploitation material, which is updated by the National Center for Missing and Exploited Children. The company said it will publish a hash - a unique code - to its database each time it’s updated. To generate this code, Apple will need the help of at least two separate child safety organizations. Security experts can use the hash to identify any changes that may occur, while organizations can audit Apple’s systems.

Is Apple digging through my photos?

The iconic baby-in-the-bathtub photo has been a popular subject for generations, from parents capturing their own children to Dreamworks’ 2017 animated comedy The Boss Baby making it a recurring joke.

Apple has implemented a program to protect users from child exploitation images, converting photos into hash codes and checking them against a known database. The company claims the chances of a false positive are less than one in a trillion annually.

Apple has implemented a system to ensure that innocent people are not reported to the National Center for Missing and Exploited Children (NCMEC). Whenever an account is flagged by the system, Apple conducts human analysis before filing a report with NCMEC, preventing any potential errors or attacks from leading to false reports.

Is Apple reading my texts?

Apple has not implemented its child abuse detection system for text messages, meaning the two systems are distinct.

Apple’s new feature on iPhones connected to children’s iCloud accounts will blur and alert kids of explicit images sent or received via the messaging app. If a child still views the image, parents will be notified. The company is also providing resources to help kids understand why these images are inappropriate.

Apple emphasized that its new feature prevents them from accessing users’ messages.

Apple has unveiled a new system to protect iPhone users from being exposed to explicit images. It will only work with phones logged in with a child’s iCloud account and will flag any sexually explicit images, as opposed to iCloud’s Photos setup which checks for known child abuse images. This should help keep kids’ cool photos safe from being flagged.

What does Apple say?

Apple asserts that its system is designed with privacy as a priority, featuring safeguards to prevent the company from accessing the contents of users’ photo libraries and to reduce the risk of misuse.

Apple’s Federighi told The Wall Street Journal that miscommunication was to blame for the confusion surrounding the company’s recent decisions.

Apple’s Craig Federighi expressed regret that the company’s messaging was not clearer, emphasizing their commitment to their decisions.

Apple argued that its new scanning feature is distinct from other plans to alert kids when they receive explicit images, emphasizing education for parents and children rather than scanning those images into its child abuse image database.

Apple has delayed the release of its new software, citing the need to make improvements based on feedback. No new release date has been announced.

Final note

Apple is taking steps to combat child abuse, and this guide explains how it could affect you. If you have any questions, please reach out. Help spread the word by sharing this article with your friends.