Apple to Scan Photos on All US iPhones For ‘Child Abuse Imagery’ [Tweet]

US iPhone users’ photos will be scanned by Apple’s automated “neuralMatch” system for pictures of child porn and abuse, according to reports. Experts accused the company of giving up privacy “to make 1984 possible.”

Financial Times reported on the plan Thursday, citing anonymous sources briefed on Apple’s plans. The scheme was reportedly shared with some US academics earlier in the week in a virtual meeting.

Dubbed “neuralMatch,” the system will reportedly scan every photo uploaded to iCloud in the US and tag it with a “safety voucher.” Once a certain number of photos – not specified – are labeled as suspect, Apple will decrypt the suspect photos and inform human reviewers – who can then contact the relevant authorities if the imagery can be verified as illegal, the FT report said. The program is initially intended to be rolled out in the US only.

The plan was described as a compromise between Apple’s promise to protect customer privacy and demands from the US government, intelligence and law enforcement agencies, and child safety activists to help them battle terrorism and child pornography.

Researchers who found out about the plan were alarmed, however. Matthew Green, a security professor at Johns Hopkins University, was the first to tweet about the issue in a lengthy thread late on Wednesday.

Forsided, 06.08.2021

Leave a Reply

Your email address will not be published. Required fields are marked *