Apple Is Scanning Your Photos
Apple's senior director of global privacy has confirmed that the company scans photos uploaded to the iCloud for evidence of illegal activities such as child sexual abuse.
Jane Horvath made the admission while speaking at the Consumer Electronics Show (CES) 2020 conference in Las Vegas yesterday, according to The Telegraph.
While speaking at the tech conference, Horvath said that photographs that are backed up to Apple's online storage service are automatically screened for illicit content.
The company has been criticized by law enforcement agencies for allowing criminals to hide behind lanes of protective encryption and for refusing to break into the phones of suspected wrongdoers.
Addressing this issue yesterday in Las Vegas, Horvath said that giving criminals nowhere to hide by scrapping encryption was "not the way we’re solving these issues" but added: "We are utilizing some technologies to help screen for child sexual abuse material."
Exactly what technologies Apple is using to screen their customers' digital photographs and how long they have been doing so was not specified.
On the company's website it states: "Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space.
"As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation."
Companies including Facebook, Google, and Twitter check for images depicting the sexual abuse of minors with Microsoft’s PhotoDNA system. The system uses hashing technology to check images posted online against a database of previously identified photographs.
Paul Bischoff, privacy advocate at Comparitech.com, believes that Apple may be doing something similar.
"Here's what I think is happening: Apple has access to a law enforcement database of child abuse photos. Apple hashes or encrypts those photos with each user's security key (password) to create unique signatures. If the signatures of any encrypted photos uploaded from an iPhone match the signatures from the database, then the photo is flagged and presumably reported to authorities.
"This allows Apple to match photos uploaded to the cloud against the law enforcement database without ever breaking encryption or actually viewing the photos."
If this is the system that Apple is using, then Bischoff warns it has a serious flaw.
He said: "If a child abuse photo is cropped or edited, if it's converted to another type of image file, or if it's compressed, then the encrypted signatures won't match up." Source: Information Security Magazine