Tech Giant Apple has announced that it will be implementing a new system which has been named Neural-hash to detect child sex abuse imagery in the United States which will check the photos on iPhones before they are uploaded to the company's iCloud service.
The announcement made on Thursday said that Apple can initiate a human review and report the user to law enforcement if the system detects such imagery.
Google, Microsoft, and Facebook already have a system in place to detect such images to check images against a database of known child sex abuse imagery.
The system is designed to reduce false positives, which the company puts at one in one trillion odds.
The new system introduced by apple is also designed to catch images of child sex abuse that have either been edited or are similar to ones known to law enforcement.
Apple believes that the new system can balance the requirement of helping law enforcement officials to stem child sex abuse and the protection of user privacy.
The law enforcement agencies in the US maintain a database of child sex abuse imageries by translating them into hashes that positively identify child sexual abuse imageries however such codes can not be used to reconstruct the images.
The new system by apple will create a hash of the images that are being uploaded to the iCloud service to compare with the existing database.
Apple has also promised a human review sending the user info to the law enforcement agencies.
Images will be checked before they go to the cloud servers. However, the system will only check the photos on iCloud servers and will not check the photos on iPhone.
Users who feel their accounts were suspended improperly will have the right to appeal, the company said.
"These new safety measures have the lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material," National Center for Missing & Exploited Children CEO John Clark said in a statement.
Despite the system seeming useful and good, cryptographers suspect that the system will have an authoritarian overreach.
Mathew Green, a cryptographer at the John Hopkins University said that the system has raised significant concerns. According to him, malicious harmless images could be sent to individuals, carefully designed to trick the system, which could frame an innocent.
Green said there is also a concern for what the technology permits.
"What happens when the Chinese government says, 'Here is a list of files that we want you to scan for," Green said. "Does Apple say no? I hope they say no, but their technology won't say no."
Green said he believes Apple has "sent a very clear signal" that in its view, "it is safe to build systems that scan users' phones for prohibited content."