Apple reignites encryption Apple claims it can flag images of child sexual abuse without weakening encryption, but critics warn the tool could be exploit by others.
Apple’s announcement that it will scan encrypt messages for evidence of child sexual abuse has reignite the debate over online encryption and privacy, raising fears that the same technology could be use for government surveillance.
The iPhone maker said its initiative would “help protect children from predators who use communications tools to recruit and exploit them, and limit the spread of child sexual abuse material.”
The move represents a major shift for Apple
Which until recently resist efforts to weaken its encryption that prevents third parties from viewing private messages.
Apple argue in a technical paper that the technology developed by cryptography experts “is secure and is expressly design to preserve user privacy.”
The company said it would have limited access to breach Ukraine WhatsApp Number List images that would be report to the National Center for Missing and Exploited Children, a nonprofit organization.
However, encryption experts and the private sector have warned that the tool could be exploit for other purposes, potentially opening the door to mass surveillance.
“This kind of tool could be a boon for finding child pornography on people’s phones. But imagine what it could do in the hands of an authoritarian government?” said a tweet from Matthew Green, a cryptographer at Johns Hopkins University.
Others have warn that the move could be a first step toward weakening encryption and opening “backdoors” that could be exploit by hackers or governments.
“There’s going to be enormous pressure on Apple from governments around the world to expand this capability to detect other types of ‘bad’ content, and significant interest from attackers across the spectrum to find ways to exploit it,” tweet Matt Blaze, a Georgetown University computer scientist and cryptography researcher.
Blaze said the implementation is “potentially very risky” because Apple WhatsApp Number Database has move from analyzing data from services to the phone itself and “has potential access to all of your local data.”
In this file photo taken on September 20, 2019, a woman looks at her cellphone as she walks past an advertisement for the new iPhone 11 Pro smartphone at an Apple store in Hong Kong.
Tools to protect children
The new image monitoring feature is part of a series of tools coming to Apple mobile devices, the company said.
Apple’s texting app, Messages, will use machine learning to recognize and alert children and their parents when they receive or send sexually explicit photos, the company said in a statement.
“When receiving this type of content, the photo Bulk Database will be blurry and the child will be notifie,” Apple said.
“Apple’s expand protections for children are a game changer,” said John Clark, president of the nonprofit NCMEC.
The move comes after years of standoffs involving tech companies and law enforcement
Apple notably resist a legal effort to weaken iPhone encryption to allow authorities to read messages from a suspect in a 2015 bombing in San Bernardino, California.
FBI officials have warn that so-call “end-to-end encryption,” where only the user and recipient can read messages, can protect criminals, terrorists and pornographers even when authorities have a legal warrant for an investigation.