World

Apple To Scan Images Of Child Sexual Abuse; Wins Praises By Privacy Advocates Are Concerned

[ad_1]

New Delhi: Apple Inc. has revealed a new tool that will scan U.S. iOS phones for images of child sexual abuse, while child protection groups have welcomed the new technology many security researchers have pointed out that this can be misused.

Apple also plans to scan encrypted messages of its users for sexually explicit content as an additional child safety measure alarming privacy advocates.

ALSO READ: Hindu Temple Attack In Pakistan: PM Imran Khan Commits To ‘Restore Mandir’ As India Expresses ‘Grave Concerns’

The tool called “neuralMatch” detects known images of child sexual abuse, will scan images before it is uploaded to the iCloud. In case it finds a match, the image will be reviewed by a human, if it is confirmed to be pornography the user’s account will be disabled, and the National Center for Missing and Exploited Children notified.

Now, there are concerns regarding privacy, Matthew Green, a top cryptography researcher at Johns Hopkins University, said that innocent people can be framed by sending them images that could trigger the system, fool Apple’s algorithm, and alert law enforcement. He said that researchers have been able to trick such systems pretty easily. 

Various tech companies have like Microsoft, Google, Facebook, and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography. Apple, however, has been under pressure from the authorities for access to information to help with crimes such as terrorism or child sexual exploitation. It was one of the major companies to embrace “end-to-end” encryption.

Researchers are also concerned about government surveillance, especially of dissidents or protesters. Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon the changes, which it said effectively destroy the company’s guarantee of “end-to-end encryption.” Scanning of messages for sexually explicit content on phones or computers effectively breaks the security, it was quoted saying by AP. 

However, child rights activists and protection groups have praised the tool.

“Apple’s expanded protection for children is a game-changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement according to an Associated Press report. “With so many people using Apple products, these new safety measures have the lifesaving potential for children.”

Similarly, Julia Cordua, the CEO of Thorn, a nonprofit founded by actors Demi Moore and Ashton Kutcher, which uses technology to help protect children from sexual abuse said that Apple’s technology balances “the need for privacy with digital safety for children.” 

[ad_2]
Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button