acm-header
Sign In

Communications of the ACM

ACM TechNews

Policy Groups Ask Apple to Drop Plans to Inspect iMessages, Scan for Abuse Images


View as: Print Mobile App Share:
The Apple logo.

More than 90 policy and rights groups around the world have published an open letter urging Apple to abandon its plans for scanning childrens messages for nudity and the phones of adults for images of child sex abuse.

Credit: Dado Ruvic/REUTERS

An open letter from more than 90 policy and rights groups worldwide calls on Apple to drop a plan to scan children's messages for nudity and adults' phones for images of child sex abuse.

The groups expressed concern that the capabilities "will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children."

The campaign, organized by the Center for Democracy & Technology, focuses on concerns over encryption and privacy.

Most of the complaints involve device-scanning, but the letter also cites changes to iMessage in family accounts that would break end-to-end encryption for iMessage.

The letter said, "Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit."

From Reuters
View Full Article

 

Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account