Apple’s forthcoming feature that will scan iOS devices for images of child abuse is an “important mission,” a software vice president at the company wrote in an internal memo. First reported by 9to5 Mac, the memo by Sebastian Marineau-Mes acknowledges that the new protections have some people “worried about the implications” but that the company will “maintain Apple’s deep commitment to user privacy.”
As part of its Expanded Protections for Children, Apple plans to scan images on iPhones and other devices before they are uploaded to iCloud. If it finds an image that matches one in the database of the National Center for Missing and Exploited Children (NCMEC), a human at Apple will review the image to confirm whether it contains child pornography. If it’s confirmed, NCMEC will be notified and the user’s account will be disabled.
The announcement raised concerns among privacy advocates who questioned how Apple could prevent the system from being exploited by bad actors. The Electronic Frontier Foundation said in a statement that “it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children” and that the system, however well-intended, “will break key promises of the messenger’s encryption itself and open the door to broader abuses.”
According to 9to5Mac, Marineau-Mes wrote in the memo that the project involved “deep cross-functional commitment” across the company that “delivers tools to protect children, but also maintain Apple’s deep commitment to user privacy.”
Apple did not immediately reply to a request for comment Friday.
www.theverge.com
Leave a Reply