![]() ![]() We wanted to be able to spot such photos in the cloud without looking at people's photos and came up with an architecture to do this," Craig Federighi, Apple's senior VP of software engineering, said last month. ![]() "If you look at any other cloud service, they currently are scanning photos by looking at every single photo in the cloud and analyzing it. ![]() ![]() An account can be reported to the National Center for Missing and Exploited Children (NCMEC) when about 30 CSAM photos are detected, a threshold Apple set to ensure that there is "less than a one in one trillion chance per year of incorrectly flagging a given account." That threshold could be changed in the future to maintain the one-in-one-trillion false-positive rate.Īpple has argued that its system is actually an advancement in privacy because it will scan photos "in the most privacy-protecting way we can imagine and in the most auditable and verifiable way possible." AdvertisementĪpple called system an advancement in privacyĪs we've previously written, Apple says its CSAM-scanning technology "analyzes an image and converts it to a unique number specific to that image" and flags a photo when its hash is identical or nearly identical to the hash of any that appear in a database of known CSAM. Apple's promise to "take additional time over the coming months to collect input and make improvements" suggests the scanning system could be implemented later than Apple intended, but the company never provided a firm release date to begin with. Given that an iPhone uploads every photo to iCloud right after it is taken, the scanning of new photos would happen almost immediately if a user has previously turned iCloud Photos on.Īpple has said it will also add a tool to the Messages application that will "analyze image attachments and determine if a photo is sexually explicit." The system will be optional for parents, who can enable it in order to have Apple devices "warn children and their parents when receiving or sending sexually explicit photos."Īpple initially said it would roll the changes out later this year, in the US only at first, as part of updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis."Īpple previously announced that devices with iCloud Photos enabled will scan images before they are uploaded to iCloud. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. "Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. "Once this capability is built into Apple products, the company and its competitors will face enormous pressure-and potentially legal requirements-from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," 90 policy groups from the US and around the world said in an open letter to Apple last month. But privacy and security advocates argue that once the system is deployed, Apple likely won't be able to avoid giving governments more user content. Apple has claimed it would refuse government demands to expand photo-scanning beyond CSAM. It isn't clear how Apple could implement the system in a way that eliminates its critics' biggest privacy concerns. Privacy groups warned of government access But given the backlash Apple has received from security researchers, privacy advocates, and customers concerned about privacy, it seems likely that Apple will try to address concerns about user privacy and the possibility that Apple could give governments broader access to customers' photos. The statement is vague and doesn't say what kinds of changes Apple will make or even what kinds of advocacy groups and researchers it will collect input from. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features. Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. But Apple said it still intends to implement the system after making "improvements" to address criticisms.Īpple provided this statement to Ars and other news organizations today: Getty Images | Oscar Wong reader comments 456 withĪpple said Friday that it will make some changes to its plan to have iPhones and other devices scan user photos for child sexual-abuse images. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |