Apple has responded to the most pressing criticisms encompassing its final decision to scan US iPhone picture libraries for known photos of kid sexual abuse content (CSAM), which was announced late final 7 days.
The tech large taken care of that the technology would not scan user’s iCloud uploads for something other than CSAM, and that it would reject governmental requests to “increase non-CSAM photographs to the hash record”.
In a FAQ response document, Apple mentioned that its “CSAM detection ability is developed solely to detect recognized CSAM photographs stored in iCloud Photos that have been determined by professionals at NCMEC [National Center for Missing & Exploited Children] and other boy or girl safety groups”.
“We have faced requires to create and deploy govt-mandated modifications that degrade the privacy of customers in advance of, and have steadfastly refused individuals needs. We will proceed to refuse them in the foreseeable future. Permit us be obvious, this technology is limited to detecting CSAM saved in iCloud and we will not accede to any government’s request to expand it,” the tech big mentioned.
The reaction comes pursuing considerations that the technology could be exploited by governments to raise surveillance, with numerous pointing to illustrations of tech providers performing with authoritarian governments. For example, Microsoft, Google, and Qualcomm have all approved demands from the Chinese point out for user information.
Apple also has a heritage of doing the job with federal government companies in the US. In the 1st half of 2019, the tech large had obtained a record-high 3,619 requests from the US authorities, trying to find consumer account facts to support regulation enforcement investigations. Experiences display Apple complied with 90% of these requests.
Nevertheless, by mid-2020, public sector cooperation had develop into much less profitable and confronted higher public scrutiny, main to tech giants this kind of as IBM and Amazon reducing ties, at minimum briefly, with the US law enforcement.
On Monday, Apple managed that, thanks to the hash technology utilized to detect the CSAM photos, it will be impossible to carry out “targeted attacks against only particular individuals” in order to body somebody. The tech huge also extra that it would conduct “human critique just before creating a report to NCMEC”.
“In a case wherever the method flags images that do not match recognized CSAM images, the account would not be disabled and no report would be submitted to NCMEC,” it stated.
Some areas of this report are sourced from: