Apple’s CSAM tracking feature is the subject of Edward Snowden’s latest editorial on Substack, the former US intelligence consultant calling the company’s strategy a “tragedy.”
Edward Snowden waives technical rebuttals of Apple’s CSAM system and gets straight to the point, saying the solution “Will permanently redefine what belongs to you and what belongs to them”.
The feature, which is slated to roll out with iOS 15 this fall, will hash and match user photos uploaded to iCloud with a CSAM database of child pornography images pulled from at least two different entities. Importantly, unlike legacy systems, Apple’s technology does all the processing on the device. This, according to Snowden, “Will erase the line that separates which devices work for you and which devices work for them”.
“Once the precedent is established that it is appropriate even for a ‘pro-privacy’ company like Apple to manufacture products that betray its users, Apple itself will lose control over how this precedent is enforced. .
Apple’s implementation of CSAM detection features has more to do with branding than child protection or regulatory compliance, as the feature can be bypassed simply by disabling iCloud photo downloads. The idea that Apple will introduce the measure for end-to-end encryption on iCloud is also ludicrous. The implementation of such a system would not matter, as the iPhone will already have built-in monitoring capability. “
Like other critics, Snowden is concerned that governments may abuse the system by forcing Apple to extend CSAM functionality on the device or by requiring it to always be active on all users: “There is no fundamental technological limit to the extent to which the precedent set by Apple can be pushed, which means that the only drag is Apple’s overly flexible corporate policy, which governments fail to understand. that too well. “
Apple for its part said it would not comply with the government’s demands to extend the system beyond its original directive related only to child pornography images.