Apple’s decision to postpone introduction of its controversial client-side scanning (CSS) CSAM-detection system looks like an even better idea amid news governments already want to use the controversial tools for other forms of surveillance.
A ‘dangerous technology’
In a new report, an influential group of 14 internationally reputed security researchers have said such plans represent a
“dangerous technology” that expands state surveillance powers. They warn the client-side scanning system, if used “would be much more privacy invasive than previous proposals to weaken encryption. Rather than reading the content of encrypted communications, CSS gives law enforcement the ability to remotely search not just communications, but information stored on user devices.”
These voices join a chorus of similar voices, including civil liberties campaigners, privacy advocates, and tech industry critics who have already warned that the plans threaten basic human rights.
While the system Apple announced seemed well-intentioned, its use of on-device scanning against image databases in the form of numerical hash data had many concerns. After all, if a device can be scanned, for one thing, it can easily be extended to search for other things.
Turns out, some governments are working on precisely that. The New York Times reports the latest findings from a group of cybersecurity researchers who have been examining proposals of this kind from before Apple’s announcement.
European Union wants CSS
The researchers say they began looking into the technology prior to Apple’s announcement in response to moves by European Union (EU) leaders to insist on such a system. The researchers think a proposal to mandate such photo scanning in the EU could come as soon as this year and would extend beyond CSAM to also include scanning for evidence of an organized crime and terrorist activity. The extension of the search domains is a red flag.
The concern is that what in many nations is seen as ordinary behaviour is criminalized in others. A search for criminal material could easily be extended to become a search for evidence of homosexuality, for example, which is a capital offence in some nations.
Just as the EU could now force Apple to enable its system for scanning CSAM material and insist it scan for additional ills, any government — including authoritarian governments — could mandate what is searched for. Apple has said it would resist, but the truth is it would be unable to do so. It is interesting that one set of crimes that so far hasn’t been proposed for such surveillance include fraud, tax evasion, and tax avoidance — though such a facility could easily be extended to those domains.
Apple has attempted to characterize the resistance it encountered to its original proposals as being little more than a confusion of messages. Apologists have tried to mask it with arguments around how most actions on the internet can be detected (which rather undermines the use of online payment systems).
Critics say both such excuses seem flawed from a company that prides itself on privacy, particularly in the absence of an internationally agreed bill of digital human rights. Many believe such proposals represent a Pandora’s Box of horrors that leads to unconstrained surveillance and state overreach.
One big issue the latest researchers warn about is that the plan allows for the scanning of a person’s devices “without any probable cause for anything illegitimate being done.” Tufts University professor of cybersecurity and policy Susan Landau, said:
“It’s extraordinarily dangerous. It’s dangerous for business, national security, for public safety and for privacy.”
University of Cambridge professor of security engineering Ross Anderson also mentioned:
“Expansion of the surveillance powers of the state really is passing a red line.”
One door opens, another one gets opened
But for many users, particularly business users, there are greater threats lurking. “As most user devices have vulnerabilities, the surveillance and control capabilities provided by CSS can potentially be abused by many adversaries, from hostile state actors through criminals to users’ intimate partners,” the report warns.
“Moreover, the opacity of mobile operating systems makes it difficult to verify that CSS policies target only material whose illegality is uncontested.”
Effectively, once such a system is put in place, it’s only a matter of time until criminal entities figure out how to undermine it, extending it to detect valuable personal or business data, or inserting false positives against political enemies.