Apple · Privacy · Security

Apple’s plan for iOS 15 CSAM scanning to somehow constitute “safe” surveillance is impossible without bulletproof security and an explicit moral framework to guide policy decisions. It has neither.

With its CSAM scanning tool (soon to be pushed to iPhones with iOS 15 and maybe Macs with Monterey), Apple has created what it promises is a “safe” surveillance experience. It will catch the absolute worst scumbags on the planet. It is launching in just the US. Turn it off by simply discontinuing your usage of iCloud Photos. If your content gets flagged by its system, then someone receiving a paycheck from Apple will look at your photo before the police are called. Feel safe?

That’s where Apple would like the story to end. Notwithstanding the innumerable potential problems with the surveillance tool itself, there are two major prerequisites for such a “safe” system to be able to exist in the first place that Apple fails to meet. The first is very basic: iOS security.

Did you know that –as we speak– owners of up-to-date iPhones are having malicious data pushed to their devices by the clients of a company called NSO Group? To refresh everyone’s memories, of the ~70 phones recently monitored for Pegasus infection by Amnesty International and Citizen Lab, a staggeringly-high percentage were fully compromised. These phones were a subset of a very large list of potential targets. The Pegasus infections were via 0-click exploits, meaning that no interaction was required by the user for the phone to get silently hacked.

When Apple contacted the Washington Post at the end of last month it said there was no fix for the issue allowing the Pegasus infections, but that the hacks were “not a threat to the overwhelming majority of our users” 🙄 (because most of us presumably aren’t targets of NSO’s clients, very reassuring).

Malware is being remotely installed on peoples’ phones, but it could also be anything, including other kinds of “illegal content”. Here’s a story from a week ago – basically, some French journalists had malware pushed to their iPhones via 0-click iMessage exploits, and then the malware extracted pictures and contacts from their phones. That personal data was then used to frame them in embarrassing (fabricated) scandals meant to end their careers.

Where are the guarantees from Apple that this will stop happening?

Where are the assurances that the bugs being exploited by NSO have been fixed?

Where is the commitment to overhauling the security of iOS and patching critical bugs ASAP?

Apple’s customers deserve a high level of transparency from Apple on the whole NSO/Pegasus affair. For years, NSO (and other exploit vendors) have facilitated hacking iPhones, causing incalculable damage to individuals (many who are in jail or worse). Apple should finally show us they have gotten real about stopping this… starting first with a press conference, then paying 20x the amount for 0-day exploits (to break NSO’s business model), and then reallocating its engineering talent to focus more on squashing bugs in its multimedia parsing libraries. I don’t know – something beyond saying it’s “not a threat to the overwhelming majority of our users”.

Not only has Apple recently added a back door to iOS, but it has not even bothered closing the front door.

One can not reconcile these two things: 1.) Apple rolling out an automated, warrantless, opt-out surveillance tool to all US iCloud customers — and 2.) iPhone owners around the world having arbitrary data pushed to their devices by powerful nation-state adversaries who want them ruined.

The Pegasus story does not have a bookend. As it stands, it is very reasonable to assume that a hacker could push arbitrary data to your phone, including pictures. We have proof (and acknowledgement from Apple) that this is still happening. Because of the broken security of Apple devices, it is irresponsible to be rolling out an automated surveillance system, and frankly – exceedingly arrogant.

“Safe surveillance” is also impossible without an explicit moral framework

TLDR; Apple follows the laws in any given market.

That’s its M.O. – to follow all the laws of each country it operates in. That sounds reasonable until a big important country passes a law requiring surveillance but under different terms, targeting different “illegal” content. Before, Apple could argue that it lacked the capability to surveil its users, but now it has shown it has that capability.

Apple does not know what laws will be passed in the future 🔮. If the scanning tool backdoor is there, Apple may be compelled to expand it. Apple itself highlighted this very concern in its 2016 fight to prevent the FBI from opening the phone of the San Bernardino shooter. Credit Ian Miers.

Some companies will quit a country when unsavory practices are imposed upon them, like when Google exited China a decade ago. Apple will follow the law… even if it erodes people’s freedoms. There is a long track record of this happening at the behest of this company, partly thanks to Tim Cook’s mushy stance:

Apple’s CEO Tim Cook said at a Fortune event in 2017, when asked about its compliance with China’s censorship and problematic laws: “Each country in the world decides their laws and their regulations. And so your choice is: Do you participate, or do you stand on the sideline and yell at how things should be? You get in the arena, because nothing ever changes from the sideline.” Apple has been “in the arena” for well over a decade now, time for a scorecard.

Here are at least three glaring issues with Cook’s stated stance:

1) The “you” in “Do you participate?” is referring to Apple, so it’s worth noting that Apple, as we know it in America, is not the Apple that is doing business in other countries. It’s a lesser, stripped down version of Apple, without key features like… Private Relay or VPN apps in the App Store, etc. As the New York Times reported, Apple even handed over legal ownership of its Chinese customers’ data to another company entirely. That’s certainly not Apple™️. How can Apple hold itself accountable for “doing the right thing” when it’s literally creating shell companies to hand off responsibility for users’ data? Apple 🇺🇸 is very different from Apple 🌎.

2) The spirit of Cook’s statement (that it’s better to be “in the arena”) is only honest if it’s periodically reassessed to gauge efficacy. It’s been over three years: has Apple had a positive effect in troubled countries? As they roll out laws that destroy press freedoms, deny recognition of entire groups of people, and prevent the recollection of key events in history? (Referring to Apple censorship of LGBTQ+ apps, the Taiwanese flag, and all VPN apps, as well as moves after the “National Security” law used to dismantle Hong Kong press freedoms, and more.) Listen to this great episode of the NYT’s Daily on Apple and China if you want to know more.

At what point will the company recognize complicity? Is Apple even willing to consider that it is complicit in destroying freedoms instead of being a force for good in other countries? If it is, it should be made known though there’s no sign it will be. Otherwise, if Apple is not willing to have any accountability beyond “we follow the law”, not only does the company’s stance of “being in the arena” not hold water, it’s actively enabling authoritarianism. Everyone (and if you’re an Apple employee, especially YOU) should recognize this.

3) In his statement, Cook denigrates “stand[ing] on the sideline and yell[ing] at how things should be”.
That forfeiture of voice is a huge concession for the Apple community, including the company’s many employees, because, let’s face it: those of us born outside authoritarian countries are on the sidelines, and yelling is our primary tool in influencing US geopolitics. Cook basically says Apple can’t yell from the sidelines about unjust laws in countries if it also does business there. It checks out: Apple itself never openly criticizes dystopian laws in countries like China. The silence is shameful and begs the question: how exactly can you do good by being “in the arena” if you remain silent on unjust policies? It also checks out in that Apple has had a relatively quiet workforce when it comes to employee activism (except on work from home policies and, more recently, pay equity disputes). Maybe they should start yelling more about how their company is changing the world.

In summary, unless Apple presents a privacy-respecting Apple in troubled countries and periodically reassesses whether a fair society is prevailing there, then many of us would prefer “yell[ing] from the sidelines”.

Surely Apple’s engineers know the dangers of rolling out any form of warrantless surveillance; they should yell about it.

When Apple’s head of privacy was asked about the CSAM system being rolled out in China, he said – and I kid you not: that it’s “launching only for U.S., iCloud accounts, and so the hypotheticals seem to bring up generic countries”. A complete non-answer. China is not a generic country, nor are we dealing with hypotheticals.

Apple says it will only roll out the tool to new countries after conducting a legal review on a country by country basis. Again, this kind of underscores the broader point: what happens when a country demands this kind of scanning tool, and passes a law to require it. What will Apple’s legal review find? Hint: Apple will follow the law.