The Death of Privacy: Apple’s New Surveillance

Published December 1, 2021

Penetentiary_Panopticon_Plan

The Panopticon & You

The strategic adversary is fascism… the fascism in us all, in our heads and in our everyday behavior, the fascism that causes us to love power, to desire the very thing that dominates and exploits us.” ― Michel Foucault

In 1975, French philosopher Michel Foucault wrote about a construct called the panopticon (from the Greek panoptes, all-seeing”).1 Dorm-room philosophy would never be the same. The actual physical construction of the panopticon itself is worth talking about, but for now the important thing to understand is this: the panopticon” describes a structure in which all the people living inside could be observed, at any time, by Watchers”, (their ideology is irrelevant) — and that furthermore the people inside are aware of it.2

What this means in practical terms is that everybody inside the panopticon goes about their day with the constant low-level psychic-rumble of knowing that someone — some secret cop, accountable to nothing but the quote unquote state — might be watching, always.

Walking in the street; working at your office; cooking at home; writing anti-establishment-leaning pamphlets in the basement; having sex with a spouse (your own or otherwise).

Always.

Such an environment, argues M. Foucault, would necessarily have a chilling effect on its denizens. Creativity would decline, anxiety and suspicion would rise, and people would be quietly separated from their social groups — and thus more vulnerable to authority. Which makes sense; how does one tell an off-color joke among good friends when a cop could be listening, taking note of any evidence that could be used against you; either right now, or maybe just someday, if you piss the wrong people off? Or if your friend pissed them off? Or their shitty kid? How do The People organize against authoritarian overreach and state-sponsored violence if the very act of organizing will get everyone who shows up slaughtered?

For the purposes of this zine, {your state} is the State, Apple’s the Watcher, and the secret cop is the algorithm. And how do you argue with an algorithm? You might as well argue with long division, you dope.

I should back up.

The Way Things Are (Currently) Done

Justice must always question itself, just as society can exist only by means of the work it does on itself and on its institutions.”
― Michel Foucault

You have a laptop (let’s be serious: work has a laptop. You have a phone). Your phone has X amount of localized storage — that is, actual space for files on your phone. Your laptop/phone is also almost certainly connected to a Cloud — meaning either Apple or Google’s computers, or Microsoft’s, or some combination of all three. Along with however many companies run the apps installed on your device.

At any given time, a portion of your data is actually present on your device, and the rest is on the segment of {COMPANY}’s computer reserved for you (your Cloud). Generally this is a function of what data you’re actively using, although the algorithms, as previously mocked, can be somewhat opaque as to how actively using” is qualified.

But up till now, if one truly cared, some level of control could, historically, be exercised over what actually got uploaded to The Cloud at all.

Emails but not photos, for example; homework but not finances.

Or, for a more trenchant example: your travel pics could back up safely to your account — but that member roster of a now-defunct LGBTQ+ organization based in pre-American departure Kabul, would not.

This is important because The Cloud, for anyone that gives a damn, has long been better known as somebody else’s computer”. A somewhat lesser-known corollary is that somebody else is always going to be more willing to give up your data — your photos, notes, texts, and emails — than you are. If it’s a private company, it’s barely even a choice — if data brokers come knocking, out goes the data while profits go up.3 4 5 Not to mention the Experians of the world.6 And if alphabetical agencies come knocking, well, who wants to spend those fat data broker profits fighting for the rights of people who are — when it really comes down to it – users, and therefore the only truly replaceable resource? Not to mention pissing off the {your state} government. It’s simply not Good Business™.

So long as Apple et all were scanning only their computers, the computers that Apple themselves own, it was at least theoretically possible to separate sensitive material — anything you wouldn’t want leaked to the media — from the rest, to ensure such data never touched a device you did not own. Which ensured you were (relatively) safe from search and seizure, unreasonable or otherwise. If the data is only stored on devices you personally own, the only legal way to access that data is via court order — a (relatively) high bar. After all, {COMPANY} cannot hand over what {COMPANY} never had, so all can go home happy, except the Feds, which is the happiest part of all.

But if the information is stored on devices to which a private company retains legal access, the easiest legal way to access is simply to ask.7

The Way Apple Will Do Things Instead

…the concept of the design is to allow all prisoners of an institution to be observed by a single security guard, without the inmates being able to tell whether they are being watched.”
― Michel Foucault

Apple’s departure from existing orthodoxy boils down to this: instead of scanning iCloud (Apple’s computers), they will instead scan each and every consumer-level Apple device on the market (your iPhone, your iPad, your MacBook, your AppleTV).

Currently, they claim that the only data subject to scanning will be photos, and even then only those about to upload to iCloud” (whatever that means). They also claim you can disable the scanning entirely merely by disabling iCloud photos (no small feat, given that the entirety of Apple’s UX is designed to make not using iCloud Photos an exercise in madness).

They claim this will make things safer”, but what they don’t say is for Apple”. After all: assume they’re telling the truth and what benefit does this system actually provide? If the only data to be scanned are those photos about to upload to iCloud, then the only real change is that the scanning happens on a device where you are culpable, not Apple.

Already this feels less like safety goggles and a bit more like horse blinders. But then again, all Apple’s going to do is scan photos that would’ve uploaded to the cloud already, right? And this is the same Apple that famously told the FBI to go screw, then built a whole corporate identity out of it, right? So what’s the problem with all that, really?

The Problem With All That, Really

But the guilty person is only one of the targets of punishment. For punishment is directed above all at others, at all the potentially guilty.”
― Michel Foucault

The problem is that is that limiting scanning solely to photos is now entirely a question of policy, not capability.

This was the crux of the San Bernadino case: Apple’s argument against the FBI was not that they would not provide the requested information (an ideological stance), but that they could not provide it, since their system architecture (purposely) prevented Apple themselves from decrypting the shooter’s data (a technical stance).

That is to say, Apple happily presents as pro-privacy when they have no skin in the game and a massive PR upside. But saying we cannot is very different from we will not, especially when it comes to globalized industry, and anyone who says Apple won’t kowtow to government demands has mistaken PR for integrity. There is already clear precedent for future overreach: the backdoor in iCloud for the Chinese Government. And where once there was some precedent in the other direction — Apple laudably refusing to assist the FBI with bending the American Constitution over a wet table — we now find ourselves with an Apple that has purposely engineered themselves out of that defense, now asking us to assume their best intentions.

Panopti-Nonconsensual

My point is not that everything is bad, but that everything is danger­ous, which is not exactly the same as bad. If everything is dangerous, then we always have something to do. So my position leads not to apa­thy but to a hyper- and pessimistic activism. I think that the ethico-political choice we have to make every day is to determine which is the main danger.”
― Michel Foucault

Intentions” being quite literally all they can offer. Now that the capability to scan a user’s device on a constant basis (and silently notify the authorities if certain conditions are met) exists, the genie is forever unbottled. And despite Apple’s intentions, the fact of the matter is that to a computer (phone/watch/media player) data=data=data. As far as your iPhone is concerned, the only difference between a torrented movie and a DOCX is which app opens to interpret it.

Thus it is that Apple’s CSAM implementation solely scanning photos is a question of policy, not capability. Which means the privacy — political, sexual, literary, and comedic — of every iPhone user in the entire world now rests solely on the assumption that Apple’s C-Levels have the chutzpah to look the FBI, NKVD, or Los Angeles County’s very own Executioners in the eyes and say no” (and then hope that the other mechanisms of government, the ones that allow Apple to do business and use banks and so on, don’t start applying a little market pressure of their own).

Tremendously reassuring, since the government agencies in charge of all this have never abused these sorts of tools before. Equally reassuring is Apple’s unimpeachable record when it comes to the security of their own servers.

And remember: this power comes in the form of an algorithm (mass surveillance; unconstitutional), not targeted investigation by human agents (targeted surveillance; intended by the forefathers). It will process all the photos on every iDevice from iOS 14 on, uncritically. It will seek any input Apple, America, or Vladimir Putin choose. Algorithms do not make decisions, they do not resist authoritarian overreach.8 Apple intends to irrevocably grant all the governments of the world a shiny new power apparatus: constant on-device scanning based on a secret, unaccountable database. History tells us that once a government has gripped a new power, they are loathe to give it up.

And with Apple taking the first bite out of their Biblical namesake, how long until the rest of the FAM&G cool kids jump on that hyperloop along with them? Assuming everyone follows suit in a similar timeframe as to when they all removed headphone jacks (the last remaining analog port), we should assume all our devices will be fully-surveilled by sometime mid-2024.

Indulgences and Their Role in the Reformation

If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him.”
-Cardinal Richelieu

A reasonable reaction to the above is that it all seems pretty onerous and inefficient, assuming the goal here is to protect the innocent while still successfully prosecuting a reasonable amount of the guilty. A state of affairs which a more distrustful author might suggest forces us to realize that the point of the game isn’t to put someone in jail, but rather to absolve guilt.

Not from anyone in particular, but entirely. Just poof, the guilt evaporates, forever consigned to some emotional equivalent to a non-Einsteinian dimension. No (prosecutable, guillotinable) human beings were responsible for these egregious, unconstitutional mythologies being designed and implemented! It was just the system”, and everyone knows the system is stacked against us! What can we poor trillion-dollar-market-share-international-corporations do in the face of the system”!?

As if the system were not a constellation of corporations and governments (I know, repetitive), that are each themselves made up of a constellation of individuals. Somebodies, somewhere, are responsible for this outrage. But the point of the system is to make it impossible for those responsible to be pinpointed — and to allow those responsible to accuse anyone else of the unforgivable-sin-du-jour (communist, loyalist, protestant, etc) in their place. Bread and circuses dispensed from an iron fist.

Remember: taking Apple at their word, the point of this system, this hash database, isn’t to catch active criminals, it’s to catch passive ones (Which, viewed uncharitably, could lead us to several more logical conclusions: perhaps the active criminals — in this case, the producers of child pornography — are already so wealthy, so successful and powerful as to be among the ruling class? Or perhaps the ruling class simply tend towards the production of child pornography? It’s difficult to be certain.9).

If your Elitism Senses™ are tingling, you should know that — “unrelatedly” — Apple reportedly has for many years offered an internal friends and family” helpline, the idea being that people who personally knew someone at Apple could get a direct line for odder edge cases and the like. Like the modern equivalent of marrying a doctor. The same is true for Facebook, and AMGonna bet the rest have something similar going on.

An (again, uncharitable) way to view this would be that Apple(/Facebook/et all) believes their employees are simply better” than their customers. Why else would they receive a direct line to top level customer service, if not because Apple either believes them to be more truthful about their issues, or simply more deserving of having their issues resolved?

Bush and Obama have had more than a few dinners together, and Tim Cook’s had dinners with both. It’s one big club and you ain’t in it.

Which brings us back to:

The Panopticon: American Style

For the observation that prison fails to eliminate crime, one should perhaps substitute the hypothesis that prison has succeeded extremely well in producing delinquency, a specific type, a politically or economically less dangerous—and, on occasion, usable—form of illegality; in producing delinquents, in an apparently marginal, but in fact centrally supervised milieu; in producing the delinquent as a pathologized subject.”
― Michel Foucault

The panopticon was conceived as a triumph of prison architecture, but only due to a lack of imagination. The advent of social media, of brigading, of flame wars, of cancel culture, and of YouTube radicalization have all brought about a social panopticon, perpetrated by the prisoners upon themselves.

But no less insidious, and far more ruinous, are the ever-extending-tendrils of the governmental equivalents, the PRISMs and Northern Lights of the world. Because what rational individual could, with a straight face, reasonably believe that the governments of the world — Five Eyes10 or otherwise — could resist making use of such a tool? Right now, it’s about the children. But soon, it will be about drug dealers. And after drug dealers comes pirated copyrighted material (a perennial favorite), after which we move onto whatever the current administration deems seditious material.

Trump demanded that Twitter provide him with the IPs of his critics, and his DOJ backed him up. Imagine what Trump would have done with something like this.

Now recognize that the next Trump will almost certainly be smarter.

A Picture of the Future

Who controls the past controls the future. Who controls the present controls the past.”
― George Orwell

This isn’t about you, and it’s not about me. As much as it pains me to admit it, we are all largely irrelevant to this slow motion cultural submission. We are not the point of the exercise.

Our children are the point. And their children, even more so.

Kids born today will grow up with a tenuous-at-best relationship to privacy. While the need for privacy may seem obvious now, that will not always be the case. Where once we knew for certain that the four elements were water, fire, wind and earth, we now cannot conceive of such a worldview. Similarly quaint will seem the idea that we could ever keep anything from The Company.

In fact, why not extend the idea even further: why stop at phones and computers? Why not mandate all future homes come standard with filters on your pipes, the better to diagnose illness” (and make sure nobody’s doing drugs in the privacy of their own homes). Why not require ID for book purchases, especially the ones we don’t like? Why allow paper books at all, in fact — they’re so much harder to trace.

All of this to say: the average child of today will grow up with the same understanding of privacy that the average child today does of butter churns — that is to say, they will know it as a relic of a less-evolved past, no longer necessary in this modern age. But unlike the shift from butter-churn to factory-farm production, future children will receive increasingly less privacy, not more. They will come of age knowing that anything they say or do or dare to commit to page — ever — can and will be used against them if and when The Man deems it necessary.

The Only True Sin Is Not Affording Supreme Authority to the State

or: the only true luxury is buying your way out of the social compact”

I don’t know the solution here. My honest assessment is that this implementation will forever mark The Day That Privacy Died For Real Tho. Apple has always been a company intent upon big swings and in the last twenty years they’ve invented entire markets and become far-and-away the most cash-rich company on Earth. Where they go, all others follow.

Smartphones exist because of Apple (@ me, blackberry apologizers). Smart watches exist because of Apple. Front-facing cameras are ubiquitous because of Apple. Analog connections — i.e. connections you can mechanically verify have not been tampered with — are on their way out largely due to apple. The tools with which a totalitarian state are observed, categorized, governed, and punished have all been designed, manufactured, and promoted by Apple.

And we — the technologically literate, the ones who (are trying to) understand how this all works, all the technical and societal implications — let it happen. In part because of Apple’s branding in privacy, so bolstered by San Bernardino. Plus there was the argument that Tim Cook, coming up in a famously heteronormative industry, might even personally understand the importance of privacy in a non-illegal context.

But mostly we let it happen because it was hard. It was hard for most people to understand, and for those who did understand, it was hard to explain to everyone else. And after people figured out how useful Google Analytics could be re: maximizing your user engagement i.e. sales, well, it got even harder to explain, and then everybody got rich and explaining not only seemed unimportant but counterproductive.

So now here we are a decade or so later with fake news optimized into digital heroin and 99% of the gaming industry reduced to thinly-veiled slot machines with GameGear-esque graphics of Frazettian babes (as if the original handheld evolved with its audience and now ran on dollar bills instead of batteries, but at a similar rate of exhaustion).

We have to draw a line somewhere, and I guess this one is turning out to be mine. Facebook is literally getting people killed on a regular basis. If we don’t figure out some way to effectively legislate against this, then intentionally or not, whether they like it or not — Apple will not be far behind.

-sent from my iPhone


  1. The original purveyor of panopticon was Jeremy Bentham - Wikipedia (1748-1832), but it’s Focualt’s application to all of society that is more pertinent here. Besides: Bentham — being an aristocrat and therefore having little-to-no skin in the antagonistic game of Man v State — saw the panopticon as a potentially positive construct, since he believed that publicity” would keep the authorities in check which, in the era of Trump, Netanyahu, and Putin, seems a bit rich. Especially coming from the same jagoff who came up with Short Review of the Declaration, and the same penny-ante Howard Roark who dedicated much of his life to strongly advocating for the construction of panopticon-based prisons. What a class act.↩︎

  2. An oft-ignored aspect of the panopticon as originally conceived is that it actually includes the idea that the Watcher” would be observed along with the prisoners, the overall intention being to make each somewhat accountable to the other. Practically, it makes sense to ignore that bit, because all of human history tells us that the only thing keeping powerful people in check is the potential loss of that power — for which panopticon does not provide a mechanism. In the panopticon, the power parity is thoroughly of the some are more equal than others” brand so common to aristocratic thought experiments, when the thinker’s understanding of hardship is roughly not enough eggs for a flan”.↩︎

  3. Brokered cell location data led to the outing and resignation of a Catholic official - The Verge↩︎

  4. Facebook sued over Cambridge Analytica data scandal - BBC News↩︎

  5. Google+ class action starts paying out $2.15 for G+ privacy violations | Ars Technica↩︎

  6. Experian API Exposed Credit Scores of Most Americans — Krebs on Security↩︎

  7. It is for this reason, for example, that Apple has provided China with a permanent backdoor into the iCloud data of all Chinese citizens. In a country famous for its bureaucracy, Apple no longer has to play Xi’s middleman.↩︎

  8. A practical example: Stanislav Petrov. Stanislav, a man, did not want to end the world in fire (maybe he had a dog), so he chose the most-charitable interpretation of events — and by doing so, saved all of human civilization. Had he been replaced with an algorithm (as he surely has been, since), the Earth would currently be in the early stages of nuclear winter, and we’d all be far too long-since-atomized to notice.↩︎

  9. I have a personal theory that child pornography functions as the 1%-equivalent of being jumped into a gang, i.e. “if you want to be part of the ruling class, the rest of the ruling class requires enough dirt on you that you can never make any real changes — or you’ll be ruined.” Aka the full Scientology”. No, I have no non-circumstantial evidence to support this, and no this is not a hill I’m willing to die on (for real, just buy me off, plz) and yes I realize how much I sound like a talking tinfoil hat when I say this. But then again, people said that about the government recording all our web history, sooooo….↩︎

  10. The Five-Eyes et all are extremely relevant in this case, since one of Apple’s main safeguards” is an insistence that multiple countries must independently submit a photo to the hash database for Apple to begin scanning for it. Given that the world’s intelligence agencies have formed the aforementioned agreements specifically to enable spying on their own citizen (contravening Federal law to do so, at least in the US), one can see how it might stretch credulity to assume requiring multiple countries to cooperate” to be anything other than a digital figleaf.↩︎

Last modified October 3, 2023  #essay   #criticism 


← Newer post  •  Older post →