Delete Your Account: Why Canceled Is the New Dead

And how then shall we live?

By Charlie Clark

A person is what happens when there is a family, and a town, a place where you are known. Where every person who knows you holds a small, invisible mirror, and in each mirror, held by family and friends and enemies, is a different reflection…. A person is what happens when you gather all these reflections around a body. So what happens when one by one the people holding those mirrors are taken from you? It’s simple. The person dies.” — Phil Klay, Missionaries

 

Why is “cancellation” one of the characteristic anxieties of our moment? I’m not asking whether “cancel culture” is good or bad, effective or ineffective; I’m not even asking whether it’s real. I’m asking why it’s so salient to so many people, and why now? How has it become one of the three or four hyperobjects in the Discourse to have its own tidally locked commentariat?

Let me ask you another question: Why have we herded ourselves into an exitless panopticon so that we can compete for status among strangers, where the victories are Pyrrhic and ephemeral but the defeats are total and irreparable?

My seven-month-old daughter has never been shown a screen. As parenting achievements go, I don’t think a screen-free infancy is especially heroic. I expect that as she starts to crawl, then walk, then run—as babble gives way to chatter—the temptation to buy stillness with screen time will grow stronger.

It helps that I’m picky about screen time myself. Like most of you reading this, I do a lot of “work” on a laptop (Gmail, Zoom, Docs, Sheets), but I recently upgraded to a Light Phone. I’ve been off social media for years: no Facebook, Twitter, Instagram, or TikTok—not even a LinkedIn. I follow zero YouTubers, subscribe to zero podcasts. I find reality shows, cable news, and streaming-era “prestige” television all equally unwatchable. I like movies—or I used to like movies. I think if you’re going to use a screen, you probably should go all the way and play a video game (but the best ones were made between 1998 and 2004).

Capitalism has not constructed 7 billion smartphones, the equivalent of an attentional Dyson sphere, just so it can pop in to say, “2 MIT grads built an algorithm to match you with wine.”

I’m quite content with my quasi-Luddism. But even if I weren’t, the way my daughter reacts when she gets an unintended glimpse of a television, computer, or smartphone would have confirmed me in it. At seven months, she has no reason to associate screens with any particular pleasure. She doesn’t know they can tell stories or respond to her touch; nothing she has accidentally seen has been designed to engage an infant. Yet the screen exercises an instant and intense fascination. Moving images in a glowing box are more immediately interesting than anything else in her world. I catch her twisting around in her high chair to watch me update our budget in Excel.

Many pixels—and even some old-fashioned ink—have been spilled over the rise of the “attention economy.” Goldhaber’s seminal essay predicted that the attention economy would supplant the money economy like the money economy did the feudal economy of lands and titles, but over the years, the narrative that has won out is that corporations, especially advertisers, are still in it for the money: monopolizing attention for profit. Justin E.H. Smith offers a typical account of the evil plan in The Internet Is Not What You Think It Is: “The new advertisement landscape by contrast is one that functions bidirectionally, monitoring potential customers’ behavior, attentional habits, and inclinations, and developing numerous technological prods and traps that together make it nearly impossible to decide to exit this commercial nexus.”

Sorry, but I don’t buy it. I watched enough South Park in my bad old days to know an Underpants Gnome when I see one. Capitalism has not constructed 7 billion smartphones, the equivalent of an attentional Dyson sphere, just so it can pop in to say, “2 MIT grads built an algorithm to match you with wine.” There’s simply not enough money in advertising to justify an increasingly total conquest of the human experience. People are mostly going to buy what they’re going to buy, and if an advertiser can nudge you from a Ford into a Chevy, that’s great for Chevy, but it’s not why you build a Doomsday Machine.

Watching my seven-month-old stare, enraptured, at a word processor has led me to shift my paradigm. I doubt that Content has anything to do with it. Maybe the basis of our Attention Revolution is the screens and speakers themselves, and the driving factor in media evolution is the growth in the potency of our audiovisual stimulators, culminating (for the time being) with iPhones and AirPods. Maybe we humans just like having our sensory cortexes stimulated. Maybe bright and loud are the point.

It’s as much human nature to seek attention for ourselves as it is to give our attention to the brightest object in view.

That’s the supply side. People have attention and they’re desperate for places to put it (Pascal: “a king without distraction is a man full of wretchedness”). What about the demand side? Goldhaber argued that, by the late ‘90s, the market for both material goods and information was essentially saturated, and attention was the last scarce resource to compete for: “The key point about the various forms and avenues of attention is that while we each want it to some extent, it does not arrive in equal measure. This explains why many of us are working harder and harder to get some.” For his part, Smith presents the internet as a quasi-biological outgrowth of our species’s natural loquacity, comparing it to other forms of telecommunication in the animal world like elephant stomps or whale song. In either case, we do not need to postulate an economic motive: it’s as much human nature to seek attention for ourselves as it is to give our attention to the brightest object in view.

We craved distraction, so we built devices so distracting that they sucked all the available attention out of meatspace. We craved attention, so we chased it over the event horizon.That’s the theory. What does this have to do with cancellation?

We’re afraid to die. Socially, I mean, not biologically. Just as we’re all destined to die one day, we’re also destined to be ignored and forgotten. There’s a little snippet of a sermon I heard as a kid. I don’t even remember which pastor said it or what his larger point was. I just remember him saying, “One day, you’re going to die, and your family and friends are going to get together to talk about you one last time. They’re going to feel sad for about an hour, then they’re going to go home and eat copious amounts of potato salad, and most of them are never going to think about you ever again.” I think about that all the time (especially the part about the potato salad).

In the end, attention is no better than money: you can’t take it with you. Most people don’t know the names of their grandparents’ grandparents. Marcus Aurelius wrote about the one who seeks posthumous fame: “[E]very one of those who remember him will himself also die very soon; then again also they who have succeeded them, until the whole remembrance shall have been extinguished.” That’s social death.

Unless you belong to a community that somehow resists the centralization of attention, your likely fate is a life of anonymity and isolation.

For most of human history, physical death was close at hand. I’m told that “nasty, brutish, and short” probably exaggerates the misery of everyone from paleolithic hunter-gatherers to medieval peasants (with the possible exception of the residents of ancient urban centers), but child mortality was high and life expectancy was low. Social death, on the other hand, was relatively distant. In pre-industrial societies, ties within families and local communities—across generations—were durable as a matter of economic necessity and simple practicality. You can’t unfollow your next-door neighbor; the best you can do is never leave your house, and that’s not an option when you have fields to till. (Another problem solved by modernity.) Perhaps most importantly, other people were the most interesting objects soliciting your attention.

So in the modern world, with our rectangularized survival curve, we’ve pushed awareness of physical death to the fringes of consciousness. Meanwhile, having glued the gaze of father, sister, and friend firmly to their smartphones, we’ve brought social death into the midst of life. We are threatened with having to live out our biological lives unknown, unrecognized, unattended: unpersons. Ergo the redistribution of anxiety: never think about death, fret about cancellation.

It is crucially important to understand though: canceled is the default condition of modern subjecthood. Unless you belong to a community that somehow resists the centralization of attention, your likely fate is a life of anonymity and isolation. Your only solace is the hope to go viral (on however large or small a scale), to eke out a personal existence from the scraps of attention (Smith: “weak, fleeting, and misdirected”) you can scrounge online.

Smith’s most compelling insight, found both in his book and in the essay which inspired it, is about how this online personhood is shaped:

[I]ndividual readers or consumers are themselves now pushed and pressured to operate online according to the same commercial logic as the companies whose products they are using…. [T]he more you use the internet, the more your individuality warps into a brand, and your subjectivity transforms into an algorithmically plottable vector of activity…. [I]ndividuals will thrive most, or believe themselves to thrive most, in this new system who are able convincingly to present themselves not as subject at all, but as attention-grabbing sets of data points.

There’s something Calvinistic about this self-perception: human nature being what it is and the architecture of the internet being what it is, the individual cannot do other than it does (run an algorithm other than it runs), and yet the individual is held responsible and receives its due penalty or reward.

We are most of us, I think, aware of the colossal difficulty of becoming real persons under present conditions.

To fear “cancel culture” (real or imaginary), then, is to fear the Last Judgment: not them which kill the body, but rather them which are able to destroy the soul. In a situation where death is the default, the only thing to fear is permadeath. And this the internet can provide, keeping a permanent record of whatever renders an individual unworthy of attention. How then can we live?

First, you must come to terms with your own social mortality. No one lives forever under the sun, and you should not fear to lose what you cannot keep, which includes your neighbor’s attention. In this veil of tears, you must be prepared to go unseen, unrecognized, misunderstood, or reviled.

Second, the game of online attention is one where the only winning move is not to play. Deep down, you know this. As Smith argues, the internet is an impediment to the sustained attention that produces mutual, intersubjective experience, which in turn yields “transformative moral commitment.” Its defining feature is “the excessively narrow channeling of our cognitive and emotional investment down pathways that are structurally guaranteed to limit or prevent personal transformation.” So go ahead, delete your account. Read a book. Take a walk. Make an IRL friend. In short, attempt some no-filter personhood.

“And they that heard it said, Who then can be saved?” We are most of us, I think, aware of the colossal difficulty of becoming real persons under present conditions. If you’ve dodged family breakdown, belong to a healthy church or other local community, and are otherwise unaffected by our practically universal derangement and despair, then you must feel awful for the rest of us. (Possibly you’re just not paying attention.) In my experience, most of us don’t have a lot of people to hold those “small, invisible mirrors” that Klay talks about in my epigraph; nevermind that a lot of us wouldn’t like what they revealed if we did.

So, third, we need a better hope. As a classicist, I love a good false etymology. In a fourth-century treatise on the Trinity, Gregory of Nyssa writes

[Because] He surveys all things and overlooks them all, discerning our thoughts, and even entering by His power of contemplation into those things which are not visible, we suppose that Godhead, or θεότης, is so called from θέα, or beholding, and that He who is our θεατής or beholder, by customary use and by the instruction of the Scriptures, is called θεός, or God.

While escaping God’s gaze might be a perennial temptation, not unattractive from a certain point of view, God as our beholder offers its own consolations, especially if we have faith that we are not only perfectly seen and perpetually remembered but perfectly loved.

Smith writes, “[T]he special power of attention as a mental faculty is to generate the experience of subjecthood, and to do so bidirectionally.” In this sense, God’s attention to us establishes the possibility of an intersubjective encounter, it opens the way for our attention to God, which is the life of prayer. This same connection is found in the writings of Simone Weil: “Absolutely unmixed attention is prayer. If we turn our minds towards the good, it is impossible that little by little the whole soul will not be attracted thereto in spite of itself” (cf. Smith: “[A]ttention opens up the attender to the object, so that it may as it were go to work on the individual.”). Let us then, knowing that we are attended, attend, which is to say, “watch and pray.”

Charlie Clark is a writer and retractor. He lives in New Hampshire.