Attention-harvesting technologies jeopardize our capacity to govern concentrated power—and ourselves.
The slogan of the Revolutionary War, “Don’t tread on me,” expresses the psychic core of republicanism. In 1776, this spirited insistence on self-rule was directed against King George, who lived in England. But against what should that insistence be directed today?
The platform firms such as Google, Facebook, and Twitter constitute a kind of imperial power that orders everyday life in far-reaching ways. Many feel that they have to pass through the portals that these firms have established in order to conduct the business of life and to participate in the common life of the nation. Sitting atop the bottlenecks of communication that are a natural consequence of “network effects,” their competitive advantage over rivals is positional, much like a classic infrastructure monopoly (think Ma Bell, or a toll road). They are positioned to collect rents from many forms of social intercourse, including some that we did not previously understand, under the rubric of “the economy” (such as dating). We pay these rents in the currency of our attention.
The state is something we need to be vigilant against; this is the libertarian intuition. But what is “the state,” in the year 2021? The thing that governs us: Where is it located? Depending on how you answer this question, libertarian prickliness may need to be redirected, based on an updated understanding of where the threats to liberty lie.
Shoshana Zuboff, in her landmark book The Age of Surveillance Capitalism, shows how the attention economy is intimately connected to the data market. The behavioral data that you generate throughout the day—not just your Internet browsing but your movements through the physical world, your shifting web of contacts, the content of your social media posts and uploaded photos, the emotional register of your voice—are used to create predictions by the platform firms. These predictions are then sold on a behavioral futures market (often in real-time auctions, even as your behavior is taking place), to be purchased by any party that has an interest in knowing your established proclivities and current receptivities on various fronts.
The point of having such predictions and fine-grained characterizations is to then intervene and nudge your behavior into profitable channels. These interventions may remain beneath the threshold of your awareness (for example, in the selection and arrangement of banner ads on the webpage you are looking at), but even in such cases, the basic lever by which your behavior is modified is through the capture of your attention.
That’s what “content” is for. Algorithms decide, based on your past history, what content should be delivered to you to maximize “time on device.” If you have ever frittered away an hour on YouTube, with its bottomless rabbit hole of recommendations, you know how this works. Our minds are treated as a resource to be harvested at scale, by mechanized means.
Attention is finite and, arguably, the most valuable resource that one has. It determines the contents of our minds, the disposition of our time, and the basic character of our experience. The question of what to attend to is, ultimately, the question of what to value. Because the economy of industrialized attention-harvesting reaches so deep into the human person, the usual categories of economics may not be adequate to parse what is going on—and what our response should be.
The Burden of Self-Regulation
What are we to make of the fact that so many people who use Twitter, Facebook, Instagram, and YouTube also complain bitterly about their own habit of spending too much time on these things? Nobody is forcing anyone to do anything, yet people report that they feel somehow unfree. If we are divided against ourselves, it seems we need to revisit the basic anthropology that underlies the free-market faith.
The view of human beings that prevailed in economics and public policy in the twentieth century held that we are rational beings who gather all the information pertinent to our situation, calculate the best means to given ends, and then go about optimizing our goal-oriented behavior accordingly. But this “rational optimizer” view leaves much out of account, especially the power of habit. (See, above all, William James’s discussion in The Principles of Psychology.) Unlike animals who are adapted by evolution to a fixed ecological niche, with behavioral scripts that are rigidly encoded in instinct, humans are flexibly adaptable, and the paradox is that this makes us susceptible to a peculiarly human form of unfreedom. Precisely because our brains are so plastic and formable, the grooves that we wear into them through repeated behavior may become deep enough that they function like walls.
In principle, we are free to form whatever habits we choose. But this moment of choice usually occurred long ago and passed without our noticing it. You just wake up one day and find that the patterns of your life are perhaps not ones that you would affirm as choice-worthy in a moment of reflection. Can one understand the compulsive behavior of an addict simply as “preference satisfaction”? Classical economics recognizes external coercion but has no ground on which to distinguish freedom from internal compulsion.
Another fact about human beings, which can probably also be attributed to evolution, is that we are layered. We still have that old lizard brain with its animal appetites, and we have higher capacities that are cultivated only with effort. These layers correspond to a rank order of pleasures. The pleasures of mathematics, for example, or playing the guitar, only become available to one with sustained effort. The learning process is initially unpleasant. To attend to anything in a sustained way requires actively excluding all the other things that grab at our attention. It requires a capacity for self-regulation—what psychologists call the “executive function” of the brain. Self-regulation is like a muscle. The more you use it, the stronger it becomes. But you can’t use it continuously all day long. Like attention, it is a finite resource. In light of these facts, it would seem significant that, for example, pornography is available 24 hours a day on a device that one carries around in one’s pocket. The absence of regulation by the state increases our burden of self-regulation, and this comes with a cost that is “off the books” of economistic thinking.
To subsume such distinctions as that between the pleasures of porn and of mathematics, or between practicing the guitar and watching cat videos, under the generic category of “preference satisfying behavior” is to erase the kind of distinctions that matter to human beings. A determination not to be “paternalistic” about such things expresses an admirable modesty, rooted in good old-fashioned liberal agnosticism about the human good. But if we are too dogmatic about this, the effect is to arrest criticism of powerful commercial entities that operate in terrain that is not yet defended by law, in ways that have already consequentially altered the human landscape.
Big Tech firms speak the dialect of autonomy and market choice with expert fluency in their public-facing pronouncements, even while building systems predicated on a very different, more realistic, picture of human agency in which habit is king.
The innovators of Silicon Valley were faced with competitive economic pressure to increase their share of users’ finite attention, and this translated into a behavioral engineering challenge with its own internal logic, pursued without consideration of how it might impinge on the common good. They created something that, like a virus, has taken on a life of its own.
Consider the discovery that when users contribute their own content on a platform, this increases their “engagement.” Facebook famously conducted large-scale experiments on its users and found that it could induce “emotional contagion.” If one curates users’ news feeds to show items likely to enrage them, this captures their attention. They get angry and spend more time on the platform. They become more active disseminators of Facebook links to others and more active generators of further content. Users organize themselves into self-radicalizing rage-tribes; our politics has gotten channeled into divisions that are, to a significant extent, artifacts of the engagement algorithms by which social media platforms have expanded their footprint in American life.
This has been compared to “gain of function” research in virology, in which the natural features of a virus are manipulated to make it more virulent, in a laboratory setting. Social media is initially appealing to us because of our natural sociability (which evolved in face-to-face societies). But, like an engineered virus that escapes the lab, it has taken on a life of its own.
The engagement algorithms of social media achieve “operant conditioning,” a powerful means of behavior modification first identified by B. F. Skinner. This is an explicitly avowed business model, the foundation of what is called “persuasive design” in Silicon Valley. Many tricks of the trade have been developed in concert with the machine gambling industry (slot machines and video poker terminals). They share an ambition to engineer addiction—and indeed, some of the key players have overlapping CVs. The plasticity of our neural pathways is such that repetition combined with random reinforcement can be used to induce compulsions that are no less real, in physiological and behavioral terms, than the compulsions of substance abuse. The reinforcement here consists of “likes” and retweets and positive comments, each of which gives your brain a little micro-shot of dopamine. What is genuinely novel is the potency and scale that behaviorist conditioning may achieve through machine learning. At some point, the libertarian risks becoming an antiquarian stuck in 1776, or 1980, if he hasn’t updated his assessment of the field of forces.
So perhaps the political calculus must change. As a prudential matter, I may decide that I want the de jure, elected government to fight the de facto, unelected government on my behalf, by regulating the attention economy. I have zero faith in the wise benevolence of those who staff the permanent bureaucracy. But we now have enough accumulated experience to say also that the business model driving Silicon Valley’s efforts to monetize every bit of private headspace has had some serious ill effects.
It would be pleasing to conclude my argument here. But in the last several months, I have found that my own view needs to be updated as well. A newly radicalized state, with a newly militarized determination to suppress dissent, gives one a newfound appreciation for good old-fashioned libertarian vigilance against “the state” as usually understood. In a corresponding inversion, Big Tech now sometimes appears as a rival center of power that could help to keep thought free, if it so chooses.
Social media tribalizes thought, but it also liberates thought from the monopoly power of the propaganda state that operates through the legacy corporate media. The panicked response of the Democratic establishment to this fracturing has been to try to gain control of social media, summoning Jack Dorsey and Mark Zuckerberg for ritual humiliations in Congress. I have no inside knowledge, but it is reasonable to assume that the bargain offered is continued regulatory forbearance on antitrust and Section 230 immunities in exchange for cracking down on dissent. This presents a genuinely disturbing prospect.
Precisely because of its unprecedented power, including power to sway elections, the Valley is in a position to resist the state’s demand that the platform firms suppress facts and argument. Such resistance will require some spine and concern for the common good.
The opposed categories “private sector” and “government” would appear to have little utility for understanding the present; we may need to put down our Milton Friedman and pick up our George Orwell.