I’m going to tell you a few things you already know.
Every time you open your phone or your computer, your brain is walking onto a battleground. The aggressors are the architects of your digital world. Their weaponry are the apps, news feeds, and notifications in your field of view every time you look at a screen.
They are attempting to capture your most scarce resource—your attention—and take it hostage for money. In order to succeed, they need to map the defensive lines of your brain, your willpower, and your desire to concentrate on other tasks, and figure out how to get through them.
You’ll lose this battle. You have already. The average person loses it dozens of times per day.
This may sound familiar: In an idle moment, you open your phone to check the time. Nineteen minutes later you regain consciousness in a completely random corner of your digital world: a stranger’s photo stream, a surprising news article, someone dancing on TikTok, a funny YouTube clip. You didn’t mean to do that. What just happened?
This isn’t your fault. It’s by design. The digital rabbit hole you just tumbled down is funded by advertising, aimed at you. Almost every “free” app or service you use depends on this surreptitious process of unconsciously turning your eyeballs into dollars, and they’ve built sophisticated methods of reliably doing so. You don’t pay money for using these platforms, but make no mistake—you are paying for them, with your time, your attention, and your perspective.
These decisions are not made with malice. They are made behind analytics dashboards, split-testing panels, and walls of code that have turned you into a predictable asset, a user that can be mined for attention.
Tech companies and media organizations alike do this by focusing on one oversimplified metric, one that supports advertising as its primary source of revenue. This metric is called engagement, and emphasizing it above all else has subtly and steadily changed the way we look at the news, our politics, and each other.
This addiction to our devices isn’t a distinct issue from the problems we’re facing in our politics; it’s actually the same system.
For the first time, the majority of information we consume as a species is controlled by algorithms built to capture our emotional attention. As a result, we hear more angry voices shouting fearful opinions and we see more threats and frightening news simply because these are the stories most likely to engage us. This engagement is profitable for everyone involved: producers, journalists, creators, politicians, and, of course, the platforms themselves.
The machinery of social media has become a lens through which society views itself—it is fundamentally changing the rules of human discourse. We’ve all learned to play this game with our own posts and content, earning our own payments in minute rushes of dopamine, and small metrics of acclaim. As a result, our words are suddenly soaked in righteousness, certainty, and extreme judgment.
For the first time, the majority of information we consume as a species is controlled by algorithms built to capture our emotional attention.
When we are shown what’s wrong in the world, we feel the desire to correct it. We want to share these transgressions with our networks. If we see more problems, these problems must have perpetrators who are responsible for them. These enemies are now everywhere, and we feel the need to call them out.
The result is a shift in our collective perception. We see a world under threat: a constant moral assault on our values, a poisonous political landscape, and an abrupt narrowing of our capacity for empathy. These new tools are fracturing our ability to make sense, cohere, and cooperate around the deepest challenges facing our species.
Let’s start by unpacking a choice that was made years ago. It was one that most people didn’t think about, a simple purchase that most were excited to make. For me, it was a tool that I was personally confident would improve my day-to-day productivity. After deciding to follow an orange-robed monk into the countryside, this was the second-most significant decision of my life. This seemingly harmless, but profoundly consequential choice was my first purchase of a smartphone. The day I opened this glass rectangle and powered it on, I had no idea that it would change my relationship with my brain.
If you fell asleep in 2007 and woke up in the 2020s, you’d find yourself in an episode of The Twilight Zone. Craned necks focusing on glowing rectangles, everyone under some persistent spell. The outside world would be mostly the same, but its inhabitants would be markedly different. Those around you would be speaking an unfamiliar language, obsessed with this luminous box in their hands.
You may remember, hazily, what it was like back in the early aughts. Falling into a book or a long magazine article was easy. Boredom was possible. Long walks by yourself, disconnected from the matrix of urgent knowledge. Not overly worried about missing something important. Not aware of the critical undercurrent of the next looming crisis.
I felt this pain early. My brain chemistry was especially susceptible to this change, particularly vulnerable to the architecture of our phones, our apps, and our feeds. I spent thousands of hours caught within the smartphone-enabled dopamine trap attached to my body. I could feel my daily ability to focus narrowed, excised, dissolved, and diminished as this extraction of my attention became more efficient.
As my attention waned, at first I was confused: My smartphone was supposed to make me more productive, but I was also losing part of my capacity to focus. Why? I studied my own actions like a clinician, trying to fix, manage, and reconfigure my digital environment so it served me, haltingly taking back control.
Years into this journey, I was officially diagnosed with ADHD. It was a painful realization, and one that began to clarify why my struggles were so intense. I was attempting to cope with the increasing informational demands on my brain, and my brain was particularly vulnerable. ADHD has become one of the more commonplace cognitive disorders of the smartphone era, suggesting that there is a link between our use of these tools and our collective inability to focus.
As the writer and productivity expert Ryder Carroll has described, having ADHD in the smartphone era world is much like trying to catch the rain with your hands.
You step outside and you bring your attention to the darkening sky. The first drops fall. You catch one, then another. Soon the storm picks up, and the rain falls faster. You miss a drop, then another. Soon there are so many things raining down on your attention that you don’t know what to focus on. Do you go for the ones coming from a distance, or the ones close to you? The more you frantically deliberate, the more you miss.
For me, it was a slow, painful process of reconstructing my attention. Over a decade I personally built an elaborate Rube Goldberg–type machine to keep myself on track. My personal-focus machine involves a half-dozen browser extensions, news feed blockers, meditation rituals, VPNs, and productivity timers. Each helps me capture a small additional fraction of my attention that would otherwise slip into an infinite digital rabbit hole.
These strategies aren’t always successful. I still can’t escape the use of these tools in my professional and personal lives, and you’re most likely in the same position. My connection to my income depends upon the usage of these instruments. The success of this book depends upon them. No notable figure can live entirely separated from social media without concessions. I must play the game to survive and achieve, and you probably do too.
If you do meet someone without a smartphone (and, I assure you, they still exist), it’s a bit like meeting someone with an obscure medical condition. You’re curious about their life: You want to know, are they okay? You’re proud of them for overcoming their hardships, but you wouldn’t be able to do what they do.
Attention is zero-sum. We unfortunately have only so many hours in our waking lives, and when we use our attention, it’s gone forever. It’s a nonrenewable moment of our finite existence lost.
Yet we may be the ones who are strange. I can attest that a chunk of my personal agency has been lost since the advent of smartphones. I can measure the loss in weeks, months, and years of my life. I know I’m not alone.
A 2018 study found that 63 percent of smartphone users say they’ve attempted to limit their usage, with only half of them feeling like they were successful. In 2022, Americans spent nearly five hours a day on mobile devices. For many of us, that’s nearly a third of our waking life. We have let them burn into our sleep, we have allowed them to eat our relationships, and have made ourselves sick with frustration, anger, exhaustion, and FOMO.
We know it, but we cannot stop.
Attention is zero-sum. We unfortunately have only so many hours in our waking lives, and when we use our attention, it’s gone forever. It’s a nonrenewable moment of our finite existence lost. And increasingly it has become our most scarce resource.
What does the world look like when our attention is pilfered, overwhelmed, and extracted from us? What happens when we collectively lose our ability to focus? What are the net costs to the economy and society? These questions were seldom asked by the early architects of our digital environment.
Trying to weigh the trade-off between what these tools are doing for us versus to us is hard to parse. In spite of the wonders of our seemingly magical connection, parts of our lives are lost. And for many of us, it wasn’t really a choice.
Once enough humans begin using a thing, it becomes the default and expected norm. Today, you can’t easily function in society without a mobile device. Depending on your industry, that may or may not include the usage of social media. Google, Facebook, and Instagram are critical tools for marketers and any business working on the internet.
Use of Twitter is a non-negotiable job requirement for most journalists. Politicians must use social media to reach their constituents. These tools have come to touch almost every part of our public lives. We’ve adapted to these tools so rapidly that the question deserves to be asked: Did we choose this weird world? And if we didn’t, who did?
On a brisk New York evening in 2014, I climbed into an Uber home from a party in Midtown with a handful of friends, on my way back to my apartment on the Lower East Side. I found myself sitting next to a bright-eyed young man with reddish-brown hair and a quiet manner. As we chatted, we discovered we had grown up just miles from each other in California, and had both lived and worked in tech in San Francisco at the same time. I mentioned I was in the early stages of researching this book. He told me he was in the beginning stages of diagnosing a problem he saw unfolding at Google where he worked. We decided to meet up the very next day for lunch.
His name is Tristan Harris. Over Vietnamese food the following afternoon, he told me how, less than a year earlier, he had realized that there was a strange mind-set among product designers in Silicon Valley. He saw a widening gulf between the incentives driving the designers and engineers who created the products, and the best interests of their millions of customers.
Specifically, he noticed that the drive to maximize internal metrics like “time on site” was increasingly in conflict with what was best for users’ attention. Design tweaks like infinite scrolling, intrusive alerts, and other so-called attention hacks were being widely used in the industry to keep people hooked on their products.
He had created a presentation entitled “A Call to Minimize Distraction & Respect Users’ Attention” that he shared internally at Google. The 141-slide deck made a case that the company had a fundamental responsibility to ensure its users didn’t spend their days buried in devices at the expense of their quality of life. After he shared it with a few people, it went viral within the company, and was viewed by thousands of Google employees.
Tristan was articulating a big new idea—one that had yet to be fully fleshed out. He was concerned that distraction was at the core of a fundamental new dilemma for the tech industry. Not a business problem, but a philosophical one. He could see that “choice” itself was becoming a squishy concept when it came to the usage of these tools.
Most entrepreneurs and product designers adhered to the basic tenets of liberal economics: the idea that individuals choose the best products in the marketplace, and subsequently reward entrepreneurs that create them. Build a better mousetrap, it’s said, and consumers will choose it over your competitors.
But in practice, many Silicon Valley product designers were also attentive students of behavioral economics, a field that recognizes that consumers are often predictably irrational, and that there are very clear psychological triggers to make people act in a certain way, regardless of their preferences.
Behavioral economics shows how to bend human decision-making. A classic example of this is the irrational market incentives of a slot machine—a game that uses irregular payouts to hack the pleasure center of players—with the rush of occasional wins netting out to a loss over time. Slot machines use a strategy known to behavioral scientists as intermittent variable rewards to get players hooked, something that Tristan saw product designers emulating in the design of news feeds and push notifications.
Tristan believed that this was about far more than distraction. He sensed that many of the features of these tools were beginning to pull us away from ourselves and into an impulsive advertising-driven dystopia. In his opinion, designers, engineers, and advertising companies alike were preying upon our attention at a steep cost. These costs include degrees of human agency—our literal free will.
Tristan became a close friend, and over the following years his message exploded. After leaving Google in December 2015, he launched a movement known as Time Well Spent, which later became the seed-crystal of a movement to reform technology to serve human interests. A few years later, he launched the Center for Humane Technology with the goal of mobilizing support for tech that aligns with human values while protecting people’s agency. By 2016 he had become a mainstage TED speaker and was in high demand in the thought-leadership circuit. At that moment, Tristan had carved out a very distinct side of the argument against exploiting our attention.
A few years later, I found myself facing the other side of this issue, quite literally, at a dinner party hosted by John Stossel, a prominent New York journalist and libertarian. John, in the interest of provoking thoughtful conversation, regularly hosted dinner salons focused on controversial issues, believing that through the process of putting conflicting ideas together over dinner, the best perspectives might be revealed through kindly debate.
The man I faced on the other side of this issue was Nir Eyal. Nir has written several books, including Indistractable: How to Control Your Attention and Choose Your Life, which focuses on how to make people immune to the tools and tricks of attention capture. If this sounds positive, it might be viewed as a sort of penance for Nir’s first book, the bestseller Hooked: How to Build Habit-Forming Products, which he used to launch a successful consulting business for companies looking to maximize the capturing of human attention. Nir was a pioneer in disseminating the strategies of gamification and addiction, and his work was widely adopted in Silicon Valley.
Nir has become something like an Anti-Tristan, telling people that these assumptions and fearmongering about human attention and agency are, in his words, bullshit. That the process of deferring responsibility for our choices to tech companies is, in his view, a terrible lie. He believes that removing ourselves from the decision-making process is a ridiculous short-changing of our personal agency. In his view, it’s disempowering and unfair to the individual: a story about how we spend our time that takes away our freedom.
Nir often quotes the French philosopher Paul Virilio: “When you invent the ship, you invent the shipwreck,” referencing the calamitous by-products of every new technology. As a foil, Nir is very effective, and he makes strong points. I can see the logic in both arguments. But he’s wrong in believing that we’re not facing a fundamental crisis at the scale that Tristan has described.
Over the years, Tristan and Nir, through very different means, have come to hold two separate corners of the same truth. As with many debates, a larger story emerges from comparing their central arguments. We are undergoing a broad systemic shift where our attention is being targeted, extracted, mined, and plundered better than it ever has before.
But we’re likely to manage this crisis better if we recognize we do have some control and take the steps to change it. If we empower people with a better understanding of what these tools are doing, we can modify our behavior, and we can demand more from the platforms themselves.
Excerpted from Outrage Machine. Copyright © 2023 by Tobias Rose-Stockwell. Reprinted with the permission of Legacy Lit.