The WIRED Guide to Internet Addiction


More than a decade after the first iPhone was released, it suddenly dawned on us that we could be addicted to our smartphones. We'd certainly developed quite the habit: Almost 50 percent of people say they couldn’t live without their phones, which we check every 12 minutes and touch an average of 2,600 times a day.

"Likes are “bright dings of pseudo-pleasure” that can be as empty as they are alluring."

—Former Facebook engineering manager Justin Rosenstein, creator of Facebook’s Like button

You don’t need to see the stats to know it’s hard to put down your device—the muscle memory of pull-to-refresh, the devil of the red notification on your shoulder, the rush that follows a flood of likes, the Instagram envy, the FOMO, scrolling endlessly by screenlight instead of falling asleep.

Researchers have been warning about the power of persuasive technology for years. But our sense of unease only went mainstream when we learned we were being manipulated. First, fake news and Russian meddling on social media demonstrated that tech platforms and the algorithms that power them could affect real-world behavior, like how you vote. Then a wave of Silicon Valley defectors revealed that social media apps were intentionally designed to trigger hits of dopamine that keep us coming back for more. And some well-publicized books exposed the toll of technology on our mental and physical health.

As suspicions swirled, a realization took shape. Maybe our death grip on our phones, which now occupy five hours of every day, isn’t a personal failing—lack of willpower, rudeness, narcissism. Maybe we’d been duped.

The financial incentive to keep us hooked is clear: Tech companies make money off of our attention, and shareholders measure success by the amount of time a company can keep us “engaged.” When we started focusing our anxieties on the effect that smartphones might have on children, the moral panic was complete.

Except that actual experts are still debating whether “addiction” is the right term for the relationship between humans and smartphones. Some say technology is not a drug like tobacco but rather a behavioral addiction, like gambling. Others say the addiction metaphor is unnecessarily alarmist and that the studies linking depression and smartphone usage only show correlation, not causation. But even major tech companies have acknowledged that their products can make us feel bad and promised to be more mindful of their users—perhaps the best data point yet that our smartphone attachment is cause for concern.

The History of Addictive Technology

Technophobia is at least as old as Socrates, who warned that the written word would weaken our memories. Similar fears about diminished intelligence, information overload, social isolation, increased laziness, or distraction followed the printing press, gramophone, telephone, radio, and TV. The arrival of the always-on Internet was no different. Cyberspace seemed designed to suck you in for hours on end—the Pavlovian conditioning triggered by AOL’s “You’ve Got Mail,” online gambling, online porn, chat rooms, instant messaging. Medical professionals started questioning whether “Internet addiction” should be a real diagnosis in the late 90’s, when America was still stuck on dialup.

"You're exploiting a vulnerability in human psychology … [The inventors] understood this, consciously, and we did it anyway.”

Sean Parker, Ex-Facebook president

But smartphones and social apps—those interactive data givers and takers always within reach—are different and more nimble beasts. Software adapts to the data we feed it, catering, perhaps, to our own individual vulnerabilities. The formula for influencing behavior adjusted accordingly.

B. J. Fogg, founder of Stanford’s Persuasion Lab, whose students went on to work for Facebook, Instagram, Uber, and Google, developed a psychological model that combined three factors to prompt a particular behavior: trigger, motivation, and ability. Take Facebook photos, for example: You get a push notification that you've been tagged in a photo (trigger), you want to make sure you look OK in the pic (motivation), and you can easily and immediately check the photo on your phone (ability).

Tricks That Keep You Glued to Your Smartphone

  • Push Notifications
    Alerts that flash across your phone, even when the screen is locked, play to the same desire for social connection as when a friend calls or texts. Except the demand to drop everything and redirect your attention comes from an app, rather than a loved one.

  • Pull-to-Refresh
    Apps are capable of continuously updating, but this slot-machine-like gesture provides the illusion of control and the allure of unpredictable rewards.

  • Variable Rewards
    The uncertainty of what you’ll find when you respond to a notification or pull down to refresh is what keeps you coming back for more.

  • Infinite Scroll
    Without visual cues to indicate an end point, humans don’t know when to stop. We’re looking at you Facebook, Instagram, and Twitter. And looking … and scrolling … and looking … and scrolling.

  • Autoplay
    Netflix’s autoplay feature, which automatically loads the next episode, is one example of the way that companies encourage you to stay engaged. Uber pulls a similar move with its drivers, by sending them the next fare before the current ride is over.

  • Bright Colors
    App icons and tiny red dots are eye-catching for a reason.

  • Short-term Goals
    Snapchat’s Snapstreaks feature shows the number of days in a row that two people have communicated with each other, prompting an unhealthy obsession for teenage users, who feel compelled to keep the streak alive.

  • Gamification
    Turning something into a game typically involves three elements: points, rewards, and a leaderboard. Fitbit, the wearable that nudged millions of people to complete 10,000 steps a day, has all three.

A former student of Fogg’s, Nir Eyal, developed his own model. In his book, Hooked: How to Build Habit-Forming Products, Eyal lays out a four-part process: trigger, action, variable reward, and investment, arguing that negative emotions can be powerful triggers. Boredom, loneliness, frustration, confusion, and indecisiveness cause a slight pain or irritation, prompting us to engage in mindless action to make the negative sensation go away. Positive emotions work too. On an app like Instagram, for instance, the trigger could be the desire to share good news.

The engine driving these feedback loops is the same mechanism that makes slot machines attractive: The uncertainty of what you’ll find when you respond to a notification or pull-to-refresh is what keeps you coming back for more. In his book Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, Adam Alter says the loop is powerful not just because of the occasional wins (like a fave), but because the experience of a recent loss (no faves) is deeply motivating.

There’s nothing inherently nefarious about the models. The same structure can be used to persuade people to make better choices, like the way FitBits turn fitness into a game or apps that nudge you to meditate. In that light, the power to change behavior doesn’t look so bad, but it still gets at the underlying question: Can persuasive technology override our free will? Fogg himself warned the Federal Trade Commission about potential political and social consequences of building “persuasion profiles,” a year before the iPhone was released. “We can now create machines that can change what people think and what people do, and the machines can do that autonomously,” he testified in 2006. “Whenever we go to a Web site and use an interactive system, it is likely they will be capturing what persuasion strategies work on us and will be using those when we use the service again.”

A few years later, a warning came from a more unlikely source. In 2010, Steve Jobs told The New York Times that his kids hadn’t used an iPad, which was just hitting the market. “We limit how much technology our kids use at home,” he said.

On their own, the crises around fake news and election interference may not have shaken us from our screentime stupor. But a cadre of whistleblowers, who got wealthy off the products they now warn against, revealed the underlying connection. Algorithms value engagement—and content that hits us low on the brain stem, inspiring fear and anger, tends to get more of a reaction. All Russia had to do to sow division was spike Facebook’s NewsFeed with stories that activated our lizard brain.

“I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all.”

—Loren Brichter, inventor of pull-to-refresh mechanism

And, as we are learning, the impact of these sticky algorithms is especially acute on kids: In iGen, published in 2017, Jean Twenge, a psychology professor at San Diego State University, noted that, according to a study she conducted, eighth graders who are heavy users of social media have a 27 percent higher risk of depression. Other experts, like Andrew Przybylski, a psychologist at the Oxford Internet Institute, caution that Twenge’s data shows correlation between depression and social media, not causation. Still, two major Apple shareholders, a hedge fund and a pension fund, cited Twenge’s study in January 2018 when they wrote an open letter to Apple urging the company to assist with more rigorous research into smartphones’ effects on children and to build better controls for worried parents.

The letter says that blaming parents or arguing that the research isn’t definitive misses the point; it cites data from the nonprofit Common Sense Media showing that the average American teenager with a smartphone spends more than 4.5 hours a day on the device, excluding texting and talking. "It would defy common sense to argue that this level of usage, by children whose brains are still developing, is not having at least some impact, or that the maker of such a powerful product has no role to play in helping parents to ensure it is being used optimally.”

The Future of Addictive Technology

Concerns around tech addiction are increasingly complex. What if smartphones and social media don’t just addle our attention span and waste our time, but can also shape and twist what we know and what we believe? As our awareness of the potential danger grows, the tactics used to keep us hooked are advancing in tandem. Artificially intelligent algorithms, armed with an unprecedented amount of personal data, are particularly hard to resist.

For example, YouTube’s algorithms recognized that progressively more extreme content keeps users stuck to their screens, so its autoplay feature recommends increasingly incendiary videos. This led Zeynep Tufekci, a professor at the University of North Carolina, to call the video-sharing site, which now gets more than a billion views a day, “the great radicalizer.”

YouTube’s recommendations to keep us engaged will only get more sophisticated. The company is testing deep neural networks to improve the process, and studies show the changes increased watch time dramatically. Meanwhile, Netflix, which already personalized thumbnails to get us to watch, is now exploring personalized trailers. The company is reportedly using machine learning and AI to automatically generate trailers from the most compelling scenes in a show, based on individual preferences. If you normally watch rom-coms, for example, it will show you the most romantic moment in an action movie.

Tech companies, facing public pressure on a new front seemingly every week, have at least acknowledged the consumer backlash. After the letter from shareholders, Apple defended itself in a public statement, saying, “We think deeply about how our products are used and the impact they have on users and the people around them. We take this responsibility very seriously and we are committed to meeting and exceeding our customers’ expectations, especially when it comes to protecting kids.” In March, the company launched a page for families and is expected to improve parental controls in the next version of iOS.

“I can’t control [Facebook]. … I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”

Chamath Palihapitiya, Ex-Facebook vice president of user growth

Still, it’s not clear if companies are really willing to watch engagement numbers drop. For instance, Facebook shareholders tried the same move as Apple shareholders, asking the company to shut down its controversial children’s product, Messenger Kids, which is aimed at kids as young as 6 years old, but to no avail.

The tug of war over the phrase “time well spent” is another good indication of the tech industry’s rhetoric-and-see attitude. Tristan Harris, the former design ethicist for Google, popularized “time well spent” as a contrast to apps like Facebook that “hijack our minds” and distract us from our priorities. And for his annual personal challenge in 2018, Mark Zuckerberg vowed to fix Facebook, including insuring that time spent on Facebook was “time well spent.” But, as Harris has pointed out, making it easier to ignore the NewsFeed clashes with Facebook’s business model, in which advertisers (who are the company’s actual, paying, customers) want your attention.

“The game is getting attention at all costs. And the problem is it becomes this race to the bottom of the brainstem, where if I go lower on the brainstem to get you, you know, using my product, I win. But it doesn't end up in the world we want to live in. We don't end up feeling good.”

Tristan Harris, Ex-Google design ethicist

To demonstrate that it was tackling the issue, Facebook published a blog post in December 2017 called “Hard Questions: Is Spending Time on Social Media Bad for Us?” (Asking and answering its own questions is the Facebook way.) Facebook acknowledged research showing that passively consuming social media can put you in a worse mood. But it also cited a study conducted at Carnegie Mellon University showing that users who were more actively engaged on Facebook, including sending and receiving personalized messages and comments, reported better psychological well-being and fewer feelings of loneliness or depression. The study, however, was conducted in partnership with Facebook.

The best hope for better practices may be the whistle-blowers themselves. Harris is now the executive director and cofounder of Center for Humane Technology, which is agitating for change, supported by Common Sense Media, a nonprofit dedicated to helping kids. While users are waiting on Facebook, Apple, and YouTube to act, entrepreneurs are developing tools and services to help with tech addiction—available, where else, but at an app store near you.

Learn More

  • The Subtle Nudges That Could Unhook Us From Our Phones
    Instead of using stockpiles of personal information to exploit our tendencies to procrastinate, companies could use the same information to detect and notify users who spend more time than they would like to on their phones. For instance, what if typing “Facebook” into your address bar prompted you to select whether the purpose of your visit is a “Quick Break,” “Easy Reading,” or “Organize an Event,” and also showed you how well those purposes had worked out for people in the past.

  • Extra Sticky
    WIRED’s Gadget Lab podcast interviews senior science writer Robbie Gonzalez to discuss the science of technology addiction and how to cope with the information overload from our news feeds.

  • Ethical Tech Will Require a Grassroots Revolution
    Defectors from Google and Facebook have banded together to create the Center for Humane Technology in order to spark a mass movement around “time well spent.” If Silicon Valley and Washington won’t address tech’s unhealthy impact, then maybe the public will.

  • It’s Time for a Serious Talk About the Science of Tech Addiction
    Everything you ever wanted to know about whether or not it’s fair to call your smartphone habit an addiction. And some suggestions for a way forward from the field of nutritional research, which had to reinvent itself after decades of demonizing fat as the root cause of obesity and chronic illness.

  • The Formula for Phone Addiction Might Double as a Cure
    B. J. Fogg, whose Stanford class on persuasive technology became known as “The Facebook Class,” thinks consumers have the power to unhook themselves from their phones. All that’s missing is the motivation. One easy way to start is by turning off notifications or turning your phone to grayscale.

  • Demonized Smartphones Are Just Our Latest Technological Scapegoat
    Technophobia is deeply rooted in our history, at least as far back as Ancient Greece, and the case against smartphones isn’t as airtight as it seems. Perhaps we fear the uncertainty as much as we fear the effects of the smartphone.

  • Can Our Phones Save Us From Our Phones?
    Explore the apps, extensions, and devices that could help you reclaim a few minutes of your day, if not your life, by putting a cap on the number of tabs you can open or interrupting you in the middle of a habit you’re trying to break.

This guide was last updated on April 12, 2018.

Enjoyed this deep dive? Check out more WIRED Guides.

Read more here:


Please enter your comment!
Please enter your name here