Last summer, I took a weekend trip to Glenwood Springs with my family. We stayed at a small cabin in a campground and did some hiking, relaxing in the hot springs, and adventuring at a local theme park. Each evening, we spent some time playing card games on the front porch and then went to bed when the sun went down.
Except that, every night, I had already scheduled a block of time that my family knew nothing about. I had to sneak down to the shared bathrooms with my iPhone and spend five minutes connecting to Candy Crush and collecting my daily prizes, boosters I could use later in the game when I was back at home.
It’s a shameful little secret, but it’s also what Tristan Harris refers to as a tech company’s unethical manipulation of my time. Apps are able to use persuasive strategies (such as Candy Crush rewards) to commit “blocks” of my schedule, which then fuels my “addiction, ” or future use of the app. According to Harris, the solution is to embark on a “renaissance in online design.” Harris’s Center for Humane Technology advocates for humane design—technology that protects rather than exploits our vulnerabilities to be consumed by tech’s siren call to our attention, political pressure to hold tech companies accountable, and a movement to increase self-awareness in digital citizens.

Harris asserts that “we need to acknowledge that we are persuadable.” He perceives a gap in our own self-perception. But I don’t think that this is the knowledge gap that most of us experience. We are consumers in our American society, and we recognize that we are assenting to persuasion on a regular basis—from casual conversations about where go to dinner to a barrage of advertisements we experience when we turn on the television or log into Facebook. We may not always work through our reactions to persuasion consciously because we’ve automated those processes, but if someone asks me, Why did you start logging into Candy Crush every day?, I will be able to answer, Well, they started giving me a reward every day, and I don’t want to miss a reward. That is a decision I have made, and I can recognize that it is the game’s way of persuading me to keep in contact.
Harris would point out that the game is pursuing its own self-interest, though, not mine. And that’s the key point that he offers, I think—not that we are creatures susceptible to persuasion, but that “technology is not neutral.” One way to look at this issue is that technology is not static but is a vehicle of change, and thus cannot be neutral. Other perspectives focus on the human element behind technology, such that technology is always tainted by the motivations of the hand behind the algorithm.
Those spending their time in digital environments may not understand the programming behind the recommendations for the next video they should watch or the next website they should visit, but every user is presented with a choice to make, and that is the gap that Harris seems to overlook. Yes, we spend too much time on social media, but that is our choice to engage, to assent to or reject persuasive strategies. Yes, I have spent too many days accessing Candy Crush for my next reward, but that was a deliberate choice, and I also didn’t feel like less of a person on the days I chose not to access the app.
Addiction is a real problem, both in the digital environment and out in what we call the “real world.” And Harris’s call for human technology that seeks to protect rather than exploit our attention and time would be valuable. But let’s talk about those opportunities within a conversation that honors the human capacity to make deliberate choices.
Now, I need to run off and grab my next Candy Crush reward—it releases at 5:00.

Leave a Reply