logo

66 pages 2 hours read

Sherry Turkle

Reclaiming Conversation: The Power of Talk in a Digital Age

Nonfiction | Book | Adult | Published in 2015

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Parts 5-6Chapter Summaries & Analyses

Part 5: “The Path Forward” - Part 6: “A Fourth Chair?”

Part 5, Chapter 1 Summary: “The Public Square”

Online, people have the choice to only see information and people they agree with. Turkle used to define the digital version of people as their second self, but she now she sees a screen as a “broken mirror” that reflects a distorted version of something authentic.

“A State of Emergency”

Elizabeth, a graduate student who became involved with online politics in 2012, told Turkle that her organization, Invisible Children, condemned Joseph Kony, who used child soldiers in Uganda. The organization made a video that highlighted his savagery and asked anyone who watched to send money. In exchange, each person would receive a sign with Kony’s face on it. Their rationale was that if enough signs were displayed, the growing awareness of Kony’s evil would create enough pressure to end his reign. Soon the video had nearly 100 million views. However, very few people put up the signs.

After the video’s mass viewership but overall inefficacy, Elizabeth believed that online support does not always materialize.

“Friendship Politics: Things to Buy and Click On”

Sharing happy feelings is not the same as not real political change. Real politics don’t exist without real meetings and conversations. Millions of online likes may not result in other actions. The millions of likes from the Kony initiative gave Elizabeth the impression that something more substantial was happening. This makes sense, given that research concludes that online connections with strangers are constrained. They are not intimate, even though they may feel like it. The writer Malcolm Gladwell suggests asking very little of people you don’t know online.

“Catastrophe Culture”

When mobile phones first became part of our culture, people often answered the phone at the drop of a hat, thinking it could be an “emergency.” This habit soon began to override considerations of politeness; people answered phones at dinner, during other important conversations, when they were about to go to sleep, and so on. Turkle partly traces the phone-call-emergency association back to the September 11 attacks. People are now inundated with a stream of catastrophic news on their smartphones, and this culture has gradually conditioned people to treat notifications as potential emergencies. However, actual catastrophes require conversations that go beyond naming them catastrophes, and people don’t like facing such conversations, which may be numerous, lengthy, and political.

“Room to Think in a World of Big Data”

Now that search engines, social media sites, and website cookies track our online habits and time expenditures, we are always being watched. Mark Zuckerberg said, “Privacy is no longer a relevant social norm” (301). Internet users either find the lack of privacy tolerable, or they don’t think about it.

“The World Without Privacy”

Turkle’s grandmother took her to the Brooklyn library when she was 10, and her grandmother explained that in Europe, fascist government figures spied on citizens by reading their mail. No one had the right to know what Turkle read—and no one would ever blackmail her for it. Turkle finds it hard to explain privacy to her 24-year-old daughter, who grew up with the Internet. Her daughter does not prioritize privacy.

“Surveillance Creates the Digital Double”

When the Internet was just born, people saw it as an adventure and an opportunity to explore. The writer and technology critic Evgeny Morozov uses Internet Explorer’s early slogan as an example: “Where do you want to go today?” (304). Today it is more applicable to ask what we want to give; i.e., we must ask ourselves what information we are willing to expose about ourselves. Our digital selves are forever preserved on servers. To be online is to accept surveillance.

“The Self of Self-Surveillance”

The philosopher Jeremy Bentham invented the idea of the Panopticon, a prison in the shape of a spoked wheel. The inmates live in the spokes; the guard is in the hub. The prisoners can never be sure if the guard is watching their spoke, so they must act as if he is.

Foucault modernized Bentham’s idea; he believed that power would arise from creating a society that willingly watched itself. While Bentham’s idea applies to unwanted surveillance, Foucault’s idea applies to apps that we willingly feed intimate details of our lives because we are under the impression that these apps care about us, whether by helping us keep in touch with friends or tracking our fitness.

“Shaped by the System: Living in the Bubble”

Instead of behaving according to the fear of the Panopticon’s surveillance, “we conform because what is shown to us online is shaped by our past interests” (307). Our sense of reality online is restricted by what the algorithms show us based on our past choices. Rather than expanding our intellectual horizons, it limits encounters with new ideas. The web is building a version of you that advertisers and data collectors can target.

A tech writer named Sara Watson told Turkle that online ads were inviting her to participate in an anorexia study. In her view, these ads were suggesting that she had anorexia, which was invasive and inappropriate. It unsettled her that she didn’t know which part of her online history might have given an algorithm the idea that she was anorexic.

“Thinking in Public”

Thoreau’s isolation was meant to avoid, in his words, living “too thickly” (309). This thickness referred to the onslaught of obligations and outside noise. The modern environment is a massively amplified version of the world Thoreau hoped to escape. Mark Zuckerberg and others like him have prioritized the sharing of interests, likes, and dislikes almost to the level of sacredness. People raised with the Internet are more likely to assume that sharing online is not only enriching for themselves but good for society.

“Objects-Not-to-Think-With”

There is a growing body of literature about how to avoid things that we would rather not think about. Lana recently graduated from college, and she told Turkle that she didn’t like the constant data collection online but avoided thinking about it. Ironically, her online groups tacitly encouraged silence, in terms of substantive conversation. There were few controversial ideas in Lana’s head, because she avoided thinking about them, so that she could avoid sharing them. She was relieved, in her words, not to have strong opinions, because she would never derail the momentum of a conversation by introducing anything uncomfortable.

“Vague on the Details”

With each young person whom Turkle interviewed, Turkle was increasingly aware that they were as expert as their elders at avoiding uncomfortable thoughts. They remained purposefully vague on the details of online privacy because they would prefer not to think about it or address the implications of digital surveillance.

During the 2012 US presidential election, a Facebook algorithm studied random precincts and informed users if and where their friends had voted. Facebook framed this as a study and concluded that social media can influence voting participation. However, law professor Jonathan Zittrain calls Facebook’s behavior “digital gerrymandering” (314).

“Snowden Changes the Game”

The military whistleblower Edward Snowden made it easier for young people to see some of the problem with data monitoring and collecting. He helped them question Facebook’s intentions. They were willing to rally around Snowden, despite previously ignoring—or glibly dismissing—the lack of privacy online. Snowden started a conversation among the youth about the sinister aspects of digital surveillance.

Part 5, Chapter 2 Summary: “The Nick of Time”

Thoreau put his chairs in the corners when he felt overwhelmed by noise and conversation. This is the equivalent of stepping back from online conversation. Campers at a device-free camp talk with Turkle about bonding with counselors. They enjoy who they are at camp, but they can’t sustain that self back at home. At home, everyone else is using technology, so it’s hard not to follow suit.

“Guideposts”

Turkle provides a list of recommendations for those wishing to reclaim conversation. Her suggestions include protecting our attention spans, unitasking rather than multitasking, making time for conversation, speaking with others who hold opposing viewpoints, avoiding narrow or binary thinking, and more. If put into practice, her recommendations will rob phones and technologies of some of their power.

“Places”

With the project of reclaiming our conversation, Turkle recommends focusing on the places where conversations can actually take place. Change is not merely the act of people putting down their phones. They must seek out the places where they can hold sustained, regular conversations, until those conversations become habitual.

“Public Conversations”

Turkle discusses several people who tell her that they almost always feel alone. She believes that public conversation is an answer. There are no perfect public forums, but conversation is always possible. 

Part 6, Chapter 1 Summary: “The End of Forgetting”

Thoreau liked to take his guests out into nature when he wanted to have a profound conversation. Turkle analogously defines the fourth chair as a space for philosophy, “a second nature of our own making, the world of the artificial and virtual” (337). She uses the concept of the fourth chair to introduce the concept of people talking to machines.

Advocates of artificial intelligence work towards the day when people can have caring robots as companions, confidants, and friends. This will require yet another definition of conversation. Turkle wonders whether a robot that appears empathetic is actually beneficial and whether it might provide actual communion for us.

“A Computer Beautiful Enough That a Soul Would Want to Live in It”

Marvin Minsky was one of the founders of Artificial Intelligence (AI). One of his students described Minsky’s work as “trying to create a computer beautiful enough that a soul would want to live in it” (338). Robots inspire hope for better lives, but Turkle worries that conversation with a robot is not a conversation.

“Simple Salvations”

People tell Turkle that they look forward to the day when Siri, Apple’s AI virtual assistant, will feel like a friend. Apple’s first advertisement for Siri featured a group of celebrities talking to Siri as if “she” were a trusted friend. On a radio panel, Turkle listens to other guests express how much they enjoy talking to Siri. One of them says that if Siri could be taught to behave as a psychiatrist, she would, in some sense, actually be a psychiatrist. People can feel fewer inhibitions when talking to a machine.

“Vulnerability Games”

Computer scientist Joseph Weizenbaum created the ELIZA program in the 1960s. ELIZA would respond to questions like a psychotherapist. Weizenbaum learned that people wanted to confide in ELIZA, even though they knew it was a program.

A 26-year-old man talked with a robot named Kismet. Kismet is an accurate mimic of speech—in both tone and rhythm—and makes eye contact. Its ability to portray facial expressions is uncanny. The man talked to Kismet frequently and told it about his day as if it actually understood. Others reported feelings that Kismet understood them and that they felt genuine warmth from the robot.

A 12-year-old girl named Estelle talked with Kismet during a difficult time in her life. She came away feeling that Kismet disliked her: Due to a glitch, the robot turned away from her while she was speaking. Other children who find Kismet unsatisfactory react aggressively.

“Treating Machines as People; Treating People as Machines”

We treat people like machines, and vice versa. We pause people to look at our phones while we check them. At an etiquette panel, a busy woman asked if it was okay to ignore a cashier since her time in line was the only time she had to check her phone. Turkle said,

Until a machine replaces the man, surely he summons in us the recognition and respect you show a person. Sharing a few words at the checkout may make this man feel that in his job, this job that could be done by a machine, he is still seen as a human being (346).

“Automated Psychotherapy”

People ask Turkle why she cares if people actually get something positive from talking to machines. Turkle believes that a more relevant question is why we don’t all care more about the implications of trusting our emotions to robots instead of other humans. If we believe that machines can provide everything we need, we have less incentive to talk with people.

“There Are No People for These Jobs”

Wired magazine ran a piece called “Better Than Human” (2012). The pro-robot argument emphasized that, by giving us practice relating to them, robots make us more human.

The piece also defines anything that a robot can do as not being uniquely human. MIT workers imagine robots as friends for the elderly. People say they would prefer a robot worker to a high school dropout who couldn’t get another job. Robots offer risk-free relationships.

“Smart Toys: Vulnerability to the As-If”

Turkle studied children and toys in the 1970s. Children liked toys, but they preferred people and knew that the toys weren’t “real”; the children knew toys didn’t have feelings. Rather than resign to this, developers continued working to make toys more convincing emotionally. Turkle thinks children are now starting to believe toys are “real,” and she worries that this isn’t just a phase in child development. A boy told Turkle that robots never have to run out of stories. They are safe and reliable emotions (because they have no emotions).

“An Artificial Mentor”

After immigrating to the United States at age eight, 17-year-old Thomas used video games as comfort. He told Turkle that he often got advice from video game characters. He calibrated his sense of right and wrong from the actions of characters he met in the games. For instance, he returned a stolen collectible to a friend after someone gave it to him, because he saw a character in a game return stolen property.

He described NPCs—non-playable characters—to Turkle. Usually, the NPCs are just programs that perform according to coded scripts, but sometimes the developers can inhabit the NPCs and make them behave however they want. The inability to distinguish between robots and real people (in games) intrigued him. He saw no reason to object to having a robot as a friend.

“From Better than Nothing to Better than Anything”

Children learn about emotion through conversation:

Children need to learn what complex human feelings and human ambivalence look like. And they need other people to respond to their own expressions of that complexity. These are the most precious things that people give to children in conversation as they grow up (356).

If children fail to learn these emotional lessons, they may develop a fear of intimacy, driving them towards robots that appear to feel but are incapable of real conversation.

“Turning Ourselves Into Spectators”

Turkle discusses one of the most memorable moments of her research. She visited an older woman who had lost a child. The woman was speaking to a cute robotic seal. It appeared to pay attention to her. It comforted the grieving woman, but Turkle felt as if they—society—had abandoned the woman. All the people who could have been helping this woman were simply standing by as spectators, hoping the robot would do the work of empathy for them. A robot without understanding had turned sentient humans into spectators in their own lives. Turkle says the moment was “wrenching” for her to watch and that the woman deserved a companion who could actually feel and understand her experiences.

“Finding Ourselves”

At the conclusion of the book, Turkle  recounts attending a meeting called “Disconnect to Connect” (360). Participants discussed the pros and cons of our increasing retreat into digital lives. The author worries that we are pretending that technology can free us from a problem that technology created: disconnection. She can think of no better solution than committing to each other, listening, making ourselves vulnerable, and reclaiming our humanity. 

Parts 5-6 Analysis

The final two sections place Turkle’s arguments against the macro scale of society. Consider what it could mean when a figure with as much power, wealth, and influence as Mark Zuckerberg said, “Privacy is no longer a relevant social norm” (301). If he is right, it means either that people no longer care about privacy, or that privacy is no longer capable of being relevant. The first case has unsettling implications, ranging from indifference to ignorance and information illiteracy. The second indicts a system that is broken. If privacy ever mattered, then the rise of apps, phones, and data tracks didn’t render it irrelevant—and yet, even with email, social media, Tinder, and other technologies, people act as though they have privacy: “This is one of the great paradoxes of digital conversation: It feels private despite the fact that you are onstage” (305).

A clue lies in Turkle’s statement that in the past, “[y]ou needed privacy to change your mind about important matters” (302). Lana’s story clearly shows that not everyone wants to have important matters on their mind. In fact, she stated that she “is glad not to have anything controversial on my mind, because I can’t think of any online place where it would be safe to have controversial conversations” (311). If there is no place to have controversial conversations online, and people spend increasing amounts of time online, then it follows that controversial opinions will dwindle, leading to an unproductive ceasefire of ideas. What used to be controversy will exist only in echo chambers.

Rather than reclaiming conversation, people look forward to the day when robots will be their caregivers, therapists, servants, and friends. Rather than parents teaching coping skills to their children, they ignore them in favor of their phones, leading to children who cannot regulate their own emotions: “A child alone with a problem has an emergency. A child in conversation with a grown-up is facing a moment in life and learning to cope with it” (331).

It is parents’ responsibility to raise future generations to be resourceful and compassionate. If children do not learn what Turkle considers the most precious of life skills—complex empathy and communication—there is little hope for an optimistic future.

The final section examines the development of artificial intelligence, which removes additional layers of responsibility from people. When Turkle discusses the woman in the grocery store who wants to ignore the cashier to check her phone, she writes:

Until a machine replaces the man, surely he summons in us the recognition and respect you show a person. Sharing a few words at the checkout may make this man feel that in his job, this job that could be done by a machine, he is still seen as a human being (346).

However, if people treat people as if they are machines—increasingly likely with a decrease in empathy—then it matters little whether robots have evolved to the point of being confidants or not.

Despite all of her concerns, Turkle concludes the book on an optimistic note:

The moment is right. We had a love affair with a technology that seemed magical. But like great magic, it worked by commanding our attention and not letting us see what the magician wanted us to see. Now we are ready to reclaim our attention—for solitude, for friendship, for society (361).

However, she has also demonstrated that the lack of conversation has crept into politics, finance, education, parenting, romantic love, empathy and more. These are the foundational pillars of a healthy society. If people do not choose to step back from technology, it will not happen. Turkle devotes a substantial part of the book to discussing people who are addicted to technology. These people do not make rational choices; they choose based on appetite and compulsion. Someone like Lana, who does not want to have controversial ideas, seems unlikely to read a book like Reclaiming Conversation: The Power of Talk in a Digital Age, given that its call to action is already controversial to those who use technology to do the work of human conversation.

blurred text
blurred text
blurred text
blurred text