The Hedgehog Review

The Hedgehog Review: Vol. 20 No. 1 (Spring 2018)

Expose Thyself! On the Digitally Revealed Life

Christine Rosen

The Hedgehog Review

The Hedgehog Review: Spring 2018

(Volume 20 | Issue 1)

When Ross W. Ulbricht, a thirty-year-old California man accused by the US government of running the black-market website Silk Road, was on trial in a federal district court in Manhattan in early 2015, his defense attorney lodged an unusual complaint with the judge. He claimed that prosecutors had failed to include a vital piece of evidence in the case it presented to the jury, one that spoke to his client’s innocence and credibility: a smiley face.1

It turns out that the purported criminal mastermind was, like many of us, a devotee of the emoji, or emoticon. Ulbricht’s case is not the only one in which emoticons have been weighed in the balance. A case that came before the US Supreme Court in 2015 hinged on whether a man who made threats on Facebook against his estranged wife should have his conviction overturned. (He eventually won his case before the Supreme Court.) The man claimed that his threats weren’t serious because they included emoji of a face with its tongue sticking out, a tactic suggesting that perhaps we have entered the era of the Winkie Defense.

Like many inventions, emoji were borne of frustration—an inability to “read” the meaning of early e-mail messages that lacked the nonverbal emotional cues that define face-to-face communication. Scott E. Fahlman, a computer scientist at Carnegie Mellon University and the man widely credited as the creator of the first smiley-face emoticon, designed it to convey sarcasm; his fellow computer scientists active on early online bulletin boards were so literal minded that attempts at jokes often fell flat. The emoticon was intended as a kind of “joke marker” to prevent this.2

Emoji are a form of emotional punctuation, a creative attempt to translate one aspect of the human experience—our emotional lives—to the screen, and in this they have been successful. But they are also a symbol of our broader struggle to reconcile human emotions with the limitations of the technologies through which we express them. The opposite of the cheerful winking emoji is the ANGRY MESSAGE DELIVERED IN ALL CAPS or the harassments of the determined Internet troll that we must endure when our efforts to connect to one another online misfire.

More and more of us now express our emotions through the devices and software we rely on in our everyday lives. E-mail, text messages, social media, video chat—each platform demands of us different expressions of ourselves. And in the near future, we will have a new range of technologies at our disposal—sensors, monitors, and software with the power to track, nudge, persuade, and coerce. What the clock did to time, technologists hope to do to emotion—regulate and regiment it, measure and monitor it. But taming the temperamental beast that is human emotion might prove a challenge that contemporary technology is unfit to take on.

Naked Little Spasms of the Self

At a recent conference on technology, public space, and sociability, I encountered an engaging, thoughtful college student who told me about a female acquaintance who had been a Facebook friend of his since his freshman year. He had never really thought about categorizing their relationship; she was one of hundreds of social connections he maintained in his vast network, a person he might see in passing on campus but whom he had never engaged in conversation.

One day on his way to class, he bumped into her and she recognized him. “We started talking,” he said, “and after a few minutes I realized I was just sweating like crazy. I had no idea why. It wasn’t even hot.” He paused and, with a rueful look, said, “It wasn’t until much later that I realized maybe I was sweating because I kind of liked her and was off my game because she was right in front of me.” He hadn’t recognized his body’s own signals when they flooded him, and it surprised him when he figured out what they meant.

We are physical animals, and we express our emotions in physical ways. We sweat, flinch, and grin. We send unspoken messages to each other during the most fleeting interactions, “naked little spasms of the self,” as sociologist Erving Goffman called them.3 There are many things our bodies understand that we would find it difficult to articulate—some say this feeling is the basis of intuition—and these physical responses are a crucial part of our emotional repertoire. We will always be able to “know more than we can tell,” as Michael Polanyi wrote of this form of “tacit knowledge.”4

Such knowledge is difficult, if not impossible, to measure, which is why so much ambiguity and mystery surround the question of what emotion is. René Descartes attempted to define emotion in terms of basic passions common to all human beings: gladness, sadness, love, hatred, and desire. Charles Darwin devoted a great deal of time to studying the expression of emotions in men and animals (much of it gleaned through his observation of facial expressions and body movements). More recently, psychologists such as Jerome Kagan have noted how we infer others’ and our own emotional states through actions, biological reactions, and verbal descriptions.

Neuroscientist Antonio Damasio draws a useful distinction between feelings, which are “inwardly directed and private,” and emotions, which are “outwardly directed and public.” Our feelings, Damasio argues, are “mental experiences of bodily states,” and require a level of consciousness that other animals lack: They require a sense of identity or self.5

“Only with the advent of a sense of self do feelings become known to the individual having them,” Damasio writes.6 Emotions can be consciously acknowledged and unconsciously experienced (that vague feeling of annoyance whose source you can’t identify). They vary in range and intensity depending on the individual (the stoic shedding a single tear versus the weeping and shaking of the hyperventilating diva). Emotions are different from moods but influenced by them, and they play a significant role in our everyday quality of life. They are also crucial to the functioning of social life.

But how well do we understand emotions? “Life is the art of being well deceived,” the nineteenth-century essayist and critic William Hazlitt once remarked, and the greatest deceptions of all are the ones we perform on ourselves. Failures of self-knowledge are commonplace, which is perhaps one of the explanations for why we tend to elevate “reason” over “feelings” when assessing our own and others’ actions. The heavily used Myers-Briggs personality test places feelings on the opposite end of the spectrum from thinking in its typology. “Be reasonable,” we say, when someone invokes their feelings to argue their cause.

Our suspicions about emotions are in part a reaction to fear of manipulation. Emotions can be faked—indeed, entire industries rely on it—and part of being human is navigating others’ emotional signals to discern truth from fiction. We all perform a kind of emotional labor in our daily lives, whether at work or among our social peers, smiling and nodding at something we might actually dislike, or offering the white lie to soothe a distressed friend. Sometimes we take this to extremes. As part of its broader “science” of “Guestology” (Walt Disney World’s term for “the study of the people for whom we provide service”), Disney employees—or Cast Members, as Disney calls them—train in the art of delivering happy feelings to visitors to Disney’s theme parks. Cast Members are supposed to remain relentlessly cheerful while performing their Disney duties and are given an arsenal of tactics for doing so.7

But force-feeling happiness while trying to force-feed it to tourists has consequences. A recent study on emotional labor found that most Disney employees engaged in “surface acting,” faking feelings of happiness while suppressing their genuine feelings of anger and resentment, eventually succumbed to “emotional exhaustion”8—a finding that hasn’t deterred other companies from adopting Disney’s norms.

But Disney is on to something. Although we experience our emotions as unique, they are socially influenced, in some cases so much so that they are subject to a kind of contagion effect. The forced smiles of Disney World employees might really make people feel as though they are having a happier time in the theme park (however unhappy these put-on emotions make Disney employees). In an experiment conducted by Stanley Schachter and Jerome Singer, test subjects were dosed with an unnamed substance (epinephrine, a form of synthetic adrenaline) and either told that heart palpitations were a possible side effect, or told nothing at all. The researchers then placed the subjects in a room with actors who were scripted to behave either with anger or euphoria.

The results suggest the power of emotional influence. Subjects who were not told of the side effect of the drug and were in the room with the angry actor reported feelings of anger; those in the room with the happy actor reported feelings of euphoria. (Those told of the possible side effect of the drug in advance were largely unaffected by the actor’s behavior.) Schachter and Singer concluded that when people experienced “a strong physiological response, the origins of which were not clear, they interpreted their own feelings as either anger or euphoria, depending on the behavior of other people who supposedly were in the same chemical boat.”9 In other words, they took their cues from those around them. Contemporary technology provides countless, compelling opportunities for such emotional contagion, but on a vast (and virtual) landscape.

What Is a Real Emotional Experience?

How can you tell if someone is angry or happy? How do you know when you feel that way yourself? If you suffer from alexithymia, you can’t. The disorder, first identified in the 1970s, describes people who are unable to articulate their own feelings and can’t understand the feelings of others. They tend to have literal-minded dreams, and have trouble reading others’ facial expressions and nonverbal cues. They often run into difficulties at work and in their personal lives because of this emotional awareness deficit. They are, for all intents and purposes, emotionally blind.

Alexithymia is rare. (Most estimates claim that 2 percent of women and 8 percent of men have mild to severe forms of the condition, which can be treated with behavioral and cognitive therapies.) For most of us, the ability to read other people’s emotional signals and expressions develops gradually and with varying degrees of skill over a lifetime.

But while studying research by Clifford Nass, a Stanford University psychologist and expert on multitasking, I was reminded of alexithymia. Nass found that spending a lot of time in mediated environments undermines our ability to read others’ emotions. When Nass showed avid Internet users pictures of human faces or told them stories about people, they had trouble identifying the emotions being expressed. “Human interaction is a learned skill,” Nass concluded, “and they don’t get to practice it enough.”10 If emotions use our bodies as their theater, as Antonio Damasio puts it, what happens when that theater becomes virtual?

As more of our emotional experiences occur online, we expand the quantity of our connections, more easily finding like-minded people with whom to share our feelings. (I should know; I’m a weather geek who spends countless, intensely happy hours burrowing into debates over barometric pressure on weather blogs.) At the same time, we lose the physical cues that define face-to-face interaction and so risk undermining a crucial skill: how to read each other’s intentions and understand others’ feelings. If the vitriolic discussions about climate change on my favorite weather blogs are any guide, this leads many of us to assume the worst about each other’s motives.

Already we have redefined what a “real” emotional experience requires. Consider the sexting scandal that engulfed former New York congressman Anthony Weiner in 2011 and again in 2013. Weiner was caught repeatedly sending sexually explicit images and messages to women other than his wife, activity that led to his resignation from the US House of Representatives and, later, defeat in New York City’s mayoral race. “I never met any of these women,” he said in his defense. “I never was in the same room with them. I never had any physical relationship whatsoever.”11 Although no one asked them, I imagine that many of the women with whom Weiner carried on his racy repartee did believe they had some sort of “relationship” with him, at least enough of one to entrust him with their intimate thoughts and images.

Although we all recognize rationally that there is another human being on the other side of the screen (except when it’s a bot), it’s becoming clear that our use of certain technologies elevates some emotional responses over others. A recent study published in the Journal of Computer-Mediated Communication explored whether incivility online (in the form of reader comments) influenced people’s perceptions of an article, in this instance a neutral explanation of emerging technologies such as nanotechnology.

The results were startling. Rude comments didn’t merely polarize readers; they changed their perception of the article. The researchers noted how “social reprimands such as nonverbal communication and isolation can curb incivility in face-to-face discussion,” but that, by contrast, “the Internet may foster uncivil discussion because of its lack of offline, in-person consequences.” As a result, they argued, this form of online incivility, which they called “the nasty effect,” may impede the “democratic goal” of public deliberation online.12

Emotional contagion of this sort spreads rapidly online. The swift punishment of supposed villains meted out on social media platforms such as Twitter are a particularly virulent form. Our new “digilante” justice provides instant, global retribution for the kinds of mistakes for which people used to be shamed only by their close peers. Consider publicist Justine Sacco, who tweeted a tasteless joke about Africans and AIDS just before boarding a flight to South Africa in 2013. While she was incommunicado aboard the plane, her joke was retweeted and spread rapidly, with Twitter users posting gleeful comments such as “We are about to watch this @JustineSacco bitch get fired. In REAL time. Before she even KNOWS she’s getting fired.” By the time her plane landed she was globally notorious—the number-one trend on Twitter worldwide—and was soon fired from her job.13

Or consider the emotional impact of anonymous apps such as Yik Yak, which allow users within certain geofenced areas to post anonymous comments about whatever strikes their fancy. Not surprisingly, the app is creating challenges in high schools and colleges. Although many of the posts are mundane, others are sexist, violent, racist, and demeaning statements about others. One university professor discovered that students had been using Yik Yak to message back and forth during her lectures, trading sexually degrading insults about her.14 “Ourself behind ourself concealed—Should startle most,” Emily Dickinson wrote, long before the rise of anonymous apps. But how much power to demean do we want our concealed selves to have?

The Virtual Velocity of Now

Our bodies process different emotions at different speeds, while technology favors one velocity: now. A study by researchers at the Brain and Creativity Institute at the University of Southern California used brain-imaging techniques to examine how feelings such as admiration and compassion arise. After participants in the study read real-life stories meant to evoke compassion and admiration, researchers found that it took participants six to eight seconds to respond to narratives of others’ pain or expressions of virtue. As one of the researchers noted, “If things are happening too fast, you may not ever fully experience emotions about other people’s psychological states.” If we spend most of our time mediating our emotions through technologies and platforms such as text messaging or Twitter, we might be losing opportunities for reflecting on our feelings. “Our study shows that we use the feeling of our own body as a platform for knowing how to respond to other people’s social and psychological situations,” one researcher said. “These emotions are visceral, in the most literal sense.”15

By contrast, the lives we live on Facebook, Instagram, Yik Yak, and Twitter are virtual, not visceral, and they favor immediacy, as well as encourage less productive feelings, such as envy. One study in Germany found a “rampant nature of envy” and other “invidious emotions” among people who were heavy users of Facebook, particularly those who tended to follow other people’s newsfeeds and check others’ profiles. They experienced Facebook as a “stressful environment” that affected their daily lives. “The spread of ubiquitous presence of envy on social networking sites is shown to undermine users’ life satisfaction,” the researchers found, creating a “self-promotion-envy spiral,” in which users were “reacting with even more self-promotional content to the self-promotional content of others.”16 We hear many stories about the positive effects of social networking (of which there are many); but new research also suggests that the effects are more complicated when it comes to our emotional responses to social media.

Coming to grips with these complications is crucial at a time when technology companies are becoming more adept at manipulating our emotions. In an interview with Technology Review, Cameron Marlow, head of Facebook’s Data Science Team said, “For the first time, we have a microscope that not only lets us examine social behavior at a very fine level that we’ve never been able to see before but allows us to run experiments that millions of users are exposed to.” In January 2012, more than half a million Facebook users unknowingly became test subjects when the company deliberately manipulated its newsfeeds by putting either more or less positive information in them, ostensibly to determine if emotions are “contagious.” (Short answer: Yes, but behavioral science had already proven this, and Facebook’s findings showed a very weak contagion effect; Facebook, by contrast, was not doing this for science. It was eager to prove to its advertisers its power to manipulate its own users).17

Similarly, engineers at the dating website OKCupid programmed the site to send its users matches that it claimed were “exceptional” but that were in fact weak—all for the purpose of finding out if users would believe the assessment and pursue the match. Not surprisingly, most did. We are nothing if not suggestible when it comes to love, even if Cupid’s arrow has been replaced by OKCupid’s algorithm. But that algorithm knows a lot about your emotional life. Some OKCupid users were horrified to learn that the site keeps not only every single message they send to a potential date, but also bits of messages they have erased while trying to craft a perfectly pitched response. The users felt, well, used. OKCupid founder Christian Rudder was unmoved by such concerns; as one of his OKCupid blog posts boasted, “We Experiment on Human Beings!”18

In the realm of emotion, Facebook and other social media are like the now-banned pesticide DDT: DDT killed disease-carrying mosquitoes (a good thing), but it also weakened the shells of birds’ eggs, rendering them so fragile that avian embryos could no longer survive inside them. Platforms like Facebook destroy the limitations of time and space, giving us an efficient way to keep in touch with far-flung connections (a good thing), but they weaken other things in the process, such as our willingness to take responsibility for our behavior online, and our willingness to take emotional risks. Writing about her gradual estrangement from talking to people on the telephone, writer Caeli Wolfson Widger noted, “It’s an instinctive preference, seemingly shared by everyone I knew, for the low emotional risk of communicating via words on a screen.”19 “Attention without feeling…is only a report,” the poet Mary Oliver has written.20 What is social media but a real-time report of our experiences, an endless inventory of immediate feelings?

Click-Here Empathy

Empathy, by contrast, is an act of imagination and will. It demands that we try to see the world from another person’s point of view. This is not the same thing as merely looking at or consuming another person’s experience when it is posted online. That feeling is more akin to sympathy or pity (or in some cases, schadenfreude) than empathy. Empathy is also grounded in our physical bodies—our observations of others’ movements and facial expressions. “When we see a stroke aimed, and just ready to fall upon the leg or arm of another person, we naturally shrink and draw back our own leg or our own arm; and when it does fall, we feel it in some measure, and are hurt by it as well as the sufferer,” Adam Smith wrote.21

Today we practice a cunningly efficient form of “click-here” empathy. A few weeks before Christmas last year, I overheard (eavesdropped on, actually) a conversation at a nearby table in the restaurant where I was eating lunch. “Oh, it’s really easy,” one woman was telling another. “You just go to Target’s website and you can gift socks to the homeless.” I have no idea if this was true, but it seemed plausible. One-click charity and other fundraising platforms such as Kickstarter have raised a lot of money for many good causes. But we should acknowledge that our use of these different technologies fosters different levels of emotional engagement, just as reading a novel will generate more empathy than reading a single tweet from a stranger. Going to a homeless shelter or other charity organization and spending time, face to face, talking with and helping the people who stay there is a way of practicing empathy that no amount of Facebook Likes or retweets can offer.

And having access to so much information about people hasn’t translated into a better understanding of them. A study by the University of Michigan Institute for Social Research published in 2011 found that “college kids today are about 40 percent lower in empathy than their counterparts 20 or 30 years ago,” and that the steepest decline had occurred over the preceding decade. One of the reasons for this sharp drop, the researchers said, was the rise of technologically mediated relationships. “The ease of having ‘friends’ online might make people more likely to just tune out when they don’t feel like responding to others’ problems, a behavior that could carry over offline,” one noted.22 Technology doesn’t just do things for us. Our prolonged use of it does things to us. Acknowledging this isn’t a form of technophobic moral panic. It keeps us honest about whether or not the technologies we love to use are helpful or harmful, and in what ways.

Professor Jeff Hancock of Stanford University has spent years studying how our online behavior changes our offline behavior. He tests whether the tenets of social learning theory, which posits that practicing certain behaviors and actions can lead to their performance in real life, also apply to our online behavior. One of his studies asked participants to pretend to be extroverted either in a Microsoft Word document, which researchers told them would not be made public, or on a live blog that would. The bloggers were far more extroverted, leading Hancock to conclude that acting in certain ways online reinforces the behavior and thus makes it more likely to be followed in real life. “Self-presentations in the online, public condition caused participants to shift their identities to become more consistent with their behavior,” he wrote.23 This is excellent news for the stressed-out businessman learning relaxation techniques online, but perhaps not so inspiring for the teenager who spends hours every week pummeling virtual prostitutes to death in the videogame Grand Theft Auto. Even if playing violent games won’t lead people to commit acts of violence (indeed, the violent crime rate is declining even as more and more people play violent video games), it does seem to lead to something else with negative social consequences, something less easily measurable than violent crime rates: an erosion of empathy.

The debate over violent video games is a skirmish in a much bigger war: namely, how we create and enforce social norms now that so many of our interactions have moved online. We hear a lot about cyberbullying, and how online activity both exacerbates it and serves to make such behavior more transparent (and hopefully less common). We hear much less about how our use of technology affects empathy, creativity, and thoughtfulness.

Consider creativity. Evidence from the Torrance Tests of Creative Thinking, which have been administered to American children in kindergarten through twelfth grade for many decades, reveals that creativity has been in steady decline since 1984. One educational psychologist, writing in Creativity Research Journal, found that “children have become less emotionally expressive” as well as “less humorous, less imaginative, less unconventional, less lively and passionate, less perceptive…and less likely to see things from a different angle.”24 Many factors have likely contributed to this decline in creativity, but it has coincided with the decades in which children’s media consumption and technology use rose dramatically.

As for thoughtfulness, do our heavily mediated lives distance us from the emotional impact of our actions? In caring professions such as nursing, there’s some evidence this is the case. In 2012, four students in a college nursing program in Kansas thought it would be funny to post pictures on Facebook of themselves holding a human placenta. All four students were expelled but later reinstated. In California, two nurses were disciplined for posting images on Facebook of a dying man in their care. And two Wisconsin nurses were fired after posting a patient’s x-rays on Facebook; the x-rays showed “a sexual device lodged in a man’s rectum.”25

These nurses and nursing students obviously violated their patients’ privacy, and most nurses are not posting highlights of their patients’ emergency room visits online, but it is worth examining the environment and culture in which nurses have been trained for clues to the behavior of these outliers. The nursing literature includes an increasing number of articles admonishing nurses to be cautious in their use of social media. In recent years, nursing schools have begun substituting simulations for the real-life clinical experiences nurses used to receive during their training. Rather than tend to real patients, nursing students use video games, screen-based simulations, and in some cases sophisticated medical manikins, to hone their skills. Massachusetts General Hospital in Boston has a “Sim Man,” as well as simulated birthing and simulated infant training centers, for example.26

As more and more nursing students spend more time with simulators and less with real human patients, however, we need to assess the impact such training has on their ability to empathize. The real-life sick and wounded are people who squirm and smell and yell and are rarely entirely hairless.

Outsourcing Emotional Labor

We live in an era that has enthusiastically embraced the outsourcing of emotional labor. We can hire people to manage our children’s birthday parties or care for elderly relatives; we can outsource even the smallest domestic jobs to people on TaskRabbit and pay them pennies an hour to perform them. When sociologist Arlie Russell Hochschild, who has studied such outsourcing, interviewed an executive assistant about her job, the woman told her, “My boss outsources patience to me.”27

If you don’t have a person to be patient for you, you can probably find a program to do it. Rather than spend time considering the tone and likely impact of something we write, we can outsource it to ToneCheck, a software program that acts as a kind of “emotional spell-check” for your e-mail messages, alerting you to “excessive displays of anger, sadness, or insensitivity.”28 In a highly competitive and unstable economy, where workers feel pressure to squeeze productive moments out of nearly every hour in the day, we begin to measure emotions (like work hours) in terms of their efficiency and usefulness.

Technology companies are keen to have us embrace this new digital-emotional outsourcing. At the 2015 SXSW conference in Austin, Texas, a panel discussion titled “Humanizing Digital: Finding Love in a World of Likes,” featured representatives from Google and the advertising industry, as well as a YouTube star who has been posting video logs of his life for years. The panel promised to share “how digital experiences can show empathy, how they anticipate human needs and what actually makes them human.”29 What they discussed, in fact, wasn’t empathy, but branding strategies to help companies monetize ever more areas of private experience; they offered strategies for businesses to use empathy as a stalking horse to sell you things.

What technology companies realize is that the “elegant instruments of their mutual estrangement,” as E.J. Mishan described new technologies of mediation more than thirty years ago,30 don’t just give us access to new kinds of behavior; they create new behaviors and new emotional connections, from low-level forms of mimicry such as the many Internet crazes that live like mayflies online (“planking,” “owling,” and, my personal favorite, the “Vadering” fad of spring 2013, which saw people imitating Darth Vader’s touch-free chokehold in Star Wars: A New Hope) to new group formations like flash mobs and Twitter mobs. Companies such as Google and Facebook then find ways to monetize these seemingly spontaneous emotional acts.

Or consider “Like-a-Hug,” the “wearable social media vest” designed by students at MIT. As Melissa Chow, one of the inventors, described it, the vest “allows for hugs to be given via Facebook, bringing us closer despite physical distance.” When someone “likes” a photo or status update on the wall of a person wearing the vest, it squeezes the wearer in a hug. This allows us to “feel the warmth, encouragement, support, or love that we feel when we receive hugs,” Chow says.31

But even those of us not eager to don a Facebook-enabled hugging vest have strong emotional connections to our technologies. A study conducted by the University of Maryland’s International Center for Media and the Public Agenda examined the feelings of college students from ten countries, including the United States and China, after they abstained from all technology and media use for twenty-four hours (including all Internet and mobile phone use, as well as online video games). The results are revealing of our emotional attachment to our devices. A student from Argentina said he “felt dead” without his normal diet of media; another, from Lebanon, described the experience as “sickening”; an American claimed she was “itching like a crackhead” for her phone. The researchers found that the craving the students expressed wasn’t only for the information their technologies provided, but also for the physical device itself.32 A recent Credit Suisse Youth Barometer report found that among people in their teens and twenties, smartphones were “more important to them than anything else—even ‘meeting friends,’ ‘Facebook,’ or ‘vacationing abroad.’”33

We are moving away from a world that values empathy with others, and toward one in which so many of our emotional experiences are mediated through technology and software that they might better be viewed as vicarious. After all, it’s so much more efficient to figure out how other people feel by scrolling through their tweets or checking their Facebook page than it is to spend time actually talking to them. As Don DeLillo said of characters in his novel Mao II, “Nobody knows how to feel and they’re checking around for hints.”34

Sensor-ship

Early efforts to create machine intelligence, such as the eighteenth-century Mechanical Turk, which delighted audiences worldwide with its creator’s claim that it played chess, put people inside machines to awe and inspire (and in many cases, fool) others. Today, we put machines on ourselves to better understand our own feelings. Enlightenment scientists urged people to know the world through the use of their own five human senses. Today, technologists think those five senses are no longer sufficient and, in many cases, even lead us astray; with computing power and new sophisticated sensor technology, we can extend our senses. Ours is the era of the “digital sixth sense,” as the CEO of Qualcomm, which makes chips for smartphones, described it.35 It is one that embraces ubiquitous computing, in which technologies aren’t merely smart but also emotionally aware. “Ubiquitous computing…has to embrace emotion as an essential element in pursuing its next level of development,” argues scientist and artificial intelligence researcher Egon L. van den Broek.36 In this future, technology is less an extension of man than an invasion of him that uses sensors and monitors to gain a better understanding of the workings of the physical body as well as the emotional lives of their wearers.

For people (like me) who fret about the deterioration of face-to-face human interaction, researchers like Alex Pentland at MIT have a simple solution: Stop worrying and let the machines do the work. Network science and technologies like sociometers (digital sensors that record movement, gestures, and nonverbal signals) make concern about deteriorating social skills obsolete, they argue. Our devices will soon read our own and others’ signals for us—and they will do it better than we ever did ourselves. Indeed, what Pentland and his colleagues want to create is a kind of Freud Machine that would make the id ever visible, a technology that, “by paying careful attention to the pattern of signaling within a social network,” would allow people to “harvest tacit knowledge that is spread across all of the individual members of the network.”37

The Delphic oracle’s guidance was “Know thyself.” Today, with the help of social networking, it would be “Show thyself.” In the near future, our sensor and network technologies will know us and show us, because, the implicit message suggests, we really can’t trust ourselves. Pentland and his ilk believe that the Enlightenment’s emphasis on individual rationality might have flattered us for a few hundred years, but it was misguided in its assessment of our behavior. We like to think of ourselves as rational individuals, Pentland says, but we are in fact “typically dominated by social network effects,”38 which is to say, we’re an irrational and easily cajoled herd.

To measure our emotions, technologists are devising a host of gadgets and sensors to track our behavior. Pentland’s lab at MIT examines the data from sociometric badges that track individuals’ physical movements using an embedded location sensor. “Just from data such as how much a subject walks around, who they call and when, and how much and when they socialize face-to-face, a user’s personality type and disposable income can be estimated,” Pentland notes. “We can also see when someone is coming down with the flu or is depressed.”39

Most smartphones already or will soon feature the same behavioral capture capabilities as Pentland’s sociometers. Students at Stanford University recently developed a modified gaming controller that uses signals from the body’s autonomic nervous system (which controls breathing and heart rates) to determine players’ emotions. Working with professor Gregory Kovacs (and with underwriting by corporate sponsors such as Texas Instruments), students are also developing sensors to monitor motorists’ alertness and emotions, a boon for car manufacturers and insurance providers. As for the privacy implications of driving a car that is monitoring your emotional state in real time, Kovacs, like most technologists, isn’t troubled. “While some might see it as an invasion of privacy, I think operators of such vehicles should give up some privacy in exchange for the trust of human lives placed in their hands,” he told a reporter.40

In 2013, Bank of America gave its employees sociometric badges so it could study their interactions at work. The sensors’ function was to “measure actual behavior in an objective way,” a human resources executive at the bank told the Wall Street Journal.41 Other companies have performed similar studies measuring employees’ movements, tone of voice, and conversational patterns with coworkers; ironically, given the high-tech nature of these experiments, what companies tend to discover is that their most productive teams are the ones that engage in the most unmediated, face-to-face communication.

Algorithmic Reassurance

The idea that the micropatterns of our behavior, once captured and measured, can give us insight into ourselves is appealing at a number of levels. We want something seemingly value free, objective, and scientific to tell us why we feel the way we do. We are understandably eager to believe that sensors can objectively measure our highly subjective emotional experiences and teach us about ourselves. Sociometers and other sensor technologies perform the same existential function as a horoscope—but instead of the immutability of the stars, we have the rigor of software algorithms to guide our behavior. Like good astrologists, our technologists offer us reassurance about our emotions.

But what kind of world will this be once we have outsourced to our devices and software the job of becoming the technicians of our inner lives, mimicking the technological virtues of efficiency, economy, predictability, and repeatability? Pentland says this will be a “sensible” society where “everything is arranged for your convenience.”42 No need, for instance, for the inefficiencies or embarrassments of a bad second date. Our sensors will signal within moments whether our affection is likely to be returned or not, and we will be able to move on. It will be the embodied version of clicking Facebook’s Like button, effectively outsourcing the work of emotional awareness.

But such an arrangement might also make it more difficult to deal with the things we can’t control, like the passage of time and the physical infirmities that come with it. We cannot always control our circumstances in the way we control the images on our screens. And by relying on them in more areas of our lives, we also could experience a kind of mass emotional deskilling. Already we’re outsourcing our memories to Facebook, our curiosity to Google, and our sense of direction to our GPS devices. Should we also reject the ideal that we can and should know our emotional lives in an intimate and unmediated way?

The question isn’t hypothetical. Consider the Moodies app, which uses the built-in microphone on your computer or smartphone to analyze “the full spectrum of human emotions” based on its analysis of the tone and inflection of your voice. You can leave the app running all the time, and it promises to deliver a new emotional analysis right to your smartphone every fifteen to twenty seconds. Its developers foresee a future in which everything from iPhone’s Siri help system to your Internet-connected car uses “emotional analytics” to figure out how you’re feeling in real time. (The Moodies app might not be entirely reliable; one mischievous user, reviewing it on the site apprecs.com, claimed that when he fed it a recording of Hitler’s speech announcing the invasion of Poland, the app interpreted Hitler’s mood as “friendliness.”)43

We need to consider how much of our emotional lives we are prepared to reveal. These technologies are potentially digital wolves in sheep’s clothing; marketed as servants of self-awareness, they could just as easily become technologies of unwanted exposure. As a Google representative told a reporter recently, “We like to say a phone has eyes, ears, skin, and a sense of location…. It’s always with you in your pocket or purse. It’s next to you when you’re sleeping. We really want to leverage that.”44

But such leveraging goes well beyond merely giving you feedback on your own feelings in real time. It has also been the focus of so-called persuasive technologies, software and devices that aim to prod us into performing certain behaviors. Today, we have sensor technologies and sophisticated software programs that allow us to engage in what persuasive technology pioneer B.J. Fogg calls “mass interpersonal persuasion.” These persuaders range from sensor-based technologies (such as sociometers) whose data can be used to encourage or discourage certain behaviors, to features of websites that encourage consumers to act in particular ways, to persuasive games that aim to help people quit smoking or eat healthier food.45

Arguably, our technological persuaders are better than people because they are devilishly persistent, can manage large volumes of information, can offer anonymity—or at least the illusion of it—and have long memories. But consider these devices from the point of view of an individual: Do you really want your cell phone equipped with a sensor that can tell when you are becoming sexually aroused so that your health insurance company can send a helpful text message reminding you to wear a condom?

Or consider the power of technologies that persuade us to do things that aren’t always good for us. In her study of the machine gambling industry, cultural anthropologist Natasha Dow Schull found that the “enchanting perceptual distortions” that designers program into gambling machines keep gamblers playing long past the point of good sense.46

Tools of persuasion—repetition, invocations of authority, and appeals to emotion—have always been available to us, as any good politician or salesman knows. But persuasive technologies pose new ethical challenges that old methods of persuasion do not, including a lack of transparency. In 1999, Daniel Berdichevsky and Erik Neuenschwander challenged designers of persuasive technologies to voluntarily adopt a kind of golden rule of persuasive technology: “Ask yourself whether your technology persuades users to do something you wouldn’t want to be persuaded to do yourself.”47 They developed eight further principles for persuasive technology design, including one that Facebook recently ignored when it manipulated its users’ newsfeeds and OKCupid trounced on when it altered the romantic matches it sent its users. For a field that claims to care about the outcomes of its “end users”—that is what they call us—persuasive technologists’ approach to ethics calls to mind Marvin Minsky’s widely quoted observation that an ethicist is “someone who sees something wrong with whatever you have in mind.”

Persuasive technologies could just as easily be viewed as subversive technologies—subversive of human intention and judgment, manipulative of human emotions. In Man’s Search for Meaning, Victor Frankl argued, “Everything can be taken from a human being but one thing: the last of the human freedoms—to choose one’s attitude in any given set of circumstances, to choose one’s own way.”48 Increasingly we use technology to mediate the emotional experiences and feelings that fuel our individual preferences. Doing so, we risk nothing less than that last freedom to choose our own way.

Endnotes

  1. Benjamin Weiser, “At Trial, Lawyers Fight to Include Evidence They Call Vital: Emoji,” New York Times, January 29, 2015.
  2. Scott E. Fahlman, “Smiley Lore 🙂,” http://www.cs.cmu.edu/~sef/sefSmiley.htm. Accessed December 11, 2017.
  3. Erving Goffman, Interaction Ritual: Essays on Face-to-Face Behavior (New York, NY: Pantheon, 1982), 269–70.
  4. Michael Polanyi, The Tacit Dimension (Chicago, IL: University of Chicago Press, 2009), 4.
  5. Antonio Damasio, The Feeling of What Happens: Body and Emotion in the Making of Consciousness (New York, NY: Harcourt, 1999), 36.
  6. Antonio Damasio quoted in “Are Your Emotions Hijacking You?,” Psychology Today, January 2, 2018.
  7. Bruce Jones, “Understanding Your Customers Using Guestology,” Disney Institute, August 21, 2012, https://disneyinstitute.com/blog/2012/08/understand-your-customers-using-guestology/.
  8. Anne Reyers and Jonathan Matusitz, “Emotional Regulation at Walt Disney World: An Impression Management View,” Journal of Workplace Behavioral Health 27 (2012): 139–59.
  9. Stanley Schachter and Jerome Singer, “Cognitive, Social, and Physiological Determinants of Emotional State,” Psychological Review 69 (1962), 379–99. The quotation summarizing the work of Schachter and Singer is from Elliot Aronson, The Social Animal, sixth edition (New York, NY: Macmillan, 2003), 28.
  10. Clifford Nass quoted in Elizabeth Cohen, “Does Life Online Give You ‘Popcorn Brain’?,” The Empowered Patient (blog), CNN, June 23, 2011, http://www.cnn.com/2011/HEALTH/06/23/tech.popcorn.brain.ep/index.html?hpt=he_c1. See also Eyal Ophir, Clifford Nass, and Anthony Wagner, “Cognitive Control in Media Multitaskers,” Proceedings of the National Academy of Sciences 106, no. 37 (2009), 1–5, doi:10.1073/pnas.0903620106.
  11. William Saletan, “Meetless Weiner,” Slate, June 7, 2011, http://www.slate.com/articles/health_and_science/human_nature/2011/06/meetless_weiner.html.
  12. Ashley Anderson, Dominique Brossard, Dietram Scheufele, Michael Xenos, and Peter Ladwig, “The ‘Nasty Effect’: Online Incivility and Risk Perceptions of Emerging Technologies,” Journal of Computer-Mediated Communication 19 (2014), 373–87.
  13. Jon Ronson, So You’ve Been Publicly Shamed (New York, NY: Riverhead, 2015).
  14. Jonathan Mahler, “Who Spewed That Abuse? Anonymous Yik Yak Isn’t Telling,” New York Times, March 9, 2015, https://www.nytimes.com/2015/03/09/technology/popular-yik-yak-app-confers-anonymity-and-delivers-abuse.html.
  15. Mary Helen Immordino-Yang, Andrea McColl, Hanna Damasio, and Antonio Damasio, “Neural Correlates of Admiration and Compassion,” Proceedings of the National Academy of Sciences 106, no. 19 (2009); see also “Can Twitter Make You Amoral?” ScienceDaily, April 14, 2009, https://www.sciencedaily.com/. Quotations are from Carl Marziali, “Nobler Instincts Take Time,” USC News, April 14, 2009, https://news.usc.edu/29206/Nobler-Instincts-Take-Time/.
  16. Hanna Krasnova, Helena Wenninger, Thomas Widjaja, and Peter Buxmann, “Envy on Facebook: A Hidden Threat to Users’ Life Satisfaction?,” Wirtschaftsinformatik Proceedings 2013, paper 92, http://warhol.wiwi.huberlin.de/~hkrasnova/Ongoing_Research_files/WI%202013%20Final%20Submission%20Krasnova.pdf.
  17. Cameron Marlow quoted in Tom Simonite, “What Facebook Knows,” MIT Technology Review, June 13, 2012, https://www.technologyreview.com/s/428150/what-facebook-knows/; Facebook study: Adam Kramer, Jamie Guillory, and Jeffrey Hancock, “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” Proceedings of the National Academy of Sciences 111, no. 24 (2014), 8788–90, http://www.pnas.org/content/111/24/8788.full.
  18. Christian Rudder, OKCupid blog, June 27, 2014, https://theblog.okcupid.com/we-experiment-on-human-beings-5dd9fe280cd5.
  19. Caeli Wolfson Widger, “Don’t Pick Up,” New York Times Magazine, October 6, 2013, MM58, http://www.nytimes.com/2013/10/06/magazine/why-i-silence-your-call-even-when-im-free.html.
  20. Mary Oliver (text) and Molly Malone Cook (photographs), Our World (Boston, MA: Beacon Press, 2007). The full quote, “Attention without feeling, I began to learn, is only a report,” is mentioned in Susan Salter Reynolds, “A Time for Us,” Los Angeles Times, January 6, 2008, http://articles.latimes.com/2008/jan/06/books/bk-reynolds6.
  21. Adam Smith, The Theory of Moral Sentiments, part 1 (Boston, MA: Wells and Lilly, 1817), 2–3. First published 1759.
  22. Sarah Konrath, Edward O’Brien, and Courtney Hsing, “Changes in Dispositional Empathy in American College Students Over Time,” Personality and Social Psychology Review 15, no. 2 (2011), 180–98; see also “Empathy: College Students Don’t Have as Much as They Used To,” Michigan News, May 27, 2010, http://ns.umich.edu/new/releases/7724-empathy-college-students-don-t-have-as-much-as-they-used-to.
  23. Amy L. Gonzales and Jeffrey T. Hancock, “Identity Shift in Computer-Mediated Environments,” Media Psychology 11 (2008), 178–79, doi:10.1080/15213260802023433; https://sml.stanford.edu/ml/2008/06/gonzales-mp-identity-shift.pdf.
  24. Kyung Hee Kim, “The Creativity Crisis: The Decrease in Creative Thinking Scores on the Torrance Tests of Creative Thinking,” Creativity Research Journal, 23 (2011), 292, http://dx.doi.org/10.1080/10400419.2011.627805.
  25. Debra Wood, “Social Media: Cautionary Tales for Nurses,” Nursezone, August 16, 2012, https://www.americanmobile.com/nursezone/. On impacts of simulation on patient care, see Michelle Aebersold and Dana Tschannen, “Simulation in Nursing Practice: The Impact on Patient Care,” Online Journal of Issues in Nursing 18, no. 2 (2013), http://nursingworld.org/MainMenuCategories/ANAMarketplace/ANAPeriodicals/OJIN/TableofContents/Vol-18-2013/No2-May-2013/Simulation-in-Nursing-Practice.html. The review of literature on nursing and simulation concludes, “Most of the work has not been rigorously evaluated for its impact on patient care outcomes, which can be challenging to measure.”
  26. Laerdal Medical, Nursing Anne manikin, http://www.laerdal.com/us/NursingAnne.
  27. Arlie Russell Hochschild, “The Outsourced Life,” New York Times, May 6, 2012.
  28. Paola Antonelli, in Paola Antonelli, ed., Jamer Hunt, Alexandra Midal, Kevin Slavin, and Khoi Vinh, Talk to Me: Design and Communication between People and Objects (New York, NY: Museum of Modern Art, 2011); also https://www.moma.org/interactives/exhibitions/2011/talktome/essay/.
  29. Workman Group Communications, e-mail message to author, February 23, 2015; “SXSW Schedule: Humanizing Digital,” 2015, http://schedule.sxsw.com/2015/events/event_IAP36390; “SXSW Panel Picker: Humanizing Digital,” 2015, http://panelpicker.sxsw.com/vote/36390.
  30. Ezra J. Mishan, Economic Myths and the Mythologies of Economics (Atlantic Highlands, NJ: Routledge Revivals, 2011), 180. First published 1986.
  31. John Metcalfe, “A Jacket That Hugs You for Getting Facebook Likes,” The Atlantic’s Citylab, October 4, 2012, http://www.citylab.com/tech/2012/10/jacket-hugs-you-getting-facebook-likes/3494/.
  32. Jacqui Cheng, “Students Face Withdrawal, Distress When Cut Off from the Internet,” Ars Technica (blog), April 6, 2011, https://arstechnica.com/gadgets/2011/04/hand-over-the-gadgets-students-distressed-isolated-without-internet/.
  33. Credit Suisse Youth Barometer, 2012, https://www.credit-suisse.com/media/assets/corporate/docs/news-and-expertise/articles/2012/10/jugendbarometer-2012-en.pdf.
  34. Don DeLillo, Mao II (New York, NY: Penguin Books, 1992), 4.
  35. Don Clark, “Electronics Develop a Sixth Sense,” Wall Street Journal, January 7, 2013.
  36. Egon L. van den Broek, “Ubiquitous Emotion-Aware Computing,” Pers Ubiquit Comput, October 15, 2011, https://link.springer.com/article/10.1007/s00779-011-0479-9.
  37. Alex (Sandy) Pentland with Wendy Heibeck, Honest Signals: How They Shape Our World (Cambridge, MA: Bradford/MIT Press, 2008), xii.
  38. Alex Pentland, Social Physics: How Good Ideas Spread—The Lessons from a New Science (New York, NY: Penguin Books, 2014), 59.
  39. Ibid., 219.
  40. Nick Bilton, “Devices That Know How We Really Feel,” New York Times, May 5, 2014.
  41. Rachel Emma Silverman, “Tracking Sensors Invade the Workplace,” Wall Street Journal, March 7, 2013.
  42. Alex Pentland, Honest Signals, 91, 98.
  43. Learn more about the Moodies app at the Beyond Verbal website, http://www.beyondverbal.com/#moodies. The review citing the app’s response to Hitler’s voice can be found here: https://apprecs.com/android/com.bvc.moodies/moodies-emotions-analytics.
  44. Wade Roush, “Inside Google’s Age of Augmented Humanity,” Xconomy, February 28, 2011, https://www.xconomy.com/san-francisco/inside-googles-age-of-augmented-humanity/.
  45. B.J. Fogg, “Mass Interpersonal Persuasion: An Early View of a New Phenomenon,” in Harri Oinas-Kukkonen et al., eds., PERSUASIVE 2008, LNCS 5033 (Berlin, Germany: Springer, 2008), 23–34, http://captology.stanford.edu/wp-content/uploads/2014/03/MIP_Fogg_Stanford.pdf.
  46. Natasha Dow Schull, Addiction by Design: Machine Gambling in Las Vegas (Princeton, NJ: Princeton University Press, 2012), 96, 108.
  47. Daniel Berdichevsky and Erik Neuenschwander, “Toward an Ethics of Persuasive Technology,” Technology Information Communications of the ACM 42, no. 5 (1999), 51, https://pdfs.semanticscholar.org/d5b1/dd3118e0aa0757ab8f3e5472b0e6b506976b.pdf.
  48. Victor Frankl, Man’s Search for Meaning (Boston, MA: Beacon Press, 2006), 66.

Christine Rosen is senior editor of The New Atlantis: A Journal of Technology & Society. She is the author of Preaching Eugenics: Religious Leaders and the American Eugenics Movement and The Extinction of Experience (forthcoming).

Reprinted from The Hedgehog Review 20.1 (Spring 2018). This essay may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission. Please contact The Hedgehog Review for further details.

Who We Are

Published three times a year by the Institute for Advanced Studies in Culture, The Hedgehog Review offers critical reflections on contemporary culture—how we shape it, and how it shapes us.

IASC Home | Research | Scholars | Events | Support

IASC Newsletter Signup

First Name Last Name Email Address
   

Follow Us . . . FacebookTwitter