Monthly Archives: June 2014

Brother Rat?

The journal Nature Neuroscience recently published an article titled “Behavioral and neurophysiological correlates of regret in rat decision-making on a neuroeconomic task”—not exactly eye-catching, as titles go. Nevertheless, over the next few days the article was cited in dozens of mainstream news sources, including the Washington Post, Time, and Huffington Post.  We might wonder what sparked the public interest in this technical scientific article…until we grasp the upshot of the study: Rats feel regret.

Photo-rat (Courtesy Wikimedia Commons)

Photo-rat by Banksy in St. Petersburg, Russia  (Courtesy of Wikimedia Commons)

Now that’s something to get people wondering. After all, we think the chasm between us and rats is vast. They’re rodents and a scourge; we’re human beings and the measure of all things. But if rats feel regret, doesn’t that change things? What could be more human than reflecting back on one’s actions, seeing the error of one’s ways, and feeling the pain of an irretrievable bad decision? Perhaps rats are more like us than we thought.

Perhaps. But there’s still no reason to think so, because this study has shown no such thing. Let me explain. The authors of the article—Adam Steiner and David Redish, both neuroscientists at the University of Minnesota—made their case by giving a definition of regret, and then by designing an experiment to test whether rats can satisfy the definition.  The experiment was successful because the rats’ behavior and brain activity satisfied the definition. But the tip-off that the study doesn’t really show what it purports to show comes when Steiner and Redish explain what they mean by regret: “Regret can be defined as the recognition that the option taken resulted in a worse outcome than an alternative option or action would have.”

This is a neuroeconomic condition that can be empirically verified. It’s neuroeconomic because it relates preferences revealed in decision-making to brain activity. It’s empirically verifiable because scientists can test whether a rat’s thinking about past action is tied to its behavior. But it isn’t a definition of regret. After all, one can satisfy this definition without experiencing regret at all.

Examples are legion, but here’s one to make the point. Suppose you frequently walk from your home to a nearby restaurant. Then one day you realize that there is a shortcut you could have been taking—and should have noticed—that shaves ten minutes off your travel time. You recognize that the path you’ve been taking has resulted in a worse outcome than the shortcut would have. You will plan to take the shortcut in the future. And if you’re like me, you might even chuckle at all the time you could have saved by consistently taking the shortcut. But this isn’t regret. Why? Because regret requires an emotional or affective component of feeling bad about the decision one has made.

In a deep irony, while presenting their definition, Steiner and Redish cite a thoughtful article by psychologist Janet Landman, where she defines regret as “a more or less painful cognitive/affective state of feeling sorry for losses, transgressions, shortcoming, or mistakes.”  This is a plausible account, but it’s unclear why Steiner and Redish cite it, since they exclude anything affective—anything involving mood or feelings—from their definition.

So what did the study actually reveal?  All that was gathered from the rats’ behavior and brain activity was that they recognized that they had failed to acquire a preferred food item, and that this then influenced their future behavior. Rats learn from their mistakes by thinking about past actions. Fair enough. But there was no evidence that the rats at any point had an affective experience of feeling bad about a past decision. There was no evidence that rats feel regret.

In 1913, psychologist John Watson gave a lecture at Columbia University which sparked the development of behaviorism in psychology. Watson called for a new approach to the study of the mind, proposing that psychologists “never use the terms consciousness, mental states, mind, content, introspectively verifiable, imagery, and the like.”

Further, Watson claimed, “Psychology is a purely objective, experimental branch of natural science which needs introspection as little as do the sciences of chemistry and physics. It is granted that the behavior of animals can be investigated without appeal to consciousness…. The position is taken here that the behavior of man and the behavior of animals must be considered on the same plane; as being equally essential to a general understanding of behavior. It can dispense with consciousness in a psychological sense.”

Over the next sixty years, behaviorism rose and then fell, giving way to the cognitive revolution in the 1970s, and now bearing little connection to today’s neuroscience. Nevertheless, it seems that something like Watson’s sentiment about the relative worthlessness of introspection and conscious experience still persists within the sciences of the mind. After all, when you consider the hurtful words you’ve just said to a loved one, and then feel the pain of wanting the impossible—”I’d like to have that one back”—this is an introspectable episode of consciousness. Regret is a feeling, not merely a connection between neural activity and behavior. But Steiner and Redish are willing to ignore these essential features of regret in favor of giving a thin, neuroeconomic—but empirically verifiable—definition.

Empirical verifiability is great when you can get it.  But the worry here is what might happen to our self-understanding as human beings if we become willing to trade in an understanding of a rich and meaning-laden feature of our nature for, well, something we can share with a rat.

Paul Nedelisky is a postdoctoral fellow at the Institute for Advanced Studies in Culture.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterLinkedInGoogle+Share

Are We Losing the Attention War?

“Like everyone else, I am losing the attention war,” writes columnist David Brooks in today’s New York Times. He goes on to cite a study reporting that 66 percent of American workers are unable to focus on one thing.

Is there any way to confront this growing epidemic of distraction and distractedness? Brooks points to the ideas of child psychologist Adam Phillips, who believes that we should encourage children to develop their innate capacity for obsession, in the best sense. And for that to happen, Phillips argues, children must feel they are in a safe environment:

“There’s something deeply important about the early experience of being in the presence of somebody without being impinged upon by their demands, and without them needing you to make a demand on them. And that this creates a space internally into which one can be absorbed. In order to be absorbed one has to feel sufficiently safe, as though there is some shield, or somebody guarding you against dangers such that you can ‘forget yourself’ and absorb yourself, in a book, say.”

16.2 cover FINAL_loPhillips’ ideas resonate strongly with the arguments that will be presented in the upcoming summer issue of The Hedgehog Review. The five essays that make up this issue’s theme—”Minding Our Minds”—take on many of the key questions associated with the current attention crisis: Have we lost our ability to focus? What do we mean by attention? Is there a breaking point in the seemingly ceaseless deluge of data, tweets, texts, and emails? Is there any connection between America’s epidemic of Attention Deficit Hyperactivity Disorder and our deepest cultural assumptions and expectations, notably a relentless emphasis on performance in all aspects of life? How is this deeper deficit affecting our humanity? Are there antidotes, cures, solutions—or are we simply in the middle of adjusting to technological upgrades, a transition masquerading as a problem? If drowning in information overload is indeed a problem, as nefarious as pollution, should we consider regulating it—and how?

To grapple with these questions, we have invited several scholars, including Matthew Crawford, Mark Edmundson, and Thomas Pfau, to examine aspects of our attention disorder that seldom receive careful consideration. As they show in various ways, attention may be far less a technological or neurobiological problem than a cultural, ethical, and philosophical one, bound up with our deepest ideas about the human person and the goals or purposes of our lives.

The summer issue will arrive in subscribers’ mailboxes (or inboxes, for those who choose the digital version) in July. To subscribe or renew, click here. 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.