Tag Archives: quantification

Apple Watch and the Quantified Self

Today Apple unveiled its latest technological creation, the Apple Watch, a wearable computer that tracks not only time but your every step, heartbeat, and calorie. With their latest product, Apple contributes to the growing availability of devices and apps that track and record our activities and biostatistics such as Fitbit, Basis, and My Fitness Pal. Given Apple’s commercial influence, the Apple Watch may well turn the nascent Quantified Self (QS) movement into a cultural mainstay delivering “self knowledge through numbers.”

Apple Watch

Apple Watch

Most QS practices track health-related activities such as calorie intake, exercise, and sleep patterns, but they are increasingly used to document and track experiences of grief, exploration, and productivity. And tracking apps and devices are even making their way unexpected areas of life experience. Attempts to measure the soul, data point by data point, for example, are increasingly common. Just last January a Menlo Park pastor teamed up with a University of Connecticut sociologist to create SoulPulse, which, as Casey N. Cep explains, is a

 a technology project that captures real-time data on the spirituality of Americans. SoulPulse attempts to quantify the soul, an unbodied version of what FitBit, the exercise-tracking device, has done for the body. After filling in a brief intake survey on your age, race, ethnicity, education, income, and religious affiliation, SoulPulse contacts you twice a day with questions about your physical health, spiritual disciplines, and religious experiences. Each of the surveys takes less than five minutes to complete.

SoulPulse encourages users to learn about their “spirituality” through the power of big data and digital automation. This may sound crazy, but what’s the difference between tracking your daily prayer life with an app and doing so with another set of repeatable instructions, such as the Benedictine Rule and its set of daily readings and reminders to ponder God?

Many aspects of the QS movement are anything but new. Western cultures have long maintained practices that document behaviors and experiences in order to discipline ourselves. Capitalism and quantifying the self have been intimately linked for some time. Early accounting practices allowed businessmen to understand the consequences of their behavior so that it could be modified in the future. Merchants developed account logs that allowed them to track the results of their business transactions and to modify them in the future.  Perhaps they had purchased too much grain and it spoiled before it could be sold. In the following year, the same merchant could alter his practice based on this cataloged information. And Frederick W. Taylor’s scientific management theories relied on precise measurements of workers’ efficiency.

And more in the tradition of St. Benedict, people have long kept track of their spiritual lives. Benjamin Franklin dutifully recorded his success in adhering to a list of thirteen virtues each day. Diaries and journals have long been witness not just to bad poetry but to detailed lists of eating and sleeping habits. Weight Watchers and its point system, founded in 1963,  turned such practices into a business.

Despite such similarities, tracking devices such as Apple Watch are not the same as eighteenth-century diaries. The former have the potential to revolutionize the health sector and facilitate better care, but what happens when they don’t just give away our desires on Facebook (I like this!) but open up a one-way data stream on our bodies? How long will it take for all that personal data to make its way to our insurance companies? (The now-common annual biometric screenings will seem quaint by comparison.)

Self-reflection and personal development are broad cultural values. But what happens to us when we focus on aspects of ourselves that are easily recorded and converted into numbers? QS enthusiasts advocate for the expansion of tracking devices from the private sphere into the work environment, where they might provide insights on employee selection, promotion, and productivity. How will tracking social and personal behavior, such as how many times one smiles during the day, alter work environments and those who inhabit them?

Digital practices and techniques for tracking and disciplining the self are different from the analogue and print predecessors for several reasons. First, what they can track has expanded. Benjamin Franklin most likely didn’t know the rate of his perspiration. Second, the precision with which data is measured and recorded is continually increasing. Similarly, tracking devices and apps are increasingly frictionless: They do their job with minimal interruption and effort on the part of the user. Finally, the digital format of the data represents a marked difference from records of the past. Many of these tracking devices easily connect to apps and programs that analyze the data, dictating to the individual a pre-programmed assessment of success or failure. The digital nature of the information also makes it easily available and transferable.

These new developments and the manufacture and dissemination of these technologies and apps through popular and trusted brands such as Apple are likely to expand the degree to which individuals come to imagine themselves, their bodies, and their habits through and as numbers. As we continue into our quantified future, will these new digital practice alter what will means to be a good person, a successful person, or an efficient person? Will be we able to juke the numbers?  Just because the technology is intended to track behavior and facilitate modification of that behavior doesn’t mean that it won’t be put to other purposes. What will we make of our new digital tracking practices and the self that we come to know through numbers?

Claire Maiers is a graduate student in the Department of Sociology at the University of Virginia.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterGoogle+LinkedInShare

The New Anti-Intellectualism

Charges of anti-intellectualism in American life are as old as the Republic. It’s the inevitable consequence of being a bottom-up state and the high degree of pragmatism that comes with it. As Jules Verne wrote of the mid-nineteenth-century United States, “The Yankees are engineers the way Italians are musicians and Germans are metaphysicians: by birth.”

The Quant & The Connoisseur logoWhat makes our current moment unique is the fact that this time the fear of ideas isn’t coming from the prairies, the backwaters, or the hilltops. It’s coming from within the elite bastions themselves, those citadels of the urbane and the cosmopolitan. At stake in this revolt is nothing less than the place of quantification within everyday life. Never before has it been so fashionable to be against numerical thinking.

The dean of this new wave is of course Leon Wieseltier, editor at The New Republic, who is making something of a crusade to cow science and quantification into submission:

What [science] denies is that the differences between the various realms of human existence, and between the disciplines that investigate them, are final…All data points are not equally instructive or equally valuable, intellectually and historically. Judgments will still have to be made; frameworks for evaluation will still have to be readied against the informational deluge.

Wieseltier’s dream is that the world can be neatly partitioned into two kinds of thought, scientific and humanistic, quantitative and qualitative, remaking the history of ideas in the image of C.P. Snow’s two cultures. Quantity is OK as long as it doesn’t touch those quintessentially human practices of art, culture, value, and meaning.

Wieseltier’s goal is as unfortunate as it is myopic. It does a disservice, first, to the humanistic bases of scientific inquiry. Scientists aren’t robots—they draw their ideas from deep, critical reflection using numerical and linguistic forms of reasoning. The idea that you can separate these into two camps would make little sense to most practicing researchers. To be sure, we all know of laughable abuses of numbers, especially when applied to cultural or human phenomena (happiness studies anyone?). This is all the more reason to argue for not sequestering number off from humanistic disciplines that pride themselves on conceptual complexity. Creating knowledge fences only worsens the problem.

But it also has the effect of hardening educational trends in which we think of students as either math and science kids or reading and verbal ones. Our curricula are designed to reinforce Wieseltier’s argument into over-simplified binaries, ones that come at the expense of our own human potential. Most importantly, in my view, is the way this line of thinking lacks precedent in the history of ideas. Some of the intellectual giants of the past, people like Leibniz, Descartes, or Ludwig Wittgenstein—presumably the folks Wieseltier would admire most—were trained as mathematicians. The history of literature, too, that most prohibited territory of number’s perennial overreach, is rife with quantitative significance. Why are there nine circles of hell in Dante’s Inferno? 100 stories in Boccaccio’s Decameron? 365 chapters in Hugo’s Les Misérables? 108 lines in Poe’s “The Raven”? Not to mention the entire field of prosody: why is so much French classical theatre composed in lines of 12 syllables or English drama in 10? Why did there emerge a poetic genre of exactly 14 lines that has lasted for over half a millennium?

Such questions only begin to scratch the surface of the ways in which quantity, space, shape, and pattern are integral to the human understanding of beauty and ideas. Quantity is part of that drama of what Wieseltier calls the need to know “why”.

Wieseltier’s campaign is just the more robust clarion call of subtler and ongoing assumptions one comes across all the time, whether in the op-eds of major newspapers, blogs of cultural reviews, or the halls of academe. Nicolas Kristof’s charge that academic writing is irrelevant because it relies on quantification is one of the more high-profile cases. The recent reception of Franco Moretti’s National Book Critics Award for Distant Reading is another good case in point. What’s so valuable about Moretti’s work on quantifying literary history, according to the New Yorker’s books blog, is that we can ignore it. “I feel grateful for Moretti,” writes Joshua Rothman. “As readers, we now find ourselves benefitting from a division of critical labor. We can continue to read the old-fashioned way. Moretti, from afar, will tell us what he learns.”

We can continue doing things the way we’ve always done them. We don’t have to change. The saddest part about this line of thought is this is not just the voice of journalism. You hear this thing inside academia all the time. It (meaning the computer or sometimes just numbers) can’t tell you what I already know. Indeed, the “we already knew that” meme is one of the most powerful ways of dismissing any attempt at trying to bring together quantitative and qualitative approaches to thinking about the history of ideas.

As an inevitable backlash to its seeming ubiquity in everyday life, quantification today is tarnished with a host of evils. It is seen as a source of intellectual isolation (when academics use numbers they are alienating themselves from the public); a moral danger (when academics use numbers to understand things that shouldn’t be quantified they threaten to undo what matters most); and finally, quantification is just irrelevant. We already know all there is to know about culture, so don’t even bother.

I hope one day this will all pass and we’ll see the benefits of not thinking about these issues in such either/or ways, like the visionary secretary of Jules Verne’s imaginary “Baltimore Gun Club,” who cries, “Would you like figures? I will give you some eloquent ones!” In the future, I hope there will be a new wave of intellectualism that insists on conjoining these two forms of thought, the numerical and the literal, figures and eloquence. It seems so much more human.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.