Category Archives: Uncategorized

The Thin Reed of Humanism

Dürer, "Melancolia I" Städel Museum

Albrecht Dürer, Melencolia I (1514), Städel Museum

Leon Wieseltier is at his cantankerous best in his latest essay “Among the Disrupted.” After two opening paragraphs that are difficult to read as anything but a commentary on the recent demise of the New Republic, Wieseltier returns to his own well-trod turf:

[A]s technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science. The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university, where the humanities are disparaged as soft and impractical and insufficiently new. The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.

Wieseltier is reprising many of the themes of his public feud with Steven Pinker in the pages of the New Republic (here, here, and here). More than anything else, this earlier spat and Wieseltier’s latest essay are cultural barometers of our impoverished cultural imagination concerning the relationship of science, the humanities, and technology.

When Wieseltier invokes “scientism,” he’s gesturing toward real concerns about the reductive materialism or naturalism that tends to underlie the work of popular polemicists like Dawkins, Dennet, and Pinker. He is not denying that our world and our selves can, in part, be explained through material mechanisms. I assume he enjoys the benefits of modern medicine like the rest of us.

But terms like “scientism” and “technologism,” however well-intentioned, can obscure more than they clarify. Those who bandy them about presume, as the historian James Schmidt lays out, a number of things. First, they presume that there are different ways of knowing the world. There are limits to a uniquely scientific knowledge. There are some things that cannot be fully explained by modern science. Second, they presume that they can discern what those boundaries are. And, finally, they presume that they can diagnose the deleterious consequences of these illicit boundary crossings.

I’m sympathetic to all three of these premises. But I’m much less confident in our ability to identify where science begins and ends than those who so diligently guard the borders of knowledge exclaiming “scientism!” when they suspect interlopers. Those who invoke “scientism”—and there is a long tradition of its use and occasional abuse as Schmidt has wonderfully documented—put themselves in the position not only of policing the borders of knowledge but also of distinguishing real science, a science that knows its place, from a false science, a science that engages in constant and illicit “border crossing.”

My point is that these ominous sounding terms are all too often used as polemical cudgels. They refer to an illicit encroachment of one sort of knowledge into what is perceived as the proper sphere of another. Thus, “scientism” ultimately refers to the use of uniquely scientific knowledge within a distinctly non-scientific domain. Any appeal to the biological basis of love would be “scientism.” And very bad. These big, ugly worlds are, in short, the searchlights of our epistemic border guards.

But if “technologism” and “scientism” refer to types of knowledge that don’t know their place, then what type of knowledge or disposition adjudicates where these boundaries begin and end? For Wieseltier and many others today, it’s humanism. Humanism is the positive correlate of all those other “–isms,” those forms of knowledge that blithely stray beyond their boundaries.

But what is humanism? “For a start,” writes Wieseltier, “humanism is not the antithesis of religion, as Pope Francis is exquisitely demonstrating. The most common understanding of humanism is that it denotes a pedagogy and a worldview:

The pedagogy consists in the traditional Western curriculum of literary and philosophical classics, beginning in Greek and Roman antiquity and — after an unfortunate banishment of medieval culture from any pertinence to our own — erupting in the rediscovery of that antiquity in Europe in the early modern centuries, and in the ideals of personal cultivation by means of textual study and aesthetic experience that it bequeathed, or that were developed under its inspiration, in the “enlightened” 18th and 19th centuries, and eventually culminated in programs of education in the humanities in modern universities. The worldview takes many forms: a philosophical claim about the centrality of humankind to the universe, and about the irreducibility of the human difference to any aspect of our animality; a methodological claim about the most illuminating way to explain history and human affairs, and about the essential inability of the natural sciences to offer a satisfactory explanation; a moral claim about the priority, and the universal nature, of certain values, not least tolerance and compassion. It is all a little inchoate — ­human, humane, humanities, humanism, humanitarianism; but there is nothing shameful or demeaning about any of it.

Yes, it is all rather “inchoate.” And therein lies the problem.

Wieseltier is correct about the long and admirable lineage of a humanist classical pedagogy, less so about the worldview claim. “Humanism” as a human-centered worldview is a neologism invented not in the Renaissance but in the early nineteenth century as another polemical cudgel, one used to fight the same types of cultural battles that Pinker and Wieseltier have long been waging.

As far as I can tell, humanism, or rather its German cognate Humanismus, was first used in 1808 by the German pedagogue and philosopher F.I. Niethammer (1766–1848). In The Conflict of Philanthropinism and Humanism in Contemporary Theories of Education and Pedagogy, he juxtaposed humanism with philanthropinism, an Enlightenment-era educational theory that regarded the human as a natural being who needed to develop his or her natural capacities. What distinguished “humanism” from more modern forms of education was an underlying concern for, as Niethammer put it, “the humanity [over] the animality” of the human. As a worldview, humanism subordinated the body to reason and defended the autonomy of human nature from the material world. As first used by Niethammer, it was a boundary term; it marked what Niethammer thought was the clear line between the mental from the material, the human from the animal.

When critics invoke “humanism” against “scientism” or “technologism,” they presume to know the proper boundaries of science and technology; they presume that they can readily and forcefully articulate where scientific knowledge ends and humanistic knowledge begins. They assume the role of guardians of our intellectual and ethical world. That’s a heavy burden.

But it’s also a presumption that ignores how much of our knowledge comes from these border crossings. It’s at the margins of our established ways of engaging our world and ourselves that new ways of seeing and imagining what it is to be human so often emerge. We may well need knowledge police and concepts like “scientism” and “humanism” to warn us of charlatans and interlopers but we should hope that they do so with a little less alacrity and a bit more humility.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterGoogle+LinkedInShare

Who Needs Captains of Erudition?

Long before “corporatization” became synonymous with higher education, Thorstein Veblen, the early twentieth-century American sociologist who coined the term “conspicuous consumption,” dismissed American universities as little more than “competitive businesses.” In On the Higher Learning in America (1918), published fewer than forty years after Johns Hopkins was founded as America’s first research university, he described the contemporary university as a “business house dealing in merchantable knowledge, placed under the governing hand of a captain of erudition, whose office it is to turn the means in hand to account in the largest feasible output.” The modern American university president wasn’t a scholar, an intellectual, a scientist, or even much of a leader. He was the manager of systems, the chief of a concern, the captain of erudition.

Thorstein Veblen, by Edwin B. Child, 1934. Courtesy of Yale University Art Gallery, Gift of Associates of the Sitter. A protege of J. Laurence Laughlin, the first head of political economy, Veblen began his uneasy passage through the University in 1892.

Thorstein Veblen, by Edwin B. Child, 1934. Courtesy of Yale University Art Gallery, Gift of Associates of the Sitter. A protege of J. Laurence Laughlin, the first head of political economy, Veblen began his uneasy passage through the University in 1892.

Botstein and Bard

Leon Botstein, the charismatic conductor of the American Symphony Orchestra and president of Bard College, is no captain of erudition. “Botstein’s voice,” writes Alice Gregory in the New Yorker,

telegraphs a wizardly moral authority. Everyone responds to it, but parents, primed to be proud of their children, are especially susceptible. ‘We live in a time where people don’t really believe in education. That doubt is something we struggle with,’ he said. ‘Your enthusiasm, your determination, your idealism about education gives back to us a reminder of why we should fight for what we do.’

For Botstein, the “quantification of American higher education,” introduced by university administrators who just want to keep their jobs and facilitated by spineless faculty who have given up on the liberal arts, is a moral affront.

Botstein’s earnest and tireless defense of an ideal, however, might just doom this small, liberal arts college, 90 minutes north of New York City. Bard, where all those black-clad kids who read Sartre in high school wound up, is the singular creation of Botstein’s will and personality. But in December 2013, Moody’s Investors Service lowered its credit outlook to “negative.” And now some of its trustees are worried. Susan Weber, a trustee and donor, said:

Everyone says, ‘Oh, he’s the most amazing fund-raiser,’ Well, I wish that were so, because we wouldn’t be so underfunded if he were that amazing. I think he’s good at it—he works hard at it—but his real strength is building an institution.

“But”?  If one word can be said to embody the confusion over the purposes of higher education, that but might be it.

Botstein built an institution with a vision, but only a captain of erudition can, it seems, sustain it.

Weber’s resigned admission of what Bard needs after Botstein has become the assumption of many university boards. University presidents shouldn’t lead national debates or make moral claims; they should alleviate political pressures and mollify the idiosyncracies of donors. Ours is the age of the competent commander-in-chief—we need accountants, not idealists.

Veblen’s Prescience—in Our Own Backyard

On June 10, 2012, my colleagues and I at the University of Virginia (UVa) learned that Veblen had been all too prescient. Helen Dragas, Rector of UVa’s Board of Trustees, briefly and matter-of-factly informed us that our president had been fired:

On behalf of the Board of Visitors, we are writing to tell you that the Board and President Teresa Sullivan today mutually agreed that she will step down as president of the University of Virginia effective August 15, 2012. For the past year the Board has had ongoing discussions about the importance of developing, articulating and acting on a clear and concrete strategic vision. The Board believes that in the rapidly changing and highly pressurized external environment in both health care and in academia, the University needs to remain at the forefront of change.

Over the following weeks, my colleagues and I, joined by an international audience, speculated about these unspecified “philosophical differences” between President Sullivan and the Board of Visitors; we wondered about the “clear and concrete strategic vision” for which the Rector called. Hadn’t we already been subjected to years of strategic planning?

After ten days of increasing frustration and concern from faculty, students, and alumni, Dragas sent a second email. This one listed a number of “challenges” that UVa faced for which Sullivan, as Dragas implied, had no plan to deal with: the long-term decline in state funding for public universities, the disruptive effects of new technologies, rising tuition costs, increasing enrollments and an aging faculty (with no money to replace it), increasing demands for faculty and curricular assessment—not to mention the increasingly expanded roles that the contemporary university plays of health-care provider, entertainment center, sports venture, industrial and government research center, and, by the way, educator. In short, the university faced a whole host of challenges, none of which were unique to UVa.

sullivan_infernal machine_72dpi

UVa President Teresa Sullivan speaks on the steps on the Rotunda after addressing a closed session of the Board of Visitors, June 2012; photo © Norm Shafer

But between June 10 and Sullivan’s ultimate reinstatement on June 26, something else happened on Grounds, something that most stories and accounts of the summer’s events missed in their efforts to chronicle the process. Not only did it surprise me; I still struggle to  make sense of it. (Talbot Brewer also tried to make sense of this series of events in the summer issue of The Hedgehog Review.)

For about two weeks, UVa faculty members paid scant attention to the myriad problems that the Rector identified; they didn’t demand political intervention; they didn’t split up into conservative and liberal corners and revive culture-war arguments (the liberal faculty against the conservative administration). For two weeks, my colleagues condemned the Board of Visitors’ actions by making explicitly ethical arguments, arguments grounded in claims about the moral purposes of the university: What the university was and ought to be. Some colleagues defended and invoked an honor code with which we usually engage, if at all, only ironically. Others celebrated founder Thomas Jefferson’s commitment to higher education as a public and democratic good, but without the ironic winks that usually accompany such discussions. There was even an impassioned defense of peer review as an ethical practice. Whatever their particular content, the arguments led to a broad consensus: This wasn’t right, this wasn’t how a university ought to be run.

With our backs to the wall and overcome by the sense that our university was imperiled, we faculty members made arguments that were not, in the first instance, financial, technological, or political. We made normative claims about what a university ought to be. That is, the arguments that my colleagues mustered focused on the moral character and purposes of the university. Faculty were engaged and motivated by a general and rather vague sense that the moral authority of the university had been threatened.

Can We Afford Our Future?

My colleague Siva Vaidhyanathan has continued to make these arguments. Recently, while writing of another attempt to oust a public university president, this time at the University of Texas, Vaidhyanathan defended the increasingly beleaguered notion of the university as a public good:

The tuition increases and the realization that the payoffs from universities are deferred and unquantifiable pushed legislators and “reformers” to demand accountability and radical administrative transformations. This has only served to make it harder for faculty to teach and conduct research. It has made the richest nation in the history of the world act like it can’t afford to believe in its own future, respect its own culture, or foster the experimentation and knowledge that might serve the entire planet.

The university is more than than an “inefficient and outdated information delivery system.” It is a public good because it advances, conserves, refines and shares knowledge for the world. And it does so most basically by forming people who believe that knowledge is a public good.

Leon Botstein may at times be bombastic. And he is always, without question, idealistic. At a moment when the very purposes and values of universities are being reshaped in the name of efficiency and disruption, we don’t need captains of erudition. We need leaders who embody the true ethos of our institutions.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Cultural Critics vs. Social Scientists—They Both Stink

The past few weeks have seen some heady attempts at generalization: first Sam Tanenhaus’s piece on “Generation Nice” and then A.O. Scott on the “Death of Adulthood.” (Potential correlation there?)

The Quant & The Connoisseur logo

The subsequent critiques of both were withering. Tanenhaus’s article proved to be laden with errors, resulting in hilarious retractions by the Times editorial staff. In response to the proof of the millennials’ niceness, the editors wrote:

An article last Sunday about the millennial generation’s civic-mindedness included several errors…. Applications to the Peace Corps recently have been in decline with a 34 percent decrease from the peak in 2009, and applications to Teach for America decreased slightly last year; neither organization has seen “record numbers of new college graduates” applying for jobs.

Well done. And the unredacted rest apparently relied on citations of studies that cited studies that cited…an ad agency?!

As for Scott (one of Q&C’s favorite film critics), his reflections on adulthood’s imminent, if not already occurring, death come from having watched a lot of TV. “Something profound has been happening in our television over the past decade,” Scott announces with a foreboding sense of doom (it’s gotten worse?). And then, in an alliterative jingle that would make even the best nineteenth-century speech writer wriggle, “It is the era not just of mad men, but also of sad men and, above all, bad men.”

So there you have it: A few shows (Mad Men, The Sopranos, Breaking Bad) have chronicled the decline of white patriarchy, which is a good stand-in for the decline of adulthood, which in turn is a good stand-in for a major shift in “American Culture.” Imagining that all of adulthood, and masculinity in particular (Scott’s real aim), was coming to an end because of a few televisual-fantasies of bad dads like Don Draper ignored, as David Marcus pointed out, a whole lot of other stuff on TV that most people actually watch, like, say, football (or NCIS or NCIS: LA). Masculinity is doing just fine there (by which I mean on display, not as in, oh-so-admirable).

One would think at this point the answer is Big Data to the rescue. Instead of making whopping generalizations based on a few selective examples, turning culture into data can give us a much better view of the “big picture” (preferably as a picture: through that most ubiquitous of contemporary genres, the infographic). If we look broadly, what is “television” telling us and how would we segment it into different groups, for surely it is not telling all of us the same thing?

The problem is, as Marcus pointed out, it’s not as though the social scientists who traffic in cultural data mining have done much better. Turning culture into data is not a seamless process, nor is its interpretation. While we all know this, we seem unable to heed this advice given the opportunity for a juicy headline—in other words, given the chance to tell a story. Narrative trumps reason in fascinating ways.

The point is not, oh forget it, let’s just let Tanenhaus make it up after all. A good story is a good story and you can’t count culture anyway. The point is we need a lot more work on the work of translating culture into data before we go ahead and start calculating and interpreting. What would be a representative sample of “TV” or “pop-culture”? How would you measure depictions of adulthood or “masculinity” (either as positively or negatively coded)? What is your control set, i.e., what are you comparing this against? And so on.

The real answer is we need to think more about the process of cultural modeling. How do we model a cultural subset through a data set (a generation, for example, or contemporary television), and how do we model a cultural practice or concept through a particular measurement? These aren’t easy questions, but they are the prerequisite for correcting against journalistic just-so stories of cultural criticism.

This is the time for the humanists to jump into the fray, not to put our heads in the sand and say, “You can’t count that!”  The challenge is to think of counting culture in more sophisticated ways and so avoid the mythologizing that passes as cultural criticism these days.

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Quit Lit: Do the Humanities Need the University?

college chains_FINAL

#165598883 / gettyimages.com

There’s a new genre taking shape on blogs, Twitter, and even in the pages of The London Review of Books: Quit Lit. Just last week, Mariana Warner, a creative writing professor and member of the Man Booker prize committee, explained her decision to resign her position at the University of Essex. In “Why I Quit,” she describes the bureaucratic disciplines of England’s new Research Assessment Exercises, which tabulate and calculate academic labor with the efficiency and mindlessness usually reserved for an assembly plant (and a low tech one at that).

In a scene she must have embellished by channeling Kakfa U., Warner recounts a meeting with her new dean:

A Tariff of Expectations would be imposed across the university, with 17 targets to be met, and success in doing so assessed twice a year. I received mine from the executive dean for humanities. (I met her only once. She was appointed last year, a young lawyer specialising in housing. When I tried to talk to her about the history of the university, its hopes, its “radical innovation,” she didn’t want to know. I told her why I admired the place, why I felt in tune with Essex and its founding ideas. “That is all changing now,” she said quickly. “‘That is over.” My “workload allocation,” which she would “instruct” my head of department to implement, was impossible to reconcile with the commitments which I had been encouraged—urged—to accept.

Confused but, more deeply, defeated by this new regime, Warner resigned. But she continued her work for the Man Booker Prize committee which, as it turns out, has proven rather clarifying.

Among the scores of novels I am reading for the Man Booker International are many Chinese novels, and the world of Chinese communist corporatism, as ferociously depicted by their authors, keeps reminding me of higher education here, where enforcers rush to carry out the latest orders from their chiefs in an ecstasy of obedience to ideological principles which they do not seem to have examined, let alone discussed with the people they order to follow them, whom they cashier when they won’t knuckle under.

As a genre Quit Lit has a few organizing features. Its form tends to be personal and aggrieved. The university, like those vague but all-powerful institutions in Kafka’s texts, has been overtaken by an alien, usually bureaucratic-statist-inhumane power. And its content tends to be not just about the decline of the university but also about the impending demise of the humanities. By turning universities into vocational schools, we are robbing our children of humanistic forms of thought and the good that ensues. (If scientists wrote prose like humanists, maybe they would be writing about the end of the university and the collapse of science. NPR had a go at Quit Lit  this past week in their series on the dramatic cuts in basic science funding and the results it is having on future generations of scientists.)

As with all literary genres, Quit Lit has its predecessors. Before there were Rebecca Schuman and NeinQuarterly’s Eric Jarosinski, there was another German scholar experimenting in the genre, Friedrich Nietzsche. In 1872, just three years after he landed his first, and only, professorship at the University of Basel without even having finished his dissertation, Nietzsche delivered a series of lectures, On the Future of Our Educational Institutions, in the city museum. Before crowds of more than 300 people, Nietzsche staged a dialogue on the future of German universities and culture between two young students and a cantankerous old philosopher and his slow-witted but earnest assistant.

The grousing philosopher lamented the decline of universities into state-sponsored factories that produced pliant citizens and mindless, “castrated” scholars who cared not a bit for life. By the end of the lectures, it’s difficult to say whether Nietzsche thought there was a future at all for German universities. Nietzsche lasted a few more years in his position, resigning only when ill health forced him to. But he left an oeuvre that looked to the university and saw little but ruin.

As Nietzsche was writing, parts of the German university might not have been in decay, but they were in decline, the humanities in particular. Between 1841 and 1881, enrollment in philosophy, philology, and history within “philosophy faculties,” which compromised the core liberal arts fields, declined from 86.4 percent to 62.9 percent, whereas in mathematics and the natural sciences enrollments increased from 13.6 to 37.1 percent of all students matriculating at German universities. The mood among humanists was often such that they sounded quite a bit like the embattled literature professors of today. In academia, crisis is generally a matter of perception, and even in what now seems like a “golden age” for humanists, there was, in fact, a seismic shift for the humanities.

More recent forms of Quit Lit tend to lack a key feature of Nietzsche’s model, however. Nietzsche never conflated the humanities or humanistic inquiry with the university. For him, humanistic inquiry—and Nietzsche was deeply humanistic as his lifelong commitment to philology attests—transcended the institutional and historically particular shape of universities, which he saw as little more than extensions of a Prussian bureaucratic machine.

In what increasingly seems like a related genre, contemporary academics and intellectuals of all sorts have ostensibly been defending the humanities. But more often than not they actually defend certain forms of scholarship as they have come to be institutionalized in largely twentieth-century American research universities. Geoffrey Galt Harpham recently produced  the most egregious but well-argued example of this tendency with The Humanities and the Dream of America. His basic thesis is that the humanities as they are now practiced were an invention of post–World War II American research universities. Similarly, Peter Brooks’s edited collection The Humanities and Public Life suggests, with its focus on disciplines and scholarship and the imperatives of the university, inadvertently echoes the same. They conflate the humanities with their departmental and institutional shapes in universities.

In the measured “yes but” prose of academic speak, Patrícia Vieira gives this spirit of conflation ethical shape in a review entitled “What are the Humanities For?”:

Debates about the “future of the humanities” frequently revolve around the suspicion that the humanities might not have one. Yet despite the direness of this anxiety—an anxiety especially personal for every academic worried about professional choices or mortgage payments—conversations on the topic are often dull, long-faced affairs. Every professor has sat through one or another of these depressing discussions. The conversation proceeds according to a familiar set of pieces: there are passionate apologias of work in philosophy, literature, history, and the arts; veiled criticism of the anti-intellectualism of higher education administrators and society at large; and vague pledges to do more interdisciplinary research and extend a fraternal hand to the social and natural sciences, who remain largely unperturbed by this plight. The whole thing wraps up with the reassuring conviction that, if the humanities go down, they will do so in style (we study the arts, after all), and that truth is on our side, all folded in a fair dosage of indulgent self-pity.

Vieira can’t imagine the future of the humanities beyond the anxieties of professors and the failures of university administrators. All she can muster is a few gentle and inveterately academic admonitions for her authors:

Brooks’s and [Doris] Sommer’s [The Work of Art in the World: Civic Agency and Public Humanitiesbooks coincide in their desire to persuade those skeptical about the importance of the arts and the humanities of their inherent worth. The volumes set out to prove that these disciplines play a crucial role in public life and that they are vital to contemporary culture. Brooks’s collection often falls short of this goal by sliding into fatalistic rhetoric about the doomed future of humanistic scholarship—the very discourse the book attempts to combat—all while ignoring some of the vibrant new research in the field. In contrast, Sommer is overconfident in the power of the arts to tackle thorny socioeconomic and political problems. Both the despondent and celebratory approaches are symptomatic of the beleaguered state of the field, forced to justify its existence based upon technocratic principles that demand immediate results and fast returns. The humanities are constantly compelled to demonstrate practical results or hopelessly admit to lacking a concrete and immediate function, straitjacketed into foreign modes of valuation lifted from the empirical sciences. Neither a dying set of disciplines nor a panacea for social ills, the humanities remain a central form of human enquiry, in that they shed light on and question the tacit assumptions upon which our societies are based, outline the history of these values, and identify alternatives to the status quo.

Despite her attempts to cast the humanities as a form of “human” inquiry, Vieira is writing about a beleaguered and exhausted profession. There are only professors and their disciplines here. And they both are trapped, as Nietzsche would say, in a “castrated” passive voice: “The humanities are compelled ….” There are no agents in this drama, just put-upon, passive professors.

I am not suggesting that we should give up on universities. Universities, especially modern research universities, have long helped sustain and cultivate the practices and virtues central to the humanities. But just as German universities were becoming international paradigms, emulated from Baltimore to Beijing, Nietzsche made a fateful diagnosis. Those practices and virtues could ossify and whither in the arcane and self-justifying bowels of the modern, bureaucratic university. “Human inquiry,” in contrast, would live on.

We may well benefit from an exercise in imagination. Could the humanities survive the collapse of the university? I think so.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Apple Watch and the Quantified Self

Today Apple unveiled its latest technological creation, the Apple Watch, a wearable computer that tracks not only time but your every step, heartbeat, and calorie. With their latest product, Apple contributes to the growing availability of devices and apps that track and record our activities and biostatistics such as Fitbit, Basis, and My Fitness Pal. Given Apple’s commercial influence, the Apple Watch may well turn the nascent Quantified Self (QS) movement into a cultural mainstay delivering “self knowledge through numbers.”

Apple Watch

Apple Watch

Most QS practices track health-related activities such as calorie intake, exercise, and sleep patterns, but they are increasingly used to document and track experiences of grief, exploration, and productivity. And tracking apps and devices are even making their way unexpected areas of life experience. Attempts to measure the soul, data point by data point, for example, are increasingly common. Just last January a Menlo Park pastor teamed up with a University of Connecticut sociologist to create SoulPulse, which, as Casey N. Cep explains, is a

 a technology project that captures real-time data on the spirituality of Americans. SoulPulse attempts to quantify the soul, an unbodied version of what FitBit, the exercise-tracking device, has done for the body. After filling in a brief intake survey on your age, race, ethnicity, education, income, and religious affiliation, SoulPulse contacts you twice a day with questions about your physical health, spiritual disciplines, and religious experiences. Each of the surveys takes less than five minutes to complete.

SoulPulse encourages users to learn about their “spirituality” through the power of big data and digital automation. This may sound crazy, but what’s the difference between tracking your daily prayer life with an app and doing so with another set of repeatable instructions, such as the Benedictine Rule and its set of daily readings and reminders to ponder God?

Many aspects of the QS movement are anything but new. Western cultures have long maintained practices that document behaviors and experiences in order to discipline ourselves. Capitalism and quantifying the self have been intimately linked for some time. Early accounting practices allowed businessmen to understand the consequences of their behavior so that it could be modified in the future. Merchants developed account logs that allowed them to track the results of their business transactions and to modify them in the future.  Perhaps they had purchased too much grain and it spoiled before it could be sold. In the following year, the same merchant could alter his practice based on this cataloged information. And Frederick W. Taylor’s scientific management theories relied on precise measurements of workers’ efficiency.

And more in the tradition of St. Benedict, people have long kept track of their spiritual lives. Benjamin Franklin dutifully recorded his success in adhering to a list of thirteen virtues each day. Diaries and journals have long been witness not just to bad poetry but to detailed lists of eating and sleeping habits. Weight Watchers and its point system, founded in 1963,  turned such practices into a business.

Despite such similarities, tracking devices such as Apple Watch are not the same as eighteenth-century diaries. The former have the potential to revolutionize the health sector and facilitate better care, but what happens when they don’t just give away our desires on Facebook (I like this!) but open up a one-way data stream on our bodies? How long will it take for all that personal data to make its way to our insurance companies? (The now-common annual biometric screenings will seem quaint by comparison.)

Self-reflection and personal development are broad cultural values. But what happens to us when we focus on aspects of ourselves that are easily recorded and converted into numbers? QS enthusiasts advocate for the expansion of tracking devices from the private sphere into the work environment, where they might provide insights on employee selection, promotion, and productivity. How will tracking social and personal behavior, such as how many times one smiles during the day, alter work environments and those who inhabit them?

Digital practices and techniques for tracking and disciplining the self are different from the analogue and print predecessors for several reasons. First, what they can track has expanded. Benjamin Franklin most likely didn’t know the rate of his perspiration. Second, the precision with which data is measured and recorded is continually increasing. Similarly, tracking devices and apps are increasingly frictionless: They do their job with minimal interruption and effort on the part of the user. Finally, the digital format of the data represents a marked difference from records of the past. Many of these tracking devices easily connect to apps and programs that analyze the data, dictating to the individual a pre-programmed assessment of success or failure. The digital nature of the information also makes it easily available and transferable.

These new developments and the manufacture and dissemination of these technologies and apps through popular and trusted brands such as Apple are likely to expand the degree to which individuals come to imagine themselves, their bodies, and their habits through and as numbers. As we continue into our quantified future, will these new digital practice alter what will means to be a good person, a successful person, or an efficient person? Will be we able to juke the numbers?  Just because the technology is intended to track behavior and facilitate modification of that behavior doesn’t mean that it won’t be put to other purposes. What will we make of our new digital tracking practices and the self that we come to know through numbers?

Claire Maiers is a graduate student in the Department of Sociology at the University of Virginia.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Humanities in the Face of Catastrophe

Earlier this month, my colleague Bethany Nowviskie asked a group of digital humanities scholars gathered in Lausaunne, Switzerland, to consider their work—its shape, its fate, its ends—within the context of climate change and the possibility of our, as in the human species, own extinction. (How’s that for a dinner-time keynote? Bourbon, please.)

The premise of Nowviskie’s talk, “Digital Humities in the Anthropocene,” was the fact of climate change and the irrevocable consequences that it could have for life on our planet. What would it mean, she asked, to be a humanities scholar in the anthropocene, a geological epic defined by human impact on the natural world? What would it mean to practice the humanities within a geological scale of time?

Whatever the fate of the “anthropocene” as a term (its existence and even inception is still being debated among geologists), the scientists, activists, and scholars who invoke it consider human activity and practices as inseparable from nature. Whether they intend to or not, they thereby challenge basic ideas about the human, culture, and agency that have sustained the humanities for centuries.

The very notion of the anthropocene and the suggestion that humans could cause geological-level change undermines, as Dipesh Chakrabarty puts it, the “age-old humanist distinction between natural history and human history.” From Vico’s famous claim that humans could  know only what they have created, while nature remains God’s inscrutable work, to Kant’s understanding of cosmopolitan history as purposive action, Western thinkers have long distinguished between human and natural history. But granting humans a geological agency, however aggregative and delayed, undermines this distinction. What becomes of history and culture when it can no longer be thought of as simply human?

For one thing, we can better conceive of our cultural and historical labor as bound to the contingency, finitude, and flux of nature, and, thus, cultivate a more acute awareness of how extinguishable our labor can be. Here’s Nowviskie speaking to her digital humanities colleagues:

Tonight, I’ll ask you to take to heart the notion that, alongside the myriad joyful, playful scholarly, and intellectual concerns that motivate us in the digital humanities—or, rather, resting beneath them all, as a kind of substrate—there lies the seriousness of one core problem. The problem is that of extinction—of multiple extinctions; heart-breaking extinctions; boring, quotidian, barely-noticed extinctions—both the absences that echo through centuries, and the disposable erosions of our lossy everyday. We edit to guess at a poet’s papers, long since burned in the hearth. We scrape through stratigraphic layers of earth to uncover ways of life forgotten, and piece together potsherds to make our theories about them hold water. Some of us model how languages change over time, and train ourselves to read the hands that won’t be written anymore. Others promulgate standards to ward against isolation and loss. With great labor and attention, we migrate complex systems forward. We redesign our websites and our tools—or abandon them, or (more rarely) we consciously archive and shut them down. DHers [digital humanities scholars] peer with microscopes and macroscopes, looking into things we cannot see. And even while we delight in building the shiny and the new—and come to meetings like this to celebrate and share and advance that work—we know that someone, sooner or later, curates bits against our ruins.

Humanities labor represents, continues Nowviskie, a deeply human striving to communicate “across millennia” and “a hope against hope that we will leave material traces,” in stone, manuscript, print, or digital form.

This is where the strivings of humanities scholars and geologists of the anthropocene intersect. They both rummage about nature for traces of the human. And while for some, such ceaseless searching is add47682f8_61308 copydriven by an indomitable humanism, a confidence in a human spirit that will always live on, others are haunted by the possibility of our extinction and the end, as Nowviskie puts it, “of so much worldly striving.” But those of us who are in search of a fragmented and sundered human are also motivated by this prospect of extinction.

When Nowviskie exhorts us to “dwell with extinction,” she echoes a humanistic disposition that has long been defined by the specter of loss. Since at least the studia humanitatis of the early modern period, scholars have practiced the humanities in anticipation of catastrophe. Theirs, however, was not a crisis of the guild, anxieties about budget cuts and declining humanities enrollments, but a fear of the radical and irreparable loss of the human record.

Many of the great encyclopedic works of the early modern European scholars were defined, as Ann Blair describes them, by the “stock-piling” of textual information intended to create a “treasury of material.” The early modern masterpieces of erudition—such as Conrad Gesner’s Bibliotheca universalis (1545) or Johann H. Alsted’s Encyclopaedia septem tomis distincta (1630)—were motivated not simply by information-lust but by a deeply cultural conception of the ends of scholarship, a desire to protect ancient learning and what humanists considered to be its integrity and authority. These early humanists, writes Blair, “hoped to safeguard the material they collected against a repetition of the traumatic loss of ancient learning of which they were keenly aware.” This loss had rendered Greek and Roman antiquity inaccessible until its gradual recovery in the fifteenth and sixteenth centuries. Humanist scholars saw encyclopedias and related reference works that collected ancient learning as guarantees that knowledge could be quickly reassembled should all books be lost again. They were busy collecting and curating because they feared another catastrophe.

Similarly, Denis Diderot described the eighteenth-century Encyclopédie as insurance against disaster:

The most glorious moment for a work of this sort would be that which might come immediately in the wake of some catastrophe so great as to suspend the progress of science, interrupt the labors of craftsmen, and plunge a portion of our hemisphere into darkness once again. What gratitude would not be lavished by the generation that came after this time of troubles upon those men who had discerned the approach of disaster from afar, who had taken measures to ward off its worst ravages by collecting in a safe place the knowledge of all past ages!

But even as Diderot promised his contributors future glory, he also acknowledged how fleeting the whole endeavor could be. The Encyclopédie, he lamented, would be irrelevant as soon as it was printed. The traces of the human that the “society of gentlemen” had so diligently collected in print would be out of date upon its first printing. Even print could not stop time.

In the nineteenth century, the famous German classicist August Böckh claimed that all this striving to collect and protect the material traces of the human was exemplified in the science of philology, which he defined as the Erkenntnis des Erkannten, or knowledge of what is and has been known. Our current knowledge is only as good as our past knowledge. And working with the fragmented documents of that past gave philologists an acute sense of how fragile our knowledge and history of ourselves was. The philologist’s attention to and care for the material nature of human culture—its embodiment in documents and texts of all sorts and qualities—was cultivated by a consciousness of how fragile it all was. The only thing that bound the species together was a documentary record always under threat.

Today that documentary record is undergoing a transition, unprecedented in its speed and extent, from printed to digital forms. Some scholars, archivists, and librarians warn that this move could prove catastrophic. Things of our past may be lost and forgotten. But if the long history of the humanities teaches us anything, it is that humanistic work has always been practiced in the face of catastrophe.

In our digital age, the vocation and the disposition of the humanities remains intact. “The work of the humanist scholar,” writes Jerome McGann, “is still to preserve, to monitor, to investigate, and to augment our cultural life and inheritance.” And it’s this disposition that thinking about the humanities in the anthropocene may help us recover.

 

Photograph: The dove and the raven released by Noah, with drowning people and animals in the water beneath the ark. From the Holkham Bible Picture Book, Add MS 47682, England: second quarter of the 14th century, Parchment codex, The British Library, London.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Ethics of Squirming

Most readers of the Infernal Machine (though you, like us, may have been taking a break from blogging) have probably read something about the controversial Facebook “emotional contagion” study. It seems that a couple of years ago a researcher at Facebook decided to see list-of-facebook-emoticon-600x369what would happen if he tweaked the words in Facebook’s “News Feed” of close to 700,000 users so as to manipulate the “emotional content” of the News Feed. Would users respond on their Facebook pages in step with the manipulations? That is, could Facebook make people feel better if they tweaked their secret algorithms to prioritize “positive” words in the News Feed? (People might spend more time on Facebook if they could!)

The researcher, Adam Kramer, then brought in some university researchers to look at the massive data set. They did some “big data” statistical analyses (of a relatively straightforward type) and added some psychological theory and found this:

When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.

The paper was peer reviewed and published in the prestigious Proceedings of the National Academy of Sciences (PNAS).

The merits of the study itself are highly questionable. My social science colleagues tell me that with such a massive sample size, you are almost always going to arrive at “statistically significant” findings, whether you are measuring “emotional contagion” or compulsive belching. The fact that the statistically significant “effect” was minimal throws more doubt onto the validity of the study. Furthermore, my social science colleagues tell me that it is doubtful that the study is really measuring “emotional contagion” at all — there are other theories (other than emotional contagion) available that would explain, why when Joe posts a “negative” statement, Jane is reluctant to follow with a “positive” one.

But the main controversy surrounds the ethics of the study: Participants were never made aware that they were part of a massive “experimental” study, and Facebook seems to have fudged on the timing of the “data use policy,” inserting the current bit about “data analysis” and “research” after the data was collected. This is out of keeping with common practice in the social sciences, to say the least.

Reviewing responses to the study within the academic community, I’ve noticed quite a lot of squirming. The fact is that university researchers were a major part of this study; Cornell University’s Institutional Review Board (IRB) board approved the study; and a prestigious academic journal has published it.

Now everyone, including Facebook’s researcher Adam Kramer, is squirming. Cornell University issued a bland statement clearly intended to wiggle its way out of the issue. PNAS has issued an “editorial expression of concern.” And the new media researchers that work in the same circles as the the university-based Facebook researchers have turned to such cliches as don’t  “throw the baby out with the bath water” (don’t over-react to university researchers using corporate-owned data on how they attempt to manipulate their users!).  They say people should stop “pointing the finger or pontificating” and instead “sit down together and talk.” “We need dialogue,” writes Mary Gray of Indiana University and Microsoft Research, “a thoughtful, compassionate conversation.”

Okay. But perhaps before we have that compassionate sit-down about the ethics of big data social media research, we might take a moment to think about the ethics of squirming. For, in this case, there has been way too much of it. I, for one, think TIME’s television critic James Poniewozik has been clear enough about the study: “Facebook can put whatever it wants in the fine print. That shouldn’t keep us from saying that this kind of grossness is wrong, in bold letters.” I don’t think much more needs to be said about the ethics of this particular study.

But something more does need to be said about the broader ethics of research, which sometimes puts us in uncertain ethical situations. There is something about the will to know, and more about the professionalization of knowledge production, that leaves us more frequently than we would like in tricky ethical territory. Rather than simply relying on an IRB “stamp of approval” university researchers might instead simply stop squirming and take responsibility for their work and even say they regret it.

Here’s what an ethics of squirming might look like:

(a) Start with full disclosure. Here  are the conflicts I am dealing with (or have dealt with); these are the messy issues; here is why I really want to squirm, but I won’t.
(b) Here’s  where I (or we) may have been, or indeed were, wrong. Here are our moral regrets. (By the way, Mr. Kramer, making people “uncomfortable” is not a a moral regret.)
(c) Here’s why I would or would not do it again.

All in all, the ethics of squirming entails less squirming and more speaking directly to the issues at hand. It means taking responsibility, either by repenting of wrongs and righting them if possible, or justifying one’s actions in public.

All together now:  Mea culpa.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

“Open” is not Public

One of the English words most in need rehabilitation these days is the word “public.” I have to confess that though I hear the word all the time, and think about it regularly, I don’t know how it needs to be rehabilitated. But I am certain it does. 900px-Open_Access_PLoS.svg

Historically, the most obvious thing one could say about the word “public” was that it is not “private.” That distinction, however, hardly holds anymore, as the “private”—whether in the form of private lives, private capital, or private information—now fills our political and social lives, such that the distinction between the private and the public makes increasingly less sense. It is not just that the most “public” controversies today—for example, the Donald Sterling debacle—tend to involve the “public” exposure of “private” lives, it is that many of our most pressing political and social problems today are inextricably tied to private interests, private choices, and privacy.

But the problem with “public” is not just that the ancient dialectic on which it rests—public versus private—is outmoded. There is also a new meaning of “public” now in wide circulation, operating (as almost all language does) at the level of unexamined common sense. “Public” is now used as a synonym for “open.”  

Let me offer an example. A few weeks back I was invited, with several other faculty, to a meeting with a program officer from a major foundation to discuss ways to make humanistic scholarship more public. What struck me about our conversation, which was quite lively, was that most people in the room seemed to assume that the main barrier to a public life for humanistic scholarship was access or openness. The thinking went like this: University presses, which publish most long-form humanistic scholarship, put up physical and financial barriers to a “public” life for scholarship. That is, print books have limited runs and cost money, sometimes quite a bit of money. Therefore these books sit on library shelves for only a few specialists to read. The solution, many in the room seemed to assume, was to convince university presses to “go digital,” with the funding agency with whom we were meeting would using its money to offset the financial loss presses would incur by going digital.

The problem of a public life for humanistic scholarship was one that the program officer presented. The foundation wanted to figure out how to invest their money in such a way as to help the humanities “go public.” But for virtually everybody in the room this meant “going digital,” making humanities work openly accessible. Openness—or open access—was the assumed key to a public life for humanistic scholarship.

But making something openly accessible does not make it public. To make something accessible or “open” in the way we talk about it today does not assume, on the level of norms, making it legible, debatable, let alone useful to non-specialists. There are millions of studies, papers, and data sets that are openly accessible but that nevertheless do not have a public life. The U.S. government, no less, has invested in various “openness” initiatives over the past two decades. These projects are presented as democratic gestures to the “public,” but they do little more  than allow governing agencies to display their democratic credentials and grant a few specialists  access to documents and data. To make something open or accessible is not to make it public.

What would it mean to make humanistic scholarship or government data or, for that matter, computing code, truly public? One clue does come to us from antiquity: The Latin word publicus meant “of the people.” Publicus was an attribute, a quality or feature of a person, thing, action, or situation, rather than the thing itself. It did not, mind you, mean that the person, thing, action, or situation was authorized by, or somehow the consequence of, the majority. That is, publicus was not a synonym for “democratic.” Rather, it meant that the person, thing, action, or situation was plausibly representative “of the people,” such that the people could engage it in a useful and productive political manner.

The question of how to make something public concerns how to endow it with a quality that could be attributed to “the people”? Clearly, this would mean taking seriously matters of “design” or what Cicero called “style,” such that the “public thing” (in Latin, the res publica) is first of all legible (able to be “read” if not fully understood), second of all in some sense subject to discourse and debate (and thus subject to political deliberation), and third of all socially useful. This means thinking long and hard about “the people” themselves: their habits, their tastes, their interests, their schedules, their aptitudes, and so on. While “openness” may in certain circumstances be part of that which builds up to a “public” quality, I would venture to say that openness or access is not even necessary. Cicero, for one, saw “public” life as one that was styled in such a way as to be of the people but not necessarily exposed or “open” to the people. Public speeches, public events, and public figures could—like some public art today—be a bit opaque (legible enough, but not necessarily fully understandable) but still be “of the people.” Moreover, certain information and deliberation may need to be kept secret for good reasons—for example, having to do with the fair administration of justice (think of a jury deliberating behind closed doors)—but that information and deliberation can still be public according to the criteria above.

Indeed, openness is relatively easy; publicity is hard. Making something open is but an act; making something public is an art.

Our digital moment is a triumphal one for “openness.” But “open” is not public. If we are to really push the digital into the public, we need go well beyond questions of access, openness, and transparency and start asking how it is that the digital might take on a quality that is “of the people.”

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Big Humanities

[Editor’s Note: This is the second installment in The Humanities in Full series.]

Before there was big science or big data, there was big humanities. Until the last third of the nineteenth century, the natural and physical sciences imitated many of the methods and practices of the humanities, especially disciplines like philology, which pioneered techniques in data mining, the coordination of observers, and the collection and sorting of information—what Lorraine Daston terms practices of “collective empiricism.”

One of the most successful and long-lasting projects was led by the Berlin philologist August Böckh. In a proposal to the Prussian Academy of Sciences in Berlin in 1815, Böckh and his colleagues requested funding for a long-term project to collect as completely as possible all Greek inscriptions, printed, inscribed, and holograph. Eventually published in four installments as Corpus Inscriptionum Graecarum between 1828 and 1859, with an index in 1877, Böckh’s project was an organizational feat that relied on the work of hundreds of philologists over decades, and it quickly became a model for German scholarship in all fields. “The primary purpose of a Royal Academy of the Sciences,” he wrote, should be to support the type of work that “no individual can accomplish.” The project collected, stored, preserved, and evaluated data. And in Böckh’s case, the data were Greek inscriptions scattered across the Mediterranean.

Böckh’s Corpus Inscriptionum Graecarum was just a prelude. In his inaugural lecture to the Prussian Academy of Sciences in 1858, Theodor Mommsen, one of Germany’s foremost classical scholars, declared that the purpose of disciplines like philology and history was to organize the “archive of the past.” What Mommsen had in mind, as would become evident in the kinds of projects he supported, was not some abstract archive of immaterial ideas. He wanted scholars to collect data and shape it into meticulously organized and edited printed volumes in which the “archive” would take tangible form. Work on the scales that Mommsen imagined would require international teams of scholars and the “liberation” of scholars from what he dismissed as the “arbitrary and senseless” divisions among the disciplines.

As secretary of the Prussian Academy of Sciences, Mommsen set out to institutionalize his vision of big philology, or what he termed the “large scale production of the sciences” [Grossbetrieb der Wissenschaften]. After securing a three-fold increase in the Academy’s budget, he supported a series of monumental projects. He oversaw the internationalization and expansion of the Corpus Inscriptionum Latinarum, the Latinate counterpart to Böckh’s project that sought to collect all inscriptions from across the entire Roman Empire. It eventually collected more than 180,000 inscriptions and grew to 17 volumes plus 13 supplementary volumes. Mommsen also helped church historian Adolf Harnack secure 75,000 Marks and a 15-year timeline for a project on Greek-Christian Authors of the First Three Centuries, the modest goal of which was to collect all of the hand-written manuscripts of early Christianity. Other projects included a prosopography of ancient Rome funded for a period of ten years.

Looking back on what Mommsen had accomplished for modern scholarship, the German philologist Ulrich von Wilamowitz-Moellendorf wrote:

The large scale production of science cannot replace the initiative of the individual; no one knew that better than Mommsen. But in many cases the individual will only be able to carry out his ideas through large scale production.

Theodor Mommsen, Wikipedia Commons

Theodor Mommsen, Wikipedia Commons

Figures such as Böckh and Mommsen introduced different scales to knowledge creation and different skill sets to humanistic scholarship. They developed, coordinated, and managed teams of people in order to organize huge sets of texts and data.

But to what end? What was the purpose of all this collecting, organizing, and managing? This was the question that transformed Germany’s most self-loathing philologist into a philosopher. A wunderkind trained in Leipzig, Friedrich Nietzsche was appointed professor of classical philology at the University of Basel at the age of 24, before he had even finished his doctorate. Just as Mommsen was busy assembling the “archive of the past,” Nietzsche began to diagnose modern culture, not to mention himself, as suffering from a bad case of “academic knowledge,” or Wissenschaft.

In We Philologists, Nietzsche excoriated his fellow scholars for abdicating philology’s real task. Ultimately, he argued, philology was not about advancing knowledge or building an “archive of the past.” It was about forming stronger, healthier human beings on the model, or at least the idealized model, of the ancient and classical Greeks. The real philologist was a lover of antiquity, someone who sought to transform himself through an encounter with a superior culture. Every good and worthwhile science, he wrote, should be kept in check by a “hygienics of life”—practices by which whatever one learned could be integrated into how one lived.

Despite his stylized iconoclasm, Nietzsche was a traditional German Grecophile for whom antiquity of the Greek sort was a moral utopia. But he was also a modern scholar struggling to come to terms with the ascendant research university and what we recognize today as its basic elements: the division of intellectual labor, academic specialization, and the constant struggle to integrate new technologies and practices for sifting through and making sense of the past.

Nietzsche’s polemics against big philology were precursors to contemporary anxieties about what might become of the humanities in the digital age. Data—be it 180,000 inscriptions or hundreds of digitized novels—cannot speak for itself, but it is never incoherent. It’s always collected, organized, edited, framed, and given meaning, whether in nineteenth-century printed volumes or twenty-first century graphs. With his bombast and passions, Nietzsche made the case for values and interpretation at a moment when textual empiricism was ascendant and positivism loomed. “Yes, but how are we to live!” was his constant refrain.

The catch, of course, is that most of us aren’t Nietzsche, though arguably too many contemporary scholars of the strongly critical-theoretical bent aspire to be. Scholarship and knowledge might be better served if many such would-be master interpreters settled for the humble but necessary drudgery of collecting, annotating, and commenting on the “archive of the past,” maintaining cultural inheritances and providing invaluable grist for the equally important job of hermeneutics. We shouldn’t forget that Nietzsche the philosopher, the moral psychologist who diagnosed the ethical ills of modernity, grew out of Nietzsche the philologist, the erudite scholar who reverentially tended ancient traditions and texts.

Nineteenth-century practices of collecting and evaluating data don’t exhaust the work of the humanities, but they highlight a broader history of the humanities in which collecting and evaluating data has been a central and even noble pursuit. Thinking of the humanities and the sciences in terms of what humanists and scientists actually do might help us develop a longer history of the humanities and see continuities that simple polemics only conceal. Nietzsche and his fellow nineteenth-century philologists struggled to reconcile more interpretive methods with historical approaches, to blend pleasure and delight with critical distance, and to temper particularity with timeless value. But Nietzsche represents only one side of the debate. While his critiques of the utopian impulses of big philology were necessary correctives, he ultimately left the university and withdrew to a life in extremis, writing at the edge of lucidity and under the shadow of genius.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Wall Must Stand: Innovation At the New York Times

“In the future,” writes digital scholar and Twitter wit Ian Bogost , “all news will be about, rather than in, the New York Times.”

That future seemed to arrive last week, and not only with the controversy unleashed by the abrupt firing of executive editor Jill Abramson. Possibly as part of the storm, someone in the newsroom leaked a 96-page document with detailed proposals for bringing the Times more fully into the digital age and—even more important—making the Grey Lady more “Reader Experience”-friendly.

#164976308 / gettyimages.com

Nieman Journalism Lab’s Joshua Benton, calls the report “one of the most remarkable documents I’ve seen in my years running the Lab.”  He even tasked three of his staffers with excerpting highlights from it. But the whole thing merits reading by anyone interested in the possible (or inevitable?) future of the news.

Not that there is anything truly new or surprising on any page of “Innovation,” as the study is so grandiloquently titled. Put together by a six-member team led by Arthur Gregg Sulzberger, the publisher’s son and heir apparent, the report is a compendium of ideas, strategies, arguments, and veiled threats familiar to anyone who has worked in or around newsrooms during the last decade or so. From the title on, it buzzes with the kind of jargony nostrums that fuel TED Talks and South by Southwest conferences, from impact toolboxes and repackaging old content in new formats to making journalists their own content promoters and integrating the Reader Experience team with the newsroom to, well, anything else that helps counter the disruptive incursions of new media upstarts like Buzzfeed and Huffington Post.

And why not counter those disrupters? As the report frequently notes, the NYT produces consistently superior content but does an almost-as-consistently inferior job of getting its content out to its reader. In some cases, competitors are even more successful at distributing Times content than the Times itself. It makes little sense to remain satisfied with such a status quo.

But reading the report invites suspicion on at least two counts. The first is quite immediate: How is it possible that these objectives haven’t already been accomplished? (And, in fact, is it possible that many of them have, as some NYT insiders say, proving once again that many strategic studies merely confirm the direction in which the institution is already heading). My incredulity arises from the facts presented in the report itself, namely the astonishing number of Times employees already dedicated to Reader Experience activities (which, as the report notes, “includes large segments of Design, Technology, Consumer Insight Group, R &D, and Product”).

The problem, explicitly remarked upon at several points in the report, appears to be turf wars.  Some of this is simply silly, such as the exclusion of Reader Experience people from key editorial meetings or other instances of uncollegial shunning. But I suspect the problem also stems from something far less silly, indeed, from the most fundamental of political-institutional questions: Who, at the end of the day, will be in command of the combined newsroom and Reader Experience personnel? Will it be the editorial leadership or the business leadership?

That question can’t be finessed, fudged, blurred, or deferred. It must be answered forthrightly, because if it isn’t, the very purpose of a serious news organization becomes unclear.

And that leads to my second big concern.  What is the real goal of “innovation” at the New York Times? Is it intended primarily to enable the editorial leaders to use and inculcate the best practices of distribution, with additional staff possessing advanced skills in those practices, in order to support and advance strong journalism? Or is intended primarily to increase the number of Reader Experiences as measured through analytics and other metrics at the expense, in the long or short runs, of the highest quality of journalism? If the former, I am on board—who wouldn’t be?

Which is why the political question must be answered first. If not, and if the new and enhanced newsroom ends up being run by the business side, then decisions will be made that will slowly erode the quality of the journalistic content. If content packagers and social media community managers answer ultimately to publishing executives and not to editors, then they will be able to demand the kind of content—whimsical features, for example, rather than hard reporting—that tends to trend most strongly.  The sad fact is that cute cat stories always sell better than revelations about city hall. The number of hits, likes, or visits will gradually but inevitably determine the editorial agenda.

Overstated? Simplistic? I don’t think so. When the ultimate purpose of a news organization is something that can be evaluated almost exclusively by metrics, then you can be sure you are no longer talking about a news organization.  A media company, perhaps, but not a journalistic one.

The report calls for some breaching of the editorial-publishing (or church-state) firewall. That sounds suspect to me. What it should call for is the migration of some of those Reader Experience departments and personnel to the editorial side of the firewall. The wall itself must stand. Or the journalism will fall.

Jay Tolson is the Executive Editor of The Hedgehog Review.

 

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.