Author Archives: Andrew Piper

79 Theses on Technology:
Piper to Jacobs—No Comment

In his 79 Theses, Alan Jacobs hits upon one of the most important transformations affecting the technology of writing today. “Digital textuality,” writes Jacobs in Thesis 26, “offers us the chance to restore commentary to its pre-modern place as the central scholarly genre.” One could remove “scholarly” from this sentence and still capture the essential point: In the interconnected, intergalactic Internet, everything is commentary.

For Jacobs, commentary is about responsiveness and the way we encode ethics into our collective electronic outpourings. Nothing could feel further from the actual comments one encounters online today. As Jacobs points out, “Comment threads seethe with resentment,” not only with what has been written, but with their secondary status as emotions, or rather one emotion. In a world where we imagine writing to be about originality, the comment can only ever be angry. In response, we either turn them off (as is the case with this blog). Or we say “No comment.” Withholding commentary is a sign of resistance or power.

Of course, this was not always the case. Commentary was once imagined to be the highest form of writing, a way of communing with something greater than oneself. It was not something to be withheld or spewed, but involved a complex process of interpretation and expression. It took a great deal of learning.

Hunayn ibn Ishaq al-'Ibadi, 809?-873 (known as Joannitius). Isagoge Johannitii in Tegni Galeni.

The main difference between our moment and the lost world of pre-modern commentary that Jacobs invokes is of course a material one. In a context of hand-written documents, transcription was the primary activity that consumed most individuals’ time. Transcription preceded, but also informed commentary (as practiced by the medieval Arab translator Joannitius). Who would be flippant when it had just taken weeks to copy something out? The submission that Jacobs highlights as a prerequisite of good commentary—a privileging of someone else’s point of view over our own—was a product of corporeal labor. Our bodies shaped our minds’ eye.

Not all is lost today. While comment threads seethe, there is also a vibrant movement afoot to remake the web as a massive space of commentary. The annotated web, as it’s called, has the aim of transforming our writing spaces from linked planes to layered marginalia. Whether you like it or not, that blog or corporate presence you worked so hard to create can be layered with the world’s thoughts. Instead of writing up here and commenting down there, it reverses the hierarchy and places annotating on top. Needless to say, it has a lot of people worried.

I personally prefer the vision of “annotation” to commentary. Commentary feels very emulative to me—it tries to double as writing in a secondary space. Annotation by contrast feels more architectural and versatile. It builds, but also branches. It is never finished, nor does it aim to be so. It intermingles with the original text more subtly than the here/there structure of commentary. But whether you call it annotation or commentary, the point is the same—to take seriously the writer’s responsiveness to another person.

Missing from these models is pedagogy. The annotated web gives us one example of how to remake the technology of writing to better accommodate responsiveness. It’s a profound first step, one that will by no means be universally embraced (which should give us some idea of how significant it is).

But we do not yet have a way of teaching this to new (or old) writers. Follow the curricular pathways from the lockered hallways of elementary school to the bleak cubicles of higher education and you will still see the blank piece of paper or its electronic double as the primary writing surface. The self-containment of expression is everywhere. It is no wonder that these writers fail to comment well.

It’s all well and good to say commentary is back. It’s another to truly re-imagine how a second grader or college student learns to write. What if we taught commentary instead of expression, not just for beginning writers, but right on through university and the PhD? What if we trained people to build and create in the annotated web instead of on pristine planes of remediated paper? Now that would be different.

Andrew Piper is Associate Professor and William Dawson Scholar in the Department of Languages, Literatures, and Cultures at McGill University.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterGoogle+LinkedInShare

Cultural Critics vs. Social Scientists—They Both Stink

The past few weeks have seen some heady attempts at generalization: first Sam Tanenhaus’s piece on “Generation Nice” and then A.O. Scott on the “Death of Adulthood.” (Potential correlation there?)

The Quant & The Connoisseur logo

The subsequent critiques of both were withering. Tanenhaus’s article proved to be laden with errors, resulting in hilarious retractions by the Times editorial staff. In response to the proof of the millennials’ niceness, the editors wrote:

An article last Sunday about the millennial generation’s civic-mindedness included several errors…. Applications to the Peace Corps recently have been in decline with a 34 percent decrease from the peak in 2009, and applications to Teach for America decreased slightly last year; neither organization has seen “record numbers of new college graduates” applying for jobs.

Well done. And the unredacted rest apparently relied on citations of studies that cited studies that cited…an ad agency?!

As for Scott (one of Q&C’s favorite film critics), his reflections on adulthood’s imminent, if not already occurring, death come from having watched a lot of TV. “Something profound has been happening in our television over the past decade,” Scott announces with a foreboding sense of doom (it’s gotten worse?). And then, in an alliterative jingle that would make even the best nineteenth-century speech writer wriggle, “It is the era not just of mad men, but also of sad men and, above all, bad men.”

So there you have it: A few shows (Mad Men, The Sopranos, Breaking Bad) have chronicled the decline of white patriarchy, which is a good stand-in for the decline of adulthood, which in turn is a good stand-in for a major shift in “American Culture.” Imagining that all of adulthood, and masculinity in particular (Scott’s real aim), was coming to an end because of a few televisual-fantasies of bad dads like Don Draper ignored, as David Marcus pointed out, a whole lot of other stuff on TV that most people actually watch, like, say, football (or NCIS or NCIS: LA). Masculinity is doing just fine there (by which I mean on display, not as in, oh-so-admirable).

One would think at this point the answer is Big Data to the rescue. Instead of making whopping generalizations based on a few selective examples, turning culture into data can give us a much better view of the “big picture” (preferably as a picture: through that most ubiquitous of contemporary genres, the infographic). If we look broadly, what is “television” telling us and how would we segment it into different groups, for surely it is not telling all of us the same thing?

The problem is, as Marcus pointed out, it’s not as though the social scientists who traffic in cultural data mining have done much better. Turning culture into data is not a seamless process, nor is its interpretation. While we all know this, we seem unable to heed this advice given the opportunity for a juicy headline—in other words, given the chance to tell a story. Narrative trumps reason in fascinating ways.

The point is not, oh forget it, let’s just let Tanenhaus make it up after all. A good story is a good story and you can’t count culture anyway. The point is we need a lot more work on the work of translating culture into data before we go ahead and start calculating and interpreting. What would be a representative sample of “TV” or “pop-culture”? How would you measure depictions of adulthood or “masculinity” (either as positively or negatively coded)? What is your control set, i.e., what are you comparing this against? And so on.

The real answer is we need to think more about the process of cultural modeling. How do we model a cultural subset through a data set (a generation, for example, or contemporary television), and how do we model a cultural practice or concept through a particular measurement? These aren’t easy questions, but they are the prerequisite for correcting against journalistic just-so stories of cultural criticism.

This is the time for the humanists to jump into the fray, not to put our heads in the sand and say, “You can’t count that!”  The challenge is to think of counting culture in more sophisticated ways and so avoid the mythologizing that passes as cultural criticism these days.

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

You must change your graduate program!

Of all the hyperbole surrounding the fate of the humanities today, the problems facing graduate studies seem the least exaggerated. The number of PhDs has far outpaced the number of available full-time jobs. Financial support is both inadequate and unevenly distributed, requiring The Quant & The Connoisseur logostudents to defer earnings over long periods of time, assume unneeded debt, and compete against differently funded peers.

Overspecialization has made knowledge transfer between the disciplines, not to mention between the academy and the rest of the workforce, increasingly difficult. The model of mentorship that largely guides student progress, now centuries-old, seems increasingly out of touch with a culture of non-academic work, so that students are ill-prepared to leave the academic track. Time-to-degree has not only not sped-up, but increasingly it also correlates with lower success rates—the longer students stay in the PhD track the lower their chances for full-employment.

Philosophy Seminar Table at University of Chicago

The hegemony of the seminar as the only model of learning is also at odds with much recent thinking about learning and intellectual development. Undoubtedly, numerous teachers and students alike would describe at least some, if not many, of their seminar experiences as profoundly uninspiring. Add to that the way we largely operate within a developmental model premised on major phase-changes that dot otherwise stable, and largely flat, plateaus. The long time between exams or their sink-or-swim nature does little to promote long-term incremental development of students as thinkers, writers, or teachers. We think more in terms of a ninteenth-century-inspired model of botanical metamorphosis, with its inscrutable internal transformations, than we do incremental, cumulative performances.

There are also bio-political aspects to the crisis that have recently been raised, where the most intense periods of work and the most intense periods of insecurity overlap precisely within the normal timeframe of human fertility. This PhD model seems downright Saturnalian, consuming its own offspring.

What is there to like about this scenario?

Luckily, the MLA has issued a report about graduate education. “We are faced with an unsustainable reality,” the report states. Indeed. And then come the all-too-familiar platitudes. Maintain excellence. More teaching. Shorter time-frames. Innovate. Better connection with non-academic jobs. Advocate for more tenure-track positions.

As the clichés pile up, so do the contradictions. Get out faster, but spend more time teaching. Keep up those rigorous standards of specialization, but do it with haste (and be interdisciplinary about it). No teaching positions available? Look for another kind of job! Persuade the university to hire more tenure track positions—whom do I call for that one?

Will the MLA report lead to changes? It’s doubtful. Sure, we’ll crack down on time to degree without really changing requirements. We’ll spice up course offerings and maybe throw in an independent project or two (digital portfolios!). We’ll scratch our heads and say the PhD would be a great fit for a job in consulting, or in a museum, maybe even Google—and then do absolutely nothing. Five or ten years from now, we’ll talk about a crisis in the humanities, the shrinking of the field, and notice that, once again, there seem to be fewer of us hanging around the faculty club.

Nothing will change because we don’t have to. As long as there are too many graduate students, there is no problem for faculty. And no matter what we say, what we do is always the same thing. Try approaching your department and saying you need to change the scope, scale, content, and medium of the dissertation. You’re in for a fun conversation.

Try implementing a mandatory time-limit with yearly progress reports and consequences for failure and you’ll be barraged with so many exceptions your Agamben will start to hurt. What do employers want from PhDs—good writing skills, general knowledge, analytical capability, facility with numbers, strong work habits, works well with others? Sorry, can’t help you. None of our students has those skills since our program doesn’t emphasize them (but we have a writing center!).

Nothing will change because we don’t have to. We’re so conservative at heart we’d rather die out with our beliefs intact than do anything that might actually better serve the student population. We’ll continue to point to the exceptions without realizing how much they still look like something we didn’t want to change.

The PhD did something once, or rather it did one thing and it did it reasonably well. It still does that one thing, which is now, quantitatively speaking, vastly unnecessary. Some have suggested scaling up to meet the sciences on their own ground (and here at The Infernal Machine, too). I would suggest that we scale down to meet the needs of the world at large. More modular, more flexible, more creative, more varied, more timely, more general, more collaborative, and more relevant. Until we have any proof that our programs are feeders for jobs outside the academy, we’re just failing by another name.

We can either change in substantive ways or pretend to do something else while actually continuing to do the same things we’ve always done. The MLA report looks a lot like the latter and no doubt so will most of the responses to it.

I’m looking forward to next year’s report. How many ways can you play the same tune?

 

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Semi-Pro—On the New Work-Study Conundrum

With the recent NCAA ruling in the Northwestern football case—in which college players were deemed eligible to unionize—the question of work on campus has reared its ugly head once again. However much the ruling was more narrowly about the behemoth that is NCAA sports, it was yet another sign of the increasing professionalization of all things academic. Even those students who were thought to be playing games are working. Whether it’s graduate students or student-athletes, there is a lot more work going on on campus than people had previously thought. In the good old days, work was something you did on the side while you studied. Now, it might be the only thing you do.

The Quant & The Connoisseur logo

In response to the decision, voices from inside the university were familiarly hypocritical. There’s just too much money at stake not to preserve the myth of the amateur or the apprentice. (In fact, it’s appalling how much money is in play in a system that still largely uses medieval guild models to understand itself.) Voices in the press, on the other hand, were typically nostalgic. Wouldn’t it be nice if this weren’t the case? Isn’t there some way of restoring the work-study balance so we can imagine real student-athletes and not financially indebted apprentices? Remember Greece?

Rather than banish the idea of work like some unclean vermin out of a Kafka story, we should be taking the opportunity to look it more squarely in the face, and not just in the world of college sports. Work is everywhere on campus today. It’s time we accepted this fact and start rethinking higher education accordingly — starting with that most hallowed distinction between working and studying that informs almost everything else we do.

Conflating work and study might seem to many to bring together categories that are traditionally seen as opposites. That’s why we have two words for them after all. But they’re also supposed to be sequentially related. You don’t just do one instead of the other. You are supposed to move from the one to the other.

Imagining work as part of the study equation rather than its opposite or its aftermath challenges this equation, especially assumptions about its general level of success. We largely operate  with what we could call a mimetic theory of learning in which students imitate, usually poorly, “real” work, after which at some point they presumably stop simulating and start producing. No one knows when this is supposed to happen or how. The failure rates in graduate school, or the difficulties most undergraduates face in moving from school to jobs, testify to how little this process results in putting people to work. And yet we insist on keeping “work” at bay.

My point is that the recent influx of work’s presence in the academy might in fact be good for learning. With all the anxiety about business models entering into the university, whether it’s students who mostly just play sports, graduate students who spend a lot of time teaching, or faculty who are subject to the market demand of student-consumers, the thing we haven’t considered is the act of learning and the vast majority of students who engage in it. What happens to them when learning looks a lot more like work?

In the humanities, to take the example I’m most familiar with, when students write papers for a class they are writing papers that are like good papers, but usually are not. There is a theory of learning-by-simulation that governs the practice. (I’ll ignore exams since we all know those are just compromises with the devil of overpopulation.) Students’ papers, and in most cases the grown-up ones they are supposed to emulate, are socially worthless. They don’t produce anything. They are poor copies, in Plato’s sense. Students may or may not drop by to pick them up at the end of the semester and, if they don’t, we throw them in the trash. Yes, that’s true, but telling.

But what about the growth! say the pundits. How else are students supposed to reach their full, expressive intellectual capacities without these exercises? Yes, it is like exercise. You practice to simulate the real game. It’s not like those kids playing basketball in sixth grade aren’t imitating their grown-up peers—that’s what learning is.

But here’s the difference. What the NCAA ruling shows us is that by the time of higher education, these students are no longer imitating. It is work in its own right and is being compensated accordingly. (Well, at least the coaches and administrators are being compensated.) Why not have the same expectations for our students? And certainly for graduate students, who are way past the sell-by date of learning by imitating? Why keep alive the pretense of simulation when in practice there is all sorts of work going on and potential for even more of it?

Because we’ve been doing everything on a simulator deck for so long, it’s hard to imagine what this might look like. But try to picture more assignments and activities that aim to solve real-world problems, even at the simplest of levels. Wikipedia pages are an obvious first step. Why not teach composition using this public space? Why not make new knowledge in the process of honing your writing skills and creating a suite of pages around an overlooked cultural feature? Along the way, students would also learn what it means to edit, a skill most of them very much need.

It would also teach them how to relate to other people’s writing. As we move more and more from a book-based world to a digital infrastructure, why not teach students how to contribute to that process? I don’t mean have them transcribe OCR instead of talking in class (too Matrix-y?). But certainly encoding texts and tagging them with more information that can be used by others are activities that require a good deal of thought and critical judgement. At the same time, this information is useful to others. Think of the boon to historians to have more information available in digital archives. Perhaps students might apply their new-found knowledge about the novel (Jane Austen, no doubt) by writing an algorithm that looks for something textually interesting in the myriad pockets and corners of the Internet. There is so much to analyze out there, why not do it rather than pretend to? I’m imagining something like the Maker Movement, though less artisanal in spirit and more analytical and applied.

The detractors will, of course, have seizures about all of this—they will say it overlooks the disinterested nature of learning and undoubtedly marks the end of the life of the mind. My colleagues in the humanities might argue that it smacks of a science system that prioritizes laboratory doing over seminar simulation. (They’ve been doing this for a long time.) Prepping for cubicle life is no way to think about Aristotle, they would insist, whither the critical thinkers? Once those ivory towers are opened to the world of work, there will be no end to it.

This is where I think the either/ors have it wrong. First, I’m not a big fan of pretending. We can pretend that work hasn’t infiltrated campus, but that doesn’t make it so. For those on the outside looking in, like the judicial system, we are in a serious state of denial. Ostrich-politics, as they say in Germany, isn’t a very effective or attractive technique (not least because of one’s embarrassing posture and the rear-end exposure).

Second, and I think more important, we can do this in ways that are intellectually valuable. We need to get over our fear of the simple—simple things, when brought together over time, add up to much more. Witness Wikipedia. You can imagine any number of smaller-scale forms of student-curated information. In fact, this might be a very useful way to frame a new kind of work-study stance. Instead of apprentices simulating their superiors, students are curators making and sharing knowledge simultaneously at different levels of scale.

I think the most compelling reason is also the least appetizing: in the end we must. Higher education has simply become too expensive to justify itself as a pasture for the world’s elite. Or to put it another way, that’s all it is unless we do something differently. It’s time to get to work.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Unpredictability of Academic Writing

The otherwise foxy Nate Silver made a very hedgehoggish comment recently when he claimed that all op-ed writing was “very predictable” and that “you can kind of auto-script it, basically.”The Quant & The Connoisseur logo

His comments caused a (very predictable) backlash, with neither Silver nor his critics bothering to substantiate their claims. But they go to the heart of some of the most basic questions about why we read and what we read for. What is the value of unpredictability in writing? Are there certain kinds of writing that are more predictable than others? Are more predictable texts of lesser quality or is it the other way around? After all, we need some predictability in order to make sense of what we are reading, but how much predictability is too much?

Here at the Quant and the Connoisseur we decided to test Silver’s claim (to outfox the fox as it were). We chose to compare the genre of popular book reviews as they appear in an industry standard like the New York Times and those that appear in one of the leading academic journals, the PMLA (Publications of the Modern Language Association).

One of the reasons we chose these examples was to answer a nagging feeling that many of us here in the academy have that literary reviews are some of the most tedious things ever written. What Silver feels about the op-ed is akin to how we feel about contemporary journalistic criticism.

We undertook this exercise also to address recent debates about the “academicness” of academic writing. By “academic” people usually don’t mean nice things (like smart, insightful, or brilliant). They usually mean turgid, jargony, repetitive, and boring. But in all the mud-slinging no one thought to look more broadly at the nature of such writing. Everyone was content to extract a few well-chosen examples of impenetrable prose and be done with it. That’s clever but not very fair.

So to test our feelings and Silver’s claim, we decided to measure the predictability of different texts across our two samples using a common metric from information theory, that of redundancy. Redundancy uses Claude Shannon’s theory of information entropy to measure the density, and therefore the unpredictability of information (we use Shannon’s definition of one minus the relative entropy over the maximum entropy to calculate redundancy). The standard example is to think about this in terms of language. In English, the probability that you will find an “h” after a “t” is much higher than finding a “z” after a “t,” though this would be reversed for German. The higher the probability of any sequence of letters, the greater the redundancy because you can guess with increasing accuracy what the next letter will be. If “h” always came after “t” in English (and only h’s came after t’s) we wouldn’t even need to write it. It would be entirely redundant because perfectly predictable.

We can do the same thing for the words of a given text. Given any word n, what is the likelihood of guessing n+1. The greater the likelihood, the more redundant a text is and thus more predictable. Imagine a text with only two words “she said,” written 500 times (for a total of 1000 words). It would have a redundancy of .899, meaning that you could remove just about 9/10 of it and still have all of the information contained in the text. The reverse case, a text with 1000 different words, would have a redundancy score of 0. No two pairs of words repeat themselves throughout the entire text. Given any word we would have no idea what came next.

Applying this measure to a sample of 189 articles taken from our two categories, we found that on average book reviews in the New York Times are significantly more redundant than literary criticism in the PMLA. Here is a boxplot showing the distributions:

Predict Boxplot

One concern we had is that academic articles tend to be considerably longer than book reviews. When we took just the first 1000 words of each we still found significantly different averages (p = 8.399e-09).

Predict Box 1000 words

While this tested the redundancy of the writing in any single article, we also wanted to know whether book reviews tended to sound more like each other in general than academic articles. So where the first score looked at the language within articles, the next score tested language between articles. How similar or dissimilar are book reviews to each other? Do they sound a lot more like each other than academic criticism does to itself? The answer again was a resounding yes.

Predictability_Boxplot_Similarity_1000

What does this all mean? First, it confirms our feelings that journalistic criticism is both more predictable and more homogenous across different articles. There is a familiarity that is an important aspect of this genre. Some might call it a house-style or just “editing.” Others might call it is just plain boring. One of the reasons we would argue that academics enjoy reading academic articles (yes, enjoy) is that there is a greater degree of surprise and uncertainty built into the language. Academic articles are information dense: their goal is to tell us new things in new ways. For people who read a lot, we get tired more easily of the same old thing. New knowledge requires new ways of saying things.

So rather than rehash tired clichés about the jargony nature of academic writing – itself a form of redundancy! – we might also want to consider one of academic writing’s functions: it is there to innovate, not comfort. To do so you need to be more unpredictable in how you put words together. It’s less soothing, but it also serves an important purpose. It’s the exact opposite of “jargon,” if by that we mean a way of speaking that is repetitive and insular. Academic writing is there to surprise us with new insights.

I am sure many will find this a surprising thing to say.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The New Anti-Intellectualism

Charges of anti-intellectualism in American life are as old as the Republic. It’s the inevitable consequence of being a bottom-up state and the high degree of pragmatism that comes with it. As Jules Verne wrote of the mid-nineteenth-century United States, “The Yankees are engineers the way Italians are musicians and Germans are metaphysicians: by birth.”

The Quant & The Connoisseur logoWhat makes our current moment unique is the fact that this time the fear of ideas isn’t coming from the prairies, the backwaters, or the hilltops. It’s coming from within the elite bastions themselves, those citadels of the urbane and the cosmopolitan. At stake in this revolt is nothing less than the place of quantification within everyday life. Never before has it been so fashionable to be against numerical thinking.

The dean of this new wave is of course Leon Wieseltier, editor at The New Republic, who is making something of a crusade to cow science and quantification into submission:

What [science] denies is that the differences between the various realms of human existence, and between the disciplines that investigate them, are final…All data points are not equally instructive or equally valuable, intellectually and historically. Judgments will still have to be made; frameworks for evaluation will still have to be readied against the informational deluge.

Wieseltier’s dream is that the world can be neatly partitioned into two kinds of thought, scientific and humanistic, quantitative and qualitative, remaking the history of ideas in the image of C.P. Snow’s two cultures. Quantity is OK as long as it doesn’t touch those quintessentially human practices of art, culture, value, and meaning.

Wieseltier’s goal is as unfortunate as it is myopic. It does a disservice, first, to the humanistic bases of scientific inquiry. Scientists aren’t robots—they draw their ideas from deep, critical reflection using numerical and linguistic forms of reasoning. The idea that you can separate these into two camps would make little sense to most practicing researchers. To be sure, we all know of laughable abuses of numbers, especially when applied to cultural or human phenomena (happiness studies anyone?). This is all the more reason to argue for not sequestering number off from humanistic disciplines that pride themselves on conceptual complexity. Creating knowledge fences only worsens the problem.

But it also has the effect of hardening educational trends in which we think of students as either math and science kids or reading and verbal ones. Our curricula are designed to reinforce Wieseltier’s argument into over-simplified binaries, ones that come at the expense of our own human potential. Most importantly, in my view, is the way this line of thinking lacks precedent in the history of ideas. Some of the intellectual giants of the past, people like Leibniz, Descartes, or Ludwig Wittgenstein—presumably the folks Wieseltier would admire most—were trained as mathematicians. The history of literature, too, that most prohibited territory of number’s perennial overreach, is rife with quantitative significance. Why are there nine circles of hell in Dante’s Inferno? 100 stories in Boccaccio’s Decameron? 365 chapters in Hugo’s Les Misérables? 108 lines in Poe’s “The Raven”? Not to mention the entire field of prosody: why is so much French classical theatre composed in lines of 12 syllables or English drama in 10? Why did there emerge a poetic genre of exactly 14 lines that has lasted for over half a millennium?

Such questions only begin to scratch the surface of the ways in which quantity, space, shape, and pattern are integral to the human understanding of beauty and ideas. Quantity is part of that drama of what Wieseltier calls the need to know “why”.

Wieseltier’s campaign is just the more robust clarion call of subtler and ongoing assumptions one comes across all the time, whether in the op-eds of major newspapers, blogs of cultural reviews, or the halls of academe. Nicolas Kristof’s charge that academic writing is irrelevant because it relies on quantification is one of the more high-profile cases. The recent reception of Franco Moretti’s National Book Critics Award for Distant Reading is another good case in point. What’s so valuable about Moretti’s work on quantifying literary history, according to the New Yorker’s books blog, is that we can ignore it. “I feel grateful for Moretti,” writes Joshua Rothman. “As readers, we now find ourselves benefitting from a division of critical labor. We can continue to read the old-fashioned way. Moretti, from afar, will tell us what he learns.”

We can continue doing things the way we’ve always done them. We don’t have to change. The saddest part about this line of thought is this is not just the voice of journalism. You hear this thing inside academia all the time. It (meaning the computer or sometimes just numbers) can’t tell you what I already know. Indeed, the “we already knew that” meme is one of the most powerful ways of dismissing any attempt at trying to bring together quantitative and qualitative approaches to thinking about the history of ideas.

As an inevitable backlash to its seeming ubiquity in everyday life, quantification today is tarnished with a host of evils. It is seen as a source of intellectual isolation (when academics use numbers they are alienating themselves from the public); a moral danger (when academics use numbers to understand things that shouldn’t be quantified they threaten to undo what matters most); and finally, quantification is just irrelevant. We already know all there is to know about culture, so don’t even bother.

I hope one day this will all pass and we’ll see the benefits of not thinking about these issues in such either/or ways, like the visionary secretary of Jules Verne’s imaginary “Baltimore Gun Club,” who cries, “Would you like figures? I will give you some eloquent ones!” In the future, I hope there will be a new wave of intellectualism that insists on conjoining these two forms of thought, the numerical and the literal, figures and eloquence. It seems so much more human.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The internet killed books again

A few years ago I created a column called “deathwatch” (macabre, I know). Its goal was to track all of the real or imagined ways that we thought media could die. Sometimes it was about the strange ways that we personify technologies, sometimes about asking why media change makes us so uncomfortable.

The Quant & The Connoisseur logo

I don’t think I realized then that it is just a part of the culture, a necessary meme through which we make sense of things around us. As strange as it may sound, there is clearly something comforting about eulogizing media technology, like a warm blanket for the overconnected.

Consider the latest incarnations. First there was George Packer’s eloquent and almost wholly factless concern that Amazon is single-handedly ruining books in The New Yorker. Buying this idea—and its clear from the buzz that everyone did—rests on two basic premises. First, it ignores the fact that 200 million more books (meaning items, unique things) were sold this year than the year before and that since roughly the late seventeenth century there has been no appreciable decline in the output of reading material except in times of war, plague, or famine. Second, it depends on you believing that the big 6 publishers are good for books. These are the descendants of the same people about whom an eighteenth-century German satirist once said drank champagne from the skulls of authors. Now they are the last line of defense in the preservation of civilized discourse.

The second case came in a recent piece in The Guardian that lamented the decline of author advances for mid-list fiction. Rather than all books, it seems, the internet is particularly bad for “literature”—or at least the kind that is just OK, and from which people apparently have been making a living since the invention of copyright. It fits the broader narrative of the death of the middle these days. Along with the shrinking middle class we now have shrinking publishers’ mid-lists.

It should be pretty clear that both of these scenarios only make sense from the producers’ side, or I should say, certain producers. I’m sure it is a tough time to be a publisher or a mediocre author. For readers, on the other hand, it’s a great time. There is fantastic stuff coming out from niche publishers, like Tomas Espedal’s Against Art (Seagull Books), Merethe Lindstrøm’s Days in the History of Silence (Other Press), or Markus Werner’s Zündel’s Exit (Dalkey Archive Press). There are so many different ways to access books now and so many that used to be hard to get that aren’t. And despite popular opinion, university presses remain committed to producing thoughtful scholarship like Bernd Stiegler’s Traveling in Place: A History of Armchair Travel (Chicago) or the vanguard of ideas, as in this year’s prize-winning book by Sianne Ngai, Our Aesthetic Categories: Zany, Cute, Interesting (Harvard). Good writing, as well as bad, doesn’t depend on machines. Dante didn’t need the printing press or copyright (though he did need his friend Boccaccio), and neither of those has protected us from the likes of Clive Cussler.

The real problem, as everybody knows, is not that the internet is ruining writing. It’s writing. There’s just too much of it. How writers and readers can build sustainable communities without being overwhelmed or lost remains an urgent task. Making sense of the traffic of ideas will inevitably require new finding aids and new tools to sift through all the material and create new social connections. Much as Ann Blair explained in her widely read study, Too Much to Know (Yale), about readers in the sixteenth century who invented all sorts of new tools and techniques to keep track of and organize the printing press’s output (indeces, tables, trees, lists), today we need a whole new range of sorting devices to help us access and find writing—so we can enjoy reading. It’s not that Amazon is killing books. It’s that they aren’t going far enough in facilitating our navigation of the digital deluge.

What I have come to realize is that fears of media change usually mean a threat to someone’s real or perceived authority. Mediation is about control—about who gets to say what to whom. Media change is about shifting that power dynamic. That’s why you usually hear it from the well-heeled. It’s a top-down concern, not bottom-up. Publishers have very nice offices, authors like Packer have very nice advances, and critics have over-indulged in their own charismatic pretenses. The internet continues to put all that in flux. And that is surely a good thing.

Andrew Piper is Associate Professor and William Dawson Scholar in the Department of Languages, Literatures, and Cultures at McGill University. His work explores the application of computational approaches to the study of literature and culture. He directs the Literary Topologies project and is the author most recently of Book Was There: Reading in Electronic Times. Andrew blogs at The Infernal Machine as The Quant & The Connoisseur. You can follow Andrew on Twitter @_akpiper.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.