Monthly Archives: March 2014

The New Anti-Intellectualism

Charges of anti-intellectualism in American life are as old as the Republic. It’s the inevitable consequence of being a bottom-up state and the high degree of pragmatism that comes with it. As Jules Verne wrote of the mid-nineteenth-century United States, “The Yankees are engineers the way Italians are musicians and Germans are metaphysicians: by birth.”

The Quant & The Connoisseur logoWhat makes our current moment unique is the fact that this time the fear of ideas isn’t coming from the prairies, the backwaters, or the hilltops. It’s coming from within the elite bastions themselves, those citadels of the urbane and the cosmopolitan. At stake in this revolt is nothing less than the place of quantification within everyday life. Never before has it been so fashionable to be against numerical thinking.

The dean of this new wave is of course Leon Wieseltier, editor at The New Republic, who is making something of a crusade to cow science and quantification into submission:

What [science] denies is that the differences between the various realms of human existence, and between the disciplines that investigate them, are final…All data points are not equally instructive or equally valuable, intellectually and historically. Judgments will still have to be made; frameworks for evaluation will still have to be readied against the informational deluge.

Wieseltier’s dream is that the world can be neatly partitioned into two kinds of thought, scientific and humanistic, quantitative and qualitative, remaking the history of ideas in the image of C.P. Snow’s two cultures. Quantity is OK as long as it doesn’t touch those quintessentially human practices of art, culture, value, and meaning.

Wieseltier’s goal is as unfortunate as it is myopic. It does a disservice, first, to the humanistic bases of scientific inquiry. Scientists aren’t robots—they draw their ideas from deep, critical reflection using numerical and linguistic forms of reasoning. The idea that you can separate these into two camps would make little sense to most practicing researchers. To be sure, we all know of laughable abuses of numbers, especially when applied to cultural or human phenomena (happiness studies anyone?). This is all the more reason to argue for not sequestering number off from humanistic disciplines that pride themselves on conceptual complexity. Creating knowledge fences only worsens the problem.

But it also has the effect of hardening educational trends in which we think of students as either math and science kids or reading and verbal ones. Our curricula are designed to reinforce Wieseltier’s argument into over-simplified binaries, ones that come at the expense of our own human potential. Most importantly, in my view, is the way this line of thinking lacks precedent in the history of ideas. Some of the intellectual giants of the past, people like Leibniz, Descartes, or Ludwig Wittgenstein—presumably the folks Wieseltier would admire most—were trained as mathematicians. The history of literature, too, that most prohibited territory of number’s perennial overreach, is rife with quantitative significance. Why are there nine circles of hell in Dante’s Inferno? 100 stories in Boccaccio’s Decameron? 365 chapters in Hugo’s Les Misérables? 108 lines in Poe’s “The Raven”? Not to mention the entire field of prosody: why is so much French classical theatre composed in lines of 12 syllables or English drama in 10? Why did there emerge a poetic genre of exactly 14 lines that has lasted for over half a millennium?

Such questions only begin to scratch the surface of the ways in which quantity, space, shape, and pattern are integral to the human understanding of beauty and ideas. Quantity is part of that drama of what Wieseltier calls the need to know “why”.

Wieseltier’s campaign is just the more robust clarion call of subtler and ongoing assumptions one comes across all the time, whether in the op-eds of major newspapers, blogs of cultural reviews, or the halls of academe. Nicolas Kristof’s charge that academic writing is irrelevant because it relies on quantification is one of the more high-profile cases. The recent reception of Franco Moretti’s National Book Critics Award for Distant Reading is another good case in point. What’s so valuable about Moretti’s work on quantifying literary history, according to the New Yorker’s books blog, is that we can ignore it. “I feel grateful for Moretti,” writes Joshua Rothman. “As readers, we now find ourselves benefitting from a division of critical labor. We can continue to read the old-fashioned way. Moretti, from afar, will tell us what he learns.”

We can continue doing things the way we’ve always done them. We don’t have to change. The saddest part about this line of thought is this is not just the voice of journalism. You hear this thing inside academia all the time. It (meaning the computer or sometimes just numbers) can’t tell you what I already know. Indeed, the “we already knew that” meme is one of the most powerful ways of dismissing any attempt at trying to bring together quantitative and qualitative approaches to thinking about the history of ideas.

As an inevitable backlash to its seeming ubiquity in everyday life, quantification today is tarnished with a host of evils. It is seen as a source of intellectual isolation (when academics use numbers they are alienating themselves from the public); a moral danger (when academics use numbers to understand things that shouldn’t be quantified they threaten to undo what matters most); and finally, quantification is just irrelevant. We already know all there is to know about culture, so don’t even bother.

I hope one day this will all pass and we’ll see the benefits of not thinking about these issues in such either/or ways, like the visionary secretary of Jules Verne’s imaginary “Baltimore Gun Club,” who cries, “Would you like figures? I will give you some eloquent ones!” In the future, I hope there will be a new wave of intellectualism that insists on conjoining these two forms of thought, the numerical and the literal, figures and eloquence. It seems so much more human.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterGoogle+LinkedInShare

You Must Unplug Your Life!

Maybe you missed it because you left your iPhone at home, but the fifth annual National Day ofundologo Unplugging was celebrated by certain digital malcontents on March 7th. The event’s organizer, Reboot, exhorts us “to unplug and reconnect in real life” for one day from sundown to sundown:

We increasingly miss out on the important moments of our lives as we pass the hours with our noses buried in our iPhones and BlackBerry’s, chronicling our every move through Facebook and Twitter and shielding ourselves from the outside world with the bubble of “silence” that our earphones create.

If you recognize that in yourself—or your friends, families or colleagues—join us for the National Day of Unplugging, sign the Unplug pledge and start living a different life: connect with the people in your street, neighborhood and city, have an uninterrupted meal or read a book to your child.

Reboot wants to change your life. You don’t just need to put down your iPad, you need to remake yourself. This broader injunction to self-reinvention reveals just what “unplugging” and “detoxing” is all about and helps us to understand better the handwringing about too much information and digital distraction that has been a constant of cultural criticism of late.

The National Day of Unplugging grew out of the group’s Sabbath Manifesto, a ten-point list of edifying imperatives that will, if heeded, make you a better person. Among the many injunctions are “Connect with loved ones,” “Nurture your health,” “Get outside,” and, my favorite, “Drink wine”—all laudable activities that I would encourage you to try out if you haven’t. But the one that tops the list is “Avoid technology.” And this one makes less sense to me. How are we to “avoid” technologies? And why exactly?

Avoiding technology may sound like a noble feat of asceticism, but it’s neither possible nor desirable. Technologies are part of us. They aren’t just fungible tools that we can set aside when we want to be more human. They help constitute what is to be human. To pretend otherwise is naive and self-defeating. Unplugging from our digital devices, as Casey N. Cep points out,

doesn’t stop us from experiencing our lives through their lenses, frames, and formats. We are only ever tourists in the land of no technology, our visas valid for a day or a week or a year, and we travel there with the same eyes and ears that we use in our digital homeland. That is why so many of those who unplug return so quickly to speak about their sojourns. The ostentatious announcements of leave-taking (“I’m #digitaldetoxing for a few days, so you won’t see any tweets from me!” “Leaving Facebook for a while to be in the world!”) are inevitably followed by vainglorious returns, excited exclamations having turned into desperate questions (“Sorry to be away from Twitter. #Digitaldetox for three WHOLE days. Miss me?” “Back online. What did I miss?”).

The idea of “unplugging” assumes that a brief hiatus from your favorite device or app will have a cleansing effect. But who among us plans on living without our technologies? We all make plans to return under the illusion that we’ve actually done something good for ourselves, that we’ve changed our lives by turning off our iPhone. These earnest efforts at digital detoxing distract from just how enmeshed we are in our technologies. The dream of a world without technologies, however short-lived, is not sustainable. We need practices and norms to help guide us through this era of technological change. We need repetition: practice, practice, practice! One day a year won’t do it.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The internet killed books again

A few years ago I created a column called “deathwatch” (macabre, I know). Its goal was to track all of the real or imagined ways that we thought media could die. Sometimes it was about the strange ways that we personify technologies, sometimes about asking why media change makes us so uncomfortable.

The Quant & The Connoisseur logo

I don’t think I realized then that it is just a part of the culture, a necessary meme through which we make sense of things around us. As strange as it may sound, there is clearly something comforting about eulogizing media technology, like a warm blanket for the overconnected.

Consider the latest incarnations. First there was George Packer’s eloquent and almost wholly factless concern that Amazon is single-handedly ruining books in The New Yorker. Buying this idea—and its clear from the buzz that everyone did—rests on two basic premises. First, it ignores the fact that 200 million more books (meaning items, unique things) were sold this year than the year before and that since roughly the late seventeenth century there has been no appreciable decline in the output of reading material except in times of war, plague, or famine. Second, it depends on you believing that the big 6 publishers are good for books. These are the descendants of the same people about whom an eighteenth-century German satirist once said drank champagne from the skulls of authors. Now they are the last line of defense in the preservation of civilized discourse.

The second case came in a recent piece in The Guardian that lamented the decline of author advances for mid-list fiction. Rather than all books, it seems, the internet is particularly bad for “literature”—or at least the kind that is just OK, and from which people apparently have been making a living since the invention of copyright. It fits the broader narrative of the death of the middle these days. Along with the shrinking middle class we now have shrinking publishers’ mid-lists.

It should be pretty clear that both of these scenarios only make sense from the producers’ side, or I should say, certain producers. I’m sure it is a tough time to be a publisher or a mediocre author. For readers, on the other hand, it’s a great time. There is fantastic stuff coming out from niche publishers, like Tomas Espedal’s Against Art (Seagull Books), Merethe Lindstrøm’s Days in the History of Silence (Other Press), or Markus Werner’s Zündel’s Exit (Dalkey Archive Press). There are so many different ways to access books now and so many that used to be hard to get that aren’t. And despite popular opinion, university presses remain committed to producing thoughtful scholarship like Bernd Stiegler’s Traveling in Place: A History of Armchair Travel (Chicago) or the vanguard of ideas, as in this year’s prize-winning book by Sianne Ngai, Our Aesthetic Categories: Zany, Cute, Interesting (Harvard). Good writing, as well as bad, doesn’t depend on machines. Dante didn’t need the printing press or copyright (though he did need his friend Boccaccio), and neither of those has protected us from the likes of Clive Cussler.

The real problem, as everybody knows, is not that the internet is ruining writing. It’s writing. There’s just too much of it. How writers and readers can build sustainable communities without being overwhelmed or lost remains an urgent task. Making sense of the traffic of ideas will inevitably require new finding aids and new tools to sift through all the material and create new social connections. Much as Ann Blair explained in her widely read study, Too Much to Know (Yale), about readers in the sixteenth century who invented all sorts of new tools and techniques to keep track of and organize the printing press’s output (indeces, tables, trees, lists), today we need a whole new range of sorting devices to help us access and find writing—so we can enjoy reading. It’s not that Amazon is killing books. It’s that they aren’t going far enough in facilitating our navigation of the digital deluge.

What I have come to realize is that fears of media change usually mean a threat to someone’s real or perceived authority. Mediation is about control—about who gets to say what to whom. Media change is about shifting that power dynamic. That’s why you usually hear it from the well-heeled. It’s a top-down concern, not bottom-up. Publishers have very nice offices, authors like Packer have very nice advances, and critics have over-indulged in their own charismatic pretenses. The internet continues to put all that in flux. And that is surely a good thing.

Andrew Piper is Associate Professor and William Dawson Scholar in the Department of Languages, Literatures, and Cultures at McGill University. His work explores the application of computational approaches to the study of literature and culture. He directs the Literary Topologies project and is the author most recently of Book Was There: Reading in Electronic Times. Andrew blogs at The Infernal Machine as The Quant & The Connoisseur. You can follow Andrew on Twitter @_akpiper.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Who’s Afraid of Nate Silver?

On Monday Nate Silver unveiled his revamped 538 blog. Flush with ESPN cash, Silver has collected a team of writers, editors, and “quantitative editors” to carry out an experiment in what he calls “data journalism.”

After Silver left The New York Times, Margaret Sullivan, the Times’ public editor, said that he had “disrupted” the Grey Lady’s culture of journalism, especially its political journalism. Silver’s

entire probability-based way of looking at politics ran against the kind of political journalism that The Times specializes in: polling, the horse race, campaign coverage, analysis based on campaign-trail observation, and opinion writing, or “punditry,” as he put it, famously describing it as “fundamentally useless.” Of course, The Times is equally known for its in-depth and investigative reporting on politics. His approach was to work against the narrative of politics – the “story” – and that made him always interesting to read.

Screen Shot 2014-03-19 at 9.19.57 PM

In the ramp-up to the launch of the new 538, Silver took a few shots at “traditional” journalism and its proclivity toward punditry. He singled out the opinion pages of the Times, The Washington Post, and the Wall Street Journal in particular. Op-ed columnists at these bastions of cultural authority thrive on simplicity and a reliance on one idea regardless of the subject:

They don’t permit a lot of complexity in their thinking. They pull threads together from very weak evidence and draw grand conclusions based on them. They’re ironically very predictable from week to week. If you know the subject that Thomas Friedman or whatever is writing about, you don’t have to read the column. You can kind of auto-script it, basically.

It’s people who have very strong ideological priors, is the fancy way to put it, that are governing their thinking. They’re not really evaluating the data as it comes in, not doing a lot of [original] thinking. They’re just spitting out the same column every week and using a different subject matter to do the same thing over and over.

It’s ridiculous to me that they undermine every value that these organizations have in their newsrooms. It’s strange. I know it’s cheaper to fund an op-ed columnist than a team of reporters, but I think it confuses the mission of what these great journalistic brands are about.

Silver’s answer to this is data journalism, and 538 is an experiment in how it might be done.

Some find Silver’s talk about journalism, data, and evidence rather empty. Offended by Silver’s attack on opinion, the cantankerous Leon Wieseltier defends “bullshit,” Silver’s shorthand for op-ed journalism, and asks more simply: what’s wrong with conviction? Not every judgment or opinion is a matter of facts that can be culled from the data.

Many of the issues that we debate are not issues of fact but issues of value. There is no numerical answer to the question of whether men should be allowed to marry men, and the question of whether the government should help the weak, and the question of whether we should intervene against genocide. And so the intimidation by quantification practiced by Silver and the other data mullahs must be resisted. Up with the facts! Down with the cult of facts!

In his excitement to tar Silver as a hard-headed positivist oblivious to the varieties of evidence and argumentation and protect us all from the “intimidation by quantification practiced by Silver and the other data mullahs,” Wieseltier makes a hard and fast distinction between fact and value.

And this is where Wieseltier’s embrace of commentary through conviction sounds more like commentary through charisma. Silver’s point is not that data are facts that speak from an unmediated truth to those who only have the will to listen. Data is not easily collected, organized, explained, and generalized. Data is more like a process. There’s no such thing as raw data. Data is hard won, theoretically complicated, and wrapped up with questions of value, questions that Wieseltier claims Silver and all his “intimidating” fellow data journalists fear. But that’s just not true. For Silver, the point is that data journalism

isn’t just about using numbers as opposed to words. To be clear, our approach at FiveThirtyEight will be quantitative — there will be plenty of numbers at this site. But using numbers is neither necessary nor sufficient to produce good works of journalism.[..] Indeed, as more human behaviors are being measured, the line between the quantitative and the qualitative has blurred.

Silver is trying to figure out what counts as evidence, argument, fact, and value in a digital world awash with easily accessible information. Data, as he puts it, does not have a “virgin birth.”

It comes to us from somewhere. Someone set up a procedure to collect and record it. Sometimes this person is a scientist, but she also could be a journalist.

Wieseltier’s appeals to “conviction” are meant to avoid such complications and force us into the sure, steely arms of the charismatic critic, someone who believes in something and has no patience with “diffidence.”

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Spectacle and the Square

The ancient Greeks gave European civilization three great political technologies: the spectacle, the square, and rhetoric. This long winter we have seen each at work in remarkable ways in Russia and Ukraine. In Sochi we saw pure spectacle in Putin’s Olympics. In Kiev we saw the power of the square, as protestors gathered in Maidan, or Independence Square, to practice rhetoric (well before they clashed with riot police). Now that we may be moving toward a disaster of one kind or another in Ukraine, we would do well to reflect on the present life of these ancient political technologies as they are playing out in Putin’s so-called Eurasia.

The intellectual context here is the work of Russian political scientist Aleksander Dugin, who some describe as the founder of the ideology of Eurasianism, or National Bolshevism, and whose thought has been enormously influential in government circles in Putin’s Russia. Like many other post-cold war intellectuals—most famously Francis Fukuyama—Dugin’s The Fourth Political Theory pronounces “the end of ideology,” arguing that of the three great ideologies of the twentieth century—communism, fascism, and liberalism—the last, and the last alone, has won.

The consequence has not only been the defeat of powerful alternate dreams of political and economic order, but the end of liberalism itself as an “ideology”—that is, as a consciously propagated and debated belief system. The end of the cold war, we might say, brought the end of liberal rhetoric. No longer the subject of debate, liberalism is in Dugin’s words “a kind of mandatory given.” The result is “the postmodern metamorphosis of liberalism into the form of postmodernity and globalization,” a kind of global society of the spectacle where peoples everywhere are subject to the logics of consumer capitalism and highly individualized “human rights.”

The “Fourth Political Theory” (i.e., Eurasianism), in Dugin’s view, represents a “crusade” against postmodernity and its co-conspirators. And yet, in a move strikingly Foucaldian, Dugin argues that the “Fourth Political Theory” must not reject postmodernism so much as learn from it and reappropriate it: “The Fourth Political Theory must draw its ‘dark inspiration’ from postmodernity, from the liquidation of the program of the Enlightenment, and the arrival of the society of the simulacra, interpreting this as an incentive for battle rather than a destiny.”

And so we come to Sochi, where an obscure Russian village was transformed into opulent stage for the spectacle of the Olympics. Whereas for the Greeks the virtue of the spectacle was found in a reprieve from the ordinary order of things, above all battle, for Putin the spectacle was reinvented as a field for his Eurasian ideological battle. To hell with “human rights,” Putin’s Olympics announced (gays were here the “symbolic” target), let’s put on a muscular show. And so, too, we have since been faced with the spectacle of Crimea and the southern territories of Ukraine, where military posturing and “protest tourists” have reappropriated the American-born postmodern war toward the ends of Eurasianism.


So much for the spectacle, what of the square? Since last fall, Kiev’s Maidan, or Independence Square, has been the site of such pressure on the notion of  “the end of ideology” and the triumph of the “postmodern era” that, whatever the future violent triumphs of Eurasianism it will never be able to credibly claim to have raised ideology from its twentieth-century grave. The people on the Maidan have done that. As Timothy Snyder has beautifully written,

What does it mean to come to the Maidan? The square is located close to some of the major buildings of government, and is now a traditional site of protest. Interestingly, the word maidan exists in Ukrainian but not in Russian, but even people speaking Russian use it because of its special implications. In origin it is just the Arabic word for “square,” a public place. But a maidan now means in Ukrainian what the Greek word agora means in English: not just a marketplace where people happen to meet, but a place where they deliberately meet, precisely in order to deliberate, to speak, and to create a political society. During the protests the word maidan has come to mean the act of public politics itself, so that for example people who use their cars to organize public actions and protect other protestors are called the automaidan.

In ancient Greek society, the spectacle and the square were two distinct but complimentary political technologies. In Russia and Ukraine, during the course of this long and fateful winter, they have become representative of two, now opposing, political and “ideological” possibilities for the twenty-first century.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Who Should Professors Write For?

Two weeks ago Nicholas Kristof proved yet again that it isn’t news until The New York Times prints it. In his weekly column, Kristof lamented university professors’ self-imposed irrelevance:

Some of the smartest thinkers on problems at home and around the world are university professors, but most of them just don’t matter in today’s great debates.

The most stinging dismissal of a point is to say: “That’s academic.” In other words, to be a scholar is, often, to be irrelevant.

And the primary reason for the gap between scholars and this public that Kristof says they should be writing for: an academic culture “that glorifies arcane unintelligibility,” rewards hyper-specialization, and revels in jargon. Kristof was particularly perplexed that academics continued their cloistered ways when they

have a growing number of tools available to educate the public, from online courses to blogs to social media. Yet academics have been slow to cast pearls through Twitter and Facebook.

I write this in sorrow, for I considered an academic career and deeply admire the wisdom found on university campuses. So, professors, don’t cloister yourselves like medieval monks — we need you!

The reactions from academics across disciplines was swift, mocking, and exasperated. It turns out that professors are writing for all kinds of audiences and in a variety of forms. (Check out these blogs, for instance, on political science, medieval book history, and current events for a taste.) And, in case you hadn’t heard, it’s rather difficult to place a piece in the New Yorker or The New York Times, Kristof’s idea of a public medium.

In addition to these responses by professors, a number of journalists who once aspired to become academics wrote about the frustrations that drove them to leave academia in pursuit of a broader audience. Josh Marshall at Talking Points Memo described a pivotal conversation with his faculty advisor, the historian Gordon S. Wood, at Brown:

Once when I was trying to figure out what I was doing I headed up to Wood’s office to discuss it with him. Wood was generous and kind and always encouraging to me but rather distant as an advisor. At one point in our conversation, he laid it on the line. “You need to decide whether you’ll be satisfied with writing for an audience of two or maybe three hundred people.”

Clearly, the correct answer to this was “yes.” And as Wood said it, then and now I have the sense he thought posing it in this way would get me back on track with a focus on the scholarly community we were a part of. But hearing it so starkly, in my mind my response was something more like, “Holy Crap, no way! That’s definitely nowhere near enough people. And worse yet, I know some of those people. And I definitely don’t want to write for them.”

So Marshall left the academy for a career in journalism to write about “the great issues of the day” and extricate himself from the intellectual and cultural constraints of thinking and writing as an academic. Marshall decided to leave the academy because the incentive structure–what gets you published, what gets you tenured, what gets you promoted–was “geared against engagement with the world outside of academics.” If a professor writes for a blog, publishes in a non-peer-reviewed journal (the New Yorker or The New York Times, for example) or gives a public lecture for the local historical society, she doesn’t get credit for any of it; that is, in the logic of academic advancement, those outward-facing engagements won’t help her life as an academic. Academics write for academics.

Another aspiring-academic-turned-journalist, Joshua Rothman at the New Yorker, expanded on this academic system that produces such bad writing and half-developed intellects and contrasted it to journalism. Both journalism and academia are undergoing broad transformations, but they’re going in different directions, according to Rothman. Cultural forces are pushing journalism toward populism and accessibility, as new digital technologies continue to lower the barriers for entry. Anybody can publish and everybody can be a journalist. And established journalists, and established media, have to work hard to make their voices heard amidst a cacophony of blogs, tweets, and web sites. In academia, however, these same forces of cultural and technological change are pushing the in the opposite direction, “toward insularity.”

As in journalism, good jobs are scarce—but, unlike in journalism, professors are their own audience. This means that, since the liberal-arts job market peaked, in the mid-seventies, the audience for academic work has been shrinking. Increasingly, to build a successful academic career you must serially impress very small groups of people (departmental colleagues, journal and book editors, tenure committees). Often, an academic writer is trying to fill a niche. Now, the niches are getting smaller. Academics may write for large audiences on their blogs or as journalists. But when it comes to their academic writing, and to the research that underpins it—to the main activities, in other words, of academic life—they have no choice but to aim for very small targets. Writing a first book, you may have in mind particular professors on a tenure committee; miss that mark and you may not have a job. Academics know which audiences—and, sometimes, which audience members—matter.

It won’t do any good, in short, to ask professors to become more populist. Academic writing and research may be knotty and strange, remote and insular, technical and specialized, forbidding and clannish—but that’s because academia has become that way, too. Today’s academic work, excellent though it may be, is the product of a shrinking system. It’s a tightly-packed, super-competitive jungle in there. [..] If academic writing is to become expansive again, academia will probably have to expand first.

The knotty insularity of academic writing is not new, however, and neither are the complaints about its irrelevance.  This is not just another nothing-new-under-the-sun observation, however. The sharp differences between scholarly and popular forms of writing first crystalized at another moment of technological change.

Between 1770 and 1800 in Germany, the number of printed titles increased 150 percent. Many intellectuals and scholars celebrated the increased availability of print as a promise of Enlightenment. More print, they reasoned, meant greater access to it and, thus, greater access to knowledge. Print would inform, cultivate, and enlighten. But, as print continued to expand, others began to doubt the promises of print. A lot of what was printed didn’t really look like knowledge. It looked more like re-circulated opinions or just plain nonsense. But this was just the question: in a world of easily accessible information, how could you tell what was real knowledge and what was junk? How could you filter out all the dross that modern print produced?

One solution was to start writing in a different language, one not so easily commodified and circulated. This was the route that the German philosopher Immanuel Kant chose.

Kant, a professor at the University of Königsberg in the furthest reaches of Prussia, was by no means a bestselling author, but he began his career writing relatively accessible texts in philosophy that were characterized by a free, open style, prone to imaginative digressions. But with the publication of his landmark Critique of Pure Reason in 1781, all that changed. The first reviewers wondered want had happened to the imminently readable Kant. They dismissed the Critique‘s obtuse, impenetrable language as a cover for bad thinking. It was “incomprehensible to the greatest part of the reading public”; it hovered “in the clouds.”  Another reviewer went even further and decried Kant’s technical, jargon-filled language as the real philosophical issue. True knowledge, as he put it, should be “popular,” true to experience, and accessible to a broad reading public.

384px-KdrV-1781

Kant was not impressed. In the introduction to the second edition of the Critique, he confronted his critics by refuting their entire premise. Real philosophy didn’t need to be popular. In fact, the pursuit of popularity and a broad public was philosophy’s problem. Not only did he not regret writing the Critique in a more popular language, he wished he had written it in a more “armour-like” fashion. He wished that he had made it more impenetrable than it already was. But why? Why all the philosophical jargon and technical arguments? Why couldn’t Kant just write in clear, engaging prose?

For Kant, his contemporaries wrote in an easily accessible style not out of philosophical or ethical high-mindedness but out of a desire to sell books. They knew as well as he did that the modern print market had little patience for rigorous, complicated arguments. As the market had expanded over the course of the eighteenth century, so too had its audiences. It no longer catered to a few learned scholars but to an ever-expanding and increasingly diverse public interested more in opinions and memes, not original, complex ideas.

Kant chose to write in a self-consciously “scholastic” style, then, not only because he thought his subject required it, but because he wanted to distinguish his work from what he considered the mindless, commodified books of the popular press.

Kant’s dilemma crystalized the growing divide between scholarly and popular kinds of writing.   For many of Kant’s Enlightenment contemporaries, the technological change that the proliferation of print represented carried with it an obvious imperative: all knowledge should be shared broadly. But for others, such as Kant, the imperative to share and write for an ever-expanding public raised a number of concerns. First, just because there print was more widely available did not mean that everything that was printed was of good quality. There was a lot of junk, and as yet little sense as to how to filter through it all to find what was actually worth reading. Second, new knowledge meant more specialized knowledge. In order to advance knowledge, scholars had to dig deeper and make ever finer distinctions. All those arcane details that Kant’s critics dismissed as superfluous were necessary if Kant was going to move beyond his philosophical predecessors. Kant didn’t engage in complex, highly technical discussions of Leibniz’s concept of space or Hume’s skepticism for the hell of it.  Technical and narrowly focused arguments were the price he, or his readers, had to pay for new knowledge. And he rightly observed that this type of scholarly knowledge conflicted with the imperative to make knowledge widely available, to popularize it. And it certainly did not jibe with the demands of the modern print market.

In response to these two conflicting pressures, to advance knowledge and popularize it, Kant drew a sharp distinction between scholarly and popular knowledge. However technical his philosophy got, he always insisted that knowledge was ultimately practical and oriented towards the world. He simply thought that his critics had gotten it backwards. They demanded popularity on the front end, but Kant insisted that popularity was something that should only be pursued after the hard, rigorous work of the scholar had been finished. “Popularity,” he wrote, should never be the “beginning of science.” Knowledge should be shared more broadly, beyond the guild of scholars, only after it had been generated. If there were no actual knowledge, then there would be nothing to disseminate, nothing to popularize. Kant called the one scholastic knowledge and the other world knowledge.

Our own confusions about the proper language and audience for academics echo Kant’s sharp division between the popular and the scholarly. The problem with the academic system, however, is not simply its specialized, highly technical languages. In world of increasing complexity, knowledge is advanced through specialization and its technical forms. As knowledge becomes more complex, so too do the concepts and the languages we use to make explain it and expand upon it. Kant distinguished scholarly and popular work so that knowledge could advance, but he never considered scholarly work to be an end in itself. It was always oriented, if not subordinated, to a broader ends.

Kant embraced the distinction between the academic and popular audiences and writing as essential to a particular kind of thinking.  The question we might ask ourselves today is whether that simple dichotomy holds true in our digital moment. It may be that the newer media are dissolving the distinctions in surprising and productive ways.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.