Monthly Archives: May 2014

You must change your graduate program!

Of all the hyperbole surrounding the fate of the humanities today, the problems facing graduate studies seem the least exaggerated. The number of PhDs has far outpaced the number of available full-time jobs. Financial support is both inadequate and unevenly distributed, requiring The Quant & The Connoisseur logostudents to defer earnings over long periods of time, assume unneeded debt, and compete against differently funded peers.

Overspecialization has made knowledge transfer between the disciplines, not to mention between the academy and the rest of the workforce, increasingly difficult. The model of mentorship that largely guides student progress, now centuries-old, seems increasingly out of touch with a culture of non-academic work, so that students are ill-prepared to leave the academic track. Time-to-degree has not only not sped-up, but increasingly it also correlates with lower success rates—the longer students stay in the PhD track the lower their chances for full-employment.

Philosophy Seminar Table at University of Chicago

The hegemony of the seminar as the only model of learning is also at odds with much recent thinking about learning and intellectual development. Undoubtedly, numerous teachers and students alike would describe at least some, if not many, of their seminar experiences as profoundly uninspiring. Add to that the way we largely operate within a developmental model premised on major phase-changes that dot otherwise stable, and largely flat, plateaus. The long time between exams or their sink-or-swim nature does little to promote long-term incremental development of students as thinkers, writers, or teachers. We think more in terms of a ninteenth-century-inspired model of botanical metamorphosis, with its inscrutable internal transformations, than we do incremental, cumulative performances.

There are also bio-political aspects to the crisis that have recently been raised, where the most intense periods of work and the most intense periods of insecurity overlap precisely within the normal timeframe of human fertility. This PhD model seems downright Saturnalian, consuming its own offspring.

What is there to like about this scenario?

Luckily, the MLA has issued a report about graduate education. “We are faced with an unsustainable reality,” the report states. Indeed. And then come the all-too-familiar platitudes. Maintain excellence. More teaching. Shorter time-frames. Innovate. Better connection with non-academic jobs. Advocate for more tenure-track positions.

As the clichés pile up, so do the contradictions. Get out faster, but spend more time teaching. Keep up those rigorous standards of specialization, but do it with haste (and be interdisciplinary about it). No teaching positions available? Look for another kind of job! Persuade the university to hire more tenure track positions—whom do I call for that one?

Will the MLA report lead to changes? It’s doubtful. Sure, we’ll crack down on time to degree without really changing requirements. We’ll spice up course offerings and maybe throw in an independent project or two (digital portfolios!). We’ll scratch our heads and say the PhD would be a great fit for a job in consulting, or in a museum, maybe even Google—and then do absolutely nothing. Five or ten years from now, we’ll talk about a crisis in the humanities, the shrinking of the field, and notice that, once again, there seem to be fewer of us hanging around the faculty club.

Nothing will change because we don’t have to. As long as there are too many graduate students, there is no problem for faculty. And no matter what we say, what we do is always the same thing. Try approaching your department and saying you need to change the scope, scale, content, and medium of the dissertation. You’re in for a fun conversation.

Try implementing a mandatory time-limit with yearly progress reports and consequences for failure and you’ll be barraged with so many exceptions your Agamben will start to hurt. What do employers want from PhDs—good writing skills, general knowledge, analytical capability, facility with numbers, strong work habits, works well with others? Sorry, can’t help you. None of our students has those skills since our program doesn’t emphasize them (but we have a writing center!).

Nothing will change because we don’t have to. We’re so conservative at heart we’d rather die out with our beliefs intact than do anything that might actually better serve the student population. We’ll continue to point to the exceptions without realizing how much they still look like something we didn’t want to change.

The PhD did something once, or rather it did one thing and it did it reasonably well. It still does that one thing, which is now, quantitatively speaking, vastly unnecessary. Some have suggested scaling up to meet the sciences on their own ground (and here at The Infernal Machine, too). I would suggest that we scale down to meet the needs of the world at large. More modular, more flexible, more creative, more varied, more timely, more general, more collaborative, and more relevant. Until we have any proof that our programs are feeders for jobs outside the academy, we’re just failing by another name.

We can either change in substantive ways or pretend to do something else while actually continuing to do the same things we’ve always done. The MLA report looks a lot like the latter and no doubt so will most of the responses to it.

I’m looking forward to next year’s report. How many ways can you play the same tune?

 

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterGoogle+LinkedInShare

Big Humanities

[Editor’s Note: This is the second installment in The Humanities in Full series.]

Before there was big science or big data, there was big humanities. Until the last third of the nineteenth century, the natural and physical sciences imitated many of the methods and practices of the humanities, especially disciplines like philology, which pioneered techniques in data mining, the coordination of observers, and the collection and sorting of information—what Lorraine Daston terms practices of “collective empiricism.”

One of the most successful and long-lasting projects was led by the Berlin philologist August Böckh. In a proposal to the Prussian Academy of Sciences in Berlin in 1815, Böckh and his colleagues requested funding for a long-term project to collect as completely as possible all Greek inscriptions, printed, inscribed, and holograph. Eventually published in four installments as Corpus Inscriptionum Graecarum between 1828 and 1859, with an index in 1877, Böckh’s project was an organizational feat that relied on the work of hundreds of philologists over decades, and it quickly became a model for German scholarship in all fields. “The primary purpose of a Royal Academy of the Sciences,” he wrote, should be to support the type of work that “no individual can accomplish.” The project collected, stored, preserved, and evaluated data. And in Böckh’s case, the data were Greek inscriptions scattered across the Mediterranean.

Böckh’s Corpus Inscriptionum Graecarum was just a prelude. In his inaugural lecture to the Prussian Academy of Sciences in 1858, Theodor Mommsen, one of Germany’s foremost classical scholars, declared that the purpose of disciplines like philology and history was to organize the “archive of the past.” What Mommsen had in mind, as would become evident in the kinds of projects he supported, was not some abstract archive of immaterial ideas. He wanted scholars to collect data and shape it into meticulously organized and edited printed volumes in which the “archive” would take tangible form. Work on the scales that Mommsen imagined would require international teams of scholars and the “liberation” of scholars from what he dismissed as the “arbitrary and senseless” divisions among the disciplines.

As secretary of the Prussian Academy of Sciences, Mommsen set out to institutionalize his vision of big philology, or what he termed the “large scale production of the sciences” [Grossbetrieb der Wissenschaften]. After securing a three-fold increase in the Academy’s budget, he supported a series of monumental projects. He oversaw the internationalization and expansion of the Corpus Inscriptionum Latinarum, the Latinate counterpart to Böckh’s project that sought to collect all inscriptions from across the entire Roman Empire. It eventually collected more than 180,000 inscriptions and grew to 17 volumes plus 13 supplementary volumes. Mommsen also helped church historian Adolf Harnack secure 75,000 Marks and a 15-year timeline for a project on Greek-Christian Authors of the First Three Centuries, the modest goal of which was to collect all of the hand-written manuscripts of early Christianity. Other projects included a prosopography of ancient Rome funded for a period of ten years.

Looking back on what Mommsen had accomplished for modern scholarship, the German philologist Ulrich von Wilamowitz-Moellendorf wrote:

The large scale production of science cannot replace the initiative of the individual; no one knew that better than Mommsen. But in many cases the individual will only be able to carry out his ideas through large scale production.

Theodor Mommsen, Wikipedia Commons

Theodor Mommsen, Wikipedia Commons

Figures such as Böckh and Mommsen introduced different scales to knowledge creation and different skill sets to humanistic scholarship. They developed, coordinated, and managed teams of people in order to organize huge sets of texts and data.

But to what end? What was the purpose of all this collecting, organizing, and managing? This was the question that transformed Germany’s most self-loathing philologist into a philosopher. A wunderkind trained in Leipzig, Friedrich Nietzsche was appointed professor of classical philology at the University of Basel at the age of 24, before he had even finished his doctorate. Just as Mommsen was busy assembling the “archive of the past,” Nietzsche began to diagnose modern culture, not to mention himself, as suffering from a bad case of “academic knowledge,” or Wissenschaft.

In We Philologists, Nietzsche excoriated his fellow scholars for abdicating philology’s real task. Ultimately, he argued, philology was not about advancing knowledge or building an “archive of the past.” It was about forming stronger, healthier human beings on the model, or at least the idealized model, of the ancient and classical Greeks. The real philologist was a lover of antiquity, someone who sought to transform himself through an encounter with a superior culture. Every good and worthwhile science, he wrote, should be kept in check by a “hygienics of life”—practices by which whatever one learned could be integrated into how one lived.

Despite his stylized iconoclasm, Nietzsche was a traditional German Grecophile for whom antiquity of the Greek sort was a moral utopia. But he was also a modern scholar struggling to come to terms with the ascendant research university and what we recognize today as its basic elements: the division of intellectual labor, academic specialization, and the constant struggle to integrate new technologies and practices for sifting through and making sense of the past.

Nietzsche’s polemics against big philology were precursors to contemporary anxieties about what might become of the humanities in the digital age. Data—be it 180,000 inscriptions or hundreds of digitized novels—cannot speak for itself, but it is never incoherent. It’s always collected, organized, edited, framed, and given meaning, whether in nineteenth-century printed volumes or twenty-first century graphs. With his bombast and passions, Nietzsche made the case for values and interpretation at a moment when textual empiricism was ascendant and positivism loomed. “Yes, but how are we to live!” was his constant refrain.

The catch, of course, is that most of us aren’t Nietzsche, though arguably too many contemporary scholars of the strongly critical-theoretical bent aspire to be. Scholarship and knowledge might be better served if many such would-be master interpreters settled for the humble but necessary drudgery of collecting, annotating, and commenting on the “archive of the past,” maintaining cultural inheritances and providing invaluable grist for the equally important job of hermeneutics. We shouldn’t forget that Nietzsche the philosopher, the moral psychologist who diagnosed the ethical ills of modernity, grew out of Nietzsche the philologist, the erudite scholar who reverentially tended ancient traditions and texts.

Nineteenth-century practices of collecting and evaluating data don’t exhaust the work of the humanities, but they highlight a broader history of the humanities in which collecting and evaluating data has been a central and even noble pursuit. Thinking of the humanities and the sciences in terms of what humanists and scientists actually do might help us develop a longer history of the humanities and see continuities that simple polemics only conceal. Nietzsche and his fellow nineteenth-century philologists struggled to reconcile more interpretive methods with historical approaches, to blend pleasure and delight with critical distance, and to temper particularity with timeless value. But Nietzsche represents only one side of the debate. While his critiques of the utopian impulses of big philology were necessary correctives, he ultimately left the university and withdrew to a life in extremis, writing at the edge of lucidity and under the shadow of genius.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Wall Must Stand: Innovation At the New York Times

“In the future,” writes digital scholar and Twitter wit Ian Bogost , “all news will be about, rather than in, the New York Times.”

That future seemed to arrive last week, and not only with the controversy unleashed by the abrupt firing of executive editor Jill Abramson. Possibly as part of the storm, someone in the newsroom leaked a 96-page document with detailed proposals for bringing the Times more fully into the digital age and—even more important—making the Grey Lady more “Reader Experience”-friendly.

#164976308 / gettyimages.com

Nieman Journalism Lab’s Joshua Benton, calls the report “one of the most remarkable documents I’ve seen in my years running the Lab.”  He even tasked three of his staffers with excerpting highlights from it. But the whole thing merits reading by anyone interested in the possible (or inevitable?) future of the news.

Not that there is anything truly new or surprising on any page of “Innovation,” as the study is so grandiloquently titled. Put together by a six-member team led by Arthur Gregg Sulzberger, the publisher’s son and heir apparent, the report is a compendium of ideas, strategies, arguments, and veiled threats familiar to anyone who has worked in or around newsrooms during the last decade or so. From the title on, it buzzes with the kind of jargony nostrums that fuel TED Talks and South by Southwest conferences, from impact toolboxes and repackaging old content in new formats to making journalists their own content promoters and integrating the Reader Experience team with the newsroom to, well, anything else that helps counter the disruptive incursions of new media upstarts like Buzzfeed and Huffington Post.

And why not counter those disrupters? As the report frequently notes, the NYT produces consistently superior content but does an almost-as-consistently inferior job of getting its content out to its reader. In some cases, competitors are even more successful at distributing Times content than the Times itself. It makes little sense to remain satisfied with such a status quo.

But reading the report invites suspicion on at least two counts. The first is quite immediate: How is it possible that these objectives haven’t already been accomplished? (And, in fact, is it possible that many of them have, as some NYT insiders say, proving once again that many strategic studies merely confirm the direction in which the institution is already heading). My incredulity arises from the facts presented in the report itself, namely the astonishing number of Times employees already dedicated to Reader Experience activities (which, as the report notes, “includes large segments of Design, Technology, Consumer Insight Group, R &D, and Product”).

The problem, explicitly remarked upon at several points in the report, appears to be turf wars.  Some of this is simply silly, such as the exclusion of Reader Experience people from key editorial meetings or other instances of uncollegial shunning. But I suspect the problem also stems from something far less silly, indeed, from the most fundamental of political-institutional questions: Who, at the end of the day, will be in command of the combined newsroom and Reader Experience personnel? Will it be the editorial leadership or the business leadership?

That question can’t be finessed, fudged, blurred, or deferred. It must be answered forthrightly, because if it isn’t, the very purpose of a serious news organization becomes unclear.

And that leads to my second big concern.  What is the real goal of “innovation” at the New York Times? Is it intended primarily to enable the editorial leaders to use and inculcate the best practices of distribution, with additional staff possessing advanced skills in those practices, in order to support and advance strong journalism? Or is intended primarily to increase the number of Reader Experiences as measured through analytics and other metrics at the expense, in the long or short runs, of the highest quality of journalism? If the former, I am on board—who wouldn’t be?

Which is why the political question must be answered first. If not, and if the new and enhanced newsroom ends up being run by the business side, then decisions will be made that will slowly erode the quality of the journalistic content. If content packagers and social media community managers answer ultimately to publishing executives and not to editors, then they will be able to demand the kind of content—whimsical features, for example, rather than hard reporting—that tends to trend most strongly.  The sad fact is that cute cat stories always sell better than revelations about city hall. The number of hits, likes, or visits will gradually but inevitably determine the editorial agenda.

Overstated? Simplistic? I don’t think so. When the ultimate purpose of a news organization is something that can be evaluated almost exclusively by metrics, then you can be sure you are no longer talking about a news organization.  A media company, perhaps, but not a journalistic one.

The report calls for some breaching of the editorial-publishing (or church-state) firewall. That sounds suspect to me. What it should call for is the migration of some of those Reader Experience departments and personnel to the editorial side of the firewall. The wall itself must stand. Or the journalism will fall.

Jay Tolson is the Executive Editor of The Hedgehog Review.

 

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Humanities in Full: Polemics Against the Two-Culture Fallacy

The New Republic does not like the digital humanities. Following Leon Wieseltier’s earlier diatribes, Adam Kirsch recently warned that the digital humanities and their “technology” were taking over English departments. Kirsch posed some reasonable questions: Are the digital humanities a form of technological solutionism? No, not withstanding the occasionally utopian strand. Are the digital humanities “post-verbal”? With all their graphs, charts, and network visualizations do they aspire to a discourse of mere pictures and objects? No and no. With all their generously funded projects, are they embracing the “market language of productivity to create yet another menacing metric for the humanities?” A good question that deserves thoughtful responses (here and here).

But Kirsch’s essay isn’t really about the digital humanities. It’s about the humanities more broadly and Kirsch’s truncated and ahistorical vision of what they ought to be. The problem with the digital humanities, he writes, is that they go against the “nature of humanistic work.” And their errant ways

derive from a false analogy between the humanities and the sciences. Humanistic thinking does not proceed by experiments that yield results; it is a matter of mental experiences, provoked by works of art and history, that expand the range of one’s understanding and sympathy. It makes no sense to accelerate the work of thinking by delegating it to a computer when it is precisely the experience of thought that constitutes the substance of a humanistic education. The humanities cannot take place in seconds. This is why the best humanistic scholarship is creative, more akin to poetry and fiction than to chemistry or physics: it draws not just on a body of knowledge, though knowledge is indispensable, but on a scholar’s imagination and sense of reality. Of course this work cannot be done in isolation, any more than a poem can be written in a private language. But just as writing a poem with a computer is no easier than writing one with a pen, so no computer can take on the human part of humanistic work, which is to feel and to think one’s way into different times, places, and minds.

Kirsch pits the technologically unadorned humanities that produce subjective experiences against the technology-dependent sciences that produce mere facts. This simple, and false,  dichotomy manages to slight both at once, and to obscure more than it clarifies.

In fact,  this humanities-sciences dichotomy is relatively recent.  And as it turns out, the  humanities itself is a relatively recent term, seldom used before the nineteenth century. The OED lists the first use as 1855 in a reference to music as of “the humanities.” Google’s NGram keyword search shows a marked increase in the prevalence of the term around 1840, just as the natural and physical sciences were becoming ascendant in universities.

Today’s distinctions between the digital humanities and the humanities proper has its longer dilthey_geisteswissenschaften_1883_0006_400pxhistory in these nineteenth-century divisions. For well over a century now, one of the dominant notions of the humanities, in the academy at least, is one that cast them squarely against the natural sciences. And this conception of the humanities, which has since gained wider influence in the culture at large, was first articulated by the late nineteenth-century German scholar Wilhelm Dilthey. Dilthey distinguished the human sciences [Geisteswissenschaften] from the “natural” sciences [Naturwissenschaften].

The Geisteswissenchaften, claimed Dilthey, studied the inner workings of mental facts – that is, the internal processes of human experience (Kirsch’s beloved mental experiences). For this internal realm, the freedom and autonomy of the subject were central and, thus, the primary objects of inquiry.

The natural sciences, by contrast, studied material processes governed by natural laws and the mechanisms of cause and effect. For Dilthey, humanities scholars don’t count, measure, or seek patterns; they seek to understand what motivates canonical historical figures who produce works of art and other artifacts of culture (Kirsch’s struggle to understand not just explain Auerbach’s Mimesis, for example). The human sciences explain phenomena from within, the natural sciences from without.

Dilthey’s efforts to distinguish sharply between the two forms of inquiry were in large part meant to resist the rising influence of the natural sciences in nineteenth-century German universities and, above all, the influence of positivism: the notion that we can have knowledge only of phenomena (the only possible knowledge is of an endless series of facts). Like Dilthey, Kirsch’s embrace of a very particular and limited notion of the humanities is reactionary. But whereas Dilthy feared the pervasive and corrosive effects of positivism, Kirsch fears the utopian delusions of technological solutionism.

These simple oppositions—the humanities versus the sciences—confuse more than they enlighten and, in a timeless irony, produce deeply anti-humanistic polemics. They also ignore the historical fact that the humanities and humanistic inquiry more broadly have not only been concerned with particular human artifacts (one painting, one poem, one piece of music) but also, as Dutch scholar Rens Bod recently put it, patterns and principles to make sense of and enjoy these artifacts.

The kinds of things that humanists actually do when they engage products of human creativity, their practices, have always been bound up with efforts to make connections and generalize. From Lorenzo Valla’s careful and methodological debunking of the Donatio Constantini (On the Donation of Constantine) in 1440 to Erich Auerbach’s Mimesis in 1946, humanists of all kinds have relied on particular notions of method, evidence, verification, and argument, just as the natural and physical sciences have relied on intuition and creativity.

We need a history and vision of the humanities capacious enough to see the humanities not as a particular method or set of disciplines but as a disposition, as a way of engaging the world.  What follows in subsequent blogs are short, polemical (in the best sense, I hope) steps toward such a history.

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Semi-Pro—On the New Work-Study Conundrum

With the recent NCAA ruling in the Northwestern football case—in which college players were deemed eligible to unionize—the question of work on campus has reared its ugly head once again. However much the ruling was more narrowly about the behemoth that is NCAA sports, it was yet another sign of the increasing professionalization of all things academic. Even those students who were thought to be playing games are working. Whether it’s graduate students or student-athletes, there is a lot more work going on on campus than people had previously thought. In the good old days, work was something you did on the side while you studied. Now, it might be the only thing you do.

The Quant & The Connoisseur logo

In response to the decision, voices from inside the university were familiarly hypocritical. There’s just too much money at stake not to preserve the myth of the amateur or the apprentice. (In fact, it’s appalling how much money is in play in a system that still largely uses medieval guild models to understand itself.) Voices in the press, on the other hand, were typically nostalgic. Wouldn’t it be nice if this weren’t the case? Isn’t there some way of restoring the work-study balance so we can imagine real student-athletes and not financially indebted apprentices? Remember Greece?

Rather than banish the idea of work like some unclean vermin out of a Kafka story, we should be taking the opportunity to look it more squarely in the face, and not just in the world of college sports. Work is everywhere on campus today. It’s time we accepted this fact and start rethinking higher education accordingly — starting with that most hallowed distinction between working and studying that informs almost everything else we do.

Conflating work and study might seem to many to bring together categories that are traditionally seen as opposites. That’s why we have two words for them after all. But they’re also supposed to be sequentially related. You don’t just do one instead of the other. You are supposed to move from the one to the other.

Imagining work as part of the study equation rather than its opposite or its aftermath challenges this equation, especially assumptions about its general level of success. We largely operate  with what we could call a mimetic theory of learning in which students imitate, usually poorly, “real” work, after which at some point they presumably stop simulating and start producing. No one knows when this is supposed to happen or how. The failure rates in graduate school, or the difficulties most undergraduates face in moving from school to jobs, testify to how little this process results in putting people to work. And yet we insist on keeping “work” at bay.

My point is that the recent influx of work’s presence in the academy might in fact be good for learning. With all the anxiety about business models entering into the university, whether it’s students who mostly just play sports, graduate students who spend a lot of time teaching, or faculty who are subject to the market demand of student-consumers, the thing we haven’t considered is the act of learning and the vast majority of students who engage in it. What happens to them when learning looks a lot more like work?

In the humanities, to take the example I’m most familiar with, when students write papers for a class they are writing papers that are like good papers, but usually are not. There is a theory of learning-by-simulation that governs the practice. (I’ll ignore exams since we all know those are just compromises with the devil of overpopulation.) Students’ papers, and in most cases the grown-up ones they are supposed to emulate, are socially worthless. They don’t produce anything. They are poor copies, in Plato’s sense. Students may or may not drop by to pick them up at the end of the semester and, if they don’t, we throw them in the trash. Yes, that’s true, but telling.

But what about the growth! say the pundits. How else are students supposed to reach their full, expressive intellectual capacities without these exercises? Yes, it is like exercise. You practice to simulate the real game. It’s not like those kids playing basketball in sixth grade aren’t imitating their grown-up peers—that’s what learning is.

But here’s the difference. What the NCAA ruling shows us is that by the time of higher education, these students are no longer imitating. It is work in its own right and is being compensated accordingly. (Well, at least the coaches and administrators are being compensated.) Why not have the same expectations for our students? And certainly for graduate students, who are way past the sell-by date of learning by imitating? Why keep alive the pretense of simulation when in practice there is all sorts of work going on and potential for even more of it?

Because we’ve been doing everything on a simulator deck for so long, it’s hard to imagine what this might look like. But try to picture more assignments and activities that aim to solve real-world problems, even at the simplest of levels. Wikipedia pages are an obvious first step. Why not teach composition using this public space? Why not make new knowledge in the process of honing your writing skills and creating a suite of pages around an overlooked cultural feature? Along the way, students would also learn what it means to edit, a skill most of them very much need.

It would also teach them how to relate to other people’s writing. As we move more and more from a book-based world to a digital infrastructure, why not teach students how to contribute to that process? I don’t mean have them transcribe OCR instead of talking in class (too Matrix-y?). But certainly encoding texts and tagging them with more information that can be used by others are activities that require a good deal of thought and critical judgement. At the same time, this information is useful to others. Think of the boon to historians to have more information available in digital archives. Perhaps students might apply their new-found knowledge about the novel (Jane Austen, no doubt) by writing an algorithm that looks for something textually interesting in the myriad pockets and corners of the Internet. There is so much to analyze out there, why not do it rather than pretend to? I’m imagining something like the Maker Movement, though less artisanal in spirit and more analytical and applied.

The detractors will, of course, have seizures about all of this—they will say it overlooks the disinterested nature of learning and undoubtedly marks the end of the life of the mind. My colleagues in the humanities might argue that it smacks of a science system that prioritizes laboratory doing over seminar simulation. (They’ve been doing this for a long time.) Prepping for cubicle life is no way to think about Aristotle, they would insist, whither the critical thinkers? Once those ivory towers are opened to the world of work, there will be no end to it.

This is where I think the either/ors have it wrong. First, I’m not a big fan of pretending. We can pretend that work hasn’t infiltrated campus, but that doesn’t make it so. For those on the outside looking in, like the judicial system, we are in a serious state of denial. Ostrich-politics, as they say in Germany, isn’t a very effective or attractive technique (not least because of one’s embarrassing posture and the rear-end exposure).

Second, and I think more important, we can do this in ways that are intellectually valuable. We need to get over our fear of the simple—simple things, when brought together over time, add up to much more. Witness Wikipedia. You can imagine any number of smaller-scale forms of student-curated information. In fact, this might be a very useful way to frame a new kind of work-study stance. Instead of apprentices simulating their superiors, students are curators making and sharing knowledge simultaneously at different levels of scale.

I think the most compelling reason is also the least appetizing: in the end we must. Higher education has simply become too expensive to justify itself as a pasture for the world’s elite. Or to put it another way, that’s all it is unless we do something differently. It’s time to get to work.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Frank Gehry and the Enigma of American Monumentalism

Pity Frank Gehry? The great architect of the anti-monumental, so wily and free and unpredictable in form, with so little obvious care for function, got outflanked by the Eisenhower Memorial complex. Pity Frank Gehry? Heroic in his anti-heroic innovations, replete with rejections of the old-world style, he could not overcome the dignified decorum of D.C.’s National Mall. Though he, as much as anybody, seems to understand the modernist enigma that was President Eisenhower, he could not solve the enigma of the American national memory.

In case you hadn’t heard (and it is quite likely you hadn’t) Gehry’s design of the Eisenhower Memorial for the National Mall—unanimously selected by the Eisenhower Memorial Commission in 2010—was rejected in April by the National Capital Planning Commission, and this after Congress refused to fund the project in January.

Since the concept was first presented, Gehry’s memorial design has been the target of fiery criticism by an assortment of people, from columnists to politicians to architects to historians to members of the Eisenhower family. The objections have been manifold, but typically have revolved around three issues: the design overplays Eisenhower’s childhood roots in Kansas relative to his wartime heroism; its scale and size is too grandiose; and its style is “modernist.”

Gehry's design for the Eisenhower Memorial was selected by the Eisenhower Memorial Commission in 2010. Photo is public domain (accessed on Wikicommons).

Gehry’s design for the Eisenhower Memorial was selected by the Eisenhower Memorial Commission in 2010. (Credit: Eisenhower Memorial Commission)

Susan Eisenhower, the former president’s granddaughter, led the charge against the focus on Kansas from the early days of the design’s public release. “Memorials in Washington speak for the nation as a whole,” she was quoted at saying. “The nation is not grateful because he grew up in rural America. He defeated Nazi Germany. He led us through tumultuous times in war and peace.” Of course, Kansans would differ!

Then there’s the size issue. The New Yorker’s Jeffrey Frank wrote:

From the start, something has been a little off with this undertaking, beginning with its preposterous scale. Dwight D. Eisenhower was certainly not a modest man; he had his share of vanities. Even late in life, he was very much the incarnation of a five-star general. But he was not someone who showed off. He didn’t swan around with his medals when he was a soldier, and, as President, he was willing to let others take the credit (as well as the blame) for Administration actions.

That Gehry’s design would require a bloated budget only magnified the complaint. As Frank pointed out, again in The New Yorker, “The revised design has more man than boy, but the memorial, which so far has cost more than sixty-two million dollars—a sum that would have appalled the fiscally austere Eisenhower—has still produced little more than acrimony.”

With regard to its “modernist” style, Roger Lewis, professor emeritus of architecture at the University of Maryland, wrote in The Washington Post

Among the most strident complainers have been those condemning the aesthetic style of the memorial. Generally disliking architectural modernism, they strongly prefer a classically inspired memorial design. These critics argue that civic structures employing a modernist or avant-garde vocabulary just don’t belong in the nation’s capital, where so many classically derivative government edifices, museums and monuments have been built. To them, selecting Gehry was a colossal error.

The protests against Gehry’s style grew so strident as to produce a full-fledged “Toward a New Eisenhower Memorial” campaign under the auspices of the National Civic Art Society, a group advocating “a traditional artistic counterculture” to oppose “a postmodern, elitist culture that has reduced its works of ‘art’ to a dependence on rarified discourse incomprehensible to ordinary people.” The group has, among other things, come near to conspiratorial sensationalism,  suggesting the memorial commission fixed the design competition, and offering “rarely seen” and “shocking” photos of the “tapestry” Gehry would include in his design. Gehry, it is clear from the group’s website, is a threat to the nation: on it, they pit “Frank Gehry in His Own Words” (e.g. “I’m confused as to what’s ugly and what’s pretty”) against  “Eisenhower in His Own Words” (e.g. “What has happened to our concept of beauty and decency and morality?”).

The “Toward a New Eisenhower Memorial” campaign, whose motto is “We’d like what Ike would’ve liked, ” has held a competition for an alternative design, offering neoclassical columns, arches and even an obelisk to properly memorialize the late president. Unfortunately, they would not grant me permission to show one of the countermemorial designs in this post. Justin Shubow, the president of the organization, told me that the National Civic Art Society owns exclusive rights to the image I wanted to post (a neo-classical arch with “Eisenhower” inscribed across the top), and that they would not permit me to display it on The Infernal Machine. He further explained that the competition was run on a small budget, and that many of the entries were by students, seemingly attempting to explain the less than stellar quality of the designs.To see some of the countermemorial designs, including the one I wanted to show,  you can go here, here, and here.

Oh boy, Gehry (and more than one of his employees) must be thinking. And rightly so. For regardless of his taste, Eisenhower was without doubt the first “modernist” U.S. president, the architecton of missiles, missile silos, massive interstate systems, efficient underground backyard nuclear bomb shelters, consumer capitalism, space monkeys, and so on. There is, even today, a hollowed-out mountain out West, built in the 1950s with great effort and at great expense, in which sits, at the end of a labyrinth of tunnels, the “Eisenhower bedroom.” It’s one of many places the president could sleep in relative peace in case of a nuclear war. Now that‘s what I would call preposterous scale! That, we might even say, is modernism!

At the same time, in the spirit of modernist enigma, Eisenhower more than any other president in modern history preached the incapacity of matter and form to adequately symbolize ideals. “No monument of stone,” General Eisenhower declared after World War II regarding the soldiers who had died, “no memorial of whatever magnitude could so well express our respect and veneration for their sacrifice as would perpetuation of the spirit of comradeship in which they died.” As president, Eisenhower was guided by the idea that America and the rest of the world needed to get beyond old European ways, which had produced disaster after disaster, and enter into a new age of spirit.

This spirit, moreover, was the spirit of the future, not the past. As Eisenhower declared at the Metropolitan Museum of Art in 1946 of the United States, “Now we enter on an era of widened opportunity for physical and spiritual development, united in a determination to establish and maintain a peace in which the creative and expressive instincts of our people may flourish.” Two years later, 1948, the abstract painter Barnett Newman sounded the same theme in “The Sublime is Now”: “We are reasserting man’s natural desire for the exalted, for a concern with our relationship to the absolute emotions. We do not need the obsolete props of an outmoded and antiquated legend. . . We are freeing ourselves of the impediments of memory, association, nostalgia, legend, myth, or what have you, that have been the devices of Western European painting.”

To be sure, Eisenhower more than once expressed his distaste for, and his inability to understand, modernist art. But it was Eisenhower’s C.I.A. that propagated modernist art across the globe as a distinctly liberal and American art form, and the president’s basic ideas about America were a kind of optimistic version of avant garde ideas: productivity, spirit, opportunity, freedom, faith  . . . these were the watchwords of the Eisenhower presidency. “Tradition” was not.

But what of General Eisenhower, the old-style hero of D-Day? When Eisenhower decided to run for president he put his general’s uniform in the closet and assumed, instead, the mantel of the “ordinary” American, the man from Abilene, the main subject of Gehry’s original design. Hyper-aware of the disastrous militarism of Europe, President Eisenhower constantly tried to deflect attention from his own heroic persona (by purposefully fumbling through press conferences and making regular references to golf, among other things). He, instead, in an ongoing polemic against FDR and other “iconic” leaders (read, Hitler!), tried to direct national attention away from the monumental achievements of great men like himself and toward the greatness of American ideals, ideals that, like the spirit of the sacrifice of American soldiers in World War II, could not be adequately represented by monumental leaders or colossal obelisks. That missiles remain one of the great colossal leftovers of the Eisenhower era, and the fact that Eisenhower remained, even after putting his general’s uniform in the closet, the most military minded of all twentieth-century American presidents (helping hold the world hostage to a nuclear arms race, fighting covert wars through the C.I.A., and building the military-industrial complex that he would warn against in his famous Farewell Address), well, this is the modernist enigma that was Eisenhower.

It’s hard to memorialize an enigma. That’s what Frank Gehry has been trying to do. It’s the sort of challenge artists like Gehry love to tackle; it’s a task well suited to a (post)modernist. But Gehry has run up against an even more powerful enigma, that of American national monumentalism, which has repeatedly returned to “traditional” neoclassical forms of old-world Europe to give expression to the sublime spirit of the new world. So go ahead and pity Frank Gehry.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.