Tag Archives: history

All for the Now—or the World Well Lost?

Costume designs for the libation bearers, London Globe production of “Oresteia,” 2015.

Riding with my middle-school daughter a while back, I heard one of her favorite pop songs on the radio and called it to her attention; she sighed and said, “Oh, mom, that’s so two minutes ago.” Apparently, the song was popular last year and, therefore, no longer qualifies as one of her current favorites. Then, she added, in exasperation, that the expression itself was outdated. It was popular a year ago, back when she and her friends used to parody the lingo of certain “popular, mean girls” from film or TV. So, a year ago is now “old”? Or two minutes? The whole conversation made me wonder: What kind of a sense of the past do our children grow up with today, and how does it shape our attitudes toward history?

That question emerged in a different way when my son started high school this year. As an academic observing his orientation, I was keenly interested in this introduction to the curriculum. Of all the things I learned, however, the most surprising was that his curriculum requires only one year of history to graduate. Three and a half years of physical education are required. Three to four years of English are essential, as are three years of math. But students at my son’s school can graduate with only one year of history, and US history at that. Even in his first-year English course, where students are required to read only three literary works during the entire academic year, two of the three were written in the last sixty years. In other words, there’s not much distant history in his English curriculum either.

This also squares with trends at the small liberal arts college where I teach. Enrollment in history courses is down. The history department’s faculty allocation has recently been cut. Even in the English department, where enrollment numbers are strong this year, our historically-oriented Renaissance literature line is being suspended due to budgetary adjustments, no doubt to make way for faculty positions in programs like biochemistry, molecular biology, and business. What this means is that my department will soon be without a permanent member who specializes in the period of the greatest flowering of literature in English.

And this dearth of expertise in the historical humanities is evident across the College. When I count the total number of pre-nineteenth century historical humanities positions at my college, considering fields such as art history, philosophy, theater, and religion, I find that only five percent of all full-time, permanent faculty members have expertise in such areas.

Is it any wonder then that young people often have a limited sense of the past, unable to place even watershed events such as the Protestant Reformation or the French Revolution or identify major historical time periods? Not long ago, my son returned home from middle school to boast that, unlike his peers who were hooked on techno-pop, he’d suddenly become interested in “more medieval music”—“You know, Mom, like Simon and Garfunkle, the Beatles, ELO.” I’ll give him a pass for being only twelve at the time, but I’d suggest that this historical illiteracy is more common—and more costly—than we might think.

Why should teaching the past matter? It matters because teaching any pre-modern culture exposes students to ways of being that may be alien to them, a form of ontological diversity just as important as the more familiar kinds we hear so much about today. Many years ago, in a lecture at my college, the classicist Danielle Allen argued that education is fundamentally about knowing the foreign. Like Allen, I share that conviction and, in my own courses, daily ask students to explore the foreign battlefields of Homeric Troy or to inhabit the psychological terrain of Augustine. Both the Iliad and the Confessions offer examples of imaginative mindscapes as foreign to many students as any far-flung land they might visit on a study-abroad trip. And such foreign intellectual encounters, so familiar in early literature and history courses, help students cultivate virtues such as empathy and tolerance.

Tracing the decline and fall of the Roman Empire, distant as it may be, reveals the dangers of overreaching imperial powers, the perils of resources stretched thin, and the consequences of growing economic disparities—none of which are problems confined only to the ancient world. As the historian Timothy Snyder observes in his brief wonder of a book On Tyranny, “Americans today are no wiser than the Europeans who saw democracy yield to fascism, Nazism, or communism in the twentieth century. Our one advantage is that we might learn from their experience.”

Although Aeschylus’s Oresteia brilliantly dramatizes the triumph of democratic processes of justice over vendetta-style retribution, it also displays the pernicious roots of patriarchy, with the Olympian gods themselves legitimizing male rule over female, as Apollo exculpates Orestes by claiming that the mother isn’t really the parent, only the seed bed, while Athena chimes in, professing to “stand by the man” despite being a woman. Likewise, Shakespeare’s Shylock, a comedy that turns on a act of mercy, also illuminates darker themes such as anti-Semitism and ethnic stereotyping.

History also teaches us that the pursuit of knowledge is often a digressive process. Unlike the natural sciences where knowledge and learning are generally linear, experimentation and research leading to new insights and replacing previous conclusions, humanistic knowledge proceeds haltingly. In the natural sciences, one often draws the conclusion that new knowledge is better than old knowledge. In the humanities, we value the ancient, the antique, the quaint, and the outmoded all in the interest of thickening and enriching our understanding of human life.

While much of that life has involved regrettable episodes, history reminds us of what it means to be questing and creative and to transcend the limits of our human predicament, as Julian of Norwich or Galileo or Mary Rowlandson once did. Studying the past has been shown to remove feelings of isolation that many young people in contemporary America report as their greatest fear. Further, today’s younger generation may learn resilience, courage, and fortitude through an imaginative engagement of the people of the past.

I have been haunted by the lines from a poem I recently read in a book, Cruel Futures by Carmen Giménez Smith,that playfully extols “Disorder in exchange/for embracing the now.” Although Smith’s short poem vindicates that disorder by focusing on personal rather than collective, historical knowledge, those lines have left me wondering about the public implications of such an “exchange.” When, as a society, we “embrace the now,” at the expense of the past, what sort of disorderly deal might we be making? I’m thinking here of, for example, the generally low level of civic participation in the United States. Might this indicate that we have become complacent about our history, forgetting the arduous efforts of a small group of patriots and visionaries, preferring instead the promises of Silicon Valley entrepreneurs and charismatic “thought leaders”?

In the academic world where I work, I often hear “That is the way of the past; this is the way of the future,” as if the past were to be regarded as a mere discarded image, disconnected from the priorities of the omnipresent “now.” As educators, we ought to remain wary of such facile dismissals of the past and be vigilant in refuting this kind of chronological snobbery, to borrow a phrase from C.S. Lewis and Owen Barfield. The wisdom and well-being of our young people and our civilization depend on historical knowledge. Otherwise, we may one day find ourselves victims of a “cruel future,” one in which ignorance of past problems condemns us to inevitable repetition of them, and where blindness about historical insights prevents us from seeing wiser paths forward.

Carla Arnell is associate professor of English and chair of the English Department at Lake Forest College.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterLinkedInGoogle+Share

Monumental Woes

Unite the Right rally attendees. Picture taken by author.

No matter how ready you think you are to see an actual Klansman, you aren’t. Not that the Klansman is easy to see. Standing on tiptoe several rows back in the crowd, I can glimpse some of the white robe, which is more than enough for me. Someone else tells me that when she got close enough to see she began to cry. It sounds dramatic, she adds, apologetically.

The Klansmen—around fifty of them—are here in Charlottesville on an early July Saturday to protest the imminent removal of statues of Robert E. Lee and Stonewall Jackson and the renaming of the respective parks that contained them from Lee to Emancipation and from Jackson to Justice.

For them, this event is a sign of their decline. Back in 1921, a few months before the statue of Jackson that’s overlooking this whole affair was unveiled, the local paper proudly announced that “the fiery cross, symbolic of the Isvisible [sic] Empire and of the unconquerable blood of America, cast an eerie sheen upon a legion of white robed Virginians as they stood upon hallowed ground and renewed the faith of their fathers.… The Ku Klux Klan has been organized in this city.” Their members were, as the article says, “Charlottesville’s leading business and professional men.”

But what is the Klan now? An image of itself, surely. These people aren’t community leaders by any stretch. At first glance, the entire struggle now is over images: statues, white hoods, and Confederate flags. Removing the statues is as symbolic as keeping them—a gesture toward Charlottesville’s black population that seems to fall just short of actual material aid. (In fact, though it hasn’t dominated the news, the city has also passed an equity package, which, among other things, has dedicated around four million dollars to developing the African American Heritage Center, public housing, and educational opportunities.)

Still, there’s an undeniable electric shock that comes from seeing a Klansman; the image has power. There’s something real here, you think. Those white robes still have power.

There’s something real here was precisely what I didn’t think about a month later, when I first started watching a live video of Unite the Right ralliers preparing to march across UVA grounds with torches. The Unite the Right is here, like the Klan, to protest the removal of the monuments, and to agitate for “white rights.”

If anything, I expected one of the fidgeting young men—maybe the one with a tiny swastika pin on his polo shirt—to ask himself, “What am I doing here?” and take off. The situation is undeniably comic. But as they continue to march with their risibly misappropriated bamboo Tiki torches chanting, “You will not replace us” (and, sometimes, “Jews will not replace us”), they quickly become less funny. When they surround the woman who is recording the video I’m watching and my screen goes black, they’re not funny at all.

The next day, many of the Unite the Right ralliers show up at Emancipation Park carrying little wooden shields. I snap a picture of one man with a shield that says DEUS VULT in one hand and a Confederate flag in the other. When the rally is ordered to disperse practically before it can even start, one rally attendee begins to yell at white counter-protestors: “Y’all are all hypocrites!” He makes eye contact with me as he says it. Given the other options on the table, there are worse things.

These people, too, don’t seem altogether real. More dangerous, to my eyes, are the private militia members who have come to the rally heavily armed and looking ready for combat. They view themselves, as one tells me, as the self-appointed keepers of the peace. But one of the kids behind a wooden shield is James Alex Fields, and in a few hours he’ll ram a car into a crowd of people on Charlottesville’s pedestrian mall, killing one counter-protestor, thirty-two-year-old Heather Heyer, and injuring nineteen others. It doesn’t get more real than that. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Where Moth and Rust Doth Corrupt

Vint Cerf, 2007.

Vinton Cerf, 2007. By Joi Ito (Flickr) [CC BY 2.0], via Wikimedia Commons.

“We are nonchalantly throwing all of our data,” said Dr. Vinton Cerf, the vice president of Google, in a recent address to American Association for the Advancement of Science, “into what could become an information black hole without realizing it…. In our zeal to get excited about digitizing we digitize photographs thinking it’s going to make them last longer, and we might turn out to be wrong.” And if we do turn out to be wrong, Cerf continues, then we will probably leave few historical records behind us. Our excessively documented age will simply disappear.

Cerf’s comments might come as surprise to some. When expressing fears over our digital records, we focus much of our concern focuses on permanence—the idea that, for instance, an unkind or embarrassing action could linger on and haunt us long beyond any reasonable amount of time. When looking at the recent hiring-and-firing of Ethan Czahor, Jeb Bush’s tech officer, many draw the obvious moral that it is unwise to tweet. “Everyone of Czahor’s generation and younger,” says Caitlin Dewey of The Washington Post, “will come with some kind of digital dirt, some ancient tweet to be deleted or beer-pong portrait to untag.” This is, of course, true.

But it’s also true that those tweets don’t stay accessible as long as we might think. Without requesting my archives or searching for specific terms, I can’t access my own tweets past a certain point (last I checked, mid-December). For those who use Twitter, as I sometimes do, as a way of taking notes, this discovery can be very unwelcome. The archive still exists, of course. I can get to it if I want. But as Cerf points out, it may be that in three or four years that archive, downloaded to my computer, may exist in a form I’m unable to open; and Twitter may no longer exist to provide them to me in a format that I can use.

These tweets are no great loss to anyone, of course, not even to me, and in their contents they do not even rise to the level of the mildly scandalous. All the same, it came as a shock to realize how many small notes of mine were functionally inaccessible to me. In fact, thinking into the future with our current technology is a difficult task in many respects. You can, if you so desire, write an email to the future, which is by itself a fairly draining spiritual exercise. But are you really so confident that you know the email address you’ll use in the future, let alone that you’ll be using email at all?

Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The End(s) of History

Historia, Nikolaos Gyzis (1892). Wikimedia Commons.

Historia, Nikolaos Gyzis (1892). Wikimedia Commons.

Samuel Moyn, a professor of history and law at Harvard University, thinks history is in trouble, big trouble, and that its difficulties have been a long time in the making. Setting forth his reasons in a recent review-essay in The Nation, “The Bonfire of the Vanities,” he has started a minor skirmish in the larger debates over the parlous state of the humanities. Many of Moyn’s fellow historians have taken sharp issue with his argument,  including Princeton University’s Anthony Grafton, whose mentor, the Italian historian Arnaldo Momigliano, represents what Moyn believes are the inveterate weaknesses of the discipline: namely, an excessive and antiquarian fealty to fact, and a discomfort with theory. Grafton, who has engaged in a fierce Facebook exchange with Moyn, hardly needs my support, but I too find reasons to quibble—and some reasons to disagree strongly—with Moyn’s attempted take-down.

First, the quibble: Is history—and let’s just leave it at historical work produced by professional academicians—really at such a low ebb? Moyn’s assessment, though he puts it in the words of the authors he is reviewing, is in the sweeping affirmative:

Today, historians worry that they have lost their audience, and their distress has made the search for the next trend seem especially pressing. At the beginning of her new book, Writing History in the Global Era, Lynn Hunt remarks that “history is in crisis” because it can no longer answer “the nagging question” of why history matters. David Armitage and Jo Guldi, in their History Manifesto, concur: in the face of today’s “bonfire of the humanities,” and a disastrous loss of interest in a topic in which the culture used to invest heavily (and in classes that students used to attend in droves), defining a new professional vocation is critical. History, so often viewed as a “luxury” or “indulgence,” needs to figure out how to “keep people awake at night,” as Simon Schama has said. Actually, the problem is worse: students today have endless diversions for the wee hours; the trouble for historians is keeping students awake during the day.

This professional anxiety may loom large for a certain part of the history professoriate—namely, academics who rely on theoretical trends to make up for their deficiencies in the craft. But does it pertain to the better historians of the last half-century who have had a large claim on the attention and interest of the educated public? Moyn names Simon Schama. But he could also name Gordon Wood and John Ellis in American history, or Robert Darnton and Lynn Hunt in European history, or Peter Brown on late antiquity, or Roy Foster on modern Irish history.

As far as declining student interest goes, it afflicts other areas of the humanities as much as it does history. In addition to a broad flight to STEM and more practical business-related studies, a general erosion and shallowing of attention is at least as much the problem as the intellectual poverty of trend-sniffling historians who have run out of theories to rest their facts on. More significantly, though, Moyn’s attack aims far higher and deeper than at the mediocrities of the field: Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Compared to What?

Rows of people at the movie on their phonts

(credit: iStock)

Rutgers University professor Keith Hampton, profiled in a recent New York Times Magazine article,  challenges the claims of fellow social scientists such as MIT’s Sherry Turkle that digital technologies are driving us apart:

Hampton found that, rather than isolating people, technology made them more connected. “It turns out the wired folk — they recognized like three times as many of their neighbors when asked,” Hampton said. Not only that, he said, they spoke with neighbors on the phone five times as often and attended more community events. Altogether, they were much more successful at addressing local problems, like speeding cars and a small spate of burglaries. They also used their Listserv to coordinate offline events, even sign-ups for a bowling league. Hampton was one of the first scholars to marshal evidence that the web might make people less atomized rather than more. Not only were people not opting out of bowling leagues — Robert Putnam’s famous metric for community engagement — for more screen time; they were also using their computers to opt in.

For Hampton, what debates and research about the effects of digital technologies on our lives so often lack is historical perspective.

We’re really bad at looking back in time,” Hampton said, speaking of his fellow sociologists. “You overly idealize the past. It happens today when we talk about technology. We say: ‘Oh, technology, making us isolated. We’re disengaged.’ Compared to what? You know, this kind of idealized notion of what community and social interactions were like.” He crudely summarized his former M.I.T. colleague Sherry Turkle’s book “Alone Together.” “She said: ‘You know, today, people standing at a train station, they’re all talking on their cellphones. Public spaces aren’t communal anymore. No one interacts in public spaces.’ I’m like: ‘How do you know that? We don’t know that. Compared to what? Like, three years ago?’

Although the merits of Hampton’s particular study can be debated, he makes an important point when he asks simply, “compared to what?” Those who make arguments about technology’s deleterious effects on our ability to converse with one another, to pay attention, or to read closely usually presume some way that we ought to talk to each other, that we ought to attend to a given object or event, or that we ought to read.

And maybe these critics are right; perhaps we ought carry on in the ways they presume we should. But appealing to history to make these normative claims is a much trickier move.  History is fraught and full of bad conversation, distraction, and poor readers.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.