All for the Now—or the World Well Lost?

Costume designs for the libation bearers, London Globe production of “Oresteia,” 2015.

Riding with my middle-school daughter a while back, I heard one of her favorite pop songs on the radio and called it to her attention; she sighed and said, “Oh, mom, that’s so two minutes ago.” Apparently, the song was popular last year and, therefore, no longer qualifies as one of her current favorites. Then, she added, in exasperation, that the expression itself was outdated. It was popular a year ago, back when she and her friends used to parody the lingo of certain “popular, mean girls” from film or TV. So, a year ago is now “old”? Or two minutes? The whole conversation made me wonder: What kind of a sense of the past do our children grow up with today, and how does it shape our attitudes toward history?

That question emerged in a different way when my son started high school this year. As an academic observing his orientation, I was keenly interested in this introduction to the curriculum. Of all the things I learned, however, the most surprising was that his curriculum requires only one year of history to graduate. Three and a half years of physical education are required. Three to four years of English are essential, as are three years of math. But students at my son’s school can graduate with only one year of history, and US history at that. Even in his first-year English course, where students are required to read only three literary works during the entire academic year, two of the three were written in the last sixty years. In other words, there’s not much distant history in his English curriculum either.

This also squares with trends at the small liberal arts college where I teach. Enrollment in history courses is down. The history department’s faculty allocation has recently been cut. Even in the English department, where enrollment numbers are strong this year, our historically-oriented Renaissance literature line is being suspended due to budgetary adjustments, no doubt to make way for faculty positions in programs like biochemistry, molecular biology, and business. What this means is that my department will soon be without a permanent member who specializes in the period of the greatest flowering of literature in English.

And this dearth of expertise in the historical humanities is evident across the College. When I count the total number of pre-nineteenth century historical humanities positions at my college, considering fields such as art history, philosophy, theater, and religion, I find that only five percent of all full-time, permanent faculty members have expertise in such areas.

Is it any wonder then that young people often have a limited sense of the past, unable to place even watershed events such as the Protestant Reformation or the French Revolution or identify major historical time periods? Not long ago, my son returned home from middle school to boast that, unlike his peers who were hooked on techno-pop, he’d suddenly become interested in “more medieval music”—“You know, Mom, like Simon and Garfunkle, the Beatles, ELO.” I’ll give him a pass for being only twelve at the time, but I’d suggest that this historical illiteracy is more common—and more costly—than we might think.

Why should teaching the past matter? It matters because teaching any pre-modern culture exposes students to ways of being that may be alien to them, a form of ontological diversity just as important as the more familiar kinds we hear so much about today. Many years ago, in a lecture at my college, the classicist Danielle Allen argued that education is fundamentally about knowing the foreign. Like Allen, I share that conviction and, in my own courses, daily ask students to explore the foreign battlefields of Homeric Troy or to inhabit the psychological terrain of Augustine. Both the Iliad and the Confessions offer examples of imaginative mindscapes as foreign to many students as any far-flung land they might visit on a study-abroad trip. And such foreign intellectual encounters, so familiar in early literature and history courses, help students cultivate virtues such as empathy and tolerance.

Tracing the decline and fall of the Roman Empire, distant as it may be, reveals the dangers of overreaching imperial powers, the perils of resources stretched thin, and the consequences of growing economic disparities—none of which are problems confined only to the ancient world. As the historian Timothy Snyder observes in his brief wonder of a book On Tyranny, “Americans today are no wiser than the Europeans who saw democracy yield to fascism, Nazism, or communism in the twentieth century. Our one advantage is that we might learn from their experience.”

Although Aeschylus’s Oresteia brilliantly dramatizes the triumph of democratic processes of justice over vendetta-style retribution, it also displays the pernicious roots of patriarchy, with the Olympian gods themselves legitimizing male rule over female, as Apollo exculpates Orestes by claiming that the mother isn’t really the parent, only the seed bed, while Athena chimes in, professing to “stand by the man” despite being a woman. Likewise, Shakespeare’s Shylock, a comedy that turns on a act of mercy, also illuminates darker themes such as anti-Semitism and ethnic stereotyping.

History also teaches us that the pursuit of knowledge is often a digressive process. Unlike the natural sciences where knowledge and learning are generally linear, experimentation and research leading to new insights and replacing previous conclusions, humanistic knowledge proceeds haltingly. In the natural sciences, one often draws the conclusion that new knowledge is better than old knowledge. In the humanities, we value the ancient, the antique, the quaint, and the outmoded all in the interest of thickening and enriching our understanding of human life.

While much of that life has involved regrettable episodes, history reminds us of what it means to be questing and creative and to transcend the limits of our human predicament, as Julian of Norwich or Galileo or Mary Rowlandson once did. Studying the past has been shown to remove feelings of isolation that many young people in contemporary America report as their greatest fear. Further, today’s younger generation may learn resilience, courage, and fortitude through an imaginative engagement of the people of the past.

I have been haunted by the lines from a poem I recently read in a book, Cruel Futures by Carmen Giménez Smith,that playfully extols “Disorder in exchange/for embracing the now.” Although Smith’s short poem vindicates that disorder by focusing on personal rather than collective, historical knowledge, those lines have left me wondering about the public implications of such an “exchange.” When, as a society, we “embrace the now,” at the expense of the past, what sort of disorderly deal might we be making? I’m thinking here of, for example, the generally low level of civic participation in the United States. Might this indicate that we have become complacent about our history, forgetting the arduous efforts of a small group of patriots and visionaries, preferring instead the promises of Silicon Valley entrepreneurs and charismatic “thought leaders”?

In the academic world where I work, I often hear “That is the way of the past; this is the way of the future,” as if the past were to be regarded as a mere discarded image, disconnected from the priorities of the omnipresent “now.” As educators, we ought to remain wary of such facile dismissals of the past and be vigilant in refuting this kind of chronological snobbery, to borrow a phrase from C.S. Lewis and Owen Barfield. The wisdom and well-being of our young people and our civilization depend on historical knowledge. Otherwise, we may one day find ourselves victims of a “cruel future,” one in which ignorance of past problems condemns us to inevitable repetition of them, and where blindness about historical insights prevents us from seeing wiser paths forward.

Carla Arnell is associate professor of English and chair of the English Department at Lake Forest College.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterLinkedInGoogle+Share

The Art of the Possible

Detail from “The Effects of Good and Bad Government,” Caleb Ives Bach (1985).

What is “reality”? One answer: If I punch the wall, I hurt myself; if I step out the window, I fall. These are the principles I can accommodate myself to or manipulate or (for a short, inglorious period) choose to defy for some doomed reason or another.

Another answer comes from the first: Reality sets the bounds of the possible, the terms of debate, the imaginative limits we need to work under. Thus for politics, that art of the possible, reality says that there are winners and losers, that on certain issues, maybe all issues, we’re dealing with a zero sum game; your health or theirs, your safety or theirs, your children or theirs. There’s only so much space, so many chairs, so much goodwill to go around. Everybody’s hands are tied, no one is ever really responsible.

I’ll admit, in this second sense, I find I’m tired of reality, a shifting and twisting declaration of what cannot be argued with or challenged that comes down to things are as good as they can be, they stand to get worse if you agitate about that fact too much, and perceived reality is the only reality worth discussing (if you feel your hands are tied, does it matter whether or not they are?). Leibniz proposes in his Theodicy that the best of all possible worlds requires some of us to do evil, to fail, and to struggle. In the grandest understanding of space and time, if all could be encompassed and understood, that might be true enough. Politically, however, it’s a little much to swallow. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Introducing the Summer issue: Identities—What Are They Good for?

Identity is too much with us late and soon. It figures prominently in clashes over diversity, multiculturalism, political correctness, offensive speech, “deplorable” voters, and arrogant elites. In our overheated politics of recognition, “Check your privilege!” has become the rebuke of choice, aimed at silencing the opinions of those whose obliviousness to their entitlement is itself a giveaway of their advantaged social status. Those so accused—cisgender white males being prime suspects—in turn accuse their critics of playing identity politics to curtail free speech.

Identities are multiform, of course. Some are given or imposed, and some are elected. Some are acquired, while some are discarded. Some have to do with skin color; others, with ethnicity or religion, region or nation, gender or age, class or profession, disability or differing ability. Identities usually come in packages, and no matter how we assemble them, or how they are assembled for us, we are all, to use the current term of art, intersectional. We assume and wear our identities—in sum or part—proudly or shamefully, arrogantly or modestly. For some, identity explains much of who they are; for others, it explains very little and may even obscure who they believe they are.

Given its current importance, the struggle for recognition among our ever-proliferating identity groups might seem to be a peculiarly modern obsession. But even in the old regimes, with their static social hierarchies, the need for recognition was powerful. Recognition was pursued and attained largely on the field of honor, in daily efforts to fulfill the duties and obligations of one’s place in the divinely ordained social order.

As the old regimes were replaced by modern democratic states with growing social mobility, the concern with honor ceded to a new universalist politics that insisted upon dignity for all citizens, including equal rights and entitlements. But if the modern age did not give rise to the politics of recognition, it did give birth, as the philosopher Charles Taylor explains, to the “conditions in which the attempt to be recognized can fail.” It did so because, along with the new universalist politics, there arose a related but sometimes conflicting politics of difference, concerned precisely with winning recognition for one or more particular groups against the neglect, exploitation, or assimilationist pressures of the dominant group. The recurring collisions between these two modes of politics have produced some of the sharpest—and even the most violent—civil struggles within modern democratic states.

But the longevity and occasional ferocity of struggles arising from demands for equal rights, on one hand, and the recognition of difference, on the other, has brought relatively little light to the phenomenon of identity itself. How do we judge the adequacy, efficacy, or value of various forms of identity in our struggle to the find not only equal rights and privileges but also meaning and community?

That is the question that animates the thematic essays of the present issue of The Hedgehog Review, and though the answers range widely, they collectively provide an entry point for a deeper, possibly less fraught discussion of what separates humanity into tribes (defined by what are often extremely fine distinctions) and what may yet bring us together in a more capacious humanism that embraces universalist principles while respecting and protecting differences. As the historian Jackson Lears wrote not long ago wrote in the London Review of Books, “Identity politics in America was a tragic necessity. No one can deny the legitimacy or urgency of the need felt by women and minorities to have equality on their own terms, to reject the assumption that full participation in society required acceptance of the norms set by straight white males. Yet even as the public sphere grew more inclusive, the boundaries of permissible debate were narrowing.”

While Lears writes from the left and is largely concerned with the way our current form of identity politics has displaced a concern with class and economic equality, voices of the right and center have joined him in criticizing this coercive narrowing of political debate. (See, for example, Walter Benn Michael’s The Trouble with Diversity, Mark Lilla’s Once and Future Liberal, Asad Haider’s Mistaken Identity, and Francis Fukuyama’s forthcoming Identity: The Demand for Dignity and the Politics of Resentment.) But escaping the grip of identity politics will require an honest reckoning with the historical and contemporary realities that continue to fuel the politics of difference, whether in the emergence of a new racism visible in soaring rates of African American incarceration or in the ever-accumulating incidents of male aggression against women. And, yes, we must also heed the identity-based grievances of those “angry white males” (and quite a few females) who came together in surprisingly wide support of an uncivil anti-politician promising to make America great again.

Of one thing we can be certain: Identity politics begets more identity politics. Any hope of overcoming that politics must begin with a willingness to listen to those who cleave to identity for the very solidarity and confidence that may free them, ironically, from the more limiting, indeed punitive, aspects of an identity. Are there more commodious forms of identity, including a rekindled and truly civic nationalism, that can bring not just tolerance but a sense of mutuality across some of the most politically heated identity divides? It is an irony—perhaps even tragic one—that the only way out of the identity trap is through it. How we negotiate that irony is one of the distinctive challenges of our modern condition.

* * *

We will be releasing a select number of essays and reviews from this issue on a rolling basis during the coming weeks, starting with these three:

What Makes Me Black? What Makes You White? by W. Ralph Eubanks

In with the Out Crowd: Contrarians, Alone and Together by Steve Lagerfeld

Virtue Signaling by B.D. McClay

The entire issue, already on its way to subscribers, includes thematic contributions from Mary Townsend, Deirdre Nansen McCloskey, Benjamin Aldes Wurgaft, Phil Christman, S.D. Chrostowska, and James McWilliams along with standalone works by Witold Rybczynski, Becca Rothfeld, and Johann N. Neem as well as six book reviews. Browse the table of contents here and subscribe—if you haven’t yet—here.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Minority Report: Futurologists Need Historians

The 1936 film “Things to Come” brought together H.G. Wells, producer Alexander Korda, and designer and director William Cameron Menzies for a prescient look at the future. (Cover art, Criterion Collection)


Does the name Archibald M. Low ring a bell? The next time you look at your smartphone or watch your television, think of the Professor. He had a role, both speculatively and in actual development, in numerous innovations during the first half of the twentieth century. Low (1888–1956) didn’t invent the lithium battery or wireless telephony, but he foresaw the concepts that serve as the basis for a host of devices, from pocket telephones to television to drones. A prolific author, he wrote more than 40 books on scientific discoveries designed to nurture the public’s interest in science and engineering. The uncredentialed Low, who adopted the moniker “Professor” much to the chagrin of his academic peers, was also an ardent futurist with a social conscience. For example, tormented by London’s noise level, he studied the Tube to find ways to ameliorate its clattering and clanking, hoping to encourage more people to try out this newfangled public transportation. He also believed that residential housing should be modernized and simplified, even be made movable. I admire his panache, his ability to popularize specialized knowledge, and his futurism tempered with social consciousness. His critics labeled him an eccentric and a hack, and they derided his penchant for publicity. But Low’s scientific work was serious: He is considered the father of radio guidance systems, designed a forced induction engine, and wrote about the field that would become known as astronautics. Like his close contemporary H.G. Wells, Low is considered a storyteller first and scientist second, indispensable qualities, one might argue, for a futurist.

Futurists are the topic of a new book, A History of the Future: Prophets of Progress from H.G. Wells to Isaac Asimov, by Peter Bowler, emeritus professor at Queen’s University in Belfast and a historian of science, whom I recently interviewed. Bowler’s book focuses on thinkers and writers in the decades around the turn of the century who invited the public into the laboratory and research lab. Many of these scientists and authors were, in essence, futurologists whose work revolutionized notions of progress and continues to mark our lives to this day. But, as Bowler reminds us, while prognosticating about the future can be liberating, it is also a cautionary tale. At the dawn of the twentieth century, the West viewed itself as triumphant, superior in culture, politics, economics, and science. The idea that Western society was destined to advance from strength to strength inspired pundits and prognosticators, especially those in science, to demonstrate their progressive nature to the people at home and abroad. (It would take a catastrophic war and depression to prove otherwise.) Bowler observes that the West, especially its futurologists, would have been better served recalling their Shakespeare: “What’s past is prologue.”

In our interview, Bowler spoke about what he refers to as “this huge swell of suspicion and criticism concerning the applications of science.” He calls on the scientific community to remember its obligation not only to the field but also to the public: “I think it’s extremely important because without that sense of outreach, scientists risk allowing wild predictions to hold sway. They need to be invigorated to think about the wider applications of what they do. The best way of doing that is by encouraging them to enter into a dialogue with the people who are going to be affected by what they’re doing.”

I would suggest opening this dialogue by sending today’s new breed of futurologists copies of Bowler’s book. Among the first would be Elon Musk. Who else but the SpaceX maverick would have the wit—and the means—to send his own red Tesla into space with the radio playing David Bowie? His aerospace manufacturing and space transport firm has a list of firsts that has made it the envy of NASA. Musk ascribes to futurology to be sure, but, he would do well to read Bowler’s passage on engineer/physicist Robert Goddard, who moved his lab to Roswell, New Mexico, to escape the liquid fuel explosion prohibitions in his native Massachusetts. (My grandmother, Edubijen Garcia, cooked for Goddard and his wife in the mid-1930s and once had to prepare New England-style baked cod for Goddard’s guests, Charles Lindbergh and Harry Guggenheim.) Goddard had friends in high places, but his work was still subjected to scorn and false reports as when Russian newspapers reported in 1924 that a Goddard rocket had taken a man to the moon, a propaganda story ginned up to stimulate Soviet scientists. Chastened, Goddard tempered his dreams about space travel and even refused to join the American Rocketry Society; like his friend H.G. Wells, he had been stung by public misunderstanding and condemnation. Musk, and even NASA, might take note, realizing that grand schemes can result in spectacular (and expensive) public failures that call into question the wisdom of space exploration in a time of more pressing earthbound problems.

Another person to whom I’d send a copy of Bowler’s book might be Tom Silva, host of Ask This Old House (PBS), a sister show of the popular This Old House. Recently, Silva visited San Francisco’s Autodesk where designers use lasers and sophisticated milling tools to fabricate furniture pieces in wood and metal. Silva seemed impressed by what the laser cutter could do, but knowing of his accomplishments as a master craftsman and builder, I found it uncomfortable to watch his enthusiasm. He may not be a futurist, but he is like many of us, straddling the fence between the past and the future. As Bowler notes, this tension over making things by automation or by human hands has plagued us for over a century and is unlikely to end any time soon. I am reminded of Buckminster Fuller’s caution that “humanity is acquiring all the right technology for all the wrong reasons.”

As a young man, Bowler told me, he had been an avowed technophile. Now in his seventies, he admits to being wary of social media and the Internet. “You have to ask,” he says, “is this particular technology or pursuit worth my time?” Or, more to the point: Is the Internet really worth it? “I get a sense people are using all this wonderful technology,” Bowler told me, “but they’re also getting enslaved by it because of what they have to do to keep up to date with all the stuff that’s being thrown at them.” Perhaps the historian has it right, especially in light of recent revelations that Facebook allowed a third party to mine its users’ data during the 2016 presidential election, news that caused its stock prices to dive and led prominent members such as Elon Musk to delete his account in protest. Bowler seems to be voicing the growing concern that we have a moral obligation to consider our plunge into the digital abyss.

Today, with the entrance of private entrepreneurs into the field of space exploration—NASA recently announced that it would partner with Musk’s SpaceX to put an international space station in lunar orbit—the space race takes on a new urgency. Who knows what benefits might be enjoyed by future generations because one individual had the means and the vision to join forces with NASA? Thinkers like Bowler might not only applaud the desire for human flourishing, but also caution that the competing interests or diverging aims of those involved in such a partnership could spoil all the best intentions. In order to calculate fully the benefits and the risks, it might be time to consult a historian like Bowler.

J.N. Campbell is an independent scholar, writer, and editor in Houston, Texas. He is the co-author with Steven M. Rooney of A Time-Release History of the Opioid Epidemic (Springer), due out this summer. His email is campbelln5@yahoo.com.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Limits of Corporate Activism

Money to Burn, Victor Dubreuil

“The work that religion, government, and war have failed in must be done by business.” This proclamation appears in the 1928 book Business as Civilizer, written by an enthusiastic advertising executive, Earnest Elmo Calkins. Calkins saw in the success of early twentieth century corporations signs of a new era commencing: America’s “business millennium.” It is not difficult to see the relevance of Calkin’s proclamation today. While Florida’s state legislature has now passed a gun control bill in the wake of the February 14th Parkland High School shooting, the most tangible preventative measures have so far been undertaken by businesses.

As Derek Thompson notes in The Atlantic, more than twenty companies have now cut ties with the National Rifle Association. Two national retailers—Walmart and Dick’s Sporting Goods—elected not to wait out new gun control legislation but announced they would voluntarily be ending gun sales to customers under 21. “We don’t want to be part of this story any longer,” explained Dick’s Sporting Goods CEO Edward Stack in an interview on CNN. Anti-gun activists have now turned their pressure toward new corporate targets, trying to get Apple, Amazon, and FedEx to break ties with NRA.

These forms of corporate activism have become familiar political stories in the age of Trump. Established corporate leaders and Silicon Valley entrepreneurs have found themselves moving off the political sidelines and weighing in on highly contentious social issues. As several political commentators observed, President Trump’s economic advisory councils have lost far more members to principled and openly dissenting resignations than his “Evangelical advisory board,” which has remained remarkably intact. The prophetic moral authority previously resting on clergy—particularly amidst the Civil Rights Movement of the 1960s—may now rest on our corporate leaders.

Yet this isn’t completely new. Calkins, in Business as Civilizer, was already in 1928 cheering on corporations’ expansion of power into other realms. The efficiency of business, he argued, is sufficient evidence that the “despised business man” can be entrusted to do what government and other institutions have failed at. But this marriage of power, profits, and politics raised as many questions then as it does now. Who ultimately controls a politically “woke” corporate sector? And whose interests are ultimately served? Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Introducing the Spring Issue: The Human and the Digital

Person using laptop, overhead view. (Digital Composite)

Are we marching to Estonia?

It might seem so. According to Nathan Heller in the New Yorker, the small Baltic republic is well on its way to transforming itself “from a state to a digital society.” Under the aegis of e-Estonia, as the nation’s government-led project is called, virtually every service the state deals with, from education to health care to transportation, is being “digitally linked across one platform, wiring the nation.” Savings and efficiencies amounting to two percent of the country’s GDP have already been realized, and cutting-edge innovations, from driverless cars to an elaborately de-centralized system of personal data, are changing the way 1.3 million Estonians (and some 28,000 registered e-residents) conduct business and lead their lives.

Whether you see it as utopia or dystopia, Estonia’s digitopia is where modern societies appear to be heading. Yet as the contributors to this issue ask, how well prepared are we humans for life under the ever-ramifying digital dispensation? Do we even begin to consider what we might be risking when we opt for, or succumb to, the ease, efficiency, and beguilements of online life?

The thread running through the essays in The Human and the Digital, our latest issue, it is that we yet poorly grasp the many perverse effects of the kind of dominion promised by our embrace of the new digital dispensation. To some degree, we are what we make. But when what we make makes us in ways that we fail to understand, the human at the core of culture grows dangerously fragile.

We will be releasing a select number of essays and reviews from this issue on a rolling basis during the coming weeks, starting with the following two:

The full issue, already on its way to subscribers, includes thematic contributions from Christine Rosen, Alan Jacobs, and Leif Weatherby, along with standalone works by Charlie TysonJonathan D. TeubnerS.D. Chrostowska, and Greg Jackson. Browse the table of contents here, and subscribe—if you haven’t yet—here.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Privilege

The abolition of privileges, at the Monument to the Republic, Paris. Via Wikimedia Commons.

We hear it said a lot these days: white privilege, male privilege, cisgender privilege. It suggests an advantage that is in some way illegitimate. The concept acquired greater sharpness for me recently while reading Simon Schama’s Citizens: A Chronicle of the French Revolution. Under the ancien régime, ennobled families were granted privilege in the literal sense; that is, they answered to a different set of laws (privy: private, leges: laws). In particular, they were exempt from taxation. Making matters worse, one could buy into this arrangement through the purchase of “venal offices,” which granted one the same immunities. One might become an inspector of cheeses, for example. It really was that ridiculous. Such positions proliferated as the fiscal crisis of the 1780s deepened; the sale of offices was a way for the crown to finance its present needs through the sacrifice of future tax revenue.  Those who purchased offices were entered, along with their descendants, into the lists of noble families, permanently exempt from the tax burden.

Meanwhile, one of many forms of taxation that peasants were subject to was the corvée (literally, “drudgery”): The men of a village would be rounded up to perform some public works project such as the building of a road, and for whatever reason this tended to happen during the harvest, just when their labor was most needed at home. It was a bitter injustice.

Obviously, the whole system of privilege was parasitical. It was also quite different from what we mean today when we speak of privilege. According to current usage, it means something like good fortune. In a polemical discussion of education, for example, it will be said that a child who grows up with two parents is “privileged,” from which we are meant to infer that there is something illegitimate about the source of his relative calm and competence.

But it’s not as though such advantages make him a parasite on society. For us, the meaning of the term is reversed. If you are privileged, it means you are expected to contribute more, not less, than someone who is “underprivileged.” But at the same time, your being in a position to do so may be subject to the same resentment that was directed to the privileges of the ancien régime. From the perspective of eighteenth century usage, it looks as though the point of recasting any advantage as “privilege” is to suggest that all inequality of condition is illegitimate, based on an underlying injustice.

But what this injustice consists of is usually not elaborated. If one presses for details (and this is already a breach of etiquette), the reasons offered are often tendentious. The term privilege is used not to make a case but rather to convey a mood. Why is there so much political opportunity to be had by deploying this mood, as a weapon? What accounts for our susceptibility to being cowed by it, or indeed to indulging it ourselves, this fuzzy indignation? In particular, we need to account for the fact that accusations of privilege are most prominent among…well, the privileged. (For example, Ivy League students.) Hold that thought. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Are Honor Codes Still Necessary?

Plaque commemorating the 150th anniversary of the University of Virginia honor system

HONOR PLEDGE
On my honor as a student, I have neither given nor received aid on this assignment/examination.

In the midst of the University of Virginia’s stately grounds, anchored by the Rotunda, Jefferson’s temple to the triumph of reason, it is easy to overlook the small plaques on the walls of UVA classrooms. The inscription they carry is central to Jefferson’s university: “On my honor as a student, I have neither given nor received aid on this assignment/examination.” Desks in the library carry notices with a shortened version of the statement: “On My Honor.”

Besides reminding students of the university’s Honor Code—and its enforcement body, the historic, student-run, Honor System—notices like these serve a more elementary (and obvious) purpose: to encourage individuals to refrain from dishonest behavior. A number of studies have demonstrated that moral standards are not self-executing, and that even the most upright person benefits from reminders. In one experiment in 2008, subjects who recited the Ten Commandments before taking a test cheated significantly less. There may be a consensus that dishonesty is bad, but that is not enough; the standards must be kept top of mind.

Like so many other aspects of our society, the idea of teaching honor and integrity has undergone a change, especially in the area of higher education. Colleges have elected to address student cheating in a variety of ways. At Hamilton College, peer proctoring is the norm; at Haverford College, cheaters must apologize to the entire student body via email; at Middlebury College, touting the honor code is a way to sell the college experience to visiting parents but not something strongly enforced. UVA’s honor code is considered effective but, according to its detractors, excessively punitive (one strike and you’re out, also known as single sanction).

In fact, there have been calls to abandon university honor codes since at least the 1930s. According to a 1983 article in the New York Times, the Johns Hopkins administration could not make the code work and didn’t bother to replace it—an associate dean was quoted as saying “The old procedure just wasn’t working.”

A Culture of Omertà

But the new one isn’t working either. During my tenure at Johns Hopkins University, where I earned a PhD in mathematics and served as a teaching assistant in undergraduate math courses for three years, I witnessed a shocking array of dishonest acts. In one semester, more than half of my calculus students copied answers verbatim from the solution bank Chegg, a website which advertises “Homework Help.” (Ironically, Chegg was not helpful at all; in many cases, its solutions are wildly off.) During exams, I often saw eyes wandering onto neighbors’ papers and I intercepted numerous smuggled notes. One student made six visits to the bathroom during a two-hour final exam.

More troubling yet was the unspoken culture of omertà. Every attempt I made to reform the culture in the math program was stymied and every complaint was rebuked. My fellow graduate students opposed my plan to inform the authorities of suspected incidents of academic misconduct, citing a reluctance to “go around the professors.” The faculty members insisted on “settling the matter internally,” which apparently meant keeping the misconduct quiet. And when I wrote a letter to the dean—I did not demand punitive action; I only requested an investigation—I encountered resistance.

In one specific case, four students submitted homework containing a problem they had copied from Chegg. One of them admitted wrongdoing. The university’s dean overseeing student conduct declined to prosecute. In fact, this dean failed to include accounts of the incidents in school records, a blatant violation of university protocol, and, in one instance of the four involved, refused to take the case to trial despite that student’s prior offense. (University policy mandates adjudication with an ethics board in the case of recidivism.) The dean’s proffered reason was that the student had already suffered consequences and, amazingly, that not every answer had been copied. From the dean’s email: “the Professor…has offered [the student] a zero on the assignment given [the student] only used an online source for one question.” But the real reason seemed to be that “The student is a senior and ready to graduate.”

If my university had had an honor system, the tasks of gathering evidence, tracking down the accused, arranging face-to-face meetings, explaining the charges, selecting a sanction, and eliciting confessions would not have fallen to an inexperienced teaching assistant like me. Rather, these functions would have been handled by trained, unbiased groups with a host of resources available to them. Without an honor code, my university quite simply lacked the infrastructure to pursue complaints.

On Duty at the Panopticon

The UVA honor system arose after the death of professor John A.G. Davis in 1840. Davis was attempting to quell a disturbance on the Lawn when he was shot by a student. The incident was alarming to both students and faculty and the honor code was introduced in 1842 to ease tensions. Later, after the Civil War, the code tended to serve Southern notions of gentlemanly honor, but it still remained student-enforced. UVA has changed much since Southern gentlemen were expelled for cheating at cards, but the honor code is still integral. As a recent student chair of the honor committee noted: “The honor system was first created at a time in our history when the University was small and homogenous, a long way from the large, diverse institution we have become. It’s a cornerstone of a place with a long, complicated and sometimes unsavory past. But…the honor system was not intended only for the age in which it was established. The truth is, the fundamental values that the honor system was founded to promote—integrity and trust—are more relevant today than ever.”

A common criticism of honor codes is that integrity ought to be a given. An explicit statement of morality should not be necessary. But UVA’s honor code—in fact, the honor code at any university—is predicated on the belief in students’ essential virtue and that the mission of the university and the individual’s own flourishing are best served when accountability for misconduct rests not with professors but with peers.

Rather than begin by accepting students’ fundamental orientation toward virtuous behavior, Johns Hopkins cultivated an atmosphere of mutual distrust between professors and students, one ripe for rampant cheating. In fact, things devolved into a near-police state. Students were routinely required to show school photo-identification cards and to sign their names before handing in their tests. One fellow TA compared exam proctoring to duty in a Panopticon. The climate was harsh—almost militaristic—but even these drastic measures were ineffective. Students were never expected to acknowledge the simple premise that they were there to learn and that their work should be their own. And so cheating persisted.

“As confidence in our social institutions collapses,” UVA professor Chad Wellmon recently observed, “the university is, at least in theory, ideally suited to be a beacon of public discourse and democratic and intellectual ideals and virtues.” UVA English professor Michael Suarez says, “Honor calls us to be honorable to each other, not merely by not committing transgressions, but also by doing reverence to the other in our midst.” The answer it seems is that, yes, honor codes can offer a better model for moral formation in the modern university.

Benjamin Diamond earned his PhD in mathematics from Johns Hopkins University in 2017.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.