Category Archives: Commentary

All for the Now—or the World Well Lost?

Costume designs for the libation bearers, London Globe production of “Oresteia,” 2015.

Riding with my middle-school daughter a while back, I heard one of her favorite pop songs on the radio and called it to her attention; she sighed and said, “Oh, mom, that’s so two minutes ago.” Apparently, the song was popular last year and, therefore, no longer qualifies as one of her current favorites. Then, she added, in exasperation, that the expression itself was outdated. It was popular a year ago, back when she and her friends used to parody the lingo of certain “popular, mean girls” from film or TV. So, a year ago is now “old”? Or two minutes? The whole conversation made me wonder: What kind of a sense of the past do our children grow up with today, and how does it shape our attitudes toward history?

That question emerged in a different way when my son started high school this year. As an academic observing his orientation, I was keenly interested in this introduction to the curriculum. Of all the things I learned, however, the most surprising was that his curriculum requires only one year of history to graduate. Three and a half years of physical education are required. Three to four years of English are essential, as are three years of math. But students at my son’s school can graduate with only one year of history, and US history at that. Even in his first-year English course, where students are required to read only three literary works during the entire academic year, two of the three were written in the last sixty years. In other words, there’s not much distant history in his English curriculum either.

This also squares with trends at the small liberal arts college where I teach. Enrollment in history courses is down. The history department’s faculty allocation has recently been cut. Even in the English department, where enrollment numbers are strong this year, our historically-oriented Renaissance literature line is being suspended due to budgetary adjustments, no doubt to make way for faculty positions in programs like biochemistry, molecular biology, and business. What this means is that my department will soon be without a permanent member who specializes in the period of the greatest flowering of literature in English.

And this dearth of expertise in the historical humanities is evident across the College. When I count the total number of pre-nineteenth century historical humanities positions at my college, considering fields such as art history, philosophy, theater, and religion, I find that only five percent of all full-time, permanent faculty members have expertise in such areas.

Is it any wonder then that young people often have a limited sense of the past, unable to place even watershed events such as the Protestant Reformation or the French Revolution or identify major historical time periods? Not long ago, my son returned home from middle school to boast that, unlike his peers who were hooked on techno-pop, he’d suddenly become interested in “more medieval music”—“You know, Mom, like Simon and Garfunkle, the Beatles, ELO.” I’ll give him a pass for being only twelve at the time, but I’d suggest that this historical illiteracy is more common—and more costly—than we might think.

Why should teaching the past matter? It matters because teaching any pre-modern culture exposes students to ways of being that may be alien to them, a form of ontological diversity just as important as the more familiar kinds we hear so much about today. Many years ago, in a lecture at my college, the classicist Danielle Allen argued that education is fundamentally about knowing the foreign. Like Allen, I share that conviction and, in my own courses, daily ask students to explore the foreign battlefields of Homeric Troy or to inhabit the psychological terrain of Augustine. Both the Iliad and the Confessions offer examples of imaginative mindscapes as foreign to many students as any far-flung land they might visit on a study-abroad trip. And such foreign intellectual encounters, so familiar in early literature and history courses, help students cultivate virtues such as empathy and tolerance.

Tracing the decline and fall of the Roman Empire, distant as it may be, reveals the dangers of overreaching imperial powers, the perils of resources stretched thin, and the consequences of growing economic disparities—none of which are problems confined only to the ancient world. As the historian Timothy Snyder observes in his brief wonder of a book On Tyranny, “Americans today are no wiser than the Europeans who saw democracy yield to fascism, Nazism, or communism in the twentieth century. Our one advantage is that we might learn from their experience.”

Although Aeschylus’s Oresteia brilliantly dramatizes the triumph of democratic processes of justice over vendetta-style retribution, it also displays the pernicious roots of patriarchy, with the Olympian gods themselves legitimizing male rule over female, as Apollo exculpates Orestes by claiming that the mother isn’t really the parent, only the seed bed, while Athena chimes in, professing to “stand by the man” despite being a woman. Likewise, Shakespeare’s Shylock, a comedy that turns on a act of mercy, also illuminates darker themes such as anti-Semitism and ethnic stereotyping.

History also teaches us that the pursuit of knowledge is often a digressive process. Unlike the natural sciences where knowledge and learning are generally linear, experimentation and research leading to new insights and replacing previous conclusions, humanistic knowledge proceeds haltingly. In the natural sciences, one often draws the conclusion that new knowledge is better than old knowledge. In the humanities, we value the ancient, the antique, the quaint, and the outmoded all in the interest of thickening and enriching our understanding of human life.

While much of that life has involved regrettable episodes, history reminds us of what it means to be questing and creative and to transcend the limits of our human predicament, as Julian of Norwich or Galileo or Mary Rowlandson once did. Studying the past has been shown to remove feelings of isolation that many young people in contemporary America report as their greatest fear. Further, today’s younger generation may learn resilience, courage, and fortitude through an imaginative engagement of the people of the past.

I have been haunted by the lines from a poem I recently read in a book, Cruel Futures by Carmen Giménez Smith,that playfully extols “Disorder in exchange/for embracing the now.” Although Smith’s short poem vindicates that disorder by focusing on personal rather than collective, historical knowledge, those lines have left me wondering about the public implications of such an “exchange.” When, as a society, we “embrace the now,” at the expense of the past, what sort of disorderly deal might we be making? I’m thinking here of, for example, the generally low level of civic participation in the United States. Might this indicate that we have become complacent about our history, forgetting the arduous efforts of a small group of patriots and visionaries, preferring instead the promises of Silicon Valley entrepreneurs and charismatic “thought leaders”?

In the academic world where I work, I often hear “That is the way of the past; this is the way of the future,” as if the past were to be regarded as a mere discarded image, disconnected from the priorities of the omnipresent “now.” As educators, we ought to remain wary of such facile dismissals of the past and be vigilant in refuting this kind of chronological snobbery, to borrow a phrase from C.S. Lewis and Owen Barfield. The wisdom and well-being of our young people and our civilization depend on historical knowledge. Otherwise, we may one day find ourselves victims of a “cruel future,” one in which ignorance of past problems condemns us to inevitable repetition of them, and where blindness about historical insights prevents us from seeing wiser paths forward.

Carla Arnell is associate professor of English and chair of the English Department at Lake Forest College.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterLinkedInGoogle+Share

The Art of the Possible

Detail from “The Effects of Good and Bad Government,” Caleb Ives Bach (1985).

What is “reality”? One answer: If I punch the wall, I hurt myself; if I step out the window, I fall. These are the principles I can accommodate myself to or manipulate or (for a short, inglorious period) choose to defy for some doomed reason or another.

Another answer comes from the first: Reality sets the bounds of the possible, the terms of debate, the imaginative limits we need to work under. Thus for politics, that art of the possible, reality says that there are winners and losers, that on certain issues, maybe all issues, we’re dealing with a zero sum game; your health or theirs, your safety or theirs, your children or theirs. There’s only so much space, so many chairs, so much goodwill to go around. Everybody’s hands are tied, no one is ever really responsible.

I’ll admit, in this second sense, I find I’m tired of reality, a shifting and twisting declaration of what cannot be argued with or challenged that comes down to things are as good as they can be, they stand to get worse if you agitate about that fact too much, and perceived reality is the only reality worth discussing (if you feel your hands are tied, does it matter whether or not they are?). Leibniz proposes in his Theodicy that the best of all possible worlds requires some of us to do evil, to fail, and to struggle. In the grandest understanding of space and time, if all could be encompassed and understood, that might be true enough. Politically, however, it’s a little much to swallow. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

The Limits of Corporate Activism

Money to Burn, Victor Dubreuil

“The work that religion, government, and war have failed in must be done by business.” This proclamation appears in the 1928 book Business as Civilizer, written by an enthusiastic advertising executive, Earnest Elmo Calkins. Calkins saw in the success of early twentieth century corporations signs of a new era commencing: America’s “business millennium.” It is not difficult to see the relevance of Calkin’s proclamation today. While Florida’s state legislature has now passed a gun control bill in the wake of the February 14th Parkland High School shooting, the most tangible preventative measures have so far been undertaken by businesses.

As Derek Thompson notes in The Atlantic, more than twenty companies have now cut ties with the National Rifle Association. Two national retailers—Walmart and Dick’s Sporting Goods—elected not to wait out new gun control legislation but announced they would voluntarily be ending gun sales to customers under 21. “We don’t want to be part of this story any longer,” explained Dick’s Sporting Goods CEO Edward Stack in an interview on CNN. Anti-gun activists have now turned their pressure toward new corporate targets, trying to get Apple, Amazon, and FedEx to break ties with NRA.

These forms of corporate activism have become familiar political stories in the age of Trump. Established corporate leaders and Silicon Valley entrepreneurs have found themselves moving off the political sidelines and weighing in on highly contentious social issues. As several political commentators observed, President Trump’s economic advisory councils have lost far more members to principled and openly dissenting resignations than his “Evangelical advisory board,” which has remained remarkably intact. The prophetic moral authority previously resting on clergy—particularly amidst the Civil Rights Movement of the 1960s—may now rest on our corporate leaders.

Yet this isn’t completely new. Calkins, in Business as Civilizer, was already in 1928 cheering on corporations’ expansion of power into other realms. The efficiency of business, he argued, is sufficient evidence that the “despised business man” can be entrusted to do what government and other institutions have failed at. But this marriage of power, profits, and politics raised as many questions then as it does now. Who ultimately controls a politically “woke” corporate sector? And whose interests are ultimately served? Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Are Honor Codes Still Necessary?

Plaque commemorating the 150th anniversary of the University of Virginia honor system

HONOR PLEDGE
On my honor as a student, I have neither given nor received aid on this assignment/examination.

In the midst of the University of Virginia’s stately grounds, anchored by the Rotunda, Jefferson’s temple to the triumph of reason, it is easy to overlook the small plaques on the walls of UVA classrooms. The inscription they carry is central to Jefferson’s university: “On my honor as a student, I have neither given nor received aid on this assignment/examination.” Desks in the library carry notices with a shortened version of the statement: “On My Honor.”

Besides reminding students of the university’s Honor Code—and its enforcement body, the historic, student-run, Honor System—notices like these serve a more elementary (and obvious) purpose: to encourage individuals to refrain from dishonest behavior. A number of studies have demonstrated that moral standards are not self-executing, and that even the most upright person benefits from reminders. In one experiment in 2008, subjects who recited the Ten Commandments before taking a test cheated significantly less. There may be a consensus that dishonesty is bad, but that is not enough; the standards must be kept top of mind.

Like so many other aspects of our society, the idea of teaching honor and integrity has undergone a change, especially in the area of higher education. Colleges have elected to address student cheating in a variety of ways. At Hamilton College, peer proctoring is the norm; at Haverford College, cheaters must apologize to the entire student body via email; at Middlebury College, touting the honor code is a way to sell the college experience to visiting parents but not something strongly enforced. UVA’s honor code is considered effective but, according to its detractors, excessively punitive (one strike and you’re out, also known as single sanction).

In fact, there have been calls to abandon university honor codes since at least the 1930s. According to a 1983 article in the New York Times, the Johns Hopkins administration could not make the code work and didn’t bother to replace it—an associate dean was quoted as saying “The old procedure just wasn’t working.”

A Culture of Omertà

But the new one isn’t working either. During my tenure at Johns Hopkins University, where I earned a PhD in mathematics and served as a teaching assistant in undergraduate math courses for three years, I witnessed a shocking array of dishonest acts. In one semester, more than half of my calculus students copied answers verbatim from the solution bank Chegg, a website which advertises “Homework Help.” (Ironically, Chegg was not helpful at all; in many cases, its solutions are wildly off.) During exams, I often saw eyes wandering onto neighbors’ papers and I intercepted numerous smuggled notes. One student made six visits to the bathroom during a two-hour final exam.

More troubling yet was the unspoken culture of omertà. Every attempt I made to reform the culture in the math program was stymied and every complaint was rebuked. My fellow graduate students opposed my plan to inform the authorities of suspected incidents of academic misconduct, citing a reluctance to “go around the professors.” The faculty members insisted on “settling the matter internally,” which apparently meant keeping the misconduct quiet. And when I wrote a letter to the dean—I did not demand punitive action; I only requested an investigation—I encountered resistance.

In one specific case, four students submitted homework containing a problem they had copied from Chegg. One of them admitted wrongdoing. The university’s dean overseeing student conduct declined to prosecute. In fact, this dean failed to include accounts of the incidents in school records, a blatant violation of university protocol, and, in one instance of the four involved, refused to take the case to trial despite that student’s prior offense. (University policy mandates adjudication with an ethics board in the case of recidivism.) The dean’s proffered reason was that the student had already suffered consequences and, amazingly, that not every answer had been copied. From the dean’s email: “the Professor…has offered [the student] a zero on the assignment given [the student] only used an online source for one question.” But the real reason seemed to be that “The student is a senior and ready to graduate.”

If my university had had an honor system, the tasks of gathering evidence, tracking down the accused, arranging face-to-face meetings, explaining the charges, selecting a sanction, and eliciting confessions would not have fallen to an inexperienced teaching assistant like me. Rather, these functions would have been handled by trained, unbiased groups with a host of resources available to them. Without an honor code, my university quite simply lacked the infrastructure to pursue complaints.

On Duty at the Panopticon

The UVA honor system arose after the death of professor John A.G. Davis in 1840. Davis was attempting to quell a disturbance on the Lawn when he was shot by a student. The incident was alarming to both students and faculty and the honor code was introduced in 1842 to ease tensions. Later, after the Civil War, the code tended to serve Southern notions of gentlemanly honor, but it still remained student-enforced. UVA has changed much since Southern gentlemen were expelled for cheating at cards, but the honor code is still integral. As a recent student chair of the honor committee noted: “The honor system was first created at a time in our history when the University was small and homogenous, a long way from the large, diverse institution we have become. It’s a cornerstone of a place with a long, complicated and sometimes unsavory past. But…the honor system was not intended only for the age in which it was established. The truth is, the fundamental values that the honor system was founded to promote—integrity and trust—are more relevant today than ever.”

A common criticism of honor codes is that integrity ought to be a given. An explicit statement of morality should not be necessary. But UVA’s honor code—in fact, the honor code at any university—is predicated on the belief in students’ essential virtue and that the mission of the university and the individual’s own flourishing are best served when accountability for misconduct rests not with professors but with peers.

Rather than begin by accepting students’ fundamental orientation toward virtuous behavior, Johns Hopkins cultivated an atmosphere of mutual distrust between professors and students, one ripe for rampant cheating. In fact, things devolved into a near-police state. Students were routinely required to show school photo-identification cards and to sign their names before handing in their tests. One fellow TA compared exam proctoring to duty in a Panopticon. The climate was harsh—almost militaristic—but even these drastic measures were ineffective. Students were never expected to acknowledge the simple premise that they were there to learn and that their work should be their own. And so cheating persisted.

“As confidence in our social institutions collapses,” UVA professor Chad Wellmon recently observed, “the university is, at least in theory, ideally suited to be a beacon of public discourse and democratic and intellectual ideals and virtues.” UVA English professor Michael Suarez says, “Honor calls us to be honorable to each other, not merely by not committing transgressions, but also by doing reverence to the other in our midst.” The answer it seems is that, yes, honor codes can offer a better model for moral formation in the modern university.

Benjamin Diamond earned his PhD in mathematics from Johns Hopkins University in 2017.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Shame: An Argument for
Preserving “Those” Monuments

Two of the Clark Mills equestrian statues of Andrew Jackson, Lafayette Park, Washington, DC (left) and Jackson Square, New Orleans (right); photos: Leann Davis Alspaugh

Audio brought to you by curio.io, a Hedgehog Review partner.

On July 4, around 8 am, the French Quarter was wild with heat. I walked up St. Peter’s and took a left on Bourbon, where street cleaners hosed off the previous evening’s bacchanalia of regret. At Canal, I went left and by the time I reached St. Charles my glasses were fogged with humidity. I crossed Poydras and went to Camp Street. From there, I went right and my pulse quickened, anticipating the famous absence I’d traveled here to witness. I was making this walk well after the press had left town and well before white supremacists terrorized Charlottesville, Virginia, to experience the empty plinth where a statue of General Robert E. Lee once stood.

But then my geography got rusty. I was expecting to see the conspicuous display of emptiness about two blocks straight ahead. My body tensed in anticipation. But crossing Andrew Higgins Street, I looked right to make sure all was clear, and it was in that nanosecond that I unexpectedly got a direct view of the nothingness that was indeed something and—a reaction I don’t typically have—I gasped.

The image moved me: Robert E. Lee, that icon of the Confederacy, that bronze statuesque symbol that once lorded several stories over New Orleans, was, after 132 years, gone, relegated (for now) to municipal storage. And there I stood, a white person who, by virtue of my whiteness, benefits daily from the legacy of slavery, and took in this poignantly empty column, feeling the power history in a way I’d never before felt it.

Weeks earlier, with rare eloquence, Mitch Landrieu, the mayor of New Orleans, drove home the emotion in a remarkable speech. The Times-Picayune called it “one of the most honest speeches on race” delivered by “a white southern politician.” Landrieu, in the aftermath of the statue’s removal from Lee Circle, explained to a city that’s 62 percent black how “These statues are not just stone and metal. They are not just innocent remembrances of a benign history. These monuments purposefully celebrate a fictional, sanitized Confederacy; ignoring the death, ignoring the enslavement, and the terror that it actually stood for.” A lot of people said it and I agreed with them—Amen.

And so there it was: a seamless convergence of media, morality, and message. The removal of a city’s offensive Confederate-themed statues, a speech that will be anthologized, the humility of a public figure, a frank look at the reality of racism, and now this eerie lone column, a stark and unifying exclamation point on a Southern landscape. And yet, in spite of myself, something in my gut told me that General Lee should have stayed.

The Problem with Jackson

Before leaving the French Quarter for Lee Circle, I spent a few moments in Jackson Square contemplating the lone statue of Andrew Jackson. As an historian, I knew Jackson fairly well. I knew he was a slaveholder. I knew he was a man who built his identity around killing Indians. I knew that his reputation as an ethnic cleanser helped get him ousted from the twenty-dollar bill.

Knowing all this, I wondered how this swaggering crusader for racial purity still sat lionized atop his rearing horse, tipping his hat to the city he saved at the Battle of New Orleans, the city that, as it purged its obvious symbols of the Confederacy, refused—as Landrieu did—to include in that purge a figure who helped make the Confederacy possible.

There’s no question that removing a Confederate era statue—a monument put in place to remind blacks that they would never have equal rights—is a symbolic expression of justice. My own reaction to Lee’s absence proved it. But the persistence of Jackson led me to realize something was wrong. It made me wonder if there might be something too easy in the symbolism of Lee’s removal, an ease that exonerated white progressives from doing something far more challenging and consequential for the cause of racial justice than tearing down statues, spitting on them, and sending out virtue signals on Instagram.

After my Jackson-to-Lee walk, I met with Richard Marksbury at a coffee shop near Tulane University. Marksbury, sixty-six and white, is a cultural anthropologist who directs the university’s Asian Studies Program. Of all the arguments marshaled against the statue removals, Marksbury’s stood out for their rigor and manner in which he delivered them—not as a caveat-generating academic, but as an activist affiliated with the all-volunteer Monumental Task Committee, a group founded in 1989 to “restore, repair, and forever maintain all the monuments located in the city.”

Marksbury’s case was this: The white citizenry of New Orleans agreed in 1884 to celebrate Robert E. Lee by erecting a monument to his legacy. Even if that choice was, in Landrieu’s words, on “the wrong side of history and humanity,” it was made without ambiguity by racists interested in furthering the myth of the lost cause. That fact alone—history left the monument there as a kind of primary source for us to interpret—legitimates its right to stay put. “If something is there for 130 years,” Marksbury said, “it’s just part of the landscape.”

I thought, no—not valid. The notion that a memorial should be preserved because, at some point in time, an empowered group of citizens deemed an evil ideology worthy of memorializing only seems reasonable if history is apolitical, unemotional, and entirely relegated to the past. But history is none of those things. Infused in the heated politics of daily life, history is what left me in shock in the shadow of Lee’s empty pedestal. History is what turned Charlottesville into a war zone. History burns those who get close.

But Marksbury, if only in an indirect way, had a point. He directed my attention to Audubon Park. There, he explained, “you will find a statue of the Buffalo Soldiers.” He said, “Do you know what those soldiers did to the Native Americans? They mutilated them. So, what about the feelings of Native Americans? If you’re going to take down Robert E. Lee, you’ve got to take down the Buffalo Soldiers.”

And as for Jackson, he noted that when Take ’Em Down Nola—the organization dedicated to removing New Orleans’s racially offensive monuments—demonstrated to have Jackson removed, they were absolutely right to do so. “Landrieu,” he said, “could have appealed to the emotions of the Native American community.” But he “remained silent.” It was a silence that kept ringing in my ears.

Sloppy History

Marksbury’s argument does not condemn the removal of Confederate-themed monuments. It condemns inconsistency. One can argue that the NOLA removals were history in the making and that, in time, the moral logic underscoring that approach would be equally applied to other symbols of racism—including Andrew Jackson and many others. That would be good (if extremely ambitious) history. But that’s not what was happening in New Orleans. The mayor and city council removed Lee and other confederates while explicitly refusing to touch the image of Jackson. It was sloppy history.

Politicians can get away with that. But professional historians cannot. When I exchanged emails with Victoria Bynum, author of several books on the myth of the Lost Cause as well as The Free State of Jones: Mississippi’s Longest Civil War (which inspired a 2016 Hollywood movie), she was adamant that the public expression of history be scrupulously accurate and consistent. “I so fervently want the true history of the Civil War understood at the popular level,” she wrote. “And it saddens me that so many Americans, and not just Southerners, actually believe that the Civil War was not caused by slavery.”

Of course, she’s right. But was removing statues of confederate generals the right way to achieve historical accuracy in public space? (Bynum, for the record suggested the monuments go into a museum.) Again, it could be. If we honestly intended to take the logic underscoring Lee’s removal to the necessary extreme then we might get on with the massive project of de-anthologizing the public landscape of all racist vestiges. Or, acknowledging the difficulty of consistency on this point, we might instead rethink the logic behind statue removals altogether.

From the Bottom Up

One transformation that has touched the entire historical profession over the past two generations is the idea that we should do history “from the bottom up.” What kind of history was done in New Orleans when the statues came down? In a sense, it was top down. You had a white man who, largely through his own initiative and the power of his position as mayor, swept historical markers from their pedestals. Landrieu’s speech was grand. But shouldn’t skepticism be stoked when a May 26, 2017, editorial predicts that “as Abraham Lincoln’s remarkable 1860 Cooper Union Speech about slavery propelled the little-known Illinois lawyer toward the Republican Party’s presidential nomination, so might Landrieu’s Gallier Hall speech prompt Democrats to give the Louisiana mayor a closer look”? We should ask: Who tangibly benefits when Lee goes missing and General Jackson—of the Battle of New Orleans fame—stays put?

Three other Confederate monuments also came down around the time of the Lee statue removal, leading some lesser-known citizens suggested a bottom up approach. News reports called their behavior criminal acts of vandalism. But one might more charitably label them interpretations of public history made by the disenfranchised. At the base of the Robert E. Lee monument, someone spray-painted the phrase “white supremacy is a LIE” in sharp black letters. There we go, I thought.

Such a brutally accurate interpretation—obviously illegal and, if allowed to run amok, pointless—was in its singularity of expression and incisive moral commentary a far greater challenge to the myth of the Lost Cause than the nothingness that now rests on the pedestal. Plus, the motives in this case were clear—to bring truth to the monument—and nobody’s political prospects were improved in the process.

With that tag, truth spoke to power because the embarrassing emblem of that horrible power remained in place to be witnessed and interpreted. Certainly, we can take a cue from the vandals and find ways to demonize these relics with appropriate levels of scorn—new explanatory plaques come to mind—rather than sending them crashing once and for all to the pavement. And—more to the point—certainly there could be greater benefits for racial justice and historical understanding by engaging in ongoing interpretations of what these monuments mean in the here and now.

Forgetting How to Feel Shame

While taking an Uber car in New Orleans, I passed several streets named after slaveholders (or those who condoned slaveholding)—Henry Clay, Zachary Taylor, Thomas Jefferson, Napoleon, Washington. Prompted by this observation, I asked my driver, an African American business owner in his forties, what he thought about the statue removals. He paused and looked at me hard in the rearview mirror. “Taking those statues down was a bad idea because they reminded white people what was done to us.” Then he added: “We are not educated.”

It took me a moment to realize what he meant by “we” and “educated,” but what he was saying was that white people don’t know how to feel shame. We haven’t been taught how to confront the troubled history and legacy of slavery in a way that demands our sustained discomfort and puts us at risk in public space. True, by wishing the statues away, we justifiably honor the crushed feelings African Americans experience when living amidst monuments that once honored slavery. But less justifiably, by wishing these statues away we also ease the guilt of progressive whites who, for altogether different reasons, also hate looking up to Lee, Jackson, and, dare one say it, Mr. Jefferson.

Don’t worry about me, my Uber driver was saying. Worry about you. He wanted, in essence, whites to swallow a healthy dose of shame, and to bring that struggle to bear on our thinking about racial justice. However paradoxically, the white supremacist thugs who marched through Charlottesville only intensified the imperative. They further demanded that the rest of us, as we witness (and die from) their violent hatred, connect the awful racism of the past to that of the present through a bridge paved with shame, the kind of shame that, from the bottom up, can overwhelm the utter lack of it that currently swaggers at the top of American politics.

If that becomes the goal we choose to pursue with our remaining Confederacy monuments—and I cannot think of a better way to use public history—then we might take a note from the New Orleans vandals and begin to add to, rather than subtract from, the existing textual landscape.

That is exactly what the civil rights lawyer, MacArthur Foundation fellow, and founder of the Equal Justice Institute (EJI), Bryan Stevenson, is doing in Montgomery, Alabama. EJI marked Montgomery with a series of historical plaques acknowledging the warehouses used in the city’s slave trade. This effort, in addition to EJI’s current project to build a national memorial dedicated to lynching victims, defies the city’s antiquated markers to the Confederacy (of which there are more than fifty). And what do you think Stevenson wants whites to feel when staring at lists of the lynched? Not a sense of ease. Not a sense of relief.

Before justice and history merge on the landscape, they will first have to merge in our hearts. Without shame, this cannot happen. Taking on shame is a process that will inevitably ask whites not only to feel that emotion, but also to live in it, and to harness it for the cause of righteousness. And if that’s what we’re in for, if that’s what must happen for us to inch toward true racial reconciliation, then moving Confederate monuments out of sight becomes less an act of racial justice than yet another expression of the same white privilege that got us into this mess to begin with.

James McWilliams is a professor of history at Texas State University and the author of A Revolution in Eating: How the Quest for Food Shaped America and Just Food: Where Locavores Get It Wrong and How We Can Truly Eat Responsibly.

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Missing Michael Cromartie

Flyer from Protestants and Other Americans United for Separation of Church and State (1960). Via Wikimedia Commons.

Michael Cromartie was a rare figure in public life. An evangelical Christian, he devoted much of his work at the Washington-based Ethics & Public Policy Center to shedding light on issues that too often fueled the angriest culture-war disagreements over the place of religion in the public square. Until his recent death after a long struggle with cancer, he was rightly hailed as a bridge builder between journalism and religion.  Twice annually, he hosted the Faith Angle Forum, which, as Ross Douthat explained in a eulogistic column for the New York Times, invited “prominent journalists, members of one of America’s most secular professions, into extended conversation with religious leaders, theologians and historians, the best and brightest students and practitioners of varied faiths.” In a tribute on the website Real Clear Politics, journalist Carl Cannon wrote that “Cromartie did more to ensure that American political journalism is imbued with religious tolerance, biblical literacy, historical insight, and an ecumenical spirit than any person alive.”

I found myself missing Cromartie as I watched (and participated in) the reaction to New York Times reporter Laurie Goodstein’s description of the religious community of Professor Amy Barrett, nominated by President Trump to the U.S. Court of Appeals for the Seventh Circuit. (Barrett’s hearing before the Senate Judiciary Committee garnered some attention after Senator Diane Feinstein opined: “The dogma lives loudly within you.”)

Goodstein’s article has many problems, but what made me think of Cromartie was what the article and some responses to it revealed about the deep misunderstandings and biases of some of America’s more prominent religion journalists about some of the most basic practices of millions of American religious believers. These kinds of misunderstandings are all the more troubling at a time when the words and actions of our president have exacerbated divisions in our nation. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Charlottesville Daze

A makeshift memorial to the victims of the car attack at the Unite the Right rally.

A friend in Boston writes to ask if I know of anyone “commenting with particular insight” on what unfolded in Charlottesville last weekend. “No,” is my tersely emailed reply, but it is less a reasoned response to the quality of the commentary I have read so far than a visceral disgust with the evil that resulted in three deaths, many injuries, and a deep disturbance of the peace not only in my hometown but in my nation. Even critical commentary confers dignity—the dignity of reasoned consideration—upon its subject, but the subject in this case is a moral enormity distinguished only by its lack of civility and civilized virtues, and therefore undeserving of any civil consideration.

I claim no vatic powers when I say I saw this coming—clearly, though not for the first time, on the morning when the man who is incapable of clear moral utterance was elected to the highest office of our land. As I wrote to a friend that morning, “I never knew how much I loved my country until now, when I see how vulnerable it is.” I say this without partisan rancor; friends of all partisan stripes have shared similar sentiments with me. And I know, more to the point, that the culture that made possible the election of this supremely hollow man was shaped by forces associated as much with progressivism and liberalism as with conservatism and reaction. Is it any surprise that this man with no real party affiliation, this man without qualities apart from self-aggrandizing, self-dramatizing need, took three days to name the evil forces—above all, the white supremacist racism of Nazis, neo-Confederates, and alt-right thugs—behind the senseless deaths and destruction of last weekend?

The fish rots from the head, runs an old adage. But it does not really describe America’s current condition. The rot is general through the body politic. The current president is a mirror—a funhouse mirror, perhaps—in which we see, and now must recognize, our own disfigured selves.

We can do much better. We must do much better.

Jay Tolson is editor of The Hedgehog Review.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

What Is Innocence Worth?

lorish innocence FLAT

In its recent Nelson v. Colorado decision, the Supreme Court affirmed what might have seemed to require no formal affirmation—namely, that a person whose criminal conviction is overturned on appeal is entitled to the return of any fees, court costs, or restitution paid to the state as a result of the conviction. Previously, the state of Colorado required an exonerated defendant to file a separate civil suit and prove actual innocence by clear and convincing evidence before funds would be repaid. Having a conviction overturned on a mere legal technicality would not suffice for financial recovery. The central question in the case—which was decided six to one in favor of the petitioners, with Justice Clarence Thomas dissenting—concerned due process.

While it was notable that the Supreme Court took up such a seemingly self-evident case, the Court did not address the question of compensation for periods of wrongful incarceration. Justice Ginsburg, writing for the majority, explained that the “[petitioners] seek restoration of funds they paid to the State, not compensation for temporary deprivation of those funds. Petitioners seek only their money back, not interest on those funds for the period the funds were in the State’s custody.” Justice Ginsburg continued: “Just as the restoration of liberty on reversal of a conviction is not compensation, neither is the return of money taken by the State on account of the conviction.” She made it clear what compensation is and what it is not: While compensation may be the return of something wrongfully taken, it is not necessarily compensation to be released from prison in which one was held for no lawful reason in the first place. Compensation is something more—an award for loss, suffering, or an injury. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.