The Hedgehog’s Array: October 24, 2014

hedgehog array logo_FLAT_72dpi[3]
Notable reads from last week:

“Ben Bradlee, legendary Washington Post editor, dies at 93,” Robert G. Kaiser
“For Benjamin Bradlee, journalism was more than a profession—it was a public good vital to our democracy.”

“One-Fifth of Detroit’s Population Could Lose Their Homes,” Rose Hackman
“Many families could stay put for just a few hundred dollars, if only they knew how to work the system.”

“The Money Midterms: A Scandal in Slow Motion,” Evan Osnos
“The elections on November 4th are on pace to be the most expensive midterms in history (even adjusting for inflation).”

“Speed Kills,” Mark C. Taylor
“Fast is never fast enough.”

“Inside Twitter’s Ambitious Plan to Kill the Password,” Casey Newton
“Can some powerful new features reset Twitter’s relationship with developers?”

“What is the Value of Toleration,” Piers Bann
“Is the defense of free speech and toleration merely another name for indifference?”

“The International-Student Revolving Door,” Albert H. Teich
“Foreign students shouldn’t have to prove they’ll go home after graduating to get a visa.”

“Obama Talks Up Net Neutrality, But Could Do More to Defend It,” Brendan Sasso
“Obama has avoided taking a position on the most controversial piece of the net-neutrality debate: what authority the FCC should use to enact new open-Internet regulations.”


Our own Jay Tolson went on Charlottesville’s WTJU to talk about the “War on Poverty” and preview the fall issue of The Hedgehog Review “Thinking About the Poor”—on newsstands November 1! Listen to the interview here! Subscribe to The Hedgehog Review today!


“What is Liberal Education For?”: A Conference Postmortem

Charles Dickens at a reading, Charles A. Barry (1867).

Charles Dickens at a reading, Charles A. Barry (1867).

The liberal education conference has now come and gone. My own panel went well (Inside Higher Ed has a brief write-up here), though I think I left with the same question I had going in, namely: Are there any truthful instrumental arguments to be made for liberal education?

Inevitably, an event like this involves some preaching to the choir. When asked if he considered any arguments against liberal education worth taking seriously, for instance, Andrew Delbanco said: “No.” Well, that’s a problem. Even if it were true—and I’m not sure it is—it plays well only in an audience full of people who have set aside three or four days to go to a liberal education conference. And in the repeated declarations that liberal education is every positive superlative—the most useful, and so on—the meaning of the term begins to become a little obscured.

This is why instrumental arguments continue to interest me. If I stand up and say, for instance, that no true instrumental arguments exist, and the Vice President of St. John’s College tells Inside Higher Ed that I am wrong (as she did), it’s clear that despite going through basically the same motions and reading the same materials in the same structured program of study, she and I have emerged with very different conceptions of what we’re doing and probably are talking about very different things.

In other words, if you’ve assembled the choir, it might be good to focus on the internal philosophical disagreements that are coming into play. There was never, so far as I could tell, a panel that meant to address head-on what a liberal education was at all. But someone who has a Straussian perspective on liberal education will disagree with someone who has a Catholic perspective, even if both of them are willing to quote Cardinal Newman’s Idea of a University. The programs at St. John’s College and Thomas Aquinas College are in many ways identical, but do the schools consider themselves to be doing the same thing? Continue reading

A Connoisseur of Vanishing Acts: An Interview With John Rosenthal

The Hedgehog Review: Our forthcoming fall issue includes a photo essay with the work of, among others, American photographer John Rosenthal. John, please describe your background and how you came to photography.

John Rosenthal (JR): My route to photography was so circuitous I can hardly follow it. In my twenties, I taught literature at the University of North Carolina–Greensboro and UNC–Chapel Hill and, during the summers, acted in stock theaters. In 1970, after the Kent State killings, I was one of the leaders of the strike that closed down the university. To the administration, I became a persona non grata, and, in the fall, I was told that my lectures were being monitored. Who needed it? So I quit.

Coney Island, New York, 1974; courtesy of the artist

Coney Island, New York, 1974; courtesy of the artist

My then-wife and I moved to Rethymnon, Crete, where I borrowed a camera and began to photograph everything—the people, the children, the rocks, the sheep, the fog over the Mediterranean. I didn’t know what I was doing, but I loved it. When we returned to America, I bought a Pentax and, when it got dark, I set up a darkroom in my kitchen. One night, I watched a new photograph rising up out of the developer. It was a photograph of two men, one of them shirtless, standing in front of a small fire on the beach at Coney Island (left). Behind them, in the distance, a ferris wheel and roller-coaster seemed to drift in the mist. I thought, okay, this is good, call yourself a photographer.


THR: Are there any photographers whose work has made a deep impact on you? Have you encountered any particular mentors or teachers?

JR: Literature brought me to photography, but not right away. I had to learn somewhere that what you see isn’t all there is, and I learned it by reading. Even though Faulkner’s Absalom, Absalom! is a book, it created in my mind images that were brighter and more distilled than anything my eyes could see. But more specifically, in the late 1960s, my late friend and mentor, Jean Morrison, a poet, photographer, and teacher, sat me down and said he wanted to show me something. Then he handed me a book of Cartier-Bresson’s photographs, saying “Don’t look fast. Don’t assume you know what you’re looking at. They’re complicated.” I knew nothing about the art of photography. I’d never heard of Henri Cartier-Bresson or Robert Frank or Diane Arbus. Probably, I’d seen Ansel Adams’s photographs of Yosemite, but I didn’t care about them.

Now, I found myself looking at a 1938 black-and-white photograph [Cartier-Bresson’s On the Banks of the Marne] of two plump, middle-aged couples sitting on the bank of the Marne, enjoying a picnic. Dirty plates, an empty bottle of wine, newspapers, a picnic basket, forks. In the river, two boats were tethered to the shore by two poles. In one of the boats, three fishing rods were propped up, dangling their lines in the river. A woman in a skirt and slip was chewing on chicken bone. One of the men, in suspenders, was refilling his wine glass, his face with its Charlie Chaplin moustache turned sideways to the camera. The other three were turned away from the camera, facing the river. The photograph’s composition was as relaxed as the picnic, but it managed to convey a culture, a society, a landscape, and, above all, the texture of friendship. The photograph, which was both an inward and outward fact, both a metaphor and itself, was a poetic act of consciousness. That year I learned to read the complex language of a photograph, and that opened up Robert Frank’s America, Diane Arbus’s creepy wonderment, and the beautiful elusiveness of Eugène Atget.


THR: You have said, “To be a photographer is to be a connoisseur of vanishing acts.” Please say a little more about this evocative statement.

JR: Well, when I first began to photograph in lower Manhattan, I found myself drawn to things like bottles of seltzer water stacked in wooden crates, dusty bread-shop windows, Ukrainian men playing backgammon in Tompkins Square, movie marquees on 42nd Street, a ship in the window of an Italian seamen’s club on Mulberry Street. There was nothing self-conscious or intentionally documentary about these photographs. They were the city I’d fallen in love with. To me, New York’s dynamic urban beauty was equal to the views at any number of litter-free national parks.

Then the city sanitized itself, real-estate prices soared, and a lot of New York disappeared to make room for the yuppies. Only then did I realize that what I’d been photographing was the imperiled city.

It turned out, to my surprise, that those early photographs are now considered documents. New York the way it used to be. Unlike Ansel Adams’s High Sierra mountains, which will stay put for a very long time, my photographs of New York in the 1970s deal almost exclusively with landscapes and moods that have largely vanished.


THR: Whether deliberate or unintentional, a photograph almost always conveys a particular story or connotation of a subject. What are your views on the ethics of photography?

JR: Photographing people, strangers especially, can be a very tricky thing to do, ethically tricky, even if it’s now a universal cellphone activity. And photographing pre-adolescent children as if they were seducing the camera brings the problem to a darker level. I think a clever person with a camera can be very dangerous. A photograph can extract people from the flow of their lives (and to some people that flow is everything). It can crop them from the lively space in which they live and have their being. A photograph can also secretly juxtapose people and objects in a highly suggestive way. Sometimes that’s a form of cruelty. I recall a photograph I saw many years ago—I won’t say who took it—of a woman in a mink coat staring into a glittering jewelry store window on Madison Avenue. She may have been idling away her time, as the rich often do, or she may have been returning home from a hospital visit to a friend who was ill. Her expression was haughty. The mink coat made it so. The photographer, of course, knew nothing about this woman, but she had turned her into a symbol of the bored rich. She’d played into a collective hunch about women in mink coats on Madison Avenue, and many viewers have undoubtedly nodded their heads at this faux profundity.

Of course, there are many occasions in which a stranger is the person you photographed, but that’s because they’ve already been reduced. They are holding a sign. They are angry. They want attention badly. And sometimes strangers simply want or need a photographer to tell their story. But, generally speaking, we need to be careful about what our photographs claim to know. The knowledge is often, as Susan Sontag once pointed out, “unearned.”

I rarely photograph people anymore.


THR: You have described how people see a divide between the verbal and the visual. As both a teacher—and one for whom literature led to photography—and a visual artist, you would seem to straddle this divide. Is this divide real? Why or why not?

Gaspé Peninsula, Canada, 2003; courtesy of the artist

Gaspé Peninsula, Canada, 2003; courtesy of the artist

JR: Frankly, I think the verbal/visual thing is an empty distinction that exempts writers from looking at Hopper’s paintings and painters from reading Faulkner’s books. But we need both kinds of artists in our lives! Yet I know that photography is connected to storytelling in a way that painting isn’t. I recognize that. Photographs—if we are to know the mind of the photographer and not just the cleverness of his image—need to exist in some kind of continuum, which can often be transformed into a narrative. Think of Robert Frank’s influential book of photographs The Americans (1958), a purely visual poem describing an America haunted by its own loneliness. It’s worth noting that Jack Kerouac wrote the introduction for The Americans.


THR: You have described photography as “the deep surprise of living in the ordinary world.” How have you overcome complacency or habit in order to remain in a state of wonder?

JR: Nowadays I think “wonder” is more of a capacity than a state-of-being. If you remain in a state of wonder, how could you develop any sort of wit? You’d end up like that terrible innocent, Harold Skimpole, in Bleak House.

Coronado, Ocala, Florida, 1986; courtesy of the artist

Coronado, Ocala, Florida, 1986; courtesy of the artist

But that doesn’t mean that there aren’t seasons of wonder. Becoming of a photographer in the early seventies was like living in that season. My first marriage was ending in North Carolina, I’d quit teaching, I had no money, and here I was, walking around Tompkins Square on the Lower East Side, looking for the right photograph. It was as if I was wandering through an undiscovered country, not exactly lost, even though I had no idea what I’d encounter on the next block. I mean, suddenly my job consisted of looking at things and photographing them in such a way that someone else would say “Yes!” What a wonderful thing to do! Of course, aesthetic and moral questions were a kind of energy. What should I photograph? Should I look at it widely or narrowly? What are the limits of intrusion? How do I learn to slow down enough to truly look? Not having developed my own way of looking at things, I pretended, at least half the time, that I was Cartier-Bresson, and I looked for images that would contain his kind of information. Sometimes I’d see something that I liked, a bar on a corner with sunlight falling sideways on the street, and I’d just wait around for something else to happen, like a dog running by. Then the dog would run by, and I’d take the photograph, and I’d take a deep breath because I knew I was getting it.

Of course later on, when I was finally taking my own photographs, not Cartier-Bresson’s, I exchanged that early wonder for patience and know-how. That was necessary.

But the capacity for wonder doesn’t go away. In 2007, when I saw the Lower 9th Ward [in New Orleans] for the first time, I felt the same way I’d felt in 1970s in lower Manhattan. Here was a story that hadn’t been told. Other stories had been told about it, but not the one I wanted to tell. Once again, I was on fire.


THR: So your work in New Orleans’s Lower 9th Ward after Hurricane Katrina brought you back to the fundamental reasons for becoming a photographer. What were those fundamentals? How did you approach this project knowing that some many people were visiting the city with more prurient intent as “disaster tourists”?

JR: When my wife and I visited New Orleans in February 2007, I had no intention of photographing the Lower 9th Ward or, for that matter, any of the breached levee zones. We were there to see a city that was, once again, opening up its doors. All I knew about the Lower 9th Ward—and I learned it while the Lower 9th was filling up with water—was that the media invariably linked it to poverty and crime. Then, the Lower 9th was underwater and people drowned, and now, in 2007, it was uninhabited. I’d seen the disaster and post-disaster photographs. End of story. New Orleans wasn’t my city.

St. Rose Missionary Baptist Church, 9th Ward, New Orleans, 2007; courtesy of the artist

St. Rose Missionary Baptist Church, 9th Ward, New Orleans, 2007; courtesy of the artist

But when a local friend drove me through what remained of the Lower 9th Ward, I was shocked. Not just by its eerie silence and emptiness, although that was shocking enough. No, it was more of a mental surprise: That the Lower 9th wasn’t anything like what I’d been led to believe. It wasn’t a slum; it was a working-class neighborhood full of bungalows. There was an elementary school named after Louis Armstrong. There were churches, once full of worshipers, with wooden doors falling off their hinges. There were abandoned homes, baking in bacteria, homes that had once been lovingly tended. You could still see that. But such care was pointless now, tragically so. People, I saw, had loved this place, and soon everything was going to be demolished. The only sound I heard in the Lower 9th Ward was the rumble of dump trucks and the crunching of wood.

The collapse of the distinction between “us” and “them” is the beginning of real documentary work. It is also one of the journeys consciousness is required to take. If you don’t take it, you’re probably a propagandist.

I decided to archive the loss, to memorialize it, before everything was gone. Photographs do that very well. They’re a fine, if modest, consolation. They testify to what has been and what will be no more, and this testimony matters. I hoped my photographs would tell a story about possession and loss, community and separation. I’d photograph only the evidence. Small things. This was someone’s home. These were the bulletin boards in someone’s kindergarten class. Everything I photographed then has since been carted away.

In 2008, on the third anniversary of Katrina, the New Orleans African-American Museum held an exhibit of my Lower 9th Ward photographs. I believe that if the photographs had pretended to know more than they had a right to know, had tried, say, to capture the sorrowful faces of the dispossessed, the invitation would never have been extended.


THR: How do you reconcile photography’s mechanical or technical aspects with its potential for expressiveness? Has there ever been an instance in which your expertise as a photographer has failed to capture the moment you saw with the naked eye?

JR: Well, one’s proficiency—as a photographer using technical equipment—improves over time. You learn what you’re doing, and the odds of capturing what you want improve. Your timing gets better. You can anticipate, if not always what Cartier-Bresson called “the decisive moment,” then at least the indecisive moment you’ve been looking for. I know this sounds a little crazy, but on the deepest level of expression, I don’t think it matters if you’re holding a pen or a camera. “I’m an artist,” John Lennon said, “and if you give me a tuba, I’ll bring you something out of it.”

Expertise, however, is different from proficiency, especially, in the arts, where it is only approximate. Creativity, of the highest sort, has its seasons. It waxes and wanes. A photographer with expertise, one who consistently produces expert images, is probably a wedding photographer. My goal as an artist is to avoid predictability and to create fresh images. So I have to keep finding a way to slide away from my own expertise, and to realign my sights.

Some photographers take the same photograph over and over¸ and that requires a kind of expertise. It’s a way of making a living.


THR: Can you tell us about what sort of equipment you use? Do you prefer traditional film or digital cameras? If you use a digital camera, have there been any modifications in your methods or your approach?

JR: Well, I should start off by saying that I’ve been shooting with a digital camera for a while now. Probably out of necessity. I spend as much time working on digital prints as I used to spend in the darkroom, but now I don’t have to stand on my bad left foot.

In my case, switching from film to digital was a matter of convenience, and that’s about it. Even though I am using a new technology, the reasons why I take photographs haven’t changed. The digital camera is, really, just a camera, and the world I want to photograph is the same old world. The old challenge remains unchanged: to use my camera to disclose some sort of hidden meaning that lies below our common awareness. A poet’s task, neither more nor less. So I trained myself to look closely for the little thing that nobody was paying attention to, the quiet thing that didn’t want to give away its secret importance. An unmade bed. A chessboard in Tompkins Square after a rainstorm. Something you might walk right by.

I guess I have faith that the actual world, as it is, is enough. It’s my guiding principle. I think that if I move things around in my photographs, arrange expressions, say, or digitally create a dream effect, then I won’t meet the criterion of perception that I’ve set for myself. I want to distill reality, not modify it with software.

Of course I’m describing only one approach to image-making—one that I inherited from a certain time and place. It’s just the way I do things. It’s no better or worse than a hundred other ways of considering and making photographs. It’s just mine.

John Rosenthal’s New Orleans photographs will appear in AFTER: The Silence of the Lower 9th Ward (Safe Harbor Books, forthcoming 2015). See more of his work at

Cowardice and Ebola

A Kikwit, Zaire clinic (1995).

A Kikwit, Zaire clinic (1995).

Not long into his new book Cowardice: A Brief History, Chris Walsh, associate director of Boston University’s writing program, notes that his subject has received surprisingly little direct consideration in Western letters and scholarship. Humanity responds to cowards and cowardly acts with an appalled reticence. “Let’s not speak of them,” Dante’s Virgil says dismissively of the hundreds of cowards crowding Hell’s vestibule. The philosopher Kierkegaard, who gave more thought to the subject, observed that “there must be something wrong with cowardliness, since it is so detested, so averse to being mentioned, that its name has completely disappeared from use.”

Not quite completely, of course, but a certain obliquity does characterize most literate reflections on the subject, even though it’s crucial to our conduct—not only in war but in those moments of choice that call for moral courage. Consider the current debate over the appropriate response to the Ebola virus as it spreads beyond its epicenter in West Africa. Does the discourse of cowardice, and its antonyms bravery and courage, play any role in this debate? Should it? Are we being evasive—even cowardly—in refusing to see the debate in those terms as well as in medical-epidemiological or national security ones? Perhaps courage and cowardice raise moral questions that we would rather ignore.

Bringing cowardice into the discussion certainly doesn’t simplify matters. Even if we accept Walsh’s working definition of a coward—”someone who, because of excessive fear, fails to do what he is supposed to do”—we run into problems. One person’s excessive fear is another’s reasonable fear, and defining our duty in any given situation invites qualifications and assorted objections.

Sure, we can easily point to the brave healthcare workers who have thoroughly acquitted themselves in fulfilling their duty, in many cases losing their lives or otherwise going far beyond duty’s call. The British healthcare worker who returned to Africa after recovering from the disease fills us with highest admiration, yet the extent of his courage may cause some to wonder whether it exceeds reasonable bounds. If we explain away such courage by attributing it to perverse or even vainglorious motives, we may be admitting we hold an idea of duty that lacks firm or clear obligations. Uncomfortable as such an admission may be, we need to understand our own standards of duty so that we know when we may be shirking them.

The question of our own cowardice grows more discomfiting when we consider the most contentious proposed response to the spread of the disease: the imposition of a travel ban that would curtail international commercial flights for people from Liberia and Sierra Leone. There are opposing arguments about the efficacy of such a ban, of course. But is one side more clearly courageous and the other more clearly cowardly?

Again, answers are not easy. First, where is one’s primary duty here—to one’s national security and well being, or to humanity and the greater global good? We who want to participate fully in a globalized world must know that with its benefits come responsibilities. Is it not cowardice to shirk such responsibilities?

It may be another form of cowardice to rush to answers that are too easy and too quick. Are there, for example, better ways of limiting the movement of infected people than by imposing restrictions on all people who share their nationality? Have we discussed and explored those alternatives? If we say our primary and unqualified duty is to our own national good, are we certain that such a ban furthers that good in the short or long terms?

Courage and cowardice enter into these questions again when we examine the motives for our answers. “Kinds of cowardice can conflict,” notes Walsh.  And he adds that “Excessive fear of being or seeming cowardly can lead to cowardice.” These potential conflicts should bear heavily on the minds of our political leaders, who sometimes show more concern for appearances than for substance. Are those political leaders who are now calling for a ban merely pandering to a fearful base or taking yet another opportunity to bash whatever President Obama calls for? Conversely, could President Obama be so fearful of the disapproval of the international community that he might refuse to make certain choices? Is it cowardly not to be cruel to be kind? After all, imposing a ban on the most afflicted nations might move the elites of those nations (the citizens most likely to avail themselves of international air travel) to devote more resources and effort to containing the contagion, in the way that Nigeria’s political elite did.

The discourse of cowardice may not bring easy answers to dilemmas so vexing as this one, but if it brought a little more honesty, that would be no small contribution. The somewhat paradoxical problem with facing up to cowardice, as Walsh’s excellent book shows, is that it usually requires great courage.


The Hedgehog’s Array: October 17, 2014

hedgehog array logo_FLAT_72dpi[3]


Interesting reads from last week:

“5 Guidelines for Living in a Pluralist Society,” John D. Inazu
“[All of our] common ground tells us surprisingly little about who we are as a people, what our goals should be, or what counts as progress.”

“Realism Is a Figure of Speech,” Joe Fassler
Vikram Chandra: “Realism is not something that is transparent. It’s not just the glass through which we see. It is also a figure of speech.”

“What if Black America Were a Country?” Theodore R. Johnson
“Essentially, what we’re witnessing is a nation that is comparable in certain ways to a regional power existing in the state of Disparistan (or, perhaps, Despairistan).”

“Salon Culture: Network of Ideas,” Andrian Kreye
“For more than a century now the salon as a gathering to exchange ideas has been a footnote of the history of ideas. With the advent of truly mass media this exchange had first been democratized, then in rapid and parallel changes diluted, radicalized, toned down, turned up, upside and down again.”

“Wikipedia, a Professor’s Best Friend,” Darius Jemielniak
“I am a professor who not only recommends that my students use Wikipedia but also encourages them to edit and develop it. However, I am in the clear minority in academia.”

“My Childhood Friend, the ISIS Jihadist,” Jakob Sheikh
“In Amir’s world, the heroes are Sunni extremists fighting for a global Islamic Caliphate. The enemies are the infidels. Weeds to be removed from the face of the earth.”

“S.E. Hinton and the Y.A. Debate,” Jon Michaud
“The author who changed the way that books for teens were written and published has seen her own work go from the spinning wire display rack near checkout to an online marketplace accessible while you wait for your morning latte.”

From our archives:

Mark Edmundson urges us to “Pay Attention!”


The Morality of Food—Then and Now


U.S. Food Administration poster, circa 1917; this poster appeared in several different languages so that the food conservation effort would reach the many immigrant groups in America; Wikipedia Commons

Eating healthy, supporting local farmers, buying fair trade coffee—aren’t these all practices we can feel good about? Well, perhaps to a point.

If the consumption of food in our time has become something of a moral matter, sometimes involving self-righteousness and even coercion, it’s helpful to learn that we didn’t arrive here overnight. The moral American foodie has a history, part of which is told by Michigan State Universtiy historian Helen Zoe Veit in her 2013 book  Modern Food, Moral Food,

Veit’s focus is eating habits in the early twentieth-century America, a time when the culture was strongly shaped by rising immigration, changing views of gender and race, greater access to public education, and, perhaps most of the all, the advent of the Great War. Even before the United States entered the war, Herbert Hoover’s Food Administration instituted a number of national programs aimed at conserving food at home so more could be transported abroad to aid the Allies. Among Hoover’s initiatives were Wheatless Monday, Meatless Tuesday, and Porkless Saturday. Some fourteen million Americans, according to Veit, or seventy percent of the population, joined the effort, signing pledge cards and proudly proclaiming their membership in the Food Administration.

This effort worked primarily through voluntarism and a widespread enthusiasm for patriotic gestures on a grand scale. (There were also plenty of snoops who reported the noncompliant.) It also helped that Americans became increasingly receptive to meatless foreign cuisine, especially Italian pasta and other vegetable-only dishes. Indeed, pasta was ideal for food conservation because it was made from semolina flour rather than the wheat needed for the war effort. Housewives boasted of their sacrifice and ingenuity in the kitchen, and publishers churned out cookbooks on fleshless eating and “the rational dietary.” Cannily, the Food Administration, public leaders, and the media further reinforced the effort by connecting voluntary fasting with democratic self-determination, individual self-control, national willpower, and moral urgency.

It is interesting to contrast the motivations behind the World War I food conservation efforts with those behind today’s food-related activism. We are all foodies now, it seems. New concepts like fair trade or responsibly sourced foods vie with the return of traditional practices such as  canning and pickling. Whether you require a social agenda with your fruits and vegetables or just a wide selection of cereal and chips, there are grocery chains to suit every shopper. In the university town where I live, there are three national chains, three well-known specialty grocers, several brave independent markets, weekly farmers’ markets, and a number of “provisioners”—not to mention big box retailers with grocery departments. This unprecedented array of choices would have had early twentieth-century food reformers sputtering in consternation at our pampered and picky appetites.

Where has this food revolution taken us? While it has opened up our eating and shopping choices, it has also reintroduced us to the moral aspect of food. And this being a media-saturated age, morality often piggybacks on star-power. First Lady Michelle Obama wants to combat the sense of failure encountered by overweight children with her Let’s Move program and its laudable, if grandiose, ambition of “[solving] the challenge of childhood obesity within a generation, so that children born today will grow up healthier and able to pursue their dreams.” Ex-Beatle Paul McCartney asserts that eating meat contributes to greenhouse gases (presumably he means not just the methane emissions of cattle and other livestock, but also the carbon footprint of the meat industry in general). Desperate Housewife Eva Longoria wants to demolish the gender bias of food by opening SHe, a female-oriented steakhouse featuring “he-cuts,” “she-cuts,” and “we-cuts.” Surely, no one would want to ruin the dreams of children, destroy the atmosphere, or oppress women simply in the name of freedom of eating?

And then there are the food glitterati: celebrity chefs with their own cable shows and stars puffing the latest fad diets. The obligatory book contracts and the cross-country appearance tours ensure that all Americans will have access to what it takes to cook like the Naked Chef while staying sexy on the Beyoncé diet. Much of this celebrity food activism is aimed at improving public health and the environment, but it often verges on coercion, both psychological and, increasingly, legal—think Michael Bloomberg and his battle against transfats and soft drinks. Today’s message about moral eating behavior is a rather muddled one, equal parts accessibility, self-expression, self-control, and overeager civic-mindedness.

Oddly, we find ourselves at a time when food activism is both radical and mainstream. The label “food elitist” or “foodocrat” may be hurled as an insult or worn as a badge of honor. The foodocracy in which we now find ourselves gives people permission to lecture others, often with overbearing rudeness, about what makes some food choices superior and others inferior. Having your latte the way you want it may happen in Starbucks, but don’t expect the same accommodation in the boutique coffee house in a gentrifying neighborhood. In our game of organic oneupsmanship, we have all become complicit in encouraging the kind of food elitism that also strips us of what one blogger called “gustatory freedom.”

Then comes the recent story of Meatless Mondays at schools in the Sarasota County (Florida) School District. This program, part of Johns Hopkins University’s Monday Campaigns, dedicates the first day of the week to different health initiatives: quitting smoking, exercise, safe sex, and eating right. Meatless Monday drew its inspiration from the meatless and wheatless programs of the World War I-era Food Administration. Just as the war-time food conservators strove to convince Americans that living without meat was the patriotic choice, so Florida school officials hope to convince schoolchildren that a bun without a burger is worthwhile for “personal health and for the health of the planet.” The difference this time around is that the international menu choices in Florida—hummus, fiesta taco salad, and spaghetti marinara—will be an easy sell to today’s kids. Much has changed since 1911 when the Van Camp Company advertised that its canned spaghetti was made with the same sauce as its pork and beans in order to convince consumers that foreign food was good.

What hasn’t changed is the use of food as propaganda. Whether little Johnny in Florida buys into Meatless Monday for his health, for the environment, or just because he forgot to bring a sandwich from home, he will have his food choices limited—and dictated—by those in authority over him. The lesson here is that others know better than you and they will pass legislation (or institute programs) to convince you of it. Perhaps we are what we eat after all.

Leann Davis Alspaugh is managing editor of The Hedgehog Review.

Recognizing the Adult in the Mirror

The Life and Age of Stages of Man's Life, from the Cradle to the Grave by James Baillie

“The Life and Age of Stages of Man’s Life, from the Cradle to the Grave,” by James Baillie

Richard Linklater’s Boyhood was this summer’s critical hit, achieving for a time a much-coveted 100 percent fresh rating at Rotten Tomatoes. (It now sits at 99 percent). The film follows a young Texan named Mason as he grows from a quiet child to a disaffected teen, ending when he becomes a college freshman. All of the time passed is real—Mason grows up over twelve years, which is how long the film took to make—and Linklater fills the background with references to remind the audience that these twelve years have passed for them, too. Everybody’s gotten older.

But nobody is growing up. Boyhood is a “coming of age” story only in the most formal sense. There’s no age to come into, no adulthood to achieve, and no adults to be found. Mason’s life is full of older people who burden him with clichéd advice, but they, too, are merely drifting from one event to another without really knowing why.  As Mason’s mother sends him off to college, she unloads her self-pity, telling Mason that raising him was her last “milestone” and that now all she has left is waiting for death. “I thought there would be more,” she says.  But if Mason has learned anything from his elders, it’s not to expect even that much.

This aimlessness is made pointedly clear in a scene in which Mason visits his step-grandparents for his fifteenth birthday. From them, he receives a suit, a Bible, and a gun. They are meant to be signs of his adulthood. But we know, and Mason knows, that he will never touch any of them. They are simply relics of an old way of being in the world, and not one that he wants or even can choose.

Boyhood did not come up in A.O. Scott’s recent essay for New York Times Magazine, “The Death of Adulthood in American Culture.”  It should have. As if to reinforce Scott’s claim is that there are no models for adulthood, Boyhood offers a sustained look at what that might mean—and it’s anything but attractive.

Yet as compelling as Boyhood is, there’s also something false about it. Mason seems predestined to his own disaffected life in a way that real people generally aren’t. Similarly, although Scott’s essay is usefully provocative, there’s something missing in it as well—namely, any clear articulation of the idea of the adulthood that he claims has been lost. Continue reading

“What is Liberal Education For?”: A Preview

Aristotle teaching Alexander the Great, Charles Laplante (1866).

Aristotle teaching Alexander the Great, Charles Laplante (1866).

This week, I’ll be presenting a paper at “What is Liberal Education For?,” a conference being held at St. John’s College, Santa Fe.  Lasting three days, it will have some twenty-eight panels and include presentations by scholars such as Boston University’s Christopher Ricks, Institute for Advance Studies in Culture fellow and author Matthew Crawford, and philosopher Roger Scruton, whose lecture “Architecture and Aesthetic Education” will close the proceedings.

Here is the conference’s statement of purpose:

We raise this question [What is Liberal Education For?], recognizing that liberal education and the great tradition of the American liberal arts college have been put on the defensive of late. Small colleges across the nation have to make their case to students, to their parents, and to the public more urgently than ever. The causes of this crisis have been analyzed extensively: there is an emerging consensus that the rapid growth of consumerism amidst new economic challenges, and the fragmentation of general studies driven by professional training and specialization in the universities, have led us to undervalue drastically the humane goals of liberal studies. These causes are themselves symptomatic of a deeper crisis in our time, a crisis of uncertainty and disorientation affecting every field of human endeavor—scientific, social, intellectual, artistic, and spiritual. Precisely in response to this crisis, liberal education can reaffirm its relevance and purposes.

My own panel, “Liberal Education: Changing Conversations,”  is focused on the rhetorical arguments for liberal education. My paper, “Liberal Education in a Specialized Age,” considers the case that can be made for unspecialized education in an economy that—on the surface at least—demands specialization and views education as job training.

There are reasons to suspect that this narrative is untrue, or at least extremely incomplete—witness the rise of the service economy. But I think it is true that we take for granted that specialization is a good and that education ought to accommodate the marketplace by helping students to specialize sooner and more adeptly. We take these things for granted even if the facts around us aren’t bearing them out. Continue reading