A Connoisseur of Vanishing Acts: An Interview With John Rosenthal

The Hedgehog Review: Our forthcoming fall issue includes a photo essay with the work of, among others, American photographer John Rosenthal. John, please describe your background and how you came to photography.

John Rosenthal (JR): My route to photography was so circuitous I can hardly follow it. In my twenties, I taught literature at the University of North Carolina–Greensboro and UNC–Chapel Hill and, during the summers, acted in stock theaters. In 1970, after the Kent State killings, I was one of the leaders of the strike that closed down the university. To the administration, I became a persona non grata, and, in the fall, I was told that my lectures were being monitored. Who needed it? So I quit.

Coney Island, New York, 1974; courtesy of the artist

Coney Island, New York, 1974; courtesy of the artist

My then-wife and I moved to Rethymnon, Crete, where I borrowed a camera and began to photograph everything—the people, the children, the rocks, the sheep, the fog over the Mediterranean. I didn’t know what I was doing, but I loved it. When we returned to America, I bought a Pentax and, when it got dark, I set up a darkroom in my kitchen. One night, I watched a new photograph rising up out of the developer. It was a photograph of two men, one of them shirtless, standing in front of a small fire on the beach at Coney Island (left). Behind them, in the distance, a ferris wheel and roller-coaster seemed to drift in the mist. I thought, okay, this is good, call yourself a photographer.

 

THR: Are there any photographers whose work has made a deep impact on you? Have you encountered any particular mentors or teachers?

JR: Literature brought me to photography, but not right away. I had to learn somewhere that what you see isn’t all there is, and I learned it by reading. Even though Faulkner’s Absalom, Absalom! is a book, it created in my mind images that were brighter and more distilled than anything my eyes could see. But more specifically, in the late 1960s, my late friend and mentor, Jean Morrison, a poet, photographer, and teacher, sat me down and said he wanted to show me something. Then he handed me a book of Cartier-Bresson’s photographs, saying “Don’t look fast. Don’t assume you know what you’re looking at. They’re complicated.” I knew nothing about the art of photography. I’d never heard of Henri Cartier-Bresson or Robert Frank or Diane Arbus. Probably, I’d seen Ansel Adams’s photographs of Yosemite, but I didn’t care about them.

Now, I found myself looking at a 1938 black-and-white photograph [Cartier-Bresson’s On the Banks of the Marne] of two plump, middle-aged couples sitting on the bank of the Marne, enjoying a picnic. Dirty plates, an empty bottle of wine, newspapers, a picnic basket, forks. In the river, two boats were tethered to the shore by two poles. In one of the boats, three fishing rods were propped up, dangling their lines in the river. A woman in a skirt and slip was chewing on chicken bone. One of the men, in suspenders, was refilling his wine glass, his face with its Charlie Chaplin moustache turned sideways to the camera. The other three were turned away from the camera, facing the river. The photograph’s composition was as relaxed as the picnic, but it managed to convey a culture, a society, a landscape, and, above all, the texture of friendship. The photograph, which was both an inward and outward fact, both a metaphor and itself, was a poetic act of consciousness. That year I learned to read the complex language of a photograph, and that opened up Robert Frank’s America, Diane Arbus’s creepy wonderment, and the beautiful elusiveness of Eugène Atget.

 

THR: You have said, “To be a photographer is to be a connoisseur of vanishing acts.” Please say a little more about this evocative statement.

JR: Well, when I first began to photograph in lower Manhattan, I found myself drawn to things like bottles of seltzer water stacked in wooden crates, dusty bread-shop windows, Ukrainian men playing backgammon in Tompkins Square, movie marquees on 42nd Street, a ship in the window of an Italian seamen’s club on Mulberry Street. There was nothing self-conscious or intentionally documentary about these photographs. They were the city I’d fallen in love with. To me, New York’s dynamic urban beauty was equal to the views at any number of litter-free national parks.

Then the city sanitized itself, real-estate prices soared, and a lot of New York disappeared to make room for the yuppies. Only then did I realize that what I’d been photographing was the imperiled city.

It turned out, to my surprise, that those early photographs are now considered documents. New York the way it used to be. Unlike Ansel Adams’s High Sierra mountains, which will stay put for a very long time, my photographs of New York in the 1970s deal almost exclusively with landscapes and moods that have largely vanished.

 

THR: Whether deliberate or unintentional, a photograph almost always conveys a particular story or connotation of a subject. What are your views on the ethics of photography?

JR: Photographing people, strangers especially, can be a very tricky thing to do, ethically tricky, even if it’s now a universal cellphone activity. And photographing pre-adolescent children as if they were seducing the camera brings the problem to a darker level. I think a clever person with a camera can be very dangerous. A photograph can extract people from the flow of their lives (and to some people that flow is everything). It can crop them from the lively space in which they live and have their being. A photograph can also secretly juxtapose people and objects in a highly suggestive way. Sometimes that’s a form of cruelty. I recall a photograph I saw many years ago—I won’t say who took it—of a woman in a mink coat staring into a glittering jewelry store window on Madison Avenue. She may have been idling away her time, as the rich often do, or she may have been returning home from a hospital visit to a friend who was ill. Her expression was haughty. The mink coat made it so. The photographer, of course, knew nothing about this woman, but she had turned her into a symbol of the bored rich. She’d played into a collective hunch about women in mink coats on Madison Avenue, and many viewers have undoubtedly nodded their heads at this faux profundity.

Of course, there are many occasions in which a stranger is the person you photographed, but that’s because they’ve already been reduced. They are holding a sign. They are angry. They want attention badly. And sometimes strangers simply want or need a photographer to tell their story. But, generally speaking, we need to be careful about what our photographs claim to know. The knowledge is often, as Susan Sontag once pointed out, “unearned.”

I rarely photograph people anymore.

 

THR: You have described how people see a divide between the verbal and the visual. As both a teacher—and one for whom literature led to photography—and a visual artist, you would seem to straddle this divide. Is this divide real? Why or why not?

Gaspé Peninsula, Canada, 2003; courtesy of the artist

Gaspé Peninsula, Canada, 2003; courtesy of the artist

JR: Frankly, I think the verbal/visual thing is an empty distinction that exempts writers from looking at Hopper’s paintings and painters from reading Faulkner’s books. But we need both kinds of artists in our lives! Yet I know that photography is connected to storytelling in a way that painting isn’t. I recognize that. Photographs—if we are to know the mind of the photographer and not just the cleverness of his image—need to exist in some kind of continuum, which can often be transformed into a narrative. Think of Robert Frank’s influential book of photographs The Americans (1958), a purely visual poem describing an America haunted by its own loneliness. It’s worth noting that Jack Kerouac wrote the introduction for The Americans.

 

THR: You have described photography as “the deep surprise of living in the ordinary world.” How have you overcome complacency or habit in order to remain in a state of wonder?

JR: Nowadays I think “wonder” is more of a capacity than a state-of-being. If you remain in a state of wonder, how could you develop any sort of wit? You’d end up like that terrible innocent, Harold Skimpole, in Bleak House.

Coronado, Ocala, Florida, 1986; courtesy of the artist

Coronado, Ocala, Florida, 1986; courtesy of the artist

But that doesn’t mean that there aren’t seasons of wonder. Becoming of a photographer in the early seventies was like living in that season. My first marriage was ending in North Carolina, I’d quit teaching, I had no money, and here I was, walking around Tompkins Square on the Lower East Side, looking for the right photograph. It was as if I was wandering through an undiscovered country, not exactly lost, even though I had no idea what I’d encounter on the next block. I mean, suddenly my job consisted of looking at things and photographing them in such a way that someone else would say “Yes!” What a wonderful thing to do! Of course, aesthetic and moral questions were a kind of energy. What should I photograph? Should I look at it widely or narrowly? What are the limits of intrusion? How do I learn to slow down enough to truly look? Not having developed my own way of looking at things, I pretended, at least half the time, that I was Cartier-Bresson, and I looked for images that would contain his kind of information. Sometimes I’d see something that I liked, a bar on a corner with sunlight falling sideways on the street, and I’d just wait around for something else to happen, like a dog running by. Then the dog would run by, and I’d take the photograph, and I’d take a deep breath because I knew I was getting it.

Of course later on, when I was finally taking my own photographs, not Cartier-Bresson’s, I exchanged that early wonder for patience and know-how. That was necessary.

But the capacity for wonder doesn’t go away. In 2007, when I saw the Lower 9th Ward [in New Orleans] for the first time, I felt the same way I’d felt in 1970s in lower Manhattan. Here was a story that hadn’t been told. Other stories had been told about it, but not the one I wanted to tell. Once again, I was on fire.

 

THR: So your work in New Orleans’s Lower 9th Ward after Hurricane Katrina brought you back to the fundamental reasons for becoming a photographer. What were those fundamentals? How did you approach this project knowing that some many people were visiting the city with more prurient intent as “disaster tourists”?

JR: When my wife and I visited New Orleans in February 2007, I had no intention of photographing the Lower 9th Ward or, for that matter, any of the breached levee zones. We were there to see a city that was, once again, opening up its doors. All I knew about the Lower 9th Ward—and I learned it while the Lower 9th was filling up with water—was that the media invariably linked it to poverty and crime. Then, the Lower 9th was underwater and people drowned, and now, in 2007, it was uninhabited. I’d seen the disaster and post-disaster photographs. End of story. New Orleans wasn’t my city.

St. Rose Missionary Baptist Church, 9th Ward, New Orleans, 2007; courtesy of the artist

St. Rose Missionary Baptist Church, 9th Ward, New Orleans, 2007; courtesy of the artist

But when a local friend drove me through what remained of the Lower 9th Ward, I was shocked. Not just by its eerie silence and emptiness, although that was shocking enough. No, it was more of a mental surprise: That the Lower 9th wasn’t anything like what I’d been led to believe. It wasn’t a slum; it was a working-class neighborhood full of bungalows. There was an elementary school named after Louis Armstrong. There were churches, once full of worshipers, with wooden doors falling off their hinges. There were abandoned homes, baking in bacteria, homes that had once been lovingly tended. You could still see that. But such care was pointless now, tragically so. People, I saw, had loved this place, and soon everything was going to be demolished. The only sound I heard in the Lower 9th Ward was the rumble of dump trucks and the crunching of wood.

The collapse of the distinction between “us” and “them” is the beginning of real documentary work. It is also one of the journeys consciousness is required to take. If you don’t take it, you’re probably a propagandist.

I decided to archive the loss, to memorialize it, before everything was gone. Photographs do that very well. They’re a fine, if modest, consolation. They testify to what has been and what will be no more, and this testimony matters. I hoped my photographs would tell a story about possession and loss, community and separation. I’d photograph only the evidence. Small things. This was someone’s home. These were the bulletin boards in someone’s kindergarten class. Everything I photographed then has since been carted away.

In 2008, on the third anniversary of Katrina, the New Orleans African-American Museum held an exhibit of my Lower 9th Ward photographs. I believe that if the photographs had pretended to know more than they had a right to know, had tried, say, to capture the sorrowful faces of the dispossessed, the invitation would never have been extended.

 

THR: How do you reconcile photography’s mechanical or technical aspects with its potential for expressiveness? Has there ever been an instance in which your expertise as a photographer has failed to capture the moment you saw with the naked eye?

JR: Well, one’s proficiency—as a photographer using technical equipment—improves over time. You learn what you’re doing, and the odds of capturing what you want improve. Your timing gets better. You can anticipate, if not always what Cartier-Bresson called “the decisive moment,” then at least the indecisive moment you’ve been looking for. I know this sounds a little crazy, but on the deepest level of expression, I don’t think it matters if you’re holding a pen or a camera. “I’m an artist,” John Lennon said, “and if you give me a tuba, I’ll bring you something out of it.”

Expertise, however, is different from proficiency, especially, in the arts, where it is only approximate. Creativity, of the highest sort, has its seasons. It waxes and wanes. A photographer with expertise, one who consistently produces expert images, is probably a wedding photographer. My goal as an artist is to avoid predictability and to create fresh images. So I have to keep finding a way to slide away from my own expertise, and to realign my sights.

Some photographers take the same photograph over and over¸ and that requires a kind of expertise. It’s a way of making a living.

 

THR: Can you tell us about what sort of equipment you use? Do you prefer traditional film or digital cameras? If you use a digital camera, have there been any modifications in your methods or your approach?

JR: Well, I should start off by saying that I’ve been shooting with a digital camera for a while now. Probably out of necessity. I spend as much time working on digital prints as I used to spend in the darkroom, but now I don’t have to stand on my bad left foot.

In my case, switching from film to digital was a matter of convenience, and that’s about it. Even though I am using a new technology, the reasons why I take photographs haven’t changed. The digital camera is, really, just a camera, and the world I want to photograph is the same old world. The old challenge remains unchanged: to use my camera to disclose some sort of hidden meaning that lies below our common awareness. A poet’s task, neither more nor less. So I trained myself to look closely for the little thing that nobody was paying attention to, the quiet thing that didn’t want to give away its secret importance. An unmade bed. A chessboard in Tompkins Square after a rainstorm. Something you might walk right by.

I guess I have faith that the actual world, as it is, is enough. It’s my guiding principle. I think that if I move things around in my photographs, arrange expressions, say, or digitally create a dream effect, then I won’t meet the criterion of perception that I’ve set for myself. I want to distill reality, not modify it with software.

Of course I’m describing only one approach to image-making—one that I inherited from a certain time and place. It’s just the way I do things. It’s no better or worse than a hundred other ways of considering and making photographs. It’s just mine.

John Rosenthal’s New Orleans photographs will appear in AFTER: The Silence of the Lower 9th Ward (Safe Harbor Books, forthcoming 2015). See more of his work at www.johnrosenthal.com.

FacebookTwitterLinkedInGoogle+Share

Cowardice and Ebola

A Kikwit, Zaire clinic (1995).

A Kikwit, Zaire clinic (1995).

Not long into his new book Cowardice: A Brief History, Chris Walsh, associate director of Boston University’s writing program, notes that his subject has received surprisingly little direct consideration in Western letters and scholarship. Humanity responds to cowards and cowardly acts with an appalled reticence. “Let’s not speak of them,” Dante’s Virgil says dismissively of the hundreds of cowards crowding Hell’s vestibule. The philosopher Kierkegaard, who gave more thought to the subject, observed that “there must be something wrong with cowardliness, since it is so detested, so averse to being mentioned, that its name has completely disappeared from use.”

Not quite completely, of course, but a certain obliquity does characterize most literate reflections on the subject, even though it’s crucial to our conduct—not only in war but in those moments of choice that call for moral courage. Consider the current debate over the appropriate response to the Ebola virus as it spreads beyond its epicenter in West Africa. Does the discourse of cowardice, and its antonyms bravery and courage, play any role in this debate? Should it? Are we being evasive—even cowardly—in refusing to see the debate in those terms as well as in medical-epidemiological or national security ones? Perhaps courage and cowardice raise moral questions that we would rather ignore.

Bringing cowardice into the discussion certainly doesn’t simplify matters. Even if we accept Walsh’s working definition of a coward—”someone who, because of excessive fear, fails to do what he is supposed to do”—we run into problems. One person’s excessive fear is another’s reasonable fear, and defining our duty in any given situation invites qualifications and assorted objections.

Sure, we can easily point to the brave healthcare workers who have thoroughly acquitted themselves in fulfilling their duty, in many cases losing their lives or otherwise going far beyond duty’s call. The British healthcare worker who returned to Africa after recovering from the disease fills us with highest admiration, yet the extent of his courage may cause some to wonder whether it exceeds reasonable bounds. If we explain away such courage by attributing it to perverse or even vainglorious motives, we may be admitting we hold an idea of duty that lacks firm or clear obligations. Uncomfortable as such an admission may be, we need to understand our own standards of duty so that we know when we may be shirking them.

The question of our own cowardice grows more discomfiting when we consider the most contentious proposed response to the spread of the disease: the imposition of a travel ban that would curtail international commercial flights for people from Liberia and Sierra Leone. There are opposing arguments about the efficacy of such a ban, of course. But is one side more clearly courageous and the other more clearly cowardly?

Again, answers are not easy. First, where is one’s primary duty here—to one’s national security and well being, or to humanity and the greater global good? We who want to participate fully in a globalized world must know that with its benefits come responsibilities. Is it not cowardice to shirk such responsibilities?

It may be another form of cowardice to rush to answers that are too easy and too quick. Are there, for example, better ways of limiting the movement of infected people than by imposing restrictions on all people who share their nationality? Have we discussed and explored those alternatives? If we say our primary and unqualified duty is to our own national good, are we certain that such a ban furthers that good in the short or long terms?

Courage and cowardice enter into these questions again when we examine the motives for our answers. “Kinds of cowardice can conflict,” notes Walsh.  And he adds that “Excessive fear of being or seeming cowardly can lead to cowardice.” These potential conflicts should bear heavily on the minds of our political leaders, who sometimes show more concern for appearances than for substance. Are those political leaders who are now calling for a ban merely pandering to a fearful base or taking yet another opportunity to bash whatever President Obama calls for? Conversely, could President Obama be so fearful of the disapproval of the international community that he might refuse to make certain choices? Is it cowardly not to be cruel to be kind? After all, imposing a ban on the most afflicted nations might move the elites of those nations (the citizens most likely to avail themselves of international air travel) to devote more resources and effort to containing the contagion, in the way that Nigeria’s political elite did.

The discourse of cowardice may not bring easy answers to dilemmas so vexing as this one, but if it brought a little more honesty, that would be no small contribution. The somewhat paradoxical problem with facing up to cowardice, as Walsh’s excellent book shows, is that it usually requires great courage.

 

The Hedgehog’s Array: October 17, 2014

hedgehog array logo_FLAT_72dpi[3]

 

Interesting reads from last week:

“5 Guidelines for Living in a Pluralist Society,” John D. Inazu
“[All of our] common ground tells us surprisingly little about who we are as a people, what our goals should be, or what counts as progress.”

“Realism Is a Figure of Speech,” Joe Fassler
Vikram Chandra: “Realism is not something that is transparent. It’s not just the glass through which we see. It is also a figure of speech.”

“What if Black America Were a Country?” Theodore R. Johnson
“Essentially, what we’re witnessing is a nation that is comparable in certain ways to a regional power existing in the state of Disparistan (or, perhaps, Despairistan).”

“Salon Culture: Network of Ideas,” Andrian Kreye
“For more than a century now the salon as a gathering to exchange ideas has been a footnote of the history of ideas. With the advent of truly mass media this exchange had first been democratized, then in rapid and parallel changes diluted, radicalized, toned down, turned up, upside and down again.”

“Wikipedia, a Professor’s Best Friend,” Darius Jemielniak
“I am a professor who not only recommends that my students use Wikipedia but also encourages them to edit and develop it. However, I am in the clear minority in academia.”

“My Childhood Friend, the ISIS Jihadist,” Jakob Sheikh
“In Amir’s world, the heroes are Sunni extremists fighting for a global Islamic Caliphate. The enemies are the infidels. Weeds to be removed from the face of the earth.”

“S.E. Hinton and the Y.A. Debate,” Jon Michaud
“The author who changed the way that books for teens were written and published has seen her own work go from the spinning wire display rack near checkout to an online marketplace accessible while you wait for your morning latte.”

From our archives:

Mark Edmundson urges us to “Pay Attention!”

 

The Morality of Food—Then and Now

398px-Poster_-_Food_will_win_the_war

U.S. Food Administration poster, circa 1917; this poster appeared in several different languages so that the food conservation effort would reach the many immigrant groups in America; Wikipedia Commons

Eating healthy, supporting local farmers, buying fair trade coffee—aren’t these all practices we can feel good about? Well, perhaps to a point.

If the consumption of food in our time has become something of a moral matter, sometimes involving self-righteousness and even coercion, it’s helpful to learn that we didn’t arrive here overnight. The moral American foodie has a history, part of which is told by Michigan State Universtiy historian Helen Zoe Veit in her 2013 book  Modern Food, Moral Food,

Veit’s focus is eating habits in the early twentieth-century America, a time when the culture was strongly shaped by rising immigration, changing views of gender and race, greater access to public education, and, perhaps most of the all, the advent of the Great War. Even before the United States entered the war, Herbert Hoover’s Food Administration instituted a number of national programs aimed at conserving food at home so more could be transported abroad to aid the Allies. Among Hoover’s initiatives were Wheatless Monday, Meatless Tuesday, and Porkless Saturday. Some fourteen million Americans, according to Veit, or seventy percent of the population, joined the effort, signing pledge cards and proudly proclaiming their membership in the Food Administration.

This effort worked primarily through voluntarism and a widespread enthusiasm for patriotic gestures on a grand scale. (There were also plenty of snoops who reported the noncompliant.) It also helped that Americans became increasingly receptive to meatless foreign cuisine, especially Italian pasta and other vegetable-only dishes. Indeed, pasta was ideal for food conservation because it was made from semolina flour rather than the wheat needed for the war effort. Housewives boasted of their sacrifice and ingenuity in the kitchen, and publishers churned out cookbooks on fleshless eating and “the rational dietary.” Cannily, the Food Administration, public leaders, and the media further reinforced the effort by connecting voluntary fasting with democratic self-determination, individual self-control, national willpower, and moral urgency.

It is interesting to contrast the motivations behind the World War I food conservation efforts with those behind today’s food-related activism. We are all foodies now, it seems. New concepts like fair trade or responsibly sourced foods vie with the return of traditional practices such as  canning and pickling. Whether you require a social agenda with your fruits and vegetables or just a wide selection of cereal and chips, there are grocery chains to suit every shopper. In the university town where I live, there are three national chains, three well-known specialty grocers, several brave independent markets, weekly farmers’ markets, and a number of “provisioners”—not to mention big box retailers with grocery departments. This unprecedented array of choices would have had early twentieth-century food reformers sputtering in consternation at our pampered and picky appetites.

Where has this food revolution taken us? While it has opened up our eating and shopping choices, it has also reintroduced us to the moral aspect of food. And this being a media-saturated age, morality often piggybacks on star-power. First Lady Michelle Obama wants to combat the sense of failure encountered by overweight children with her Let’s Move program and its laudable, if grandiose, ambition of “[solving] the challenge of childhood obesity within a generation, so that children born today will grow up healthier and able to pursue their dreams.” Ex-Beatle Paul McCartney asserts that eating meat contributes to greenhouse gases (presumably he means not just the methane emissions of cattle and other livestock, but also the carbon footprint of the meat industry in general). Desperate Housewife Eva Longoria wants to demolish the gender bias of food by opening SHe, a female-oriented steakhouse featuring “he-cuts,” “she-cuts,” and “we-cuts.” Surely, no one would want to ruin the dreams of children, destroy the atmosphere, or oppress women simply in the name of freedom of eating?

And then there are the food glitterati: celebrity chefs with their own cable shows and stars puffing the latest fad diets. The obligatory book contracts and the cross-country appearance tours ensure that all Americans will have access to what it takes to cook like the Naked Chef while staying sexy on the Beyoncé diet. Much of this celebrity food activism is aimed at improving public health and the environment, but it often verges on coercion, both psychological and, increasingly, legal—think Michael Bloomberg and his battle against transfats and soft drinks. Today’s message about moral eating behavior is a rather muddled one, equal parts accessibility, self-expression, self-control, and overeager civic-mindedness.

Oddly, we find ourselves at a time when food activism is both radical and mainstream. The label “food elitist” or “foodocrat” may be hurled as an insult or worn as a badge of honor. The foodocracy in which we now find ourselves gives people permission to lecture others, often with overbearing rudeness, about what makes some food choices superior and others inferior. Having your latte the way you want it may happen in Starbucks, but don’t expect the same accommodation in the boutique coffee house in a gentrifying neighborhood. In our game of organic oneupsmanship, we have all become complicit in encouraging the kind of food elitism that also strips us of what one blogger called “gustatory freedom.”

Then comes the recent story of Meatless Mondays at schools in the Sarasota County (Florida) School District. This program, part of Johns Hopkins University’s Monday Campaigns, dedicates the first day of the week to different health initiatives: quitting smoking, exercise, safe sex, and eating right. Meatless Monday drew its inspiration from the meatless and wheatless programs of the World War I-era Food Administration. Just as the war-time food conservators strove to convince Americans that living without meat was the patriotic choice, so Florida school officials hope to convince schoolchildren that a bun without a burger is worthwhile for “personal health and for the health of the planet.” The difference this time around is that the international menu choices in Florida—hummus, fiesta taco salad, and spaghetti marinara—will be an easy sell to today’s kids. Much has changed since 1911 when the Van Camp Company advertised that its canned spaghetti was made with the same sauce as its pork and beans in order to convince consumers that foreign food was good.

What hasn’t changed is the use of food as propaganda. Whether little Johnny in Florida buys into Meatless Monday for his health, for the environment, or just because he forgot to bring a sandwich from home, he will have his food choices limited—and dictated—by those in authority over him. The lesson here is that others know better than you and they will pass legislation (or institute programs) to convince you of it. Perhaps we are what we eat after all.

Leann Davis Alspaugh is managing editor of The Hedgehog Review.

Recognizing the Adult in the Mirror

The Life and Age of Stages of Man's Life, from the Cradle to the Grave by James Baillie

“The Life and Age of Stages of Man’s Life, from the Cradle to the Grave,” by James Baillie

Richard Linklater’s Boyhood was this summer’s critical hit, achieving for a time a much-coveted 100 percent fresh rating at Rotten Tomatoes. (It now sits at 99 percent). The film follows a young Texan named Mason as he grows from a quiet child to a disaffected teen, ending when he becomes a college freshman. All of the time passed is real—Mason grows up over twelve years, which is how long the film took to make—and Linklater fills the background with references to remind the audience that these twelve years have passed for them, too. Everybody’s gotten older.

But nobody is growing up. Boyhood is a “coming of age” story only in the most formal sense. There’s no age to come into, no adulthood to achieve, and no adults to be found. Mason’s life is full of older people who burden him with clichéd advice, but they, too, are merely drifting from one event to another without really knowing why.  As Mason’s mother sends him off to college, she unloads her self-pity, telling Mason that raising him was her last “milestone” and that now all she has left is waiting for death. “I thought there would be more,” she says.  But if Mason has learned anything from his elders, it’s not to expect even that much.

This aimlessness is made pointedly clear in a scene in which Mason visits his step-grandparents for his fifteenth birthday. From them, he receives a suit, a Bible, and a gun. They are meant to be signs of his adulthood. But we know, and Mason knows, that he will never touch any of them. They are simply relics of an old way of being in the world, and not one that he wants or even can choose.

Boyhood did not come up in A.O. Scott’s recent essay for New York Times Magazine, “The Death of Adulthood in American Culture.”  It should have. As if to reinforce Scott’s claim is that there are no models for adulthood, Boyhood offers a sustained look at what that might mean—and it’s anything but attractive.

Yet as compelling as Boyhood is, there’s also something false about it. Mason seems predestined to his own disaffected life in a way that real people generally aren’t. Similarly, although Scott’s essay is usefully provocative, there’s something missing in it as well—namely, any clear articulation of the idea of the adulthood that he claims has been lost. Continue reading

“What is Liberal Education For?”: A Preview

Aristotle teaching Alexander the Great, Charles Laplante (1866).

Aristotle teaching Alexander the Great, Charles Laplante (1866).

This week, I’ll be presenting a paper at “What is Liberal Education For?,” a conference being held at St. John’s College, Santa Fe.  Lasting three days, it will have some twenty-eight panels and include presentations by scholars such as Boston University’s Christopher Ricks, Institute for Advance Studies in Culture fellow and author Matthew Crawford, and philosopher Roger Scruton, whose lecture “Architecture and Aesthetic Education” will close the proceedings.

Here is the conference’s statement of purpose:

We raise this question [What is Liberal Education For?], recognizing that liberal education and the great tradition of the American liberal arts college have been put on the defensive of late. Small colleges across the nation have to make their case to students, to their parents, and to the public more urgently than ever. The causes of this crisis have been analyzed extensively: there is an emerging consensus that the rapid growth of consumerism amidst new economic challenges, and the fragmentation of general studies driven by professional training and specialization in the universities, have led us to undervalue drastically the humane goals of liberal studies. These causes are themselves symptomatic of a deeper crisis in our time, a crisis of uncertainty and disorientation affecting every field of human endeavor—scientific, social, intellectual, artistic, and spiritual. Precisely in response to this crisis, liberal education can reaffirm its relevance and purposes.

My own panel, “Liberal Education: Changing Conversations,”  is focused on the rhetorical arguments for liberal education. My paper, “Liberal Education in a Specialized Age,” considers the case that can be made for unspecialized education in an economy that—on the surface at least—demands specialization and views education as job training.

There are reasons to suspect that this narrative is untrue, or at least extremely incomplete—witness the rise of the service economy. But I think it is true that we take for granted that specialization is a good and that education ought to accommodate the marketplace by helping students to specialize sooner and more adeptly. We take these things for granted even if the facts around us aren’t bearing them out. Continue reading

After Strange Gods: Peter Thiel and the Cult of Startupism

Peter Thiel at TechCrunch50 (2008). Image from Wikimedia Commons

Peter Thiel at TechCrunch50 (2008). Image from Wikimedia Commons

In the humanities departments of the university where I live and work, the word “corporate” is an epithet of disdain, and “entrepreneurship” is code for “corporate.” My fellow humanists tolerate the business school because it provides fuel for the English composition classes that keep us tenured radicals employed.

A confession: I used to share that outlook myself. But the experience of working alongside actual entrepreneurs and CEOs of various stripes shattered my comfortable assumptions. Not only did I find that entrepreneurs are willing to take risks that I would never hazard; I also learned that many are keenly interested in the world of ideas, theory, and “big picture” thinking. Indeed, such philosophically inclined entrepreneurs excel at practical wisdom—what Aristotle called phronesis—precisely because their imaginations have been nourished by contemplation. They are philosophers of a kind I will never be.

Prominent among these philosopher-entrepreneurs is Peter Thiel.  A co-founder of PayPal and Palantir, he has become a Silicon Valley guru, the contrarians’ contrarian. His new book, Zero to One: Notes on Startups, Or How to Build the Future, began as notes taken by an admiring student, Ben Masters, who took Thiel’s course at Stanford. It is an ambitious book—it could even be described as Machiavelli’s The Prince as re-imagined for our startup age—and it puts Thiel’s command of the philosophical canon on prominent display. How many other business books at the airport bookstore draw on Hegel, Nietzsche, Aristotle, John Rawls, and René Girard?

Zero to One is aphoristic, biting, forthright, and at times, in the spirit of Machiavelli, ruthless. Thiel unapologetically commends the pursuit of monopoly (“the more we compete, the less we gain”), and then counsels noble lies to hide its achievement. He casts aspersions on the bureaucracies of existing organizations: “Accountable to nobody,” he writes, “the DMV is misaligned with everybody.” And he calls out bad ideas, particularly those coupled with shoddy execution. His take-down of failed federal investment in clean technology is well worth the cost of the book.

Thiel’s intellectual reach is anything but modest. He offers a sociology of creativity, a grand theory of human civilization, and even a sort of theology of culture–though it is not quite clear whom he casts as God. Indeed, it’s over the grandiosity and hubris of Thiel’s claims that I find myself parting ways with the more fawning reviews of his book. I realize that creative risk-tasking requires a healthy dose of self-confidence that can often come across as arrogance. What worries me, though, is not his confident dispensing of practical wisdom but the hubristic evangelizing for what might be called startupism.

A cult of creative innovation, startupism has four notable  features, beginning with the outsized role it accords to human creativity. As early as page two of the book, Thiel tells us “humans are distinguished from other species by our abilities to work miracles. We call these miracles technology.” His emphasis on the creative power of human making is laudable and timely, though not particularly new. (Thiel should add Giambattista Vico to his reading list.) What’s unique to startupism is the “miraculous,” god-like powers Thiel attributes to us mortals: “Humans don’t decide what to build by making choices from some cosmic catalogue of options given in advance; instead, by creating new technologies, we rewrite the plan of the world.” We command fate. “A startup is the largest endeavor over which you can have definite mastery. You can have agency not just over your own life, but over a small and important part of the world. It begins by rejecting the unjust tyranny of Chance. You are not a lottery ticket.”

Second, the creativity celebrated by startupism blurs the old distinction between Creator and creature. What Thiel calls “vertical” or “intensive” progress isn’t 1+1 development; truly creative, intensive progress is a qualitative advance from 0 to 1. I believe the Latin for that is creation ex nihilo. (And “[t]he single word for vertical, 0 to 1 progress” is…you guessed it…“technology.”)

Third, as you might expect, startupism has its own ecclesia: the new organization founded by a noble remnant who have distanced themselves from the behemoths of existing institutions. “New technology,” Thiel observes, “tends to come from new ventures” that we call startups. These are launched by tiny tribes that Thiel compares to the Founding Fathers and the British Royal Society. “[S]mall groups of people bound together by a sense of mission have changed the world for the better,” he explains, because “it’s hard to develop new things in big organizations, and it’s even harder to do it by yourself.” We shouldn’t be surprised, then, that “the best startups might be considered slightly less extreme kinds of cults.” The successful startup will have to be a total, all-encompassing institution: our family, our home, our cultus.

Finally, in startupism, the founder is savior. Granted, Thiel—following Girard—is going to talk about this in terms of scapegoating in a long, meandering chapter that aims to associate successful Silicon Valley geeks with pop stars and other people we like to look at. But it’s not just that founders are heroes in their companies. The scope of their impact is much wider: “Creative monopolists give customers more choices by adding entirely new categories of abundance to the world. Creative monopolies aren’t just good for the rest of society; they’re powerful engines for making it better.” But to get there, Thiel says, “we need founders.” No founders; no progress. Steve Jobs, hear our prayer.

Thiel offers genuine, authoritative insight into entrepreneurship and the dynamics of a startup organization. When he tacitly suggests that society derives its crucial and even salvific dynamism from the startup, I become both skeptical and nervous. Can startups contribute to the common good? Without question. Are startups going to save us? Not a chance.

Thiel’s hubris stems from a certain parochialism. Startupism is a Bay-area mythology whose plausibility diminishes by the time you hit, say, Sacramento. The confident narrative of progress, the narrow identification of progress with technology, and the tales of 0 to 1 creationism are the products of an echo chamber. This chamber fosters hubris among the faithful precisely because it shuts out competing voices that might remind them of the deeper and wider institutional, intellectual, and even spiritual resources on which they depend and draw. We are makers, without question, but we are also heirs. We can imagine a different future, but we have to deal with a past that was created by others before us.

Thiel, and the New Creators like him (and get ready for a slew of parroting Little Creators coming in their wake), have drunk their own Kool-Aid and believed their own PR. It’s why all the sentences that begin “At PayPal…” grow tiresome and make you wonder why someone who developed a new mode of currency exchange thinks he brought about the new heaven and the new earth ex nihilo. One can applaud Thiel’s elucidation of creativity and innovation while deploring the (idolatrous) theology in which he embeds it. We need startups. We can do without startupism.

James K. A. Smith is a professor of philosophy at Calvin College, where he holds the Byker Chair in Applied Reformed Theology and Worldview. He is the author of Who’s Afraid of Postmodernism? and How (Not) to be Be Secular: Reading Charles Taylor, among other books. Smith is also the editor of Comment magazine.

Looking Beyond the New Numbers on Poverty

The U.S. Census Bureau has released the income and poverty figures for Americans in 2013, and they offer at least a glimmer of light in an otherwise quite somber landscape. The glimmer is the poverty rate, which fell from 15 percent in 2012 to 14.5 in 2013, with statistically significant improvements occurring, notably, among children and Hispanics.

102271790Left: Native American man and child with American flag by Lauri Lyons/Photonica World,
Getty #102271790

 

 

 

 

 

 

The gloom—although it’s not really news at this point—comes from the absence of the kind of improvements Americans have been looking for since the end of 2008–2009 recession. The number of poor people (45.3 million) is basically unchanged from the numbers of the two previous years. And while the situation for many children may be improving, the poverty rates for youth transitioning into adulthood (ages 18–24) and for young adults (ages 25–34) were no better than those of the previous year—and this despite rising education levels among these two key cohorts of the workforce. Education is still America’s great engine of upward mobility, of course, but high-school degrees and even college diplomas are not helping as many Americans move in the right direction. (And indeed the debts incurred through hefty college loans are leaving many struggling grads in dire financial straits. )

Extrapolations from the data reveal further disturbing trends. According to analysis by the Brookings Institution,  for instance, the number of poor people in the nation’s 100 largest metropolitan areas has grown by 10 million between 2000 and 2013, with the suburban poor population growing twice as fast the the number of urban poor. Brookings researchers Elizabeth Kneebone and Natalie Holmes summarize the pattern this way:

Four years into the recovery, America’s metro areas—like the nation overall—had achieved only modest progress toward reducing poverty to its pre-recession levels. Where gains did occur, they tended to happen in big cities, further accelerating a long-term trend in the suburbanization of U.S. poverty and the challenges that accompany it.

But more troubling than all of this is an underlying economic trend that puts more and more individuals and families on the edge of poverty: the continued flattening of real wage growth for most American workers. Indeed, since 2000, according to the Economic Policy Institute, real wage growth has been negative for those in the bottom 30 percent of wage distribution, a reality that puts many workers only one missed paycheck away from falling below the poverty line.

Numbers and trends such as these are common fodder for the debate over who’s winning or losing the War against Poverty, a campaign launched fifty years ago by President Lyndon B. Johnson. For reasons I discussed in an earlier blog, the overworked war metaphor has itself become a problem, whether used by liberals in dedicating themselves to the struggle against poverty or by conservatives in describing the failure of Big Government to defeat it.

In addition to creating unrealistic goals and expectations, the war metaphor ends up placing even more stigmas on the poor, stigmas that are used by some to blame the poor for their poverty and by others to deny them responsibilty or agency. The stigmas also contribute to the related and reflexive view that poverty is exclusively a problem of others, not equally one of the larger society or our political economy.

How, then, do we think beyond the numbers, the metaphors, the stale single-cause explanations? The fall issue of The Hedgehog Review invites reader to consider the ways we have come to think about, act in behalf of, depict, or judge those people whose only truly common distinction is their limited—sometimes desperately limited—means. We ask readers to question whether their own assumptions and perceptions are themselves contributing to an increasingly divided commonwealth, in which zero-sum thinking seems to be the guiding principal of our political economy.

One metaphor often used in discussing the poor—the trope of the “invisible poor”—receives much-needed scrutiny from several contributors to the fall issue, including literary scholar John Marsh. Tracing a long and distinctive line of authors who saw themselves as exposing the hidden realities of poverty in order to enlighten and encourage the sympathy of better-off readers, Marsh questions whether the project of class discovery didn’t quickly run its legitimate course. After all, despite repeated fictional and journalistic exposés, the alienation between the well off and the poor seemed only to grow. Writers such as James Agee in Let Us Now Praise Famous Men, for example, struggled with feelings of bad faith, worrying that his writing was becoming, in effect, “poor-nography,” an exploitation of the very people whose lives he was trying to document and enter into.

Exposing the invisible poor may have become an exhausted moral and literary trope, but it is hard to abandon it when the reality of the last four decades has been a growing separation between the classes, with less interaction and fewer places of intersection between the middle and upper classes, one one side, and the less well off, on the other. Conservative social critic Charles Murray in Coming Apart describes a radical divide between what he terms the new upper class and the new lower class, a divide that has effectively destroyed a common civic culture through which values were once broadly inculcated and reinforced.

Looking beyond that widening divide, several of our authors describe the places and institutions where the poor, whether working or not, are finding support and community, making connections, pulling themselves up by the bootstraps. But as noted author and educator Mike Rose explains, many of the strongest institutional supports for the less well off—the public library, among them—are themselves in a precarious condition. Today, too many  Americans simply fail to appreciate the mutiple benefits that a well-supported local library brings to people who have very little cultural, intellectual, and even social infrastructure on which to rely.

Social research and resulting public policies themselves contribute to misunderstandings about the poor and poverty, argues historian Alice O’Connor. By trying to locate a single pathology or set of root causes (whether cultural, biological or even neurological), American social analysts repeatedly advocate policies that aim to “fix” people while neglecting or even ignoring the wider and deeper social and economic forces that make poverty, in the minds of some people, an acceptable and even necessary component of an efficient, competive economy. In short, says O’Connor, “poverty research has become caught up in a paradox of its own making—of diminishing insight into the problem of poverty amid more, and more intimately detailed, data about the poor.”

O’Connor also cites sociologist Mark Rank’s recent observation that economic hardship has become a widespread, diversified, and even “mainstream” experience affecting a majority of Americans during the course of their lives. In a first-person account, critic and novelist William McPherson explores his own descent into such hardship, recounting some of the ways that poverty complicates almost every aspect of daily existence. Anthropologists Michael and Ines Jindra, in their report from the world of the safety net, provide a glimpse into the many predicaments and problems that drive a diverse assortment of people to seek help from various independent, nonprofit assistance agencies. They also bring to light the concerns and challenges of the people who work in these agencies. “Taken together,” the authors say, “these and other nonprofits provide a window on some of the colliding and commingling subcultures that make up the kaleidoscopic world of the poor.”

Photographic depictions of the poor have also played a crucial role in America’s discourse on poverty, and the fall issue of THR honors that body of work through a photo gallery and essay that features the work of many of our outstanding photographers and documentarians. These are images that capture the granular details of hard lives, attesting as much to strength and resilience as to struggle and weariness.

“Thinking About the Poor,” the fall issue of The Hedgehog Review will be available November 1 at select Barnes & Noble bookstores or by ordering online. Subscribe in print ($25 one year) or digital ($10).