Tag Archives: culture

All for the Now—or the World Well Lost?

Costume designs for the libation bearers, London Globe production of “Oresteia,” 2015.

Riding with my middle-school daughter a while back, I heard one of her favorite pop songs on the radio and called it to her attention; she sighed and said, “Oh, mom, that’s so two minutes ago.” Apparently, the song was popular last year and, therefore, no longer qualifies as one of her current favorites. Then, she added, in exasperation, that the expression itself was outdated. It was popular a year ago, back when she and her friends used to parody the lingo of certain “popular, mean girls” from film or TV. So, a year ago is now “old”? Or two minutes? The whole conversation made me wonder: What kind of a sense of the past do our children grow up with today, and how does it shape our attitudes toward history?

That question emerged in a different way when my son started high school this year. As an academic observing his orientation, I was keenly interested in this introduction to the curriculum. Of all the things I learned, however, the most surprising was that his curriculum requires only one year of history to graduate. Three and a half years of physical education are required. Three to four years of English are essential, as are three years of math. But students at my son’s school can graduate with only one year of history, and US history at that. Even in his first-year English course, where students are required to read only three literary works during the entire academic year, two of the three were written in the last sixty years. In other words, there’s not much distant history in his English curriculum either.

This also squares with trends at the small liberal arts college where I teach. Enrollment in history courses is down. The history department’s faculty allocation has recently been cut. Even in the English department, where enrollment numbers are strong this year, our historically-oriented Renaissance literature line is being suspended due to budgetary adjustments, no doubt to make way for faculty positions in programs like biochemistry, molecular biology, and business. What this means is that my department will soon be without a permanent member who specializes in the period of the greatest flowering of literature in English.

And this dearth of expertise in the historical humanities is evident across the College. When I count the total number of pre-nineteenth century historical humanities positions at my college, considering fields such as art history, philosophy, theater, and religion, I find that only five percent of all full-time, permanent faculty members have expertise in such areas.

Is it any wonder then that young people often have a limited sense of the past, unable to place even watershed events such as the Protestant Reformation or the French Revolution or identify major historical time periods? Not long ago, my son returned home from middle school to boast that, unlike his peers who were hooked on techno-pop, he’d suddenly become interested in “more medieval music”—“You know, Mom, like Simon and Garfunkle, the Beatles, ELO.” I’ll give him a pass for being only twelve at the time, but I’d suggest that this historical illiteracy is more common—and more costly—than we might think.

Why should teaching the past matter? It matters because teaching any pre-modern culture exposes students to ways of being that may be alien to them, a form of ontological diversity just as important as the more familiar kinds we hear so much about today. Many years ago, in a lecture at my college, the classicist Danielle Allen argued that education is fundamentally about knowing the foreign. Like Allen, I share that conviction and, in my own courses, daily ask students to explore the foreign battlefields of Homeric Troy or to inhabit the psychological terrain of Augustine. Both the Iliad and the Confessions offer examples of imaginative mindscapes as foreign to many students as any far-flung land they might visit on a study-abroad trip. And such foreign intellectual encounters, so familiar in early literature and history courses, help students cultivate virtues such as empathy and tolerance.

Tracing the decline and fall of the Roman Empire, distant as it may be, reveals the dangers of overreaching imperial powers, the perils of resources stretched thin, and the consequences of growing economic disparities—none of which are problems confined only to the ancient world. As the historian Timothy Snyder observes in his brief wonder of a book On Tyranny, “Americans today are no wiser than the Europeans who saw democracy yield to fascism, Nazism, or communism in the twentieth century. Our one advantage is that we might learn from their experience.”

Although Aeschylus’s Oresteia brilliantly dramatizes the triumph of democratic processes of justice over vendetta-style retribution, it also displays the pernicious roots of patriarchy, with the Olympian gods themselves legitimizing male rule over female, as Apollo exculpates Orestes by claiming that the mother isn’t really the parent, only the seed bed, while Athena chimes in, professing to “stand by the man” despite being a woman. Likewise, Shakespeare’s Shylock, a comedy that turns on a act of mercy, also illuminates darker themes such as anti-Semitism and ethnic stereotyping.

History also teaches us that the pursuit of knowledge is often a digressive process. Unlike the natural sciences where knowledge and learning are generally linear, experimentation and research leading to new insights and replacing previous conclusions, humanistic knowledge proceeds haltingly. In the natural sciences, one often draws the conclusion that new knowledge is better than old knowledge. In the humanities, we value the ancient, the antique, the quaint, and the outmoded all in the interest of thickening and enriching our understanding of human life.

While much of that life has involved regrettable episodes, history reminds us of what it means to be questing and creative and to transcend the limits of our human predicament, as Julian of Norwich or Galileo or Mary Rowlandson once did. Studying the past has been shown to remove feelings of isolation that many young people in contemporary America report as their greatest fear. Further, today’s younger generation may learn resilience, courage, and fortitude through an imaginative engagement of the people of the past.

I have been haunted by the lines from a poem I recently read in a book, Cruel Futures by Carmen Giménez Smith,that playfully extols “Disorder in exchange/for embracing the now.” Although Smith’s short poem vindicates that disorder by focusing on personal rather than collective, historical knowledge, those lines have left me wondering about the public implications of such an “exchange.” When, as a society, we “embrace the now,” at the expense of the past, what sort of disorderly deal might we be making? I’m thinking here of, for example, the generally low level of civic participation in the United States. Might this indicate that we have become complacent about our history, forgetting the arduous efforts of a small group of patriots and visionaries, preferring instead the promises of Silicon Valley entrepreneurs and charismatic “thought leaders”?

In the academic world where I work, I often hear “That is the way of the past; this is the way of the future,” as if the past were to be regarded as a mere discarded image, disconnected from the priorities of the omnipresent “now.” As educators, we ought to remain wary of such facile dismissals of the past and be vigilant in refuting this kind of chronological snobbery, to borrow a phrase from C.S. Lewis and Owen Barfield. The wisdom and well-being of our young people and our civilization depend on historical knowledge. Otherwise, we may one day find ourselves victims of a “cruel future,” one in which ignorance of past problems condemns us to inevitable repetition of them, and where blindness about historical insights prevents us from seeing wiser paths forward.

Carla Arnell is associate professor of English and chair of the English Department at Lake Forest College.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterLinkedInGoogle+Share

“Putting the Soul to Work”: Reflections on the New Cognitariat

What color is your parachute? Image via Wikimedia Commons.

What color is your parachute? Image via Wikimedia Commons.

I don’t know how careers are seen in other countries, but in the United States we are exhorted to view them as the primary locus of self-realization. The question before you when you are trying to choose a career is to figure out “What Color is Your Parachute?” (the title of a guide to job searches that has been a perennial best seller for most of my lifetime). The aim, to quote the title of another top-selling guide to career choices, is to “Do What You Are.”

These titles tell us something about what Americans expect to find in a career: themselves, in the unlikely form of a marketable commodity. But why should we expect that the inner self waiting to be born corresponds to some paid job or profession? Are we really all in possession of an inner lawyer, an inner beauty products placement specialist, or an inner advertising executive, just waiting for the right job opening? Mightn’t this script for our biographies serve as easily to promote self-limitation or self-betrayal as to further self-actualization?

We spend a great deal of our youth shaping ourselves into the sort of finished product that potential employers will be willing to pay dearly to use. Beginning at a very early age, schooling practices and parental guidance and approval are adjusted, sometimes only semi-consciously, so as to inculcate the personal capacities and temperament demanded by the corporate world. The effort to sculpt oneself for this destiny takes a more concerted form in high school and college. We choose courses of study, and understand the importance of success in these studies, largely with this end in view. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Beethoven and the Beef Jerky Maker

Klaus Kammerichs: Beethon (1986) Material: Beton nach einer Vorlage des Beethoven-Porträts von Joseph Karl Stieler (1819) Standort: Beethovenhalle in Bonn

Klaus Kammerichs: Beethon (1986) Material: Beton nach einer Vorlage des Beethoven-Porträts von Joseph Karl Stieler (1819) Standort: Beethovenhalle in Bonn. By Hans Weingartz (Own work) [CC BY-SA 2.0 de], via Wikimedia Commons

A few years ago, a Boston-based recording studio was entrusted with an especially challenging remastering project. As box after box of reel-to-reel tapes, the classical music archives of RCA’s Living Stereo label, began arriving, the studio engineers had to figure out how to get usable material off the fragile tapes and transfer it to compact discs. The tapes had been stored properly, but the glue that held the magnetic medium on the acetate tapes had become sticky. Unspooling the tapes in order to thread them through a playback machine could destroy them entirely.

Introduced in the 1950s, the Living Stereo label brought stereo recordings in the form of long-playing records (LPs) into the mainstream. With a postwar boom in home record players with stereo sound reproduction capability, music lovers could enjoy affordable LPs featuring some of the world’s greatest orchestras in the comfort of their own living rooms. Living Stereo, along with rival CBS Masterworks, changed more than just how people listened to music. Heard on radio programs and used as music education tools, these recordings set new musical standards for amateurs and professionals all over the world. Conductors like Charles Munch and violinists like Jascha Heifetz became household names whose interpretations have continued to have an indelible impact on music.

In the early 2000s, many record companies began to dig deep into their back catalogs, looking for a way to monetize their holdings. Transferring Living Stereo archives to CDs would give audiophiles greater access to historic recordings and provide a profitable boost to the record companies. But none of that would happen if the reel-to-reel tapes couldn’t be rescued. Finding a machine to play the tapes on wasn’t a problem—the studio in Boston is a veritable museum of audio technology. But even unspooling the sticky tapes carefully by hand could create irrevocable damage. The solution turned out to be a humble beef jerky maker. The appliance’s round shape is exactly the size of a tape reel, and its dehydration settings reach just the right temperature to “cook” a tape without damaging it. The studio went on to transfer scores of Living Stereo tapes, bringing Beethoven, Brahms, Bartok and a host of great orchestras back from acetate oblivion. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

What Is It About Culture?

Wood engraving after a drawing by Jules Gaildrau, 1857. Old Book Illustrations.

Wood engraving after a drawing by Jules Gaildrau, 1857. Old Book Illustrations.

When the word culture was selected as Merriam-Webster’s word of the year for 2014, we at The Hedgehog Review took notice. Culture, after all, is our game, and the fact that more and more people are apparently puzzling over its meaning struck us as a matter of some, well, cultural interest.

Merriam-Webster’s editors base their selection solely on whichever word receives the biggest increase of visits on their web site during the course of a year. So what was it about culture that occasioned so much lexical befuddlement in 2014? The editors tried to explain:

Culture is a big word at back-to-school time each year, but this year lookups extended beyond the academic calendar. The term conveys a kind of academic attention to systematic behavior and allows us to identify and isolate an idea, issue, or group: we speak of a “culture of transparency” or “consumer culture.” Culture can be either very broad (as in “celebrity culture” or “winning culture”) or very specific (as in “test-prep culture” or “marching band culture”).

Searching for deeper significance, Joshua Rothman at The New Yorker saw the uptick of interest  as a sign that people were finding the word “unsettling,” and possibly more negative than positive in its connotations. “The most positive aspect of ‘culture’—the idea of personal, humane enrichment—now seems especially remote,” Rothman surmised. “In its place, the idea of culture as unconscious groupthink is ascendant.”

If culture has acquired a “furtive, shady, ridiculous aspect”—from the ritualized business and bureaucratic uses of the word (“corporate culture”) to the trivially commercial (“celebrity culture”) to the identity-oriented (“gay culture”) to the sinister behavioral (“rape culture”)—Rothman has found at least one thing to cheer about all this culture talk:

“Culture” may be pulling itself apart from the inside, but it represents, in its way, a wish. The wish is that a group of people might discover, together, a good way of life; that their good way of life might express itself in their habits, institutions, and activities; and that those, in turn, might help people flourish in their own ways.

In these less-than-joyous times, one wants to endorse the positive wherever one can find it. But Rothman comes to his optimism a little prematurely. The proliferating uses of culture may indeed suggest a growing, if inchoate, popular awareness that cultures, in the deepest sense, are those symbolic “webs of significance” (in anthropologist Clifford Geertz’s words) that provide humans with meaning and moral order. But we suspect this awareness grows out of an troubled sense of what is, at once, so powerful and insufficient about the deep culture of modernity: namely, its almost exclusive celebration of the autonomous individual loosed from all strong commitments and guided only by his or her (consumerist) appetites and preferences. Social critic Philip Rieff famously dubbed ours an “anti-culture,” and its deficiencies, including its therapeutic ethos, partly account for the multitude of mini-cultures that have emerged from it. These mini-cultures, based on everything from ethnic identities to hobbies to life-stages, do indeed “identify and isolate an idea, issue, or group.” But the affiliations and ties provided by such mini-cultures are ultimately fragile and contingent. None challenge the sovereignty of the individual and his or her elective affinities.

There are those who celebrate the proliferation of cultures, believing that a hundred flowers blossoming are far preferable to a single hegemonic garden. But it seems to us that a true culture—broad but not monolithic, sustained through deep commitments and the cultivation of virtues, but neither static nor rigidly hierarchical—is the only thing that has the potential to connect, and sometimes even unify, fractious humans living in a shared society and polity. Without such a culture, we are reduced to arbitrating our differences solely through such mechanisms as bureaucratic process or the law (and, in the latter case, placing an increasingly heavy, if not impossible, burden on the law’s finite resources).

The proliferation of mini-cultures within our larger anti-culture is but one feature of the modernity that we examine, in one way or another, in each issue of The Hedgehog Review. In the forthcoming spring issue (appearing around March 1), we examine the contemporary  “culture of transparency”—in which everything about us is revealed, and everything about us is used in tracking, appealing to, and even shaping us—and its place in our increasingly insistent “information culture.” Those two related mini-cultures merit a gimlet-eyed examination lest we accept their implicit meanings and moral claims without recognizing their limited and possibly dehumanizing consequences.

Jay Tolson is editor of
The Hedgehog Review.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Miss Manners and Mr. Manspreader

Emily Post (author of Etiquette), 1923. Image from Wikimedia Commons.

Emily Post (author of Etiquette), 1923. Image from Wikimedia Commons.

Let’s get this out of the way: Whoever came up with “manspreading” should never be allowed to coin another word again, no matter how sorry he or she is.  It’s a terrible word, I cannot endorse it, but somebody made it—so here we are.

The “manspreader” is a man who sprawls in his seat on the bus or on the subway, thus taking up two or three seats instead of one. Such men have become the target of recent campaign from New York’s Metropolitan Transportation Authority, where “stop the spread” posters take their place alongside helpful reminders to stand up for the elderly and to refrain from molesting others. The phenomenon, for the record, is real enough: The manspreader takes his place alongside many subway pests, like “the person who leans against the subway pole” (also a target of this campaign, along with “women who put on makeup on the subway”). Until they began to be publicly shamed, these men did not have a friend in the world, at least so long as they were on the subway.

Now, however, they have many friends—but, on the whole, not very good ones. They range from Rich Cromwell at The Federalist (“The Rabid Equality Crowd Finally Outright Admits They Hate Testicles”) to David Covucci at BroBible to Katherine Timpf at National Review. All of these writers share a contempt for the entire idea of the MTA campaign, which they interpret, variously, as an act of aggression against men and an elaborate feminist performance grievance. “Stop the spread,” rather than being an appeal for good behavior, seeks, in their view, to impose a “rabid equality.” Such demands for “equality,” in other contexts, are seen as the speech of “bullies,” “neo-Victorians,” and “puritans.”

Puritans? Maybe. Insofar as these feminist critics believe that most of our actions take place in an ethical sphere, they are puritanical in the most praiseworthy sense. Those who care about social mores—a description that fits a least a few of these columnists—share with them, at the very least, an interest in manners. And manners, in America, are also a Puritan endeavor: Arthur Schlesinger Sr. opens his history of American etiquette books, Learning How to Behave, with a discussion of Puritan legal codes. Today, however, when considering how to behave in public, one is more likely to turn to a source little nearer to hand: Emily Post’s Etiquette, now in its eighteenth edition. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Color Commentary

Pantone's color campaign for its 2015 Color of the Year, Marsala (photo: Pantone)

Pantone’s color campaign for its 2015 Color of the Year, Marsala (photo: Pantone)

To introduce Marsala, the 2015 Color of the Year, Pantone Color Institute executive director Leatrice Eiseman pulled out all the stops: “Marsala enriches our mind, body and soul, exuding confidence and stability. Marsala is a subtly seductive shade, one that draws us in to its embracing warmth.”

Who knew that a color could do so much?

Pantone, the color house used by graphic designers and color-trend followers, makes a big splash every year by naming the shade to watch. Immediately after the announcement, clothing manufacturers, makeup companies, home décor designers, and fashionistas set to work blogging and chatting about the new possibilities of glamor and sensation opened up by the color.

Pantone’s own website introduced Marsala, a deep burgundy red, earlier this month with the full fashion-magazine treatment. A series of photographs incorporating color-coordinated clothing, makeup, food, and fabrics showcase the obligatory gorgeous models—exotic men with stubble, beautiful women with chunky accessories—all cavorting in a to-die-for apartment drenched in the shades of wine. One photo even shows a wine-colored playscript cover of “A Midsummer Marsala Dream.”

In early December, the Wall Street Journal featured a Marsala mash-up including ties, jeans, handbags, necklaces, plates, and—for the girl who has everything—wine-colored mascara and brow enhancer. Naysayers like Tanya Basu in The Atlantic found little comfort or warmth in a color that made her think of dried blood or rust. Pantone’s models may be sampling wine and pomegranate seeds, but Basu saw industrial carpets and dorm rooms. And she was not alone in her low regard for the color. The Cut blogger Kathleen Hou pouted that the color was “icky” and “makes you want to go to Olive Garden.”

The decision to tout one color as The Color is not taken lightly. Colors and their names convey a range of emotions, make intangible impressions, and create market possibilities. Pantone’s name for the chosen color is as important as the shade itself—even though the names can be mildly confusing. Mimosa, the pick of 2009, was a bright buttery yellow, while honeysuckle, a feminine pink, was the color for 2011. The honeysuckle flower can also be yellow, while mimosa flowers are typically pink—did that make these colors tough to sell? Last year’s color, Radiant Orchid, a lavender-pink shade, was a hot seller in the spring, but when fall arrived,  what was to be done with all those Radiant Orchid-colored toasters? Turquoise, the color of 2010, demonstrated truth in advertising, a blue-green shade close to that of the gemstone. Yet those with fond memories of rusty Chili Pepper, the color for 2007, might be forgiven for seeing its resemblance to Marsala.

The history of color is closely tied to cultural moments. The arrival of the Spanish in the Americas in the sixteenth century led to the export of massive quantities of cochineal, an insect that when ground up becomes a durable and brilliant scarlet. The ladies of Spain quickly clamored for all things bright red and pink (which they usually purchased with silver coins minted from colonial American ore). In the 1850s, the color mauve was discovered by a young chemist who was trying to synthesize artificial quinine. The residue from one his experiments became the world’s first aniline dye, guaranteed not to fade with time and washing. Queen Victoria wore a mauve gown to her daughter’s wedding, and Empress Eugénie of France cooed that the color matched her eyes—and an epidemic of “mauve measles” swept Europe. As cultural historian Simon Garfield noted in his 2001 book on the history of mauve, the color’s popularity led to burgeoning interest in the practical applications of chemistry and advances in the fields of medicine, weaponry, perfume, and photography. Mauve became indelibly associated with the elaborate, overstuffed décor of the Victorian period; when mauve returned in the 1980s, it was billed as “dusty rose,” a name much more congenial with that era’s other favorite color: hunter green.

Marsala is no mauve, but it does reflect our present cultural mood. Commentators have often noted that during times of economic and political instability, people seek out ways to control their immediate environments. This nesting instinct combined with a dramatic increase in the supply of cheap consumer goods has led furniture makers, interior designers, and fabric designers to swath and cushion homes in piles of pillows, deeply padded furniture, and colorful appliances. Our image-saturated age creates consumers eager to translate the flickering screen into a pleasing palette of color and texture in their homes. In the closely tied areas of fashion and makeup, the last few years have been characterized by alternating waves of luxury and austerity. The flames of indulgence are first fanned and then banked back in the name of simplicity and conservation (or what in fashion is known as “vintage,” itself a term that also applies to wine). Pantone’s color campaign plays on on this dynamic by drunkenly mixing leather and lace, tweed and organza with floral prints, stripes, and damask in a kind of lost Marsala weekend.

Inevitably, Marsala evokes food and drink. To quote Pantone’s Eiseman again: “It [the color] has an organic and sophisticated air.” Marsala is both earthy and complex, not accidentally, words also used to describe wine.

But the greatest source of Marsala’s current authority is its well marketed ubiquity. Pantone’s ability to dictate mood by coloring our clothing, our walls, our accessories, even our coffee-makers is powerful indeed. Interestingly, the last time a similar shade swept the fashion world, it was called oxblood. This non-Pantone wannabe had its moment, but its Oxbridge connotations of privilege just didn’t give it staying power. Or maybe it was a lack of official Pantone status. From our public appearance to how we feather our nests, Marsala could prove to be just as powerful as the Sicilian fortified wine for which it is named.

Leann Davis Alspaugh is managing editor of The Hedgehog Review.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Lyndon Johnson’s War

President Johnson's 1964 State of the Union address

President Johnson’s 1964 State of the Union address. (Credit: LBJ Library photo by Cecil Stoughton)

Few would dispute that America’s war on poverty—declared 50 years ago by President Lyndon B. Johnson in his State of the Union Address—is still a long way from over. With 15 percent of Americans today living under the poverty line, only four percent fewer than when Johnson launched his campaign, many might even agree with Ronald Reagan’s stinging assessment that “poverty won.”

The stricter nonpartisan truth is that no war on poverty could have been won any more than it could have been lost. This is not to deny that many of Johnson’s Great Society programs—from expanded food stamps and Medicaid to Head Start and job-promoting tax cuts—did much to improve the quality of poor Americans’ lives. They even made many of those lives possible, and that is no small accomplishment.

Yet for all that he achieved, Johnson did his cause no good when he framed it as a war. To be sure, war metaphors are a common staple of political rhetoric, used to mobilize popular support for worthy campaigns, from J. Edgar Hoover’s war on crime to George W. Bush’s war on terror. But their use often has unintended consequences, as scholars and pundits have frequently observed. The war metaphor creates the expectation of an eventual conclusion, ideally a victory, something that is almost impossible to achieve in dealing with intractable social and existential conditions. It also implies the existence of a clearly defined enemy, but complex systemic problems have elusive and problematic bogeymen. (If our capitalist, free-enterprise system, with its inevitable winners and losers, is partly the problem, do we really want to “defeat” it?) And by implying a clear-cut struggle, the metaphor can demonize those who differ with proposed strategies for victory. Oversimplification, disappointed expectations, frustration, political polarization, and a general weariness have been just a few of the unfortunate outcomes of this protracted metaphorical war.

President Johnson's War on Poverty Tour

President Lyndon B. Johnson greets a resident of Appalachia during his Poverty Tour of Appalachia. (Credit: LBJ Library photo by Cecil Stoughton)

But arguably the most perverse consequence of Johnson’s rhetoric has been the gradual, almost imperceptible stigmatization of the very people the war was intended to help. If 50 years of fighting haven’t eradicated the problem of poverty, then, many people conclude, isn’t it possible that poor people themselves are the problem? However simplistic, the logic of that conclusion comports all too well with a range of stereotypes, misconceptions, half-truths, and prejudices: The poor are different. Their characters are deficient. Helping them only makes things worse.

At the very least, this blaming-the-victim syndrome has eroded the confidence of poor Americans, convincing many that they are failures. More broadly, it has contributed to something like the disappearance of the poor, in both figurative and concrete ways. Figuratively, people struggling at the bottom of the economic ladder became faceless as they were subsumed under the category of poverty, losing their individuality, distinctiveness, and humanity in the process. More concretely, they increasingly disappeared into their own distinct worlds, growing up, attending schools, working, suffering illnesses, and dying in places that are cut off and separate from those of the better off.

Such a separation, for both tangible and intangible reasons, makes it even harder for the least well off to improve their condition. They live in neighborhoods and communities with schools and other public accommodations that lag well behind those found in the wealthier precincts, which in itself hugely complicates the daily business of getting on. But the lack of face-to-face contact also results in a growing values divide, one that conservative social analyst Charles Murray takes pains to describe in his latest book, Coming Apart (2012).

While Murray focuses on the harm this does to the poor, it also has a debilitating effect on the middle classes and indeed on all Americans. It does so by eroding their sense of the common good, of social solidarity and trust, the absence of which allows a brutal sort of zero-sum thinking to prevail. We see the effects of this declining solidarity in the most obvious ways. It is widely acknowledged, for example, that growing income inequality makes many of the middle class fearful of falling into poverty themselves. But so far that rising concern has failed to produce the political will to mitigate the worst effects of inequality on those who earn the least, even by such modest measures as boosting the federal minimum wage to the inflation-adjusted level of 1964.

The poor are different in only one respect: They have less money. Poverty will not go away. Nor will any war defeat it. But the plight of poor Americans will be less crushing, and less hurtfully defining, if they are seen as part of shared body, as equals as deserving of decent and respectful consideration as any other part of that body. The best way to honor Johnson’s idealism is to declare his war over and, then, to rededicate ourselves to forging a true “one out of many.”

 

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Is the Distracted Life Worth Living?

Philosophy is something close to a national pastime in France, a fact reflected not just in the celebrity status of its big thinkers but also in the interest its media show in the subject.  So perhaps it’s not surprising that several French publications recently sent correspondents, interviewers, and even philosophers to the Richmond, Va. motorcycle repair shop of Matthew Crawford, mechanic, philosopher, and a senior fellow at the University of Virginia’s Institute for Advanced Studies in Culture.

Matthew Crawford

Matthew Crawford

They  came not only to follow up on points raised in Crawford’s last and best-selling book, Shop Class As Soulcraft: An inquiry into the Value of Work, but also to draw him out on some of the themes of his forthcoming book on the subject of attention, and particularly the cultural dimensions of what might be called our universal Attention Deficit Disorders.  For Crawford,  the two books advance a common concern with mental ecology under the conditions of modernity, and how the challenges to that ecology might be countered by a restored regard for– and a renewed cultivation of– the discipines, practices, and rituals that once gave meaning to everyday life and work.

Jean-Baptiste Jacquin of Le Monde asked Crawford what he means when he says his next book will treat the political economy of attention.  Crawford’s reply (with apologies for my own translation):

Political economy concerns itself with the way certain resources are shared and distributed.   Now, attention is an extremely important resource, as important as the time we each have at our disposal.  Attention is a good, but it is rapidly depleted by a public space saturated with technologies that are dedicated to capturing it…The book I am writing is a warning against the massification of our spirit.  To have any intellectual originality, you must be able to extend a line of reasoning very far.  And to do that, you have to protect yourself against an array of external distractions.

Jacquin pressed Crawford on what specific things people might do to counter the endless demands being put on our attention.  Having a fuller cultural consciousness of the problem is one thing that may help, Crawford suggested.  And engaging in activities that structure our attention is another:

I think manual work, almost any form of manual work,  is a remedy.  Cooking, for example. To prepare a fine meal requires a high level of concentration.  Everything you do at each stage of preparation depends directly on the activity itself and on the objects, the ingredients.

In a dialogue between Crawford and French philosopher Cynthia Fleurry arranged by Madame Figaro , Crawford got into the question of autonomy and its connections with attention:

We have a vision of autonomy that is overly liberal,  almost a caricature of itself, in that we take it to imply a kind of self-enclosure.  Attention is precisely the faculty that pulls us out of our own head and joins us to the world.  Attention, perhaps, is the antidote to narcissism….

The ironic and toxic result of advertising and other information saturating the environment is, Crawford explained, to isolate the self, to flatter it with delusions of its autonomy and agency.  Children grow up pressing buttons and things happen, he elaborated, but they never acquire real mastery over the world of things.  They can only make things happen by clicking buttons. “And there you have it,”  said Crawford , “an autonomy that is autism. ”

An even more intensive discussion of manual work, contrasted with the abstract, symbol-manipulating work that employs more and more people,  appears in the November issue of Philosophie Magazine, with Crawford exchanging thoughts with philosopher Pascal Chabot, author of Global Burn-out (2013).   Crawford nicely summed up what might be lost to all those symbol-manipulators who think of themselves as master of the universe even as they lose a fundamental knowledge of their world:

What anthropology, neurobiology, and common sense teach us is that it’s difficult to penetrate to the sense of things without taking them in hand. …It is not through representations of things but by manipulating them that we know the world.  To say it another way, what is at the heart of human experience is our individual agency:  our capacity to act on the world and to judge the effects of our action….But the organization of work and our consumerist culture increasingly deprive us of this experience. American schools,  beginning in the 1990s, dismantled shop classes–which for me had been the most  intellectually stimulating classes—in favor of introductory computer classes, thus fostering the idea that the world had become a kind of scrim of information over which it was sufficient to glide. But in fact dealing with the world this way makes it opaque and mysterious, because the surface experience doesn’t require our intervention but instead cultivates our passivity and dependence.   That has political consequences.  If you don’t feel you can have a real effect on the world, then you don’t believe you have any real responsibility for it. I believe that the depoliticization we are witnessing in the modern world comes from this sense of a lack of agency. The financial crisis is another alarming symptom of the problem:  A trader makes a choice that will have an effect in three years and thousands of miles away.  The consequences of his action are a matter of indifference to him.  By contrast, repairing a motorcycle doesn’t allow you to have that kind of detachment.  If it doesn’t start, your failure jumps out at you and you know who is responsible.  In teaching you that it is not easy to ignore consequences, manual work provides a kind of moral education which also benefits intellectual activity.   

The Hedgehog Review will take up the subject of attention in its summer issue, and Crawford will be one of the featured contributors.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.