Tag Archives: philosophy

All for the Now—or the World Well Lost?

Costume designs for the libation bearers, London Globe production of “Oresteia,” 2015.

Riding with my middle-school daughter a while back, I heard one of her favorite pop songs on the radio and called it to her attention; she sighed and said, “Oh, mom, that’s so two minutes ago.” Apparently, the song was popular last year and, therefore, no longer qualifies as one of her current favorites. Then, she added, in exasperation, that the expression itself was outdated. It was popular a year ago, back when she and her friends used to parody the lingo of certain “popular, mean girls” from film or TV. So, a year ago is now “old”? Or two minutes? The whole conversation made me wonder: What kind of a sense of the past do our children grow up with today, and how does it shape our attitudes toward history?

That question emerged in a different way when my son started high school this year. As an academic observing his orientation, I was keenly interested in this introduction to the curriculum. Of all the things I learned, however, the most surprising was that his curriculum requires only one year of history to graduate. Three and a half years of physical education are required. Three to four years of English are essential, as are three years of math. But students at my son’s school can graduate with only one year of history, and US history at that. Even in his first-year English course, where students are required to read only three literary works during the entire academic year, two of the three were written in the last sixty years. In other words, there’s not much distant history in his English curriculum either.

This also squares with trends at the small liberal arts college where I teach. Enrollment in history courses is down. The history department’s faculty allocation has recently been cut. Even in the English department, where enrollment numbers are strong this year, our historically-oriented Renaissance literature line is being suspended due to budgetary adjustments, no doubt to make way for faculty positions in programs like biochemistry, molecular biology, and business. What this means is that my department will soon be without a permanent member who specializes in the period of the greatest flowering of literature in English.

And this dearth of expertise in the historical humanities is evident across the College. When I count the total number of pre-nineteenth century historical humanities positions at my college, considering fields such as art history, philosophy, theater, and religion, I find that only five percent of all full-time, permanent faculty members have expertise in such areas.

Is it any wonder then that young people often have a limited sense of the past, unable to place even watershed events such as the Protestant Reformation or the French Revolution or identify major historical time periods? Not long ago, my son returned home from middle school to boast that, unlike his peers who were hooked on techno-pop, he’d suddenly become interested in “more medieval music”—“You know, Mom, like Simon and Garfunkle, the Beatles, ELO.” I’ll give him a pass for being only twelve at the time, but I’d suggest that this historical illiteracy is more common—and more costly—than we might think.

Why should teaching the past matter? It matters because teaching any pre-modern culture exposes students to ways of being that may be alien to them, a form of ontological diversity just as important as the more familiar kinds we hear so much about today. Many years ago, in a lecture at my college, the classicist Danielle Allen argued that education is fundamentally about knowing the foreign. Like Allen, I share that conviction and, in my own courses, daily ask students to explore the foreign battlefields of Homeric Troy or to inhabit the psychological terrain of Augustine. Both the Iliad and the Confessions offer examples of imaginative mindscapes as foreign to many students as any far-flung land they might visit on a study-abroad trip. And such foreign intellectual encounters, so familiar in early literature and history courses, help students cultivate virtues such as empathy and tolerance.

Tracing the decline and fall of the Roman Empire, distant as it may be, reveals the dangers of overreaching imperial powers, the perils of resources stretched thin, and the consequences of growing economic disparities—none of which are problems confined only to the ancient world. As the historian Timothy Snyder observes in his brief wonder of a book On Tyranny, “Americans today are no wiser than the Europeans who saw democracy yield to fascism, Nazism, or communism in the twentieth century. Our one advantage is that we might learn from their experience.”

Although Aeschylus’s Oresteia brilliantly dramatizes the triumph of democratic processes of justice over vendetta-style retribution, it also displays the pernicious roots of patriarchy, with the Olympian gods themselves legitimizing male rule over female, as Apollo exculpates Orestes by claiming that the mother isn’t really the parent, only the seed bed, while Athena chimes in, professing to “stand by the man” despite being a woman. Likewise, Shakespeare’s Shylock, a comedy that turns on a act of mercy, also illuminates darker themes such as anti-Semitism and ethnic stereotyping.

History also teaches us that the pursuit of knowledge is often a digressive process. Unlike the natural sciences where knowledge and learning are generally linear, experimentation and research leading to new insights and replacing previous conclusions, humanistic knowledge proceeds haltingly. In the natural sciences, one often draws the conclusion that new knowledge is better than old knowledge. In the humanities, we value the ancient, the antique, the quaint, and the outmoded all in the interest of thickening and enriching our understanding of human life.

While much of that life has involved regrettable episodes, history reminds us of what it means to be questing and creative and to transcend the limits of our human predicament, as Julian of Norwich or Galileo or Mary Rowlandson once did. Studying the past has been shown to remove feelings of isolation that many young people in contemporary America report as their greatest fear. Further, today’s younger generation may learn resilience, courage, and fortitude through an imaginative engagement of the people of the past.

I have been haunted by the lines from a poem I recently read in a book, Cruel Futures by Carmen Giménez Smith,that playfully extols “Disorder in exchange/for embracing the now.” Although Smith’s short poem vindicates that disorder by focusing on personal rather than collective, historical knowledge, those lines have left me wondering about the public implications of such an “exchange.” When, as a society, we “embrace the now,” at the expense of the past, what sort of disorderly deal might we be making? I’m thinking here of, for example, the generally low level of civic participation in the United States. Might this indicate that we have become complacent about our history, forgetting the arduous efforts of a small group of patriots and visionaries, preferring instead the promises of Silicon Valley entrepreneurs and charismatic “thought leaders”?

In the academic world where I work, I often hear “That is the way of the past; this is the way of the future,” as if the past were to be regarded as a mere discarded image, disconnected from the priorities of the omnipresent “now.” As educators, we ought to remain wary of such facile dismissals of the past and be vigilant in refuting this kind of chronological snobbery, to borrow a phrase from C.S. Lewis and Owen Barfield. The wisdom and well-being of our young people and our civilization depend on historical knowledge. Otherwise, we may one day find ourselves victims of a “cruel future,” one in which ignorance of past problems condemns us to inevitable repetition of them, and where blindness about historical insights prevents us from seeing wiser paths forward.

Carla Arnell is associate professor of English and chair of the English Department at Lake Forest College.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterLinkedInGoogle+Share

A Philosopher Who Matters

Detail from The Death of Socrates, 1787, Jacques Louis David, Metropolitan Museum of Art, New York.

In the moments before Socrates’ execution, he made a plea to his accusers: “This much I ask from them: When my sons grow up, avenge yourselves by causing them the same kind of grief that I caused you, if you think they care for money or anything else more than they care for virtue, or if they think they are somebody when they are nobody.”  You can’t seek Sophia and Mammon, Socrates warned. Fortunately, most philosophers don’t have to worry about this temptation. Trust me: Nobody’s getting rich dissecting syllogisms or parsing Hegel.

Unless you’re Charles Taylor. This week the Canadian philosopher was awarded the inaugural million-dollar Berggruen Prize for “ideas that shape the world”—what people are describing as the Nobel in philosophy.  (In fact, this is Taylor’s second million-dollar prize, having been awarded the Templeton Prize in religion in 2007.)

The award is well-deserved. Taylor is almost without peer (although I could imagine Jürgen Habermas also receiving this prize), and his work certainly exemplifies what the prize seeks to recognize: that ideas do indeed shape the world. So what is it that distinguishes Taylor’s work and has attracted this kind of attention? I think there are several features of his ongoing contribution that stand out. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Lessons from the Ring—Then and Now

Photograph by Mike Powell, Allsport Concepts; Getty Images.

Photograph by Mike Powell, Allsport Concepts; Getty Images.

Years ago, I had the honor of interviewing David Mamet, who, in addition to being a fine playwright, is a longtime practitioner of the martial arts. After our conversation, I asked him to give me one piece of advice I might pass along to my students. He said, “Tell them to pick some physical art—ballet, boxing, judo, yoga, whatever—and to stick with it. It will make them feel grounded and better able to deal with adversity and rejection in this world.” By moving your body in a certain way, he was saying, you will shape the way you feel and who you are.

Philosophy professors (including me) assume that we learn to negotiate these things only by reflecting on them. It is as though we have become oblivious to the lessons we can learn on the path leading from the body to the brain. Once, I confided about an emotional problem to a yoga teacher. She replied, “The answer to the problem is just to breathe.” At the time, I was deeply and rather unreflectively committed to the belief that it is only by thinking that we can solve problems. The yoga teacher’s words awakened me to something I should have known already.

After all, I had been training boxers for decades, learning and imparting some of the lessons Carlo Rotella writes about so eloquently in Cut Time: An Education at the Fights:

The deeper you go into the fights, the more you may discover about things that would seem at first blush to have nothing to do with boxing. Lessons in spacing and leverage, or in holding part of oneself in reserve even when hotly engaged, are lessons not only in how one boxer reckons with another but also in how one person reckons with another. The fights teach many such lessons…about getting hurt and getting old, about distance and intimacy…boxing conducts an endless workshop in the teaching and learning of knowledge with consequences.

In the sweat-and-blood parlor of the boxing ring, young people deal with feelings they seldom get controlled practice with, such as anxiety and anger. And make no mistake—the kind of people we become is largely determined by the way we negotiate those dreadnought emotions. Continue reading

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

After Strange Gods: Peter Thiel and the Cult of Startupism

Peter Thiel at TechCrunch50 (2008). Image from Wikimedia Commons

Peter Thiel at TechCrunch50 (2008). Image from Wikimedia Commons

In the humanities departments of the university where I live and work, the word “corporate” is an epithet of disdain, and “entrepreneurship” is code for “corporate.” My fellow humanists tolerate the business school because it provides fuel for the English composition classes that keep us tenured radicals employed.

A confession: I used to share that outlook myself. But the experience of working alongside actual entrepreneurs and CEOs of various stripes shattered my comfortable assumptions. Not only did I find that entrepreneurs are willing to take risks that I would never hazard; I also learned that many are keenly interested in the world of ideas, theory, and “big picture” thinking. Indeed, such philosophically inclined entrepreneurs excel at practical wisdom—what Aristotle called phronesis—precisely because their imaginations have been nourished by contemplation. They are philosophers of a kind I will never be.

Prominent among these philosopher-entrepreneurs is Peter Thiel.  A co-founder of PayPal and Palantir, he has become a Silicon Valley guru, the contrarians’ contrarian. His new book, Zero to One: Notes on Startups, Or How to Build the Future, began as notes taken by an admiring student, Ben Masters, who took Thiel’s course at Stanford. It is an ambitious book—it could even be described as Machiavelli’s The Prince as re-imagined for our startup age—and it puts Thiel’s command of the philosophical canon on prominent display. How many other business books at the airport bookstore draw on Hegel, Nietzsche, Aristotle, John Rawls, and René Girard?

Zero to One is aphoristic, biting, forthright, and at times, in the spirit of Machiavelli, ruthless. Thiel unapologetically commends the pursuit of monopoly (“the more we compete, the less we gain”), and then counsels noble lies to hide its achievement. He casts aspersions on the bureaucracies of existing organizations: “Accountable to nobody,” he writes, “the DMV is misaligned with everybody.” And he calls out bad ideas, particularly those coupled with shoddy execution. His take-down of failed federal investment in clean technology is well worth the cost of the book.

Thiel’s intellectual reach is anything but modest. He offers a sociology of creativity, a grand theory of human civilization, and even a sort of theology of culture–though it is not quite clear whom he casts as God. Indeed, it’s over the grandiosity and hubris of Thiel’s claims that I find myself parting ways with the more fawning reviews of his book. I realize that creative risk-tasking requires a healthy dose of self-confidence that can often come across as arrogance. What worries me, though, is not his confident dispensing of practical wisdom but the hubristic evangelizing for what might be called startupism.

A cult of creative innovation, startupism has four notable  features, beginning with the outsized role it accords to human creativity. As early as page two of the book, Thiel tells us “humans are distinguished from other species by our abilities to work miracles. We call these miracles technology.” His emphasis on the creative power of human making is laudable and timely, though not particularly new. (Thiel should add Giambattista Vico to his reading list.) What’s unique to startupism is the “miraculous,” god-like powers Thiel attributes to us mortals: “Humans don’t decide what to build by making choices from some cosmic catalogue of options given in advance; instead, by creating new technologies, we rewrite the plan of the world.” We command fate. “A startup is the largest endeavor over which you can have definite mastery. You can have agency not just over your own life, but over a small and important part of the world. It begins by rejecting the unjust tyranny of Chance. You are not a lottery ticket.”

Second, the creativity celebrated by startupism blurs the old distinction between Creator and creature. What Thiel calls “vertical” or “intensive” progress isn’t 1+1 development; truly creative, intensive progress is a qualitative advance from 0 to 1. I believe the Latin for that is creation ex nihilo. (And “[t]he single word for vertical, 0 to 1 progress” is…you guessed it…“technology.”)

Third, as you might expect, startupism has its own ecclesia: the new organization founded by a noble remnant who have distanced themselves from the behemoths of existing institutions. “New technology,” Thiel observes, “tends to come from new ventures” that we call startups. These are launched by tiny tribes that Thiel compares to the Founding Fathers and the British Royal Society. “[S]mall groups of people bound together by a sense of mission have changed the world for the better,” he explains, because “it’s hard to develop new things in big organizations, and it’s even harder to do it by yourself.” We shouldn’t be surprised, then, that “the best startups might be considered slightly less extreme kinds of cults.” The successful startup will have to be a total, all-encompassing institution: our family, our home, our cultus.

Finally, in startupism, the founder is savior. Granted, Thiel—following Girard—is going to talk about this in terms of scapegoating in a long, meandering chapter that aims to associate successful Silicon Valley geeks with pop stars and other people we like to look at. But it’s not just that founders are heroes in their companies. The scope of their impact is much wider: “Creative monopolists give customers more choices by adding entirely new categories of abundance to the world. Creative monopolies aren’t just good for the rest of society; they’re powerful engines for making it better.” But to get there, Thiel says, “we need founders.” No founders; no progress. Steve Jobs, hear our prayer.

Thiel offers genuine, authoritative insight into entrepreneurship and the dynamics of a startup organization. When he tacitly suggests that society derives its crucial and even salvific dynamism from the startup, I become both skeptical and nervous. Can startups contribute to the common good? Without question. Are startups going to save us? Not a chance.

Thiel’s hubris stems from a certain parochialism. Startupism is a Bay-area mythology whose plausibility diminishes by the time you hit, say, Sacramento. The confident narrative of progress, the narrow identification of progress with technology, and the tales of 0 to 1 creationism are the products of an echo chamber. This chamber fosters hubris among the faithful precisely because it shuts out competing voices that might remind them of the deeper and wider institutional, intellectual, and even spiritual resources on which they depend and draw. We are makers, without question, but we are also heirs. We can imagine a different future, but we have to deal with a past that was created by others before us.

Thiel, and the New Creators like him (and get ready for a slew of parroting Little Creators coming in their wake), have drunk their own Kool-Aid and believed their own PR. It’s why all the sentences that begin “At PayPal…” grow tiresome and make you wonder why someone who developed a new mode of currency exchange thinks he brought about the new heaven and the new earth ex nihilo. One can applaud Thiel’s elucidation of creativity and innovation while deploring the (idolatrous) theology in which he embeds it. We need startups. We can do without startupism.

James K. A. Smith is a professor of philosophy at Calvin College, where he holds the Byker Chair in Applied Reformed Theology and Worldview. He is the author of Who’s Afraid of Postmodernism? and How (Not) to be Be Secular: Reading Charles Taylor, among other books. Smith is also the editor of Comment magazine.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Is Nothing Truly Alive?

 

Theo Jansen's Strandbeest.  (Wikimedia Commons)

Theo Jansen’s Strandbeest. (Wikimedia Commons)

There is no such thing as life.

That is the provocative claim made by Ferris Jabr in a recent op-ed appearing in The New York Times. Ferris Jabr is an associate editor at Scientific American, and at first blush, his claim sounds ridiculous. I know I’m alive. So there’s one example of life. Surely Jabr knows that he himself is alive. And we all see hundreds of examples of living things every day. So why exactly does Jabr think there is no such thing as life?

Jabr makes his case this way: “What is life? Science cannot tell us. Since the time of Aristotle, philosophers and scientists have struggled and failed to produce a precise, universally accepted definition of life.” Since we don’t have a definition of life, he continues, how can we talk about living things?  He points out that science textbooks describe living things by picking out features that living things often have. Such lists usually point to organization, growth, reproduction, and evolution. If something has all or most of these features, then it’s probably alive.

However, Jabr explains that these textbook lists fail miserably as definitions of life.  We can find things that are organized, display growth, reproduce, and evolve, and yet are not alive.  And for some things—viruses, for example—we can’t figure out whether they’re alive or not.

He continues: “Why is it so difficult for scientists to cleanly separate the living and nonliving and make a final decision about ambiguously animate viruses?”  Jabr has an explanation: “Because they have been trying to define something that never existed in the first place…Life is a concept, not a reality.”

But here Jabr has gone astray.  He concludes from the fact that life doesn’t have a definition that there is really no such thing as life.  But this is an invalid inference. For a concept can lack a definition and yet still be a real thing.  

Here’s an easy example: redness. Redness doesn’t have a definition. If you don’t believe me, take a stab at defining it. It’s a color—sure. It’s not blue, or yellow, or black, or any of the colors that aren’t red. Neither is it some particular wavelength of light; that’s what causes us to experience redness, but that isn’t what redness is. But this isn’t helping—none of these distinctions tell us what redness is. Redness is its own special thing, and nothing besides redness itself accounts for what it is. Nevertheless, redness is real. It’s a real thing whose concept doesn’t have a definition. The concept of redness is what is called a primitive concept.  It helps define other things, but nothing else defines it. It’s an unexplained explainer.

Redness isn’t the only primitive concept.  There are plenty of others.  For example: the concept of being part of something, the concept of possibility, the concept of goodness, the concept of being identical to something, to name a few.  But most importantly for the matter at hand, others have researched the very issue Jabr’s talking about—the failure of philosophical and scientific efforts to define life—and have given good reasons to think that the concept of life is primitive.  Perhaps most notably, Michael Thompson, a philosopher at the University of Pittsburgh, has made this case in his profound and influential book Life and Action (2008).

Where does all this leave Jabr’s argument?  The absence of a definition for a concept in no way suggests that the concept lacks real instances.  And life certainly seems to have real instances.  So it looks as though we should continue to accept the reality of life and simply recognize that it can’t be defined. Jabr’s case turns out to be less than compelling.

But so what? What’s the real-world significance of arguing in a New York Times op-ed that life doesn’t exist? More than we might initially think. To see what I’m getting at, let’s suppose for the moment that Jabr is right. Jabr illustrates the upshot of his claim about the non-existence of life by comparing things we ordinarily think of as living with certain artifacts, in particular the life-like handiwork of Dutch artist Theo Jansen. Jansen’s Strandbeest are wind-propelled mobile structures that resemble gigantic, many-footed arthropods. Jabr’s conclusion is that “Recognizing life as a [mere] concept is, in many ways, liberating. We no longer need to recoil from our impulse to endow Mr. Jansen’s sculptures with “life” because they move on their own. The real reason Strandbeest enchant us is the same reason that any so-called “living thing” fascinates us: not because it is “alive,” but because it is so complex and, in its complexity, beautiful.”

If life isn’t real—if life is just a sort of beautiful complexity—then the distance between artifacts like the Strandbeest and things we normally consider living is removed. With this distance removed, we are free to see the Strandbeest as “alive.” Jabr thinks his conceptual innovation has brought enchantment to artifacts.

But there is a dark flip-side to this argument.  For with the loss of distance between life and mere elegant complexity, we are also free to see genuinely living things as mere complex artifacts. When a complex artifact—say, a watch—has outlasted its practical usefulness or lost its aesthetic value, there’s no barrier to it being scrapped or thrown away. Of course, we are ordinarily much more hesitant to treat living creatures in this way. Why this is so is a complicated question, but it is in part because we recognize that living creatures possess a mysterious value in virtue of being alive.  Anyone who has seen an animal die learns this, watching the animating spark fade away.

However, if we lose the distance between life and mere complexity, will we feel a heightened sense of loss when we discard a watch?  Or will we merely be less inclined to believe in the strange yet precious value of living creatures?  Jabr thinks he is bringing enchantment to artifacts.  We should worry he is disenchanting the living.

Paul Nedelisky  received his PhD in philosophy from the University of Virginia in 2013 and is now a Postdoctoral Fellow at the Institute for Advanced Studies in Culture, where he is working on a book about science and morality.

 

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Is the Distracted Life Worth Living?

Philosophy is something close to a national pastime in France, a fact reflected not just in the celebrity status of its big thinkers but also in the interest its media show in the subject.  So perhaps it’s not surprising that several French publications recently sent correspondents, interviewers, and even philosophers to the Richmond, Va. motorcycle repair shop of Matthew Crawford, mechanic, philosopher, and a senior fellow at the University of Virginia’s Institute for Advanced Studies in Culture.

Matthew Crawford

Matthew Crawford

They  came not only to follow up on points raised in Crawford’s last and best-selling book, Shop Class As Soulcraft: An inquiry into the Value of Work, but also to draw him out on some of the themes of his forthcoming book on the subject of attention, and particularly the cultural dimensions of what might be called our universal Attention Deficit Disorders.  For Crawford,  the two books advance a common concern with mental ecology under the conditions of modernity, and how the challenges to that ecology might be countered by a restored regard for– and a renewed cultivation of– the discipines, practices, and rituals that once gave meaning to everyday life and work.

Jean-Baptiste Jacquin of Le Monde asked Crawford what he means when he says his next book will treat the political economy of attention.  Crawford’s reply (with apologies for my own translation):

Political economy concerns itself with the way certain resources are shared and distributed.   Now, attention is an extremely important resource, as important as the time we each have at our disposal.  Attention is a good, but it is rapidly depleted by a public space saturated with technologies that are dedicated to capturing it…The book I am writing is a warning against the massification of our spirit.  To have any intellectual originality, you must be able to extend a line of reasoning very far.  And to do that, you have to protect yourself against an array of external distractions.

Jacquin pressed Crawford on what specific things people might do to counter the endless demands being put on our attention.  Having a fuller cultural consciousness of the problem is one thing that may help, Crawford suggested.  And engaging in activities that structure our attention is another:

I think manual work, almost any form of manual work,  is a remedy.  Cooking, for example. To prepare a fine meal requires a high level of concentration.  Everything you do at each stage of preparation depends directly on the activity itself and on the objects, the ingredients.

In a dialogue between Crawford and French philosopher Cynthia Fleurry arranged by Madame Figaro , Crawford got into the question of autonomy and its connections with attention:

We have a vision of autonomy that is overly liberal,  almost a caricature of itself, in that we take it to imply a kind of self-enclosure.  Attention is precisely the faculty that pulls us out of our own head and joins us to the world.  Attention, perhaps, is the antidote to narcissism….

The ironic and toxic result of advertising and other information saturating the environment is, Crawford explained, to isolate the self, to flatter it with delusions of its autonomy and agency.  Children grow up pressing buttons and things happen, he elaborated, but they never acquire real mastery over the world of things.  They can only make things happen by clicking buttons. “And there you have it,”  said Crawford , “an autonomy that is autism. ”

An even more intensive discussion of manual work, contrasted with the abstract, symbol-manipulating work that employs more and more people,  appears in the November issue of Philosophie Magazine, with Crawford exchanging thoughts with philosopher Pascal Chabot, author of Global Burn-out (2013).   Crawford nicely summed up what might be lost to all those symbol-manipulators who think of themselves as master of the universe even as they lose a fundamental knowledge of their world:

What anthropology, neurobiology, and common sense teach us is that it’s difficult to penetrate to the sense of things without taking them in hand. …It is not through representations of things but by manipulating them that we know the world.  To say it another way, what is at the heart of human experience is our individual agency:  our capacity to act on the world and to judge the effects of our action….But the organization of work and our consumerist culture increasingly deprive us of this experience. American schools,  beginning in the 1990s, dismantled shop classes–which for me had been the most  intellectually stimulating classes—in favor of introductory computer classes, thus fostering the idea that the world had become a kind of scrim of information over which it was sufficient to glide. But in fact dealing with the world this way makes it opaque and mysterious, because the surface experience doesn’t require our intervention but instead cultivates our passivity and dependence.   That has political consequences.  If you don’t feel you can have a real effect on the world, then you don’t believe you have any real responsibility for it. I believe that the depoliticization we are witnessing in the modern world comes from this sense of a lack of agency. The financial crisis is another alarming symptom of the problem:  A trader makes a choice that will have an effect in three years and thousands of miles away.  The consequences of his action are a matter of indifference to him.  By contrast, repairing a motorcycle doesn’t allow you to have that kind of detachment.  If it doesn’t start, your failure jumps out at you and you know who is responsible.  In teaching you that it is not easy to ignore consequences, manual work provides a kind of moral education which also benefits intellectual activity.   

The Hedgehog Review will take up the subject of attention in its summer issue, and Crawford will be one of the featured contributors.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.