Portrait of America’s Young Adults: Wary but Optimistic

According to a Pew Research Center survey, 55% of Millennials have posted a selfie on a social media site, compared with 24 percent of Gen Xers; 9 percent of Boomers; and 4 percent of the Silent Generation.

Generational snapshots sometimes confound us in the ways actual photographs do. The players in a team photo taken right after losing to their perennial cross-town rival are inexplicably smiling. Why? Well, it turns out they barely lost to a team from whom they’d expected a thorough trouncing.

A similar mystery arises from the recent survey of the Millennial generation, the cohort of young American adults who today range in age from 18 to 33. According to the Pew Research Center’s report, ”Millennials in Adulthood,” these Millennials are relatively detached from religious and political institutions, are dealing with greater economic challenges (high levels of student debt, unemployment, and stagnant wages), are less inclined to rush into marriage, and are more prone to distrust other people than were the young adults of the three preceding generations.

YPEW social trends graph: Millennials Upbeat about Their Financial Futureet for all those indicators suggesting a fundamental wariness toward the world, the Millennials are curiously optimistic about their future, more so than the members of the three previous generations of Americans, the Baby Boom, Generation  X, and the Silent Generation.

Pew doesn’t offer explanations for this seeming disconnect, though it does conjecture that the racial diversity of the Millennials—the most racially diverse generation in American history, thanks largely to the influx of Asian and Hispanic immigrants–has something to do with their lack of social trust.  (“A 2007 Pew Research Center analysis found that minorities and low-income adults had lower levels of social trust than other groups,” the report notes.)

Pew also speculatates that the social and political turbulence of the 1960s and 1970s made the Boomers as young adults more pessimistic about the future than today’s Millennials are. If so, it’s interesting to note that economic insecurity may be less troubling to young adults than those other kinds of instability.

Still, apart from the absence of that negative factor, where does the striking optimism of the Millennials come from? Why are they more optimistic than the Silent Generation that came before the Boomers and also more optimistic than the Gen Xers who came after the Boomers?

Two possible answers: first, the very factor possibly contributing to low levels of social trust—namely, the fact that many of the Millennials are immigrants, or the sons and daughters of immigrants—may account for their resilient optimism. To these young Americans, the future still looks brighter than it did in the countries they or their parents came from. They cling to the American Dream more easily than do those Americans who have seen the dream gradually lose its promise.

The other possibility: young Americans are optimistic because they derive support and solace not from traditional institutions like churches and neighborhoods but from the virtual worlds they frequent and even at times seem to inhabit. (“They have taken the lead,” Pew notes, “in seizing on the new platforms of the digital era—the internet, mobile technology, social media—to construct personalized networks of friends, colleagues and affinity groups.  They are ‘digital natives’—the only generation for which these new technologies are not something they’ve had to adapt to. Not surprisingly, they are the most avid users.”) While digital communities may be “weak,” in terms of levels of commitment and affiliation, they represent worlds of seemingly limitless possibility, including the entrepreneurial possibilities associated with the new, and particularly social, media.

Can such optimism endure?  Only the next generational snapshot will tell.

FacebookTwitterLinkedInGoogle+Share

Measuring Virtue in the Audit Society

Can virtue be measured? That was the question before a conference held at Oriel College, Oxford in January sponsored by the Jubilee Centre for Character and Values of the University of Birmingham. I didn’t attend, but the nearly 40 papers delivered at the conference are available online at the Centre’s website. Reading through some of the papers and the paper abstracts I was struck by two things. First, while some of the presenters raised basic philosophical questions about the very idea of measuring virtue, many of the presenters, after a caveat or two about complexity, confidently said “yes,” virtue or character can be scientifically measured.

 

A new phrenology?

A new phrenology? (Courtesy of Wikimedia Commons)

Second, I was struck by how little they agreed on how to do it, which one might have expected to undermine their confidence. Granted, the various speakers were not all addressing the same aspect of the elephant—some were concerned with measuring only a single virtue, some were asking about measuring character education programs, some offered instruments to measure how individuals view moral concepts, and so on—yet everyone seemed to have his or her favorite method or construct,  and virtually no two were the same. The empirical assessment of virtue/ethics/character seems to a thriving industry.  So far, though, it has produced no strong consensus of thought.

Perhaps the most interesting thing about the conference was the very question of measuring virtue. Who is asking this question and why now? Certainly one demand for measurement is coming from the new science of morality that is going great guns in psychology, neuroscience, and other fields. More generally, though, and feeding the science, are the imperatives of the “audit society” and the multiplication, over the past few decades, of formal “checking up” systems that were created in the name of accountability. These systems have spread far and wide, from government to education to medicine and beyond, and they require measurable indicators that are verifiable.

Michael Power, author of The Audit Society (1997), defines in an earlier paper a key aspect of verifiability: “that attribute of information which allows qualified individuals working independently of one another to develop essentially similar measures or conclusions from an examination of the same evidence, data or records….” This normally quantitative attribute is one thing in the context of a financial audit, where money is the medium, or quality assurance of, say, light bulbs, where readily visible criteria of success or failure exist and preexist the audit process. In these contexts, auditing and certifying are a secondary monitoring of compliance or performance.

However, it is quite another matter in contexts where the performance of doctors or teachers or professors is at issue. In these complex contexts—and this has implications for something as qualitative, relational, and holistic as virtue—activities that are not easy to quantify must first be “made auditable” by finding some feature of them to measure or rate. While this need not be a bad thing—it might, for instance, force professionals to think seriously about their goals—it can take on a life of its own.

The very act of creating measures and benchmarks and rating scales can badly distort the nature of the thing being audited, throwing off all sorts of unintended consequences. Far from a merely derived and neutral activity, auditing and performance measurement can construct a system of knowledge and then re-shape the organizational environment to make that system successful. More germane to virtue is the distinct possibility that because the disposition itself is not readily amenable to verifiable, non-subjective measurement, what will be quantified is simply some aspect that is easy to count, often a crude and not very meaningful aspect at that. This aspect, because verifiable and thus more tractable and “real,” then gets confused with the thing itself. Virtue becomes, as one of the speakers at the Oxford conference argued, “what virtue tests test.”

I recently heard a social scientist argue that when it comes to measuring morality any measure is better than none, an at-least-we’re-counting-something view which Power also observes in his research. But surely, in light of the dynamics of real-world assessment practices, such a facile view is deeply mistaken. Only a very good measure is better than none.

 

Democraphobia

The “end of history” thesis appears to have come to its final end in recent weeks. Certainly, the once-heralded spread of democracy and liberal values throughout the world is now looking far from inevitable.

Even before recent discouraging developments in—take your pick—Crimea (phony ballots and voter suppression before Anschluss), Turkey (farewell to Twitter, amid other suppressions of a free press), or Venezuela (jailed mayors and slain students), trend lines were not encouraging.  Freedom House, the reliable global monitor of such matters, reports 8 straight years of more declines in political liberties and civil rights worldwide than gains. Unfree and partly free countries now outnumber free ones 107 to 88. So much for Hegel (and Fukuyama), at least for the next half century or so.

What was so emphatically depressing about those Crimea ballots, shown above, (which allowed select voters the “choice” between joining Russia directly or joining it indirectly) was their dramatic illustration of another Freedom House finding: that “modern authoritarians” are suppressing all opposition even while maintaining the outward trappings of legality and democratic process (though quietly and insistently dismantling or dominating institutions that guarantee real pluralism, including legislatures, the judiciary, police and security forces, the media, civil society, and even the economy).

The Crimea nastiness focused the world’s attention on this new form of “managed democracy” because we saw it brazenly employed in a transnational land grab that violated most widely accepted principles of international law and national sovereignty. And if it could happen in Crimea (and tomorrow in other parts of southern and eastern Ukraine), why can’t our current crop of smooth autocrats use managed democracy to acquire whatever territories they set their eyes on. President Hamid Karzai, who harbors notions of bringing Pashtun regions of Pakistan into Afghanistan, has already endorsed Crimea’s annexation as the valid exercise of the principle of self-determination, and he’s not even thought to be an autocrat.

But if the world is turning into a bleak stage for the cynical manipulation and abuse of democratic principles for undemocratic, illiberal, or simply self-aggrandizing ends, then the United States cannot hold itself entirely blameless. We haven’t exactly been burnishing the image of democracy lately. Our recent governmental disfunctions, often driven by thoroughly unprincipled partisanship, have given people around the world good reason to think that democracy may not be a model system of reasonable, efficient, or even particularly virtuous governance. The rolling back of voter-rights protections in certain states and the imposition of new voting requirements in others raises questions about the depth of our commitment to core practices of democracy. And the growing power of money in politics has raised concerns about a drift toward patrimonial capitalism and even oligarchy.

All that said, reports of the death of American democracy are greatly exaggerated. Our system has come through other depressions, gilded ages, and even, as in the years preceding the Civil War, crippling bouts of political gridlock. What makes our current shortcomings so problematic and worrisome is that they now come under the intense scrutiny of friends and foes around the world, the former counting on us to serve as a model, the latter hoping we fail.

Even worse, we appear to be doing our very best to convince the world through our own cultural exports that our foes’ fondest wishes are coming true.  Speaking at the Institute for Advanced Studies in Culture a few weeks ago, Martha Bayles, author of Through a Screen Darkly: Popular Culture, Public Diplomacy, and America’s Image Abroad, (a section of which is excerpted in the current issue of The Hedgehog Review), made a sobering point about the huge popularity of the Netflix series House of Cards in China. A dark drama about corruption, intrigue, and murder in the highest corridors of power in Washington, the show particularly appeals to elite Chinese viewers who seem to take comfort in the fact that political life in America appears to be at least as rotten as their own.

House of Cards may be an extreme case, but as Bayles shows in her timely book, the decline of America’s public diplomacy efforts and institutions—which once vigorously promoted our strongest civic and political ideals—means that popular culture exports are now the main shapers of our image abroad. And when not glorifying violence, crime, or casual sex, most of these exports depict a people largely cut off from sustaining ties with family or community, completely absorbed in preening narcissim and seflish consumerism. So this, both friends and foes must think, is what American democracy hath wrought! Needless to say, the picture inspires neither emulation nor respect.

No, we can’t blame the world’s growing democracy deficit on Hollywood and other engines of American popular culture production. After all, television and film depictions of contemporary American society are not entirely caricatures. But we must at least recognize how little we do to correct the distorted picture of what our nation holds most dear. And how doing so little costs us, and the world, so much.

In Defense of the Misunderstood Hedgehog

While we welcome the arrival this week of Nate Silver’s new 538 blog, and in fact, defend his focus on “data journalism” here on our sister blog The Infernal Machine, it is our 21st-century age-of-the-brand duty to come to the defense of our namesake, the hedgehog, recently maligned.

To catch you up, the 538 blog has adopted the fox as its mascot, drawn from the same Archilochus quotation that runs in the front of every Hedgehog Review issue:

The fox knows many things, but the hedgehog knows one big thing.

Silver explains why the fox mentality appealed, expanding the notion of “many things” to “many different forms of data journalism.” (The fact that the fox is trendy and cute can’t hurt.)

As to where the fox’s nemesis, the hedgehog, comes out in all of this, New York Magazine has already inquired. Silver’s response:

So if you all are the foxes, who’s a hedgehog? 
Uhhhh, you know … the op-ed columnists at the New York Times, WashingtonPost, and Wall Street Journal are probably the most hedgehoglike people. They don’t permit a lot of complexity in their thinking. They pull threads together from very weak evidence and draw grand conclusions based on them. They’re ironically very predictable from week to week. If you know the subject that Thomas Friedman or whatever is writing about, you don’t have to read the column. You can kind of auto-script it, basically.

It’s people who have very strong ideological priors, is the fancy way to put it, that are governing their thinking. They’re not really evaluating the data as it comes in, not doing a lot of [original] thinking. They’re just spitting out the same column every week and using a different subject matter to do the same thing over and over.

Our humble hedgehog is not easily offended. It even agrees that this definition loosely and somewhat carelessly follows the intellectual typology laid out by Isaiah Berlin in his important essay, “The Hedgehog and the Fox.”  (Others have also pondered the origins of the fox and hedgehog.) Still, our editors have in mind quite different understandings of the hedgehog when they set about their work. In the past, in fact, each issue tackled one theme, exploring a single topic of broad cultural significance from a variety of angles. While in more recent times we have added a number of non-thematic essays,  the subject of cultural change is still our “one big thing.”

We will go further and admit we share something more substantial with other philosophical hedgehogs, something quite out of step with fashionable postmodern attitudes.  That is, we believe in the truth—and, even more, that pursuing it is essential to the pursuit of the good.

If we and our contributors can take up the hunt with the nimbleness of foxes, so much the better. And while we imagine that our hedgehog is not even on the radar of the new fox on the block, we suspect we two have more in common than a philosophical Greek might have thought.

Is Nothing Truly Alive?

 

Theo Jansen's Strandbeest.  (Wikimedia Commons)

Theo Jansen’s Strandbeest. (Wikimedia Commons)

There is no such thing as life.

That is the provocative claim made by Ferris Jabr in a recent op-ed appearing in The New York Times. Ferris Jabr is an associate editor at Scientific American, and at first blush, his claim sounds ridiculous. I know I’m alive. So there’s one example of life. Surely Jabr knows that he himself is alive. And we all see hundreds of examples of living things every day. So why exactly does Jabr think there is no such thing as life?

Jabr makes his case this way: “What is life? Science cannot tell us. Since the time of Aristotle, philosophers and scientists have struggled and failed to produce a precise, universally accepted definition of life.” Since we don’t have a definition of life, he continues, how can we talk about living things?  He points out that science textbooks describe living things by picking out features that living things often have. Such lists usually point to organization, growth, reproduction, and evolution. If something has all or most of these features, then it’s probably alive.

However, Jabr explains that these textbook lists fail miserably as definitions of life.  We can find things that are organized, display growth, reproduce, and evolve, and yet are not alive.  And for some things—viruses, for example—we can’t figure out whether they’re alive or not.

He continues: “Why is it so difficult for scientists to cleanly separate the living and nonliving and make a final decision about ambiguously animate viruses?”  Jabr has an explanation: “Because they have been trying to define something that never existed in the first place…Life is a concept, not a reality.”

But here Jabr has gone astray.  He concludes from the fact that life doesn’t have a definition that there is really no such thing as life.  But this is an invalid inference. For a concept can lack a definition and yet still be a real thing.  

Here’s an easy example: redness. Redness doesn’t have a definition. If you don’t believe me, take a stab at defining it. It’s a color—sure. It’s not blue, or yellow, or black, or any of the colors that aren’t red. Neither is it some particular wavelength of light; that’s what causes us to experience redness, but that isn’t what redness is. But this isn’t helping—none of these distinctions tell us what redness is. Redness is its own special thing, and nothing besides redness itself accounts for what it is. Nevertheless, redness is real. It’s a real thing whose concept doesn’t have a definition. The concept of redness is what is called a primitive concept.  It helps define other things, but nothing else defines it. It’s an unexplained explainer.

Redness isn’t the only primitive concept.  There are plenty of others.  For example: the concept of being part of something, the concept of possibility, the concept of goodness, the concept of being identical to something, to name a few.  But most importantly for the matter at hand, others have researched the very issue Jabr’s talking about—the failure of philosophical and scientific efforts to define life—and have given good reasons to think that the concept of life is primitive.  Perhaps most notably, Michael Thompson, a philosopher at the University of Pittsburgh, has made this case in his profound and influential book Life and Action (2008).

Where does all this leave Jabr’s argument?  The absence of a definition for a concept in no way suggests that the concept lacks real instances.  And life certainly seems to have real instances.  So it looks as though we should continue to accept the reality of life and simply recognize that it can’t be defined. Jabr’s case turns out to be less than compelling.

But so what? What’s the real-world significance of arguing in a New York Times op-ed that life doesn’t exist? More than we might initially think. To see what I’m getting at, let’s suppose for the moment that Jabr is right. Jabr illustrates the upshot of his claim about the non-existence of life by comparing things we ordinarily think of as living with certain artifacts, in particular the life-like handiwork of Dutch artist Theo Jansen. Jansen’s Strandbeest are wind-propelled mobile structures that resemble gigantic, many-footed arthropods. Jabr’s conclusion is that “Recognizing life as a [mere] concept is, in many ways, liberating. We no longer need to recoil from our impulse to endow Mr. Jansen’s sculptures with “life” because they move on their own. The real reason Strandbeest enchant us is the same reason that any so-called “living thing” fascinates us: not because it is “alive,” but because it is so complex and, in its complexity, beautiful.”

If life isn’t real—if life is just a sort of beautiful complexity—then the distance between artifacts like the Strandbeest and things we normally consider living is removed. With this distance removed, we are free to see the Strandbeest as “alive.” Jabr thinks his conceptual innovation has brought enchantment to artifacts.

But there is a dark flip-side to this argument.  For with the loss of distance between life and mere elegant complexity, we are also free to see genuinely living things as mere complex artifacts. When a complex artifact—say, a watch—has outlasted its practical usefulness or lost its aesthetic value, there’s no barrier to it being scrapped or thrown away. Of course, we are ordinarily much more hesitant to treat living creatures in this way. Why this is so is a complicated question, but it is in part because we recognize that living creatures possess a mysterious value in virtue of being alive.  Anyone who has seen an animal die learns this, watching the animating spark fade away.

However, if we lose the distance between life and mere complexity, will we feel a heightened sense of loss when we discard a watch?  Or will we merely be less inclined to believe in the strange yet precious value of living creatures?  Jabr thinks he is bringing enchantment to artifacts.  We should worry he is disenchanting the living.

Paul Nedelisky  received his PhD in philosophy from the University of Virginia in 2013 and is now a Postdoctoral Fellow at the Institute for Advanced Studies in Culture, where he is working on a book about science and morality.

 

The Debate Over Nudging

Think of this post as a little nudge to reflect further on “nudging.”

To wit: A recent post on this blog by Charles Mathewes and Christina McRorie, “Human Freedom and the Art of Nudging,” sparked three thoughtful replies on the blog Political Theology, each representing a different philosophical camp.

In the original post, Mathewes and McRorie point out that “nudging,” the idea of influencing your behavior by, say, placing the sugar-loaded cereal on a lower or higher shelf, is not  impeding your freedom, as some have contended. They write:

The issue, then, is not between freedom and tyranny. The issue is whether we will choose to consciously and deliberately shape those forces, or rather let them be determined by purely economic factors, as is the current status quo, such as in the case of the eye-level Kellogg’s cereal…. That is, the choice is not between a paternalistic “bureaucrat in Washington DC” and “you,” or between being “nudged” or manipulated by someone else or having your own innocent agency; the choice is between having the nudger be responsive to political leaders whom you put in power and the nudger be, say, some advertising executive over whose decisions you never have any say.

The responses on the Political Theology blog:

First, Hunter Baker and Micah Watson take the classical conservative viewpoint, with their post, “It Matters Who is Doing the Nudging.”

Then, Roland Boer offers the Marxist stance in “Nudging: Can Reform Make a Better Society?”

Finally, Kevin Vallier chimes in with “Reasonable Libertarian Worries about Nudging.”

Now, Mathewes and McRorie are back with two replies, “A Response to the Responses; or, a Note of Clarification about Nudges, Paternalism, and Agency” Part I and Part II.

The last word, at least for now:

The question before us now is not, “Should we engage in nudging on behalf of the public?” In light of the fact our lives are constantly being nudged—both by government and the very shape of the markets in which we swim every day—the question is instead, “How ought we to use the tools we have at hand to reflectively order our lives together so as to best promote the common good?” In this way, discussions over nudging and the practical impact of our public policies can bring to the fore fundamental questions about the nature of human freedom, and our common life together.

 

The Culture War and America’s Image Abroad

How do people around the world see America?

In different ways,  of course, depending on a host of factors. These include levels of economic and social development, religion, politics, manners, and mores.

Whatever they think about America, though, most people today get their image of the United States, including its values, from America’s popular culture exports. Some would argue that this composite picture, emerging largely from American films, TV shows, and music, is a funhouse mirror reflection of American reality, a picture that may attract some people in some parts of the world but that is just as likely to trouble and offend many others in other parts.

"Through A Screen Darkly" by Martha BaylesAmerica did not always think that its image should be entrusted solely to its popular culture machine. For a time, and quite successfully, it devoted considerable resources to advancing its values and principles through the institutions and practices of public diplomacy, including the United State Information Agency, assorted USIA- and State Department-sponsored cultural programs, student exchanges, and American libraries.

But at the end of the Cold War, America’s pursuit of public diplomacy fell victim to the collapse of a fragile domestic consensus that transcended partisan and even deeper cultural divides within American society. The culture war that emerged with clarity and force then greatly complicated and arguably destroyed our faith in government-supported and government-directed efforts to win hearts and minds.

That, at least, is part of the story that critic and author Martha Bayles tells in her new and valuable book, Through a Screen DarklyPopular Culture, Public Diplomacy, and America’s Image Abroad, just published by Yale University Press. Bayles’s book is already sparking discussion, and for good reason. Here is part of an admiring review that appeared in The Weekly Standard:

Bayles understands that the golden age of American public diplomacy is over. The Cold War audience yearned to be free; our mission was to ensure that they were well-informed and to urge them to be hopeful yet patient. Today’s audience has far more in common with its rulers than did the peoples of the Warsaw Pact, who were subject to an alien Communist regime. And today’s regimes can reassert their authority by mobilizing against a common threat to ruler and ruled: a godless, rootless America. Our gospel of freedom and individual possibility has little purchase in places where familiarity with our popular culture demonstrates that the outcome of our gospel is loathsome.

Bayles’s genius here is not just in dissecting the pathology of the pop-culture mind, but in revealing its effects on the world at large—in matters of war, peace, freedom, and human relations. She is also open to the idea that the entertainment industry’s distortions and libels have a degree of truth to them. And that’s the bad news: America’s image, as distorted in Hollywood’s mirror, may be telling us something unlovely about ourselves.

A segment adapted from that book appears in the new spring issue of The Hedgehog Review, and Bayles herself addressed some of her book’s points at the Institute for Advanced Studies in Culture on March 5.

What Public Universities Owe the Public

graduates

(Credit: iStockphoto.com)

A great deal has lately been made about the widening inequality in America and its various effects not only on the poor but on those struggling to remain in the middle class. Unlike aristocracies, modern liberal democracies are designed to avoid the rule of the few who have a monopoly on wealth and power. Yet modern democracies accept market economies that introduce disparities in wealth and power and the class differences that go with them. The challenge for such democracies is to allow for inequalities while constraining and mitigating their worst effects, at least to the extent that citizens from the various classes can see themselves as parties to a social contract underwriting the principle of a common good.

Our Constitution, our civil religion, and our republican traditions were all an attempt to articulate such a contract.  For all citizens, regardless of their class, to feel that they are in it together with citizens of other classes, there must be a reasonable belief in the possibility of social and economic mobility on the basis of effort, character, and ability. But this is not enough. In general, citizens of a stable modern democracy should be able to believe that their class position is a reasonable reflection of their efforts, character, and ability. It would be going too far to suggest, as did John Winthrop, the first governor of the Massachusetts Bay Colony, that God ordained inequality of wealth and power so “that every man might have need of others, and hence they might be all knit more nearly together in the bonds of brotherly affection.” But ideally most citizens of a democratic commonwealth should feel at least minimal ties of affection—including a sense of gratitude and mutual obligation—with fellow citizens of all classes.

This kind of bond is the forgotten element in the American dream, and unfortunately, it is as forgotten in public higher education as in the institutions of wealth accumulation. The undeniable fact is that “the academic class,” the graduates, the faculty, and the administrators of public universities, are a privileged class of citizens. That being the case, what should less-advantaged citizens feel toward the academic class? Should those who are educationally, economically, and politically less advantaged have reasons to be grateful for the advantaged status of graduates, faculty, and administrators of prestigious public universities? And if the academic class of prestigious public universities is obligated to those less-advantaged educationally, economically, and politically, what is it that they owe them?

Unfortunately, and for a variety of reasons, public universities, especially the more prestigious ones, have lost any real sense of what their functions are as democratic institutions. Ask a postal worker what the U.S. Postal Service is for, a soldier what the U.S. military is for, a firefighter what the fire department is for, or a nurse what a public hospital is for, and you will get fairly straightforward answers. Ask a graduate, a faculty member, an administrator, or a board member of a prestigious public university what a public university is for in a modern liberal democracy and you will too often get little more than a string of clichés. Public universities lack any substantial sense of what their functions are as democratic institutions.  When a carpenter forgets what his hammer is for, it is time either to help him remember or to fire him. The same is true for administrators, faculty, and students at universities.

One of the functions of public universities is to provide access to quality education in order to facilitate social mobility and to mitigate the effects of class disparities. Given this, one would think that academic leaders would be centrally guided by the question, what kind of education should graduates be required to have before they get their degrees and the economic and political advantages that go with them? What kind of core curriculum should be required of public universities to ensure that they fulfill their proper role in a modern liberal democracy and therefore merit public support and funding? That these questions are not being asked in regard to curricular issues is symptomatic of the rudderless nature of public higher education, especially in regard to liberal arts education.

Of course, one of the public interests served by public universities is to provide job training in the professions. In doing so, public universities also serve to provide economically accessible opportunities for citizens to advance their private interests through career advancement. In this way, public universities are part of an effort to garner the best talent for the professions and to serve class mobility in a competitive economy of a democratic society. To serve these functions, public higher education must be of high quality and economically accessible to qualified students, regardless of their class background.

Another private interest served by public universities is more directly related to liberal arts education: namely, to provide economically accessible opportunities for citizens to enrich their lives through the study of the arts and sciences. The kind of class mobility essential to a modern democracy is not simply a matter of economic mobility but also of educational mobility, that is, the accessible journey from a limited exposure to the varieties of human experience, creativity, and inquiry to a life enriched by an expansive exposure to all of these things. Even those already economically advantaged may still hunger for a different kind of advancement that only a well-conceived liberal arts education can provide.

Wren Building at W&M

Woodcut of the College of William & Mary (Credit: The American Cyclopædia / Wikimedia Commons)

But something is still missing if we think that providing “opportunities” for citizens is the central function of a public university: namely, education for the responsibilities of citizenship. If economically accessible opportunities for training in the professions and opportunities for liberal arts education for personal enrichment were the only interests served by pubic universities, the curriculum could be determined by pure market considerations: that is, by whatever job training is needed and desired in the economy, and by whatever students and professors choose to think is personally enriching. But these are not the only interests served by public universities in a modern democracy, and it is a fatal mistake to think that they are more basic than the interest in educating responsible citizens.

The idea of the social contract envisioned by the Framers included the idea that the advantages of privilege come with the responsibility of citizenship and the education required for it. This should be the guiding thought in the overall design of a public university, especially in regard to the curriculum.

So, then, what do graduates of a public university have an obligation to know (and faculty to teach) before they are granted a degree that opens the doors to the privileges they will enjoy and the positions they will occupy? Some maintain that the central task of liberal arts education should focus on “critical thinking skills and creativity” rather than “content knowledge and memorization.” This is the current trend or fad, but the distinction between critical thinking and content knowledge is a canard, a cover for squishy curricula designed for the convenience of students, faculty, and administrators.

How can graduates think critically and responsibly about the relevance of historical knowledge to current issues facing our society when the curricula of their universities allow them to avoid studying American history beyond the high-school level? And what about the ability of such advantaged graduates to think critically about how Americans can relate to other people in the world, when they are allowed to avoid studying the world’s major religions and humanistic traditions? More basically, what about the ability of these graduates to understand our own form of government, the rule of law in America and its effects on everyone, when they are allowed to avoid studying the U.S. Constitution and the historical debates over the major cases in Supreme Court history? What about the ability of privileged graduates to think critically about the economic future of our society and the relationship between the government and the economy when they are allowed by their universities to avoid studying economics and the major schools of economic theory?

And shouldn’t less advantaged citizens, in return for their gratitude and regard, expect the graduates of public universities to have the critical thinking skills necessary to evaluate whether the government is investing well in science or whether global warming is scientific fact or hoax, to address concerns about the relationships between science and religion, and to discriminate between the sound use of statistical reasoning in social science and its misuse in political propaganda detrimental to the general welfare? If so, how could the curriculum of a public university allow students to avoid studying any “hard” laboratory science, any biology and natural selection, or any rigorous social science and statistics?

In general, how can university graduates be ignorant of these and other things and reliably have the ability to think critically regarding the public good and the social contract? Of course, there are students who take a deep interest in these subjects, but the issue is not about what students might take an interest in but about what privileged students have an obligation to learn and what privileged faculty have a responsibility to teach.

None of these questions inform discussion on curricula at most public universities, especially the more prestigious ones. The College of William and Mary in Virginia, where I taught for my entire career, just passed a new curriculum under which a student can graduate without ever taking a course in American history, the world’s religions, economics, government, philosophy, and (possibly) natural science. Instead, students choose courses from vaguely designed “domains” that have only a nominal connection to the kind of substantive courses that once made up a rigorous liberal education. The fact is that neither the old nor the new curriculum at William and Mary was designed to require graduates to have any of the content knowledge mentioned in the questions above or the critical skills that go with such knowledge. This is not the kind of public university Madison and Jefferson rightly had in mind as essential to democracy, and there is no sense in which it can be called progressive. There is no whitewashing the fact that a curriculum like that of the College of William and Mary, “the Alma Mater of a Nation,” turns a public university into a club for privileged faculty, administrators, and students. This is a pernicious abuse of privilege and academic freedom that exacerbates the decline in the humanities and only weakens the social contract.

Unfortunately, public light seldom shines in the darker recesses of academia. When government abandons public universities to the political world of private fundraising and university politics, it commercializes the curriculum, creates a class of itinerate administrators on their way up, feigns oversight through ineffectual boards, and abandons the needs of democracy at the altar of what privileged faculty members want to sell and what privileged students want to buy. If we care about our public universities, our social contract with each other, and our democracy, we must insist that this status quo does not serve the public interest. We, the public, must demand excellence in what matters most in education.

George W. Harris is Chancellor Professor of Philosophy Emeritus at the College of William and Mary and the author of Reason’s Grief: An Essay on Tragedy and Value (Cambridge University Press, 2006 and 2012).