The Hedgehog Review

The Hedgehog Review: Vol. 17 No. 2 (Summer 2015)

On the Value of Not Knowing Everything

James McWilliams

Audio brought to you by, a Hedgehog Review partner.

The Hedgehog Review

The Hedgehog Review: Summer 2015

(Volume 17 | Issue 2)

In January 2010, while driving from Chicago to Minneapolis, Sam McNerney played an audiobook and had an epiphany. The book was Jonah Lehrer’s How We Decide, and the epiphany was that consciousness could reside in the brain. The quest for an empirical understanding of consciousness has long preoccupied neurobiologists. But McNerney was no neurobiologist. He was a twenty-year-old philosophy major at Hamilton College. The standard course work—ancient, modern, and contemporary philosophy—enthralled him. But after this drive, after he listened to Lehrer, something changed. “I had to rethink everything I knew about everything,” McNerney said.

Lehrer’s publisher later withdrew How We Decide for inaccuracies. But McNerney was mentally galvanized for good reason. He had stumbled upon what philosophers call the “Hard Problem”—the quest to understand the enigma of the gap between mind and body. Intellectually speaking, what McNerney experienced was like diving for a penny in a pool and coming up with a gold nugget.

The philosopher Thomas Nagel drew popular attention to the Hard Problem four decades ago in an influential essay titled “What Is It Like to Be a Bat?” Frustrated with the “recent wave of reductionist euphoria,”1 Nagel challenged the reductive conception of mind—the idea that consciousness resides as a physical reality in the brain—by highlighting the radical subjectivity of experience. His main premise was that “an organism has conscious mental states if and only if there is something that it is like to be that organism.”2

If that idea seems elusive, consider it this way: A bat has consciousness only if there is something that it is like for that bat to be a bat. Sam has consciousness only if there is something it is like for Sam to be Sam. You have consciousness only if there is something that it is like for you to be you (and you know that there is). And here’s the key to all this: Whatever that “like” happens to be, according to Nagel, it necessarily defies empirical verification. You can’t put your finger on it. It resists physical accountability.

McNerney returned to Hamilton intellectually turbocharged. This was an idea worth pondering. “It took hold of me,” he said. “It chose me—I know you hear that a lot, but that’s how it felt.” He arranged to do research in cognitive science as an independent study project with Russell Marcus, a trusted professor. Marcus let him loose to write what McNerney calls “a seventy-page hodgepodge of psychological research and philosophy and everything in between.” Marcus remembered the project more charitably, as “a huge, ambitious, wide-ranging, smart, and engaging paper.” Once McNerney settled into his research, Marcus added, “it was like he had gone into a phone booth and come out as a super-student.”3

When he graduated in 2011, McNerney was proud. “I pulled it off,” he said about earning a degree in philosophy. Not that he had any hard answers to any big problems, much less the Hard Problem. Not that he had a job. All he knew was that he “wanted to become the best writer and thinker I could be.”

So, as one does, he moved to New York City.

McNerney is the kind of young scholar adored by the humanities. He’s inquisitive, open-minded, thrilled by the world of ideas, and touched with a tinge of old-school transcendentalism. What Emerson said of Thoreau—“he declined to give up his large ambition of knowledge and action for any narrow craft or profession”—is certainly true of McNerney.

Without pretense, he says things such as “I love carrying ideas in my head, turning them over, and looking at them”; “I have my creative process mapped out”; and “I’m currently incubating a hunch and making it whole.” He uses the phrase “connect the dots” quite a bit and routinely evaluates the machinery of his own mind.

My favorite example: “I have this insane problem—if I read 300 pages of anything I’ll find something to write about, but the more I read the less novel the idea seems and the less I want to write about it.” Such are the phenomena that preoccupy him. After our first phone conversation, he called me a “new soul mate” before signing off. Ever mindful, ever curious, he is a poster boy for the humanities.

The Mosh Pit of Thought

But it’s unclear how much longer the humanities can nurture the Sam McNerneys of the world. Even at Hamilton—a solid liberal arts college—McNerney was, by his own assessment, something of a black sheep. As he indulged the life of the mind, grappling earnestly with timeless philosophical problems, his friends prepared themselves for lucrative careers in law, medicine, and finance. They never criticized his choice—“They never said, ‘You’re going to live a shitty life, Sam,’” he told me—but they didn’t rush to join him in the mosh pit of thought, either.

For all of McNerney’s curiosity—one deeply reflective of a humanistic temperament—it has led him headlong into a topic (the Hard Problem) that has the potential to alter permanently the place of the humanities in academic life. If, after all, Nagel is proven wrong—that is, if subjectivity is in fact reducible to an identifiable network of neural synapses—what is the point of investigating the human condition through a humanistic lens? If what it is like to be human, much less a bat, turns out to be empirically situated in the dense switchboard of the brain, what happens to Shakespeare, Swift, Woolf, or Wittgenstein when it comes to explaining ourselves to ourselves?

It’s perhaps because of this concern that Nagel’s famous essay stays famous, playing a rearguard role in philosophy seminars throughout the country. By challenging the very notion of a biological understanding of consciousness, by positing individual consciousness as an existential reality that defies objectification, “What Is It Like to Be a Bat?” breathes continual life into a phenomenon—the inner subjectivity of experience—that science has yet to illuminate with empirical exactitude.

The critic Richard Brody noted as much when he wrote last year in The New Yorker that “the ideas that Nagel unfolds ought to be discussed by non-specialists with an interest in the arts, politics, and—quite literally, in this context—the humanities.” He continued:

If Nagel is right, art itself would no longer be merely the scientist’s leisure-time fulfillment but would be (I think, correctly) recognized as a primary mode of coming to grips with the mental and moral essence of the universe. It would be a key source of the very definition of life. Aesthetics will be propelled to the forefront of philosophy as a crucial part of metaphysical biology…. The very beauty of Nagel’s theory—its power to inspire imagination—counts in its favor.4

And thus Nagel’s essay—and the humanities—abide.

Humanities vs. STEM

Behind Brody’s optimism there’s a much-discussed backstory. You’ve heard it repeated like a mantra: The humanities are in crisis. They’re dying. There is a mass exodus of literature and history and anthropology majors. Some say this is overhyped angst. As a history professor who has seen twenty years of change, I disagree. It’s real. A shift is underway in higher education, and that shift, in many ways, mirrors—or at least is a microcosm of—the frenzied quest to solve the mystery of the Hard Problem.

The numbers don’t deny it. Nationally, the number of students majoring in the humanities has fallen substantially since 1970.5 At Stanford, 45 percent of the faculty is trained in the humanities, but only 15 percent of students major in humanities fields.6 At Yale, between 1971 and 2013 the proportion of humanities majors dropped from 53 percent to 25 percent among women, and from 37 percent to 21 percent among men. Meanwhile, economics has skyrocketed as a preferred major, with the number of economics majors growing almost threefold at traditionally humanities-inclined institutions such as Brown University.7 The acronym STEM—science, technology, engineering, mathematics—is now part of every university’s lingua franca.

It hardly helped the humanities when a 2012 Georgetown University study found that students in non-technical majors had unemployment rates ranging from 8.9 to 11.1 percent, while graduates in engineering, science, education, and health care had an overall unemployment rate of 5.4 percent.8 It is for good reason that the top five majors at Duke are now in biology, public policy, economics, psychology, and engineering.9 English majors must cringe a little bit when they learn that STEM majors make $32,000 more the year they enter “the real world.”10

Plausible explanations for the withering of the humanities run the gamut. Writing in The New Criterion, Mark Bauerlein, a professor of English at Emory University, blames old-school identity politics: “The minute professors started speaking of literary works as second to race and queerness, they set the fields on a path of material decline.”11 (Cranky.) The essayist Arthur Krystal points to the rise of postmodern theory—particularly deconstructionism—as a culprit. After a “defamiliarized zone of symbols and referents” gutted Western thought, he explains, the result was “the expulsion of those ideas that were formerly part of the humanistic charter.”12 (Stodgy.) But by far the loudest and most controversial response to the crisis in the humanities comes from William Deresiewicz. In his recent book Excellent Sheep, he highlights the scourge of “credentialism” among “entitled little shits.” He asks, “Do young people still have the chance, do they give themselves the chance, to experience the power that ideas have to knock you sideways?”13 Run ragged by status-driven career agendas, they seem not to. They seem pre-channeled, alienated from the whole notion of ideas for ideas’ sake. “Nothing in their training,” Deresiewicz writes of his pressure-cooked subjects, “has endowed them with a sense that something larger is at stake.”14

Perhaps an even deeper reason for the humanities’ shrinking status is the intensification of a certain and perhaps temporary habit of mind among today’s undergraduates, a habit that Nagel reminds us has severe limitations: the fierce adherence to quantification. Why this turn has happened at this point in time is difficult to say, but it seems fairly certain that a renewed faith in the power of radical empiricism—not to mention the economic advantages it can confer when judiciously applied post-graduation—has decisively lured students out of the humanities and into fields where the defining questions are reducible to just the facts, thank you.

Empiricism Run Amok?

Those left in the wake of this trend scratch their heads and prove the rule. Consider the experience of Logan Sander, a Princeton freshman majoring in comparative literature. In tenth grade, Sander wrote a paper on Kurt Vonnegut’s Slaughterhouse-Five in her English class. As with McNerney, the impact of the experience unexpectedly transformed her ambitions. She quickly fell in love with literature and began to read and study fiction with an inspired urgency, reveling in more questions than answers, seeking insights rather than data.

After graduating first in her class at Southview High School in Sylvania, Ohio, Sander was invited to an event celebrating valedictorians from twenty-five public high schools. There she discovered, a little to her dismay, that she was the only valedictorian planning to pursue a non-scientific field of study. Today, Sander has discovered a dynamic humanistic bubble at Princeton, but still, she observes, “there’s this assumption that if you’re pursuing the humanities you’re not likely to get a good job or make a lot of money.” She remains committed to literature. “Those in it seem to really love what they’re doing,” she told me. Plus, she wondered, “Since when is the value of the humanities based on money?”15

Of course, there’s nothing inherently wrong with an undergraduate shift toward empiricism. Nor is there a problem with pursuing money. Gathering and mapping and deploying objective data will produce everything from a cure for cancer to an app for identifying the finest coffee within a ten-block radius to the best fertilizer for sub-Saharan farmers (and probably has, as far as I know). The quality of human life will surely improve because of such endeavors, and those pursuing them are bound to live accomplished and well-remunerated lives. The world will be better off for their contributions. But…

As Sander says, “The humanities…touch the inner parts of our minds and souls the way technology cannot.” Indeed, what about the inner parts of our minds? Our souls?! What about the shimmering but elusive beauty of subjective experience? What about those things you can’t measure or convey?

When it comes to these questions, it’s worth wondering if empiricism hasn’t run amok in the halls of academe. After all, if we go about the business of being ambitious humans armed with an empiricism that grasps and gobbles up and conquers everything up to and including consciousness, it seems reasonable to wonder if we’ll lose something essential to the precarious project of being human—something such as humility. If nothing else, Nagel’s challenge reaffirms the value of humility.

I have no hard proof for this thesis, but I think there’s something to it: Knowing that there are things we don’t know—and may never know—has a humbling effect on the human mind. Humility is a form of modesty that asks us to accept ambiguity. Ambiguity, in turn, is ultimately what brings us together to explore the mysteries of existence through the wonder-driven endeavors we lump under that broad umbrella known as the humanities. If we knew it all, if we understood what it was like to be a bat, probably even Logan Sander would not be a comparative literature major.

In a way, to catch consciousness, to close the mind-body gap, would be to eliminate that humility. It would be to answer most of the big questions—to collapse the umbrella and move into a post-human world. And that might sound great to logical positivists and atheists and neurobiologists. But as the essayist Charles D’Ambrosio reminds us, “Answers are the end of speech, not the beginning.”16

Are we really ready to stop talking?

Correlating Consciousness

Christof Koch is the chief scientific officer at the Allen Institute for Brain Science, in Seattle. In the world of neurobiology, he’s a big deal. If Nagel led the effort to popularize the Hard Problem of consciousness, Koch leads the effort to solve it. In April 2014, President Obama introduced the US Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative. With $60 million a year going to the Allen Institute, Koch is poised to play a pivotal role in the scientific effort to locate the mind in neurobiological space.

Of all the things he said to me in the course of a lively conversation last November, this, delivered with risible enthusiasm, stood out the most: “At some point, you will know what it’s like to be a bat.”

Koch’s faith in empiricism is pure. His reality is deeply physical. Driving his neurobiological approach to consciousness is a deep conviction that science, which he calls “humanity’s most reliable, cumulative, and objective method for comprehending reality,” is a project that “should also help us explain the world within us.” What Nagel identifies as the elusive subjectivity of experience, Koch, in his recent book Consciousness: Confessions of a Romantic Reductionist, refers to as “properties of the natural world.” What Nagel says we’ll never find, Koch, through the swagger of science, insists we’ll own.

Koch first explored the mind-body problem while doing a postdoctorate at MIT in the early 1980s. It was a time in his life, he explained, “when I was young and brash and naive and didn’t like the idea that something can’t be solved.” When he moved to the California Institute of Technology in 1986, he developed a lifelong intellectual and personal friendship with the biologist Francis Crick (of DNA double-helix fame), a man whom he came to admire as “a reductionist writ large.” Koch came under Crick’s wing and thrived.

The two men worked closely together on the Hard Problem, postulating what they called “neural correlates of consciousness.” They defined these as “the minimal neural mechanisms jointly sufficient for any one specific conscious percept.”17 When Crick died in 2004, Koch carried the torch down the path of hard empiricism, more confident than ever that “the weird explanatory gap between physics and consciousness” could be closed to produce to “a complete elucidation of consciousness.”18

Koch’s current working hypothesis is elegant. A précis of it might go something like this: Consciousness begins and ends with neural information. The mind is inextricably bound up with verifiable information zipping through the brain in the form of synaptic liaisons among skittering dendrites. The integration of that information lays the foundation of consciousness. Through an integrated information theory, pioneered by the University of Wisconsin’s Guilio Tononi, one can viably posit a unified consciousness from the seemingly endless causal interactions within the relevant parts of the brain. Because one can theoretically calculate the extent of this integration, one can feasibly identify consciousness. More so: One can also measure it.

Koch is quick to caution that this is all a postulate. Nothing has been finalized in the frenzied quest to map out consciousness. “We cannot yet calculate the state of awareness for even the simple roundworm with current computers, let alone deal with the complexity of the human brain,” he writes.19 At this point in time, in short, we still have no idea what it’s like to be a bat.

But that doesn’t mean we someday won’t. What Koch has unequivocally discovered in his impassioned search for an empirical understanding of consciousness is something critical within himself: utter perseverance. He won’t quit. “My aim in life,” he told me, “is to have a grand view of how this all works.” Not to do so, he explained, would be “scandalous.”

Evaluating Koch’s mission in light of the humanities, it’s tempting to assess his project in zero-sum terms—that is, as an endeavor that seeks to replace imagination with information, creativity with concreteness, subjectivity with objectivity, the soul with the body. It’s tempting, in other words, to see the BRAIN Initiative as a hubristic endeavor to reduce the beautiful messiness of humanistic creativity to the neatness of a mathematical equation.

Koch’s aggressive rhetorical defense of science does little to discourage such a concern. He writes:

If we honestly seek a single, rational, and intellectually consistent view of the cosmos and everything in it, we must abandon the classical view of the immortal soul. It is a view that is deeply embedded in our culture; it suffuses our songs, novels, movies, great buildings, public discourse, and our myths. Science has brought us to the end of our childhood. Growing up is unsettling to many people, and unbearable to a few, but we must learn to see the world as it is and not as we want it to be. Once we free ourselves of magical thinking we have a chance of comprehending how we fit into this unfolding universe.20

Magical thinking? This passage is enough to make the humanist cringe. It’s time to grow up? Be done with childish things? We recall our innocent love for the blues of Robert Johnson, and we think, “Leave my songs alone!” We consider how stunned we feel reading the last paragraph of Cormac McCarthy’s Suttree, and we think, “Don’t touch my novels!” We pause and re-read Mark Strand’s reference to “an understanding that remains unfinished,” and we say, “Stay away from my poetry!” We enter the Rothko Chapel and ride a wave of new-agey spiritualism, and think, “Leave my architecture in peace.” We ponder Nagel’s question, tapping the spirit of the Beatles, and we say to ourselves, “Let it be.”

But the reductionists are on hand to throw cold water on our mystical musings: The majority of today’s philosophers of mind defend Koch’s empiricism. (“We’d better figure out what it’s like to be a bat,” Georgetown philosophy professor Bryce Huebner told me.) Koch and his colleagues refuse to accept the phrase “This will never be answered.” (Koch told me that skeptics have uttered such nonsense throughout history.) The humanities are slowly ceding ground to the “neuro-humanities” and the “digital humanities.” The federal government is spending hundreds of millions of dollars to map the circuitry of the mind. And the majority of today’s best and brightest undergrads are allegedly “excellent sheep,” moving zombie-like toward STEM-related pursuits and jobs making apps and managing hedge funds.

Chances are good, in other words, that nothing will be left alone. And so we have to ask: Will the humanities sink under the weight of science? And if not, how will they respond? What will be their role?

Can Wonder Save the Humanities?

The fear pervading the humanities these days evokes what the historian of science Lorraine Daston has recently called “the paradox of wonder.” She describes it in the most eloquent terms:

The marvel that stopped us in our tracks—an aurora borealis, cognate words in languages separated by continents and centuries, the peacock’s tail—becomes only an apparent marvel once explained. Aesthetic appreciation may linger…but composure has returned. We are delighted but no longer discombobulated; what was once an earthquake of the soul is subdued into an agreeable frisson.… The more we know, the less we wonder.21

The world’s greatest scientific thinkers have always been a little bamboozled in the face of this relationship. Descartes believed that wonder was a necessary spur to scientific knowledge as well as a seductive stimulus that could turn turbulent and addictive. Bacon similarly understood wonder to be “a dangerous passion,” deeming it a form of “broken knowledge” that should only be sampled in small doses and as a means to fixing that break.

Since the days of Descartes and Bacon, the sensation of wonder—at least in the world of contemporary science—has evolved into either an embarrassing illumination of ignorance (think astrology) or an affirmation of an Occam’s razor−style scientific explanation (think natural selection). The generally pejorative connotation of wonder, however, is such that, as Daston explains, “humanists are even more chary [than scientists] of expressing wonder.”22 Wonder, in other words, might be integral to the humanistic worldview, but as a state of mind, it’s on the ropes.

Christof Koch, for all his empiricism, suggests how it might make a comeback. Hints that Koch approaches life with a profound attunement to its wondrous manifestations, and that such wonder underscores his science, come through in his personal website. There, you can find an illustrated narrative of long solo hikes on the John Muir Trail, multi-hour trail runs in the San Gabriel Mountains, and grueling rock-climbing excursions in Yosemite Valley. There, you can read dreamy commentary such as “At night, I would contemplate the high alpine sky above me and the moral compass within me” or “I discovered the Zen of marathon running.”23

It’s evident that Koch is keyed in to the subtle tremors of human experience, the kinds that give you butterflies and make your heart race—the kinds that the classical humanist is loathe to reduce to an algorithm. When we spoke, Koch lowered his voice almost to a whisper as he said, “I find myself in this wonderful universe that’s so conducive to life.”

The subtitle of his book Consciousness confirms this unexpected impulse animating his worldview: Confessions of a Romantic Reductionist. At the end, he poignantly confronts the inevitability of his own death—“the significance of my personal annihilation”—and delivers an assessment that reifies the romantic part of reduction. After “facing down an existential abyss of oblivion and meaningless within me,” he writes, he underwent “an unconscious process of recalibration,” and arrived here:

I returned to my basic attitude that all is as it should be. There is no other way than I can describe it: no mountaintop conversion or flash of deep insight, but a sentiment that suffuses my life. I wake up each morning to find myself in a world full of mystery and beauty. And I am profoundly thankful for the wonder of it all.24

“The wonder of it all.” This grabbed me. Could wonder save the humanities? Reading history and literature has taught me that all ideologies ultimately hit and crumble upon the brick wall of reductionism. It has taught me that if ideologies proved completely true, something essential about humanity would be lost. It has taught me that the pursuit of knowledge (scientific or otherwise) has meant different things to different people at different times. Francis Bacon, for one, would never have entertained an ambiguity-crushing version of existence of the kind espoused by today’s reductionists. So why should we?

Not to be glib, but why should there be only one version of scientific truth today? Could it be that Koch and his cohort have hijacked and linearized the entire idea of science itself? Has the desire for reductionism reduced science to a single paradigm that marginalizes any phenomenon—irony, for instance—that resists the explanatory powers of brain mapping? And might it not be the very job of the humanities to bring back these intangibles to the center of humanity and celebrate them and ask science to broaden its horizons to accommodate mystery?

I wondered. And then I reconnected with Sam McNerney.

After moving to New York, he got a small loan from his dad—“How many dads would do that?”—moved in with his girlfriend, and started to blog about the mind, the brain, business, and philosophy. He ran out of money. “It sucked,” he said. But he stuck to his ambition: becoming the best thinker and writer he could be. After a lot of online writing, he received a call from a big publishing company asking if he was interested in writing an in-house blog about business books. He was.

When I called him in early December, his job with the publishing company had run its course and he was back in his tiny Manhattan apartment, writing freelance. “I finally have time to think!” he said. We discussed an article he was working on about how technology structures information. “Technology advances,” he said, “and we complain about information overload. But it turns out we’re really good at organizing information. And when we organize information really well, something gets sacrificed.” He wondered, “Is information too organized?” He wondered, “What gets sacrificed?”

The answer he suggested is a word I hadn’t heard in a while, but, given all the research I’d been doing for this article, it initiated a kind of convergence: “serendipity.” Serendipity, indeed. Serendipity is so beautifully slippery. It situates itself between Nagel and Koch, empiricism and the humanities, wonder and science. Serendipity, when you get right down to it, is at the beating heart of wonder. It accounts for McNerney’s epiphany, Sander’s love of literature, my own distrust of reducibility, and Koch’s comfort with his own “annihilation.” Serendipity interrupts the linear view of science.

McNerney brought up Darwin. Darwin, he explained, would never have read Malthus’s 1798 essay on population growth, and thus would never have developed his theory of natural selection, had he been “searching for birds on Google.” The comparative looseness of information, the unchanneled nature of investigation, joined with Darwin’s innate curiosity, is what led to one of the most unifying explanations of physical existence the world has known. What sparked it all was pure serendipity.

In our age of endlessly aggregated information, the ultimate task of the humanities may be to subversively disaggregate in order to preserve that serendipity. After all, a period of confusion inevitably precedes the acquisition of concrete knowledge. It’s a necessary blip of humbling uncertainty that allows for what McNerney describes as “the call and response” between disparate ideas. As long as that gap exists, as long as a flicker of doubt precedes knowledge, there will always be room for humanistic thought—thought that revels in not knowing. As long as that gap exists, we will not be reduced to the moral equivalent of computers.

“There’s a big benefit to not knowing the answer to a question for a long time,” McNerney said toward the end of our conversation. “The trick is knowing enough but not too much, not so much that you kill that sense of wonder.”


  1. Thomas Nagel, “What’s It Like to Be a Bat,” Philosophical Review (83) 4 (October 1974), 435.
  2. Ibid.
  3. Russell Marcus, e-mail message to author, November 28, 2014.
  4. Richard Brody, “Thomas Nagel: Thoughts are Real,” The New Yorker online, July 16, 2013;
  5. “Bachelor’s Degrees in the Humanities,” Humanities Indicators, American Academy of Arts and Sciences; Accessed on April 21, 2015.
  6. Tamar Lewis, “As Interest Fades in the Humanities, Colleges Worry,” The New York Times, October 30, 2013;
  7. Sarah Sachs, “Economics sees growing pains as students look for ‘marketable skills’,” The Brown Daily Herald, April 5, 2013;
  8. Anthony P. Carnevale et al., “Not All College Degrees Are Created Equal,” report from Georgetown’s Center for Education and the Workforce, 4;; 4. Accessed April 21, 2015.
  9. Duke University, Office of News and Communications; Accessed April 21, 2015.
  10. Susan Adams, “Majoring in the Humanities Does Lay Off, Just Later,” Forbes, January 22, 2014;
  11. Mark Bauerlein, “Humanities: Doomed to Lose?,” The New Criterion, November 2014;
  12. Arthur Krystal, “The Shrinking World of Ideas,” The Chronicle of Higher Education, November 21, 2014;
  13. William Deresiewicz, Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life (New York: Free Press, 2014), 111.
  14. Ibid., 13.
  15. Logan Sander, telephone interview with author, November 25, 2014.
  16. Leslie Jamison, “Instead of Sobbing, You Write Sentences: An Interview with Charles D’Ambrosio,” The New Yorker online, November 26, 2014;
  17. Christof Koch, Consciousness: Confessions of a Romantic Reductionist (Cambridge, MA: MIT Press, 2012), 42.
  18. Ibid., 27.
  19. Christof Koch, “A Theory of Consciousness,” Scientific American Mind; 19: Accessed April 21, 2015.
  20. Koch, Consciousness, 152.
  21. Lorraine Daston, “Wonder and the Ends of Antiquity,” The Point; Accessed on April 22, 2015.
  22. Ibid.
  23. Christof Koch personal website; Accessed April 21, 2015.
  24. Koch, Consciousness, 161.

James McWilliams is a professor at Texas State University and the author of A Revolution in Eating: How the Quest for Food Shaped America and Just Food: Where Locavores Get It Wrong and How We Can Truly Eat Responsibly. His work has appeared in Harper’s, The New York Times, The New Yorker online, and The Paris Review.

Reprinted from The Hedgehog Review 17.2 (Summer 2015). This essay may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission. Please contact The Hedgehog Review for further details.

Who We Are

Published three times a year by the Institute for Advanced Studies in Culture, The Hedgehog Review offers critical reflections on contemporary culture—how we shape it, and how it shapes us.

IASC Home | Research | Scholars | Events | Support

IASC Newsletter Signup

First Name Last Name Email Address

Follow Us . . . FacebookTwitter