Monthly Archives: August 2014

Algorithms Rule

inf machine_CDF_wanderer_algorithm_FLAT

The Internet killed expertise. Or, so claims public-policy specialist Tom Nichols in a recent essay that laments the demise of a cultural fixture.

I fear we are witnessing the “death of expertise”: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers, knowers and wonderers—in other words, between those of any achievement in an area and those with none at all. By this, I do not mean the death of actual expertise, the knowledge of specific things that sets some people apart from others in various areas. There will always be doctors, lawyers, engineers, and other specialists in various fields. Rather, what I fear has died is any acknowledgment of expertise as anything that should alter our thoughts or change the way we live.

For Nichols, technologies such as Google and Wikipedia betoken a loss of epistemic authority, that is, what counts as authoritative knowledge in our digital age. What legitimates one form of knowledge over another? Which sources of knowledge are to be trusted? Which not? What practices and scholarly habits, techniques, and institutions render knowledge authoritative or worthy? In our digital age, these questions seem almost quaint, throwbacks to a less free and democratic age.

Nichols isn’t alone in divining the collapse of expertise and authority in our digital age. But other, more sanguine observers celebrate it and the liberating promises of digital technologies. “Neither the Internet nor the WWW,” writes Cathy Davidson, a professor at the City University of New York Graduate Center, “has a center, an authority, a hierarchy, or even much of a filter on the largest structural level.” With the advent of digitally-supported learning, “conventional modes of authority break down.” Digital technologies will liberate us from the constraints of traditional forms of epistemic authority. There will be no filters in the digital future to come.

Davidson’s messianic hopes as well as Nichols’s cultural despair mistakenly suppose that there can somehow be a vacuum of epistemic authority. But, in truth, forms and functions of epistemic authority, be they the disciplinary order of the research university or Wikipedia’s fundamental principles or “Five Pillars,” are themselves filtering technologies, helping us to orient ourselves amid a surfeit of information. They help us discern and attend to what is worthwhile. Google searches point us in the direction of some resources and not others. Technologies are normative, evaluative structures to make information accessible, manageable, and, ultimately, meaningful. It is not a question, then, of the presence or absence of epistemic authority; it is about better or worse forms of epistemic authority. Expertise and cultural authority are still with us. But now it might be more spectral, embodied not in the university don but in the black-boxed algorithm.

If the Internet and the World Wide Web lack, as Davidson puts it, a “centralized authority” and a “filter,” they do so only on the most abstract level. Our daily interactions with the Web are made possible by a host of technological constraints and filters. People access and engage information through technologies that allow them to select, filter, and delimit. Web browsers, hyperlinks, blogs, online newspapers, and the computational algorithms of Facebook, Google  and financial institutions help us turn terabytes of data into something more scalable, that is, something that can be made useful to an embodied person. These now-ubiquitous technologies help us to sort, to Google a needle in the haystack—and in so doing, they have become central mediums for the way we experience the world.

We are living in an age of algorithmic authority. Algorithms filter our music choices, track our purchasing decisions, find our airline tickets, and help us withdraw money from an ATM. They are ubiquitous. They are forming who we are and who we want to become. But we are only beginning to ask about our algorithmic selves. How can we learn about these algorithms from the outside and how they increasingly organize our very selves?

Authority hasn’t vanished. It has just assumed different, more latent forms. As Evgeny Morozov puts it,

The reason to fear Facebook and its ilk is not that they violate our privacy. It is that they define the parameters of the grey and mostly invisible technological infrastructure that shapes our identity.

We can’t free ourselves from our technologies; digital detoxes are palliative exercises. But we can try to get to know our new algorithmic selves.

Credit: Photo montage with algorithm and Caspar David Friedrich’s Wanderer Above the Sea of Fog (1818)

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.