Tag Archives: James Scott

Harnessing Big Data to Democratic Ends

Seattle skyline at night, 2002. Wikimedia Commons.

Seattle skyline at night, 2002. Wikimedia Commons.

Part 3 of the series Thriving Cities in a World of Big Data 

It’s easy to be afraid of Big Data. (Like Big Brother or Big Tobacco, it’s coming for us).  It’s even easier to be excited about it. As Noah Toly noted in his recent posts on Common Place, the same qualities of Big Data can inspire both utopian dreams and dystopian fears. But what can Big Data do for–or to–democracy?

In 1992, the United Nations Conference on Environment and Development (UNCED) called for governments and NGOs to “develop and identify indicators of sustainable development in order to improve the information basis for decision-making at all levels.” The hope was to craft new, large-scale statistical resources that would help assess and craft policies. The UNCED’s goal was twofold: first, to “bridge the data gap” that exists at all levels of government on key environmental and economic issues; second, to improve “information availability” in order to assure that data be accessible to all decision makers and managed securely and openly.  UNCED hoped not only to improve elite decision making but also to democratize sustainable development practices.

The data the UNCED proposed to track would serve democracy in a “broad sense, ” helping  individuals and institutions at both the international and grassroots levels to engage with the pressing questions of our time. But while obviously helpful to the functioning of democratic societies, these large-scale statistical measures also present problems for them.  For example, elites can use such data to support what  appear to be their interests alone—a truth borne out by metrics such as GDP, which can be used to monitor and inform the economic power of the wealthy without reflecting the well-being of the population as a whole.

Indeed, assorted measurements and data have long been used by governments and political elites to  control populations, going back to the first efforts by monarchs to require  thorough census data on their  subjects. As political scientist and anthropologist James Scott shows in Seeing Like a State, the transparency that statistical measures give to complex political phenomena can also make citizens more “legible” to, and thus controllable by, political elites. Big Data thus not only abets surveillance, but can also also bring the politically “illegible” into the fold by forcing their normalization.

Then, too, Big Data can be used to assert the sufficiency of statistical fact, thereby sometimes curtailing robust or fully informed demoractic debate. Take recent partisan arguments  pitting economic stimulus against austerity. Neither side is prepared to engage in a conversation over the data itself. Both sides claim to be in possession of the facts, the left asserting that stimulus will lead to sustained economic growth, the right that austerity is the only route to the same destination. In this case, as in others, conversations beginning with the assertion of absolute facts tend to end either in stalemate (as in recent debates about the federal budget in America) or with technocracy, where the statistician is favored over and against the popular will of people (as with Italy’s Monti government).

Can we have the benefits of Big Data without the drawbacks? Is there a way to harness the democratic power of information  while also promoting democratic open-mindedness and popular empowerment? The work of geographer Meg Holden, who studied the development and implementation of a regional environmental impact index called Sustainable Seattle (S2), is useful here. Holden’s study of S2 shows how  complex phenomena such as urban sustainability and climate change can be made subjects of political debate through statistical measures.

Holden shows that grassroots attention to indicator development and application allowed the S2 project to bridge existing learning gaps between local politics and dispersed economic, ecological, cultural, and institutional phenomena. Rather than shifting knowledge of large-scale phenomena outside democratic debate, S2 promoted “social learning.” Residents of Seattle could (and did) use its findings to promote better  democratic debates.

The demands S2 placed on its developers were many. Among other things, they had to become statistical experts in indicator development while finding measurements that meaningfully correlated to ecological questions. They had to be marketers who could advertise their project and findings to their community. And finally, they had to lobby in support of their findings in order to have an impact on local politics. While Holden shows that S2 was imperfect, the responsiveness of its developers to broad public concerns makes it a model for those hoping to  to harness Big Data for democratic ends.

Above all, the S2 project leaders recognized the limits of Big Data. Its developers acknowledged the imperfect nature and sources of data, the limitations inherent in its processing, the necessity of packaging findings, and the need to bring findings to all audiences and institutions. In the long run, such chastened optimism and humility may prove to be the most helpful lessons  of all.

Callum Ingram is a graduate student in the Department of Politics at the University of Virginia. His dissertation research focuses on the use of urban space and architecture by democratic social movements.

Editor’s note: For more on this topic, subscribe to receive the spring issue of The Hedgehog Review, “Too Much Information.”

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterGoogle+Share