Monthly Archives: February 2015

Why the New Flows of Capital Matter for Cities

Prior to the rise of the car and the trucking industry, cities were the best places for investment. They provided access to markets through ports, rivers, and railroads. They had large pools of unskilled labor living near factories, and they were relatively dense making business easier for firms. Significantly, capital, the cultural sociologist Zygmunt Bauman claims, was “heavy” and enmeshed in place:

Routinized time tied labor to the ground, while the massiveness of the factory buildings, the heaviness of the machinery and, last but not least, the permanently tied labor ‘bonded’ the capital. Neither capital nor labor was eager, or able, to move.

Urban centers were hubs of industry, fostering, in the words of political scientist Douglas Rae, a “civic fauna.” The rich and the poor lived close together and intermingled by participating in common civic projects. Although hardly utopias—cities struggled with public health problems, pollution, and ethnic and racial antagonism—the flow of capital through cities created jobs and a rich cultural infrastructure. But as transportation and communication technology advanced, urban investment slowed, moving away from the expensive real estate and high taxes of the city toward greener pastures in the county. As a result, many cities over the last century experienced massive unemployment and high crime as populations followed the flow of capital to middle-class enclaves in the suburbs.

Yet, today, our cities again are seeing fresh investment due to new emerging economic sectors in knowledge and technology. Instead of building factories, investments in the burgeoning knowledge economy focus on human capital, innovation, and lighter technologies. These new sectors have resulted in job growth in software and pharmaceutical development, biotech, digital entertainment, and financial innovation, among other fields.

Urban centers are prime locations for these knowledge-based industries. The reason according to Professor Dana Silver is that “[Cities] spur innovation by facilitating face-to-face interaction, they attract talent and sharpen it through competition, they encourage entrepreneurship, and they allow for social and economic mobility.” As these sectors continue to grow, capital is again flowing back into cities. Yet, cities must recognize the form of capital has changed, and with that transformation there comes not only opportunities but also new challenges.

As labor shifted from working with machines and metal to generating new ideas and innovations, capital became highly mobile and disruptive—where once capital was “heavy,” it is now “light.” “The disembodied labor of the software era,” says Bauman, “no longer ties down capital: it allows capital to be extraterritorial, volatile, and fickle. Disembodiment of labor augurs weightlessness of capital.” The rapid movement of money allows a city to react quickly, investing in nascent tech industries. New, successful businesses can spring up overnight in seemingly any place.

However, even in an era of highly mobile capital, capital is not moving just anywhere, but to very specific destinations. Though the death of distance has long been heralded, place matters more than ever in the knowledge economy. Cities such as San Jose, San Francisco, New York, Washington D.C., Raleigh-Durham, Seattle, and Austin are places of intense knowledge-based economic growth. Each of these cities contains a cluster of similar high-tech industries that serve to reinforce the vitality of the region.

Although cities are again becoming important economic engines, their revival may not be evenly felt across the country. Innovation hubs take time to develop and certain areas have historical or geographic advantages. Silicon Valley, for example, owed much of its early growth to being the site of Cold War research and more than thirty years would pass before it became a world technology center.

As they attempt to harness this growth, smaller and less-established cities will discover some growing pains. They may experience something closer to what I found working at a successful education tech company. A decade after it was founded in a small city, the company was purchased by a private equity company. The investors moved the company to a larger, more innovative metropolitan area, laying off more than one hundred employees—a considerable blow to the local job market.

Innovation sectors require a highly educated and technological sophisticated population. Such a population takes vast resources to develop and will have potentially tremendous social consequences. As Tyler Cowen argues:

Th[e] imbalance in technological growth will have some surprising implications. For instance, workers more and more will come to be classified into two categories. The key questions will be: Are you good at working with intelligent machines or not? If you and your skills are a complement to the computer, your wage labor and market prospects are likely to be cheery. Ever more people are starting to fall on one side of the divide or the other. That’s why average is over.

The new flows of capital are not only changing cities economically, but also socially. In a Slate article, economist Robert Frank argues “top salaries have been growing sharply in virtually every labor market because of two factors—technological forces that greatly amplify small increments in performance and increased competition for the services of top performers.” This economic environment can deepen existing economic divides and exacerbate social tensions within a city. High-growth superstar cities such as San Francisco are struggling with exactly this problem.

In a rush to grow, cities may overlook those who are not part of the new economy. Worries over gentrification include more than simply displacing low-income families—these same people may be shut out of access to greater economic opportunity. The knowledge economy can disproportionately award skilled individuals and the sectors that employ them, while creating a sharp economic divide felt locally and nationally.

Unsurprisingly, many cities would rather have the problems of booming San Francisco than struggling Rust Belt cities like Detroit, Michigan. Cities such as Columbus, Ohio, for example, are developing knowledge indicators that can help target new tech industries. Richard Florida’s conception of the “Creative Class” (where cities are encouraged to appeal to artists for economic growth) has been extremely receptive across the country. As innovation sectors continue to grow, cities have again become investment targets, a trend that brings economic as well as urban revitalization. But with opportunity comes challenges. For a city to thrive, governments and business leaders will need to grapple with the seismic economic changes underway. Their metrics and laws will need to be updated to capture this shift. But more importantly, they will need to be vigilant to ensure that the new flows of capital benefit not just a select few, but everybody.

Stephen Assink is curator and manager of Common Place. He is also a member of the Principal Investigator team for the Thriving Cities Project.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.


Reflecting on “Data” and “Big Data” for Cities

The supercomputer Arctur-1, 2012.

The supercomputer Arctur-1, 2012. By Arctur (Own work) [CC BY-SA 3.0], via Wikimedia Commons.

Part 4 of the series Thriving Cities in a World of Big Data 

Given the rapid pace of city growth and the concurrent demand for better infrastructure and services, pressure on city leaders and managers to make smart policy and planning decisions around investment has never been greater. Limited public budgets, demands for open participatory government, and aging and deteriorating infrastructure also add to the complexities of achieving prosperous, sustainable, resilient, and inclusive cities. This increasingly complex planning environment is driving the demand for data on cities.

The massive collecting and sorting of information known as Big Data is responding to this need and becoming a necessary and useful tool for city leaders. However, in order to create broader knowledge of cities, Big Data must be contextualized and complemented by standardized and comparative city metrics, driven by demand of city leaders themselves. Standardized indicators reported by cities, such as those in the new international standard ISO 37120 Sustainable Development of Communities – Indicators for City Services and Quality of Life are needed to provide a more complete picture of city performance to inform decision making.

ISO 37120 was published in May 2014 by the International Organization for Standardization (ISO). ISO 31720 defines and establishes methodologies for a comprehensive set of indicators that will enable any sized city in a developed or developing economy to track and measure its social, economic, and environmental performance in relation to other cities.

Standardized indicators can help reframe the Bent Flyvbjerg question—“where we are going?”—to “where ought we be going?” For cities, standardized indicators are important for benchmarking, guiding investments, and tracking progress. Cities are positioned to benefit from this type of data precisely because standardization of data enables city-to-city learning and the exchange of best practices, and data also empowers citizens by making them more informed about their city’s service delivery, with the end goal of improving quality of life.

ISO 37120 represents a critical shift in thinking when it comes to city data. It provides cities and stakeholders with a standardized approach and a global framework for third party verification of city data.

Noah Toly points to an example from Anthony Townsend on how Big Data helped to locate residents in Chicago who were vulnerable to extreme weather events. While Toly points out that Big Data may be able to provide this type of information, he argues that it is less likely to inform decision makers on why these people are vulnerable and what should be done to mitigate their risk. This is where standardized metrics can complement Big Data by helping to track the impact and readiness of cities to respond to extreme weather events and other risks. For this reason, in addition to ISO 37120, another standard for indicators on urban resilience is now being considered by the ISO. Risk and resilience indicators reported by cities will complement and in fact inform Big Data on risk from extreme weather events and track how neighborhoods will be impacted.

The first ever certification system and Global Registry for ISO 37120 has been developed by the World Council on City Data (WCCD). The WCCD, launched in May 2014 at the Global Cities Summit in Toronto, has been established to take this critical data agenda forward. The organization is coordinating all efforts on city data to ensure a consistent and comprehensive platform for standardized urban metrics through ISO 37120 and future standards under development.

Back to the question at hand: Big Data can be a tool for government. However, with regards to the concerns that Toly raises, it should not be the only tool. Big Data should be one of many datasets that cities turn to in order to ensure that cities are in fact headed in the right direction when it comes to sustainable planning for the future.

Patricia McCarney is President and CEO of the World Council on City Data (WCCD), and a Professor of Political Science and Director of the Global Cities Institute (GCI) at the University of Toronto.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Harnessing Big Data to Democratic Ends

Seattle skyline at night, 2002. Wikimedia Commons.

Seattle skyline at night, 2002. Wikimedia Commons.

Part 3 of the series Thriving Cities in a World of Big Data 

It’s easy to be afraid of Big Data. (Like Big Brother or Big Tobacco, it’s coming for us).  It’s even easier to be excited about it. As Noah Toly noted in his recent posts on Common Place, the same qualities of Big Data can inspire both utopian dreams and dystopian fears. But what can Big Data do for–or to–democracy?

In 1992, the United Nations Conference on Environment and Development (UNCED) called for governments and NGOs to “develop and identify indicators of sustainable development in order to improve the information basis for decision-making at all levels.” The hope was to craft new, large-scale statistical resources that would help assess and craft policies. The UNCED’s goal was twofold: first, to “bridge the data gap” that exists at all levels of government on key environmental and economic issues; second, to improve “information availability” in order to assure that data be accessible to all decision makers and managed securely and openly.  UNCED hoped not only to improve elite decision making but also to democratize sustainable development practices.

The data the UNCED proposed to track would serve democracy in a “broad sense, ” helping  individuals and institutions at both the international and grassroots levels to engage with the pressing questions of our time. But while obviously helpful to the functioning of democratic societies, these large-scale statistical measures also present problems for them.  For example, elites can use such data to support what  appear to be their interests alone—a truth borne out by metrics such as GDP, which can be used to monitor and inform the economic power of the wealthy without reflecting the well-being of the population as a whole.

Indeed, assorted measurements and data have long been used by governments and political elites to  control populations, going back to the first efforts by monarchs to require  thorough census data on their  subjects. As political scientist and anthropologist James Scott shows in Seeing Like a State, the transparency that statistical measures give to complex political phenomena can also make citizens more “legible” to, and thus controllable by, political elites. Big Data thus not only abets surveillance, but can also also bring the politically “illegible” into the fold by forcing their normalization.

Then, too, Big Data can be used to assert the sufficiency of statistical fact, thereby sometimes curtailing robust or fully informed demoractic debate. Take recent partisan arguments  pitting economic stimulus against austerity. Neither side is prepared to engage in a conversation over the data itself. Both sides claim to be in possession of the facts, the left asserting that stimulus will lead to sustained economic growth, the right that austerity is the only route to the same destination. In this case, as in others, conversations beginning with the assertion of absolute facts tend to end either in stalemate (as in recent debates about the federal budget in America) or with technocracy, where the statistician is favored over and against the popular will of people (as with Italy’s Monti government).

Can we have the benefits of Big Data without the drawbacks? Is there a way to harness the democratic power of information  while also promoting democratic open-mindedness and popular empowerment? The work of geographer Meg Holden, who studied the development and implementation of a regional environmental impact index called Sustainable Seattle (S2), is useful here. Holden’s study of S2 shows how  complex phenomena such as urban sustainability and climate change can be made subjects of political debate through statistical measures.

Holden shows that grassroots attention to indicator development and application allowed the S2 project to bridge existing learning gaps between local politics and dispersed economic, ecological, cultural, and institutional phenomena. Rather than shifting knowledge of large-scale phenomena outside democratic debate, S2 promoted “social learning.” Residents of Seattle could (and did) use its findings to promote better  democratic debates.

The demands S2 placed on its developers were many. Among other things, they had to become statistical experts in indicator development while finding measurements that meaningfully correlated to ecological questions. They had to be marketers who could advertise their project and findings to their community. And finally, they had to lobby in support of their findings in order to have an impact on local politics. While Holden shows that S2 was imperfect, the responsiveness of its developers to broad public concerns makes it a model for those hoping to  to harness Big Data for democratic ends.

Above all, the S2 project leaders recognized the limits of Big Data. Its developers acknowledged the imperfect nature and sources of data, the limitations inherent in its processing, the necessity of packaging findings, and the need to bring findings to all audiences and institutions. In the long run, such chastened optimism and humility may prove to be the most helpful lessons  of all.

Callum Ingram is a graduate student in the Department of Politics at the University of Virginia. His dissertation research focuses on the use of urban space and architecture by democratic social movements.

Editor’s note: For more on this topic, subscribe to receive the spring issue of The Hedgehog Review, “Too Much Information.”

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.