In June I took the general exams for my PhD program, which involved a one-week written and oral test on topics related to my chosen fields — urban information systems and democratic land use planning. This means over the past year I’ve plowed through much of the literature on urban modeling from the 1950s to the present day. As a result, I’ve been feeling acute déjà vu reading about the latest efforts by IBM and others to model “smart” cities, presented as a new frontier for cities devoid of any previous research.
For example, here is a description of an IBM project announced this week:
This problem–if you can’t measure it, you can’t manage it–combined with the impulse to improve cities by models, is driving both IBM’s “smarter city” strategy and the nascent “urban systems” movement, which seek to apply complexity science to cities. IBM … today announced the latest plank in its smarter city platform: an “app” containing 3,000 equations which collectively seek to model cities’ emergent behavior. IBM also revealed its first customer, the City of Portland, Oregon. Systems Dynamics for Smarter Cities, as the app is called, tries to quantify the cause-and-effect relationships between seemingly uncorrelated urban phenomena. What’s the connection, for example, between public transit fares and high school graduation rates? Or obesity rates and carbon emissions? To find out, simply round up experts to hash out the linkages, translate them into algorithms, and upload enough historical data to populate the model.
Here is a description of Jay Forrester’s 1969 book Urban Dynamics. (A MIT professor emeritus, Forrester is known as the founder of System Dynamics.)
In this controversial book, Jay Forrester presents a computer model describing the major internal forces controlling the balance of population, housing, and industry within an urban area. He then simulates the life cycle of a city and predicts the impact of proposed remedies on the system. Startling in its conclusions, this book became the basis of a major research effort that has influenced many government urban-policy decisions.
The contemporary smarter cities discourse seemed to start as merely a marketing ploy, but recently its proponents have sought a more substantial foundation. Although maybe there is more under the surface, so far all I have seen is warmed-over systems modeling or system optimization of the type invented in the 1950s and 1960s. If the promoters of these methods hope for contemporary relevance they must explain why — and how — the severe challenges these approaches face in a democratic society can be overcome.
Perhaps the most well-known article in this field is Douglass Lee’s 1973 article “Requiem for Large-scale Models” (PDF) where Lee, then a freshly minted Berkeley PhD, laid out the “seven sins” of the early generation of large-scale models (which included Forresters’ urban dynamics model): hypercomprehensiveness, grossness, hungriness, wrongheadedness, complicatedness, mechanicalness, and expensiveness. Importantly, he described desirable characteristics for city models:
- Transparency (“‘Black-box’ models will never have an impact on policy other than possibly through mystique, and this will be short lived and self-defeating.”)
- Balance between theory, objectivity, and intuition (“large-scale modeling has been significantly lacking in theory”)
- Start with a particular policy problem that needs solving, not a methodology that needs applying
- Build only very simple models
These recommendations reflect two fundamental differences between cities and other complex systems: randomness and democracy. These underlying theoretical challenges face any would-be urban modeler, from hacktivist to corporate consultant, engaged in the “battle for control of smart cities,” described by Anthony Townsend in a 2010 report and in his forthcoming book.
Urban systems aren’t just complex systems, they’re highly random ones subject to internal and exogenous shocks almost impossible to model, let alone predict. (e.g., gas prices, hurricanes, Justin Bieber concerts, etc) Most concerning, contradictory theories describe these models’ most most important variable, human behavior. These theories all have limited explanatory power but some validity, e.g., economics’ utility maximization and sociology’s social norms.
Secondly, the promise of urban optimization must be reconciled with democratic government. IBM has been running ads where the their employees boast of all the good things they are doing — tracking food for safety or reducing crime. Every time I see them, I think about priorities and trade-offs. Who decided these were the right priorities for resources? Individually they are achieving laudable goals, but they can only be judged in context. Only a democratically legitimate government can determine whether money is well spent on a food or crime tracking systems, versus other pressing concerns like education, health care, and infrastructure.
This post is not a critique of using data and analytical methods for urban policy. To the contrary, I think they’re as needed as ever and have been working with MAPC on a scenario modeling platform. There very well may be analytical innovations, like cellular automata, genetic algorithms, or complexity theories, which could be applied to create useful urban models. However new technology and new buzzwords does not eliminate the long-running theoretical and practical challenges of using models to improve urban life, or the importance of learning from history.