Posted: April 6th, 2010 | Author: Rob Goodspeed | Filed under: Government, Infrastructure, Public Policy, Urban Development | Comments Off
The always-interesting Witold Rybczynski has a provocative piece up on Slate arguing that the failure of government-led urban planning means that “in a democracy, a vision of the future city will best emerge from the marketplace.” I don’t disagree with his observation that private organizations and real estate developers have taken the lead in shaping our cities, however I don’t believe it follows that the government has no role whatsoever.
Such an argument erases the many ways governments are deeply involved in planning urban spatial structure: designing and operating streets and other infrastructure, regulating urban land markets through enforcement of property rights and zoning, shaping the location and character of development through wetlands and other environmental regulations, subsidizing and shaping the housing finance system, and establishing and enforcing building codes and standards, to just name a few.
Although Rybczynski is right the government has largely withdrawn from the business of directly engaging in architecture and urban design (and that’s a good thing), the lesson isn’t that government should (or will) withdraw completely. The stark contrasts of quality of life between well- and poorly-governed cities illustrates just how important these more subtle processes of planning remain. His argument reminds me of Peter Montgomery’s thoughtful analysis of Jane Jacob’s The Death and Life of Great American Cities. Although her critique of Robert Moses and heavy-handed modernist city planning is important, Montgomery argues her celebration of the urbanity of her neighborhood omits the government processes that establish the framework of urban life (zoning, subway system, urban services, etc). In this way it can be read as a neoconservative tract, writing out the role of government. (In addition, Montgomery argues she ignores corporations, class and race divisions, and metropolitan equity).
To be fair, Rybczynski does stress the importance of government for “management” and “little plans,” and to a degree I’m just rejecting his definition of planning. But the point I hope to make is the “urban visions” created by real estate developers aren’t a pure product of the market, but derivative from government-determined transportation systems, zoning, and metropolitan spatial structure.
The more interesting and accurate conclusion to draw from the failures of modernist city planning is to consider which forms of government planning are still active and desirable. In this sense, Rybczynski’s article is a bit behind the times. The tremendous interest in high speed rail, urban transit, green building codes, the government’s role in wind power and broadband, and housing finance regulation has reminded us of the central role of government in shaping our cities. Hopefully this will be the legacy of the Obama era: that the choice between government and the market is a false dichotomy. Because the two are mutually dependent, addressing public problems (such as city planning, and yes, health care) requires attention to the design of each.
> Witold Rybczynski – Don’t Plan On It: Centralized city planning is not the answer to the problems facing America’s cities.
Posted: April 1st, 2010 | Author: Rob Goodspeed | Filed under: Urban Development | Tags: Census, Data | Comments Off
Although a majority of Americans have already returned their Census forms, technically today is the “Census Day” for the purposes of determining where people should be counted. How is your community doing? Take a look on the Census Bureau’s nifty participation rate map, where you can get a widget for the national participation rate or any county, city, or tract in the country.
Posted: October 28th, 2009 | Author: Rob Goodspeed | Filed under: Energy, Sustainability, Technology, Urban Development | Tags: Models | 4 Comments »
Computer modeling is a powerful tool for analyzing complex urban systems. Indeed, for decades metropolitan-scale transportation planning has been informed by increasingly sophisticated computer models. In addition, models are commonly used to study all types of infrastructure systems, the urban environment, even possible location of future of urban growth. In fact, I’m building an attractiveness model for future residential development in South Florida in a class this semester.
However, models can have insidious effects. They excel when applied to deterministic systems, where the rules are static and known, but often fail when applied to systems with arbitrary or random characteristics. Even more troubling, models can impede decision-making by hiding their assumptions, introducing bias into the simulation.
In this light, let’s consider a simple model developed by the oil company Chevron. Their “Energyville” game is located the Will You Join Us website, now being promoted through magazine and TV ads that position the company as an energy company interested in finding energy “solutions” and using it “wisely.” Energyville is presented as a neutral challenge: “What energy sources will power your city?” A disclaimer reminds the user that the assumptions are “based on The Economist Intelligence Unit’s assessment of global facts and trends obtained form numerous credible sources.” The warning observes the game makes many simplifications, acknowledging “global forces and technological developments may change current and future assumptions.” The game aesthetics shows a clear influence of the popular SimCity.
Launching the simulation, I begin by placing some wind turbines in the city. After installing three turbines, the limit is reached: “Geographical and other constraints prevent Wind power from providing any more power to Energyville.” Next, I turned to solar panels. After just two placed, I get this error: “Unavailable! Solar panels are still too cost prohibitive and inefficient to provide any more power to Energyville.” The only remaining renewable energy source is a massive conventional-looking hydroelectric dam. After installation on the river, most of my city’s electrical needs are met.
All except for the ever-important petroleum. There’s no Better Place-type electric car networks possible here. “Warning! low on fuels,” a message quickly appears, saying I need petroleum for airplanes, vehicles, and mass transit. Only once I put a huge petroleum platform in the ocean could I proceed to the next level.
Before level 2 begins, the simulation presents a policy choice: should I adopt energy efficiency measures that will improve environmental quality and “security,” while placing a tax on economic output? Round two is similar, with a couple surprises. First, my wind farms are in trouble:
Ironically, my attempt at developing renewable sources was thwarted by the very global warming I am concerned with! Next, my solar program is in trouble:
I’m a bit confused by this one. After all, all energy costs money. Solar panels can only be “too costly” if cheaper alternatives are available. What these options are – continued petroleum, nuclear, or some other source, is not explained. In this round my fossil fuels are unaffected by catastrophe.
Repeated playing revealed other game paths have other possible events. In one case, solar panels become more attractive to homeowners due to net metering policies, and two actually make wind power even more attractive due to vaguely specified improved technology or other benefits. If you invest in nuclear it warns you uranium may increase in price due to global demand. Once, a terrorist attack in the middle east tightens oil supply. But petroleum price and supply rarely plays a role in the problem – despite the historical evidence as recently as the summer as 2008 that it can be subject to major price volatility. (Incidentally, I think the game was created in 2007.)
In the end, is this a fair simulation? Despite the capricious nature of some of the factors, most of the assumptions are probably reasonable. Presumably Chevron is too savvy to deliberately plant obvious biases in some of the assumptions, nevertheless I’m sure a serious energy wonk could find plenty to quibble about. However, like too many models, Energyville doesn’t clearly reveal its underlying assumptions, or allow the user to question or manipulate them. Although the limits placed on the speed alternative fuels can be rolled out are probably derived from mainstream sources, history shows change — whether beneficial or catastrophic — can be surprisingly rapid.
This means Energyville misses a major educational opportunity. (Despite it appearing on an educational blog – the only Technorati link to the site.) The flash interface makes it impossible to copy text and contains no links to external sources, and the “about” page lists dozens of unlinked articles, reports, and websites, and no assumption is presented as contested. Although thousands of players may learn a few facts embedded in the game, or gain a vague sense of the benefits and limitations of various energy sources, it doesn’t support serious examination and debate about energy technology or policy. But maybe that’s the point.
Posted: October 15th, 2009 | Author: Rob Goodspeed | Filed under: Technology, Urban Development | Tags: collaboration, Urbanism and Planning | 2 Comments »
Over at my MIT webspace I just launched a database of web tools for participation and collaboration in planning. I created it mostly to help myself keep track of all the technology and consultants in this area, and also because of my dissatisfaction with existing databases. It’s not meant to be all-encompassing, just cover the important tools and some of the more innovative projects out there. Comments and suggestions are welcome!
> Web Tools for Participation and Collaboration in Planning
Posted: October 12th, 2009 | Author: Rob Goodspeed | Filed under: Technology, Urban Development | Tags: Data, open cities, open data | 6 Comments »
Last week’s Open Cities conference, sponsored by the Rockefeller Foundation and Next American City, brought together a diverse group to discuss the role of new media in shaping urban policy. One of the major topics discussed was the emerging trend of cities establishing data catalogs where a wide range of datasets and feeds are made available, often with the explicit goal of enabling private apps that will use the data to create value. Washington, D.C.’s data catalog is a national leader, and San Francisco, Boston, and others not far behind. (Through sheer coincidence, New York City announced their BigApps contest during the conference.) In addition to the city-led programs, a host of other sources — from Google Transit to Data.gov — are making urban data more available than ever.
Within government, data can be a powerful tool for management and service delivery. Baltimore’s CitiStat and its emulators have shown the power of data to focus on the bottom line for easily quantified government services and policies. Applications for e-management within government are many, and today’s New York Times story on IBM’s Smarter Cities initiative describes several.
Outside of government, the case is less clear. Some at the conference questioned whether governments should expend their limited resources on finding, cleaning, and publishing data. I think this debate is largely won. The costs of hosting data has dropped precipitously, most of the datasets have already been purchased by citizen tax money, and the resulting apps really do seem to create new value for city residents. Less clear, however, is whether disclosing data to the public will have any impact on urban policy.
It is this deeper question that lurks in the background of conversations about data: although more and more may be available, does influence urban policy or planning? A conference attendee who works for the mayor of a major east coast city suggested this at one point: in his opinion the city was driven by politics, not data.
On the one hand, data seems very needed in planning. Urban planners analyze data to understand trends, and every city plan contains detailed tables, charts, and data analysis. Outside government, community development corporations and nonprofits are also frequent data users: for grant applications, advocacy, and to explore trends in urban neighborhoods. In fact, hundreds of government planners, nonprofit employees, community activists and citizens came to the conference I helped organize here in Boston last summer titled “Data Day: Using Data to Drive Community Change.”
However, the cynic will retort there are “lies, dammed lies, and statistics.” Certainly, government planners and activists need data, the argument goes, but it’s just to support their particular agenda or policy. Taken to the extreme, this jaded view says you can find statistics to back up any belief.
This wasn’t always the case. In fact, for a brief period in the 1960s there was a great deal of interest about the possibility of establishing “social indicators” analogous to economic indicators. Just as economic indicators, such as unemployment rate, are used to determine economic policy, social indicators would guide social policy. Judith Innes in her 1975 book Social Indicators and Public Policy argued social indicators could be created, but must rely on a consensus understanding of definitions and measurement. The book’s fascinating history of the unemployment rate shows how the measurement has responded to cultural values about who to count. Despite thousands of books and articles on indicators in the late 60s and early 70s, the movement didn’t take off as expected. Defining social indicators was value-laden, collecting social data expensive, and focusing on data seemed irrelevant to a turbulent, problem-filled world. It’s little wonder when the second edition of Innes’ book appeared in 1990 it was re-titled Knowledge and Public Policy.
Although falling short of her definition of an indicator, many government datasets do provide a common framework for discussion and analysis, even perhaps guide policy creation. Although often imperfect, their flaws are well known by all users. In the 1990s, a number of “indicators” projects emerged, organized as the National Neighborhood Indicators Partnership. Generally based in nonprofits or foundations, these projects took advantage of new technology and plentiful government data to track measures of their choosing. (At MAPC, I worked closely with the Boston affiliate – the Boston Indicators Project)
Today, thanks to rapidly evolving technology more urban data is available than ever. Its role is equally ambiguous, simultaneously in demand by diverse users to use for advocacy, government service delivery, and perhaps crafting urban policy. At the conference, federal officials reminded the group the Obama administration is interested in evidence based governance, and President Obama even elevated the former architect of the D.C. data catalog, Vivek Kundra, to the nation’s first Chief Information Officer. In an interesting way, perhaps during times of concern for the public interest we are more likely to view data as a shared resource for deliberation and discussion of new policies and plans. We may be in a new era of data availability, but as always what matters isn’t the numbers themselves, but how we view them.
Posted: August 5th, 2009 | Author: Rob Goodspeed | Filed under: Transit, Transportation, Urban Development | Tags: High Speed Rail, Passenger Rail | 1 Comment »
Advocates for passenger rail in America are excited. The stimulus bill provided $8 billion for high speed rail construction, California has passed a bond for nearly $10 billion to build a system in that state, and other projects from Florida to Chicago are moving forward. The Federal government is planning to issue the grants to “jump-start” development of a national system this fall. In this climate, Harvard Economist Edward Glaeser posted today the second in a series where he says he will attempt to conduct an economic analysis of high speed rail.
“Personally, I almost always prefer trains to driving,” Glaeser wrote in his introductory post to the series last week, but quickly added, “the public must be wary every time our leaders decide to spend billions of our tax dollars,” citing economists who have long argued passenger rail is rarely worth its cost.
The result of the first analysis is unsurprising: an unfavorable finding to a conventional cost-benefit analysis evaluation. Some will quibble with the assumptions or methodology, and they are very rough. (Ryan Avent posted this scathing critique on Streetsblog earlier today, the Transport Politic also evaluated the assumptions)
However, I’m not going to attack the math. I’m not overly worried about the outcome of a cost-benefit analysis calculation, because we almost never make big transportation infrastructure decisions through this kind of analysis. We built the interstate highway system because it was deemed a suitable national goal. Cities are building light rail transit because they want it for a variety of reasons. Certainly, economic analysis informs many specific aspects of the system design, such as weighing possible routes, deciding on service levels, and sometimes selecting the mode. However, experienced transportation planners know because of the future’s uncertainty, even the most rigorous analysis can be wrong. And on many of the big questions, such as what mode of transportation to invest in and where it should go, are decided through the political process regardless of whether they make economic or practical sense. Finally, even the most brilliant and accurate cost-benefit analysis is only meaningful if the actual cost is somewhere close to the estimate cost. It turns out creating accurate estimates — and ensuring the project is built for a similar price — is very difficult.
For large infrastructure projects like high speed rail, accurate cost estimates almost never happens. According to Bent Flyvbjerg’s Megaprojects and Risk, the cost overruns can truly be spectacular:
||Cost overrun (%)
|Boston Central Artery/Tunnel Project (Big Dig)
|Great Belt rail tunnel, Denmark
|Shinkansen Joetsu rail line, Japan
|Washington Metro, USA
|Channel tunnel, UK, France
|Paris-Auber-Nanterre rail line
That means the total actual cost of Boston’s Big Dig was nearly three times the original estimate. A 100 percent cost overrun means it was double. On average, out of the study of 258 total projects in the book, the actual costs were 45 percent higher than estimated.
The causes of what they call this “calamitous history” are many and diverse. For the list above, I selected a variety of projects, from different times and cultures, to show it’s not uniquely an American problem. They’re not even all government-led projects, most notably the Channel tunnel was a private initiative. Private ownership does help, but cost overruns for a selection of private transport projects in the book run from 80% for the Channel tunnel to 15% for France’s Pont de Normandie bridge.
Much of the rest of the book is a detailed discussion of the causes for error in the creation of estimates, and possible reforms to improve the efficiency of megaproject construction. On the cost estimate side, the authors argue project studies rarely incorporate sufficient accounting of the inevitable risk, and are often skewed to meet political needs.
On the construction efficiency side, the authors propose four basic “instruments of accountability”:
- The involvement of risk capital: At least one-third of the project’s budget should come from private investors with no sovereign guarantee – i.e., they would lose money if the project goes over budget or has lower revenue than anticipated.
- Explicit formulation of regulatory regime: identifying all associated costs, even non-obvious ones like access ramps and stations, creating independent environmental review and other necessary oversight committees, and define the project financial and decision-making structures. They suggest a state-owned enterprise approach or build-operate-transfer approach, which place the government at arms length and minimizes conflicts of interest where the government is working both as an active booster and trying to meet public-interest objectives.
- Performance specifications: Define performance targets instead of specifying the specific approach for technical and environmental goals.
- Transparency: Enhanced transparency and public involvement to scrutinized plans and minimize opposition.
Our conventional project planning system could stand to gain from some of these reforms, and the time is now to incorporate them into the selected high speed rail projects. I still believe it’s important to try to create accurate estimates of total costs and benefits, and bring them into the decision-making process. However, since cost-benefit models are limited by uncertainty and often disregarded by the political system, we must also focus our attention on how to complete projects in the most cost-effective way possible.
> Edward Glaeser series: Part 1, Part 2
> Flyvbjerg, Bruzelius, and Rothengatter: Megaprojects and Risk
Posted: August 3rd, 2009 | Author: Rob Goodspeed | Filed under: Technology, Urban Development | Tags: APA, Planning, Social Media | 1 Comment »
Over the past year I’ve become involved in the American Planning Association’s Technology Division, an interesting group of academics and professionals interested in technology in planning. This summer together with PB’s Steve Chiaramont, I became co-editor of the Division’s newsletter Planning and Technology Today.
Our first issue contains stories on GIS, regional scenario analysis, and the use of social media to engage citizens in planning. Check it out online or download the PDF.