Posted: December 19th, 2011 | Author: Rob Goodspeed | Filed under: Boston, Transit | Tags: fares | 1 Comment »
Boston’s subway plays a critical role for the city. Despite a fare increase in 2007 and receiving a dedicated portion of the state’s sales tax, in recent years the agency’s tight budget (driven partly by labor, health care, and energy costs) has prevented needed maintenance and upgrades. With many of the system’s cars nearing the end of their operating lives, it is only a matter of time before service reliability and safety are impacted even more. For these reasons, the MBTA is expected to begin the process of raising fares in January.
In my view, political unwillingness to raise fares has resulted in a situation where the safety, comfort, and convenience of riders are threatened. Fares should be raised, and the additional revenue used for maintenance and upgrades to tracks and train cars. Many of the T’s riders are middle and upper class — and can afford fares that are closer to the true cost of the service. However, I also support creating discounted fares and passes for low-income residents.
To be clear, this is a second-best solution to creating new broad-based taxes that are less sensitive to economic cycles than the sales tax (such as those used in Paris). However the political resistance to raising or increasing any taxes, as well as a widespread “user pay” principle in U.S. transportation means fares will remain a significant source of revenue. As described below, my analysis suggests increasing fares in Boston may reduce ridership slightly, but not result in a “death spiral” of declining ridership and revenue sometimes seen in other cities.
A common objection to raising fares is that it will encourage more people to drive to work. According to an analysis I completed for a class project last spring, the system’s relatively low price elasticity means this effect will be muted for most of the core subway system. This low elasticity is probably due to the high cost of parking in most of Boston and the low existing fares (compared to the price of other transportation options).
In a class paper, I investigated what the effect of increasing the MBTA fare to a new flat rate of either $2.00 or $2.25, or implementing the distance-based or peak fare structure used by Washington, D.C.’s WMATA Metrorail. Below are links to a presentation summary and the original paper.
To do the study, I applied elasticities estimated by the state’s Central Transportation Planning Staff (CTPS) after a 2007 fare restructuring. Using the Automated Fare Collection system data, I made assumptions about where passengers traveled to using the pattern of morning and evening boardings. I found current riders pay different prices based on the fare type:
- CharlieCard fare: $1.70
- Monthly LinkPass average: $1.13
- Systemwide average: $1.26
An important finding is the very low average price being paid by owners of the LinkPass for each trip. Because of the relatively low elasticity for subway trips and popularity of passes resulting in low effective fares, increasing fares would generate substantial additional revenue but also possibly decrease ridership. Note I assume all riders would pay the new fare, if there remains a subsidy for pass holders the magnitudes would be smaller.
|Peak-of-peak only ($0.20 surcharge)
|WMATA Distance-Based Fares
I did consider equity considerations in the paper, however it is difficult to analyze without rider-level data. Distance-based fares would result in dramatic increases in total fares for outlying stations.
For comparison, I completed a quick survey of the cash fare for large, center-city subway and light rail systems (as of August 2011). The results are uniformly higher than the average price paid by MBTA riders, and all except Los Angeles are higher than the $1.70 paid by most per-trip riders in Boston.
- New York City – $2.25
- Chicago – $2.25
- Salt Lake City (LR) – $2.25
- Denver (LR) – $2.25 – $5
- Miami – $2
- Philadelphia – $2.00
- Washington, D.C. – $1.95 – $5
- Dallas (LR) – $1.75 – $5
- Los Angeles – $1.50
Interestingly I may have been too conservative in my analysis, as some of the flat fares discussed here are even higher than those I tested. I hope the results of the analysis mentioned in this article are made public, and compromise options such as low-income subsidies are put on the table for consideration.
Posted: December 16th, 2011 | Author: Rob Goodspeed | Filed under: Technology | Tags: GIS, municipal data, open government | 3 Comments »
My first peer-reviewed journal article was published this month by the Journal of the Urban and Regional Information Systems Association (URISA), an open access journal published by a leading geographic information systems (GIS) professional organization. Titled “From Public Records to Open Government: Access to Massachusetts Municipal Geographic Data,” it reports the result of a public records request for GIS data I submitted to all 351 Massachusetts municipalities. Here is the abstract:
Increasingly, citizens are demanding access to raw data from governments to hold public officials accountable, look up facts, conduct analysis, or create innovative applications and services. Cities and towns create data using geographic information systems such as layers describing parcels, zoning, and infrastructure that are useful for a wide range of purposes. Through a public records request to all 351 Massachusetts municipalities, this paper investigates whether these data are accessible to citizens in practice. Some response was received by 78.6 percent of the municipalities. Two municipalities refused access to all electronic records. Many others charged fees ranging up to $453 or placed legal restrictions on the data through licensing that could chill or prohibit creative reuses of the information through emerging technologies. Other practical barriers limited public access to data, such as limited resources, government officials’ limited technical knowledge, and outsourcing to private vendors. A followup survey among municipalities that did not respond to the request was conducted to determine if they had GIS systems or data policies, and this information was collected for 80.3 percent of the municipalities. Finally, the paper discusses the legal, policy, and technical steps that can be taken by governments to move from a “public records” to an “open government” paradigm for transparency of government data. The policy recommendations for municipalities include publishing GIS data for free online and with minimal legal restrictions.
The paper started as a class project for the MIT class “Ethics and Law on the Electronic Frontier” I took in Fall 2010. This research is related to the work I did at the City of Boston, where in 2010 I developed an open government strategy proposing how the city could use technology to achieve transparency, participation, and collaboration goals.
Goodspeed, Robert. 2011. “From Public Records to Open Government: Access to Massachusetts Municipal Geographic Data” (PDF) Journal of the Urban and Regional Information Systems Association 23 : 1, p. 21-32.
> What Government Data Should be Transparent?
> Public Sector Innovation: Learning from History (on local GIS technology adoption)
Posted: September 30th, 2011 | Author: Rob Goodspeed | Filed under: Detroit, Public Participation, Urban Development | No Comments »
In June I published an op-ed in the Detroit News describing my research on urban renewal in Detroit in the 1940s. I concluded with the observation:
The voices of citizens affected by renewal must be heard. Dramatic, large-scale projects can have harmful and unexpected consequences. The history of urban planning has shown success occurs through a careful process of building consensus, detailed analysis and cooperative action.
In response Marja Winters, the city’s deputy director of planning and development, wrote an editorial arguing the process has been highly participatory, involving 28 city-wide meetings and 10,000 citizens, and large numbers of participants said they agreed they had had the opportunity to share ideas and opinions.
(As an aside: She objected to a line which read “The plan calls for closing neighborhoods, cutting services and cultivating new industries.” I agree with her criticism: the words aren’t mine, but those of a Detroit News editor. The manuscript I submitted read: “The Detroit Works Project — Mayor Bing’s roadmap for the city’s future — has proposed dramatic solutions: closing neighborhoods, cutting services, and cultivating new industries.”)
I have not attended the Detroit Works public meetings or examined the process, so I cannot critique it in detail. The first major policy initiative coming out of the process was announced in July, but amounted to the selection of some priority areas for city services. The proposal left some puzzled. Where was the grand vision, or bold proposals? Perhaps there is no need for “planning” at all, just better urban management? (See “Questions dog Detroit Works plan: Advocates want to see long-term strategy“)
This situation and Winters’ article raises interesting questions: is all participation alike? Can the design of the process affect the outcome? What models exist for planning for “shrinking cities”?
It is common for major urban plans or policies to be developed through quite elaborate processes. For example, I collected this diagram that was circulated in the early stages of the Imagine Austin Comprehensive Plan:
In general, their design is left up to professionals who draw upon professional experience. Most process designs characterize several aspects: problem definition, deliberation and participation, analysis, policy design, and decision-making. Under each of these, details include:
- The number, type, mission, membership and missions of committees
- What expertise and analysis is required, and how they are involved
- The timing, nature, and purpose of broader participation such as meetings, surveys, and online engagement
- How decisions will be made.
One of the cleares descriptions of how processes are designed for local contexts comes from Barbara Faga’s book Designing Public Consensus. After several case studies, the book presents the following public process plan as a starting point:
This way of thinking is not unique to urban planning. As the field of risk assessment has become embroiled in value-laden controversies, experts have had to re-assess their approached. In 1996, leaders in the field proposed an analytic-deliberative model that seeks to tightly link the needed analysis with involvement from affected parties.
Perhaps the most common process theory for large-scale planning is scenario planning (PDF), adopted from methodologies invented by the private sector for corporate planning. Although providing guidance for how thoughtful “scenarios” can be used to consider options for the future, scenario planning’s participatory logic is underdeveloped.
The crowning achievement of process thinking in public policy may be the consensus building approach (CBA), a method for resolving dilemmas often associated with Larry Susskind, a MIT professor of urban studies and planning. This negotiation methodology has strict requirements for the nature of the problems where it can be applied, how stakeholders are identified and included, and how negotiation should move forward. However it’s not clear how this approach — designed to intervene in acrimonious public debates about clear problems or decisions — applies to the problem of urban planning.
If there is an art to process design, can there be a science? It is rarely studied for a variety of reasons. First is the argument that process doesn’t matter. It could be that the outcome is the same regardless of what is done, or the real decisions that matter are being made elsewhere — by powerful elected officials or market actors. Second, from a social science perspective, studying them is maddeningly difficult. There are too many confounding variables and no clear to measure. What would you measure, and how? For this reason there are many descriptive case studies that steer clear of specific details. Lastly, analyzing processes requires a different form of knowledge than found in most research. Instead of theory that describes reality, we need a theory of what would happen given a certain sequence of events or actions.
Theory aside, how do you plan for Detroit? A good process would focus first on the goal. What is the “problem” in Detroit, anyway? It could be too much land, too few jobs, high crime, or a lack of revenue for government services. Although they are related, tackling any one means clarifying what the priorities are.
The most direct case for Detroit is the Youngstown 2010 project in Youngstown, Ohio. This process involved large-scale participation and a vision and plan adopted by the city council which anticipates significant changes to accommodate a permanently reduced population. Here is the process diagram from Faga’s book:
Where Detroit Works — or any other large-scale planning in Detroit — should go depends on what the local stakeholders seek to accomplish. Although any process must be locally tailored, the process designers aren’t starting from scratch. The models described above can be used to design a process that reflects both values and practical needs to involve the public, detailed analysis, and come to agreement on a solution to public problems.
Posted: September 9th, 2011 | Author: Rob Goodspeed | Filed under: Government, Technology, Urban Development | Tags: cybernetics, smarter cities | 1 Comment »
Periodically I come across an old article that seems very relevant to the present, such as the article about public sector innovation I posted in January.
The ongoing expanded use — and declining cost — of sensors and computing technologies has sparked a renewed interest in using them to solve persistent urban problems. A similar wave of interest occurred during the early history of digital computing. In his influential 1950 book, Norbert Wiener popularized the term “cybernetics” to refer to the emerging science of communication and control of organized systems. If the city is an organized system, then cybernetics in city hall would involve creating information feedback loops to be used by the manager (or “actuator”) to minimize the effects of disturbances and maximize achievement of urban goals. Sound familiar? It should: IBM inked a multimillion dollar deal to open a real-time “public information management center” in Rio de Janeiro (right) as part of their smarter cities initiative, and Wired magazine is keeping up a drumbeat about the power of feedback loops.
In an astute article published in Science in 1970, E.S. Savas considered the challenges this approach might face in the real world of New York City government. I don’t doubt the importance of real-time control for management tasks like transportation system management and emergency response, but the article describes some important challenges such a system would face if applied more broadly. Savas described how the five elements of the cybernetic loop would play out in the city: (1) dynamics of urban government, (2) information system, (3) administration, (4) goal setting, and (5) disturbances.
1. Dynamics of city government: The election cycle faced by big city mayors would limit the range of solutions considered, resulting in smaller goals and visible acts, which “may be more symbolic than effective.” Government itself is very slow-moving and one solution — delegating power — may have unintended consequences.
2. Information system: Arguably today much more information is available than was in 1970 about what’s happening in the city. But another crucial input is as tricky as ever — gauging the will of the people.
3. Administration: Making a decision is one thing, but implementing it requires an administration with appropriate personnel and structure, a well-known weakness of big-city bureaucracies.
4. Structure of government: Not only are city governments organized in anachronistic ways, the article omits another key fact: the fragmentation of powers. In Boston, for example, in addition to municipal fragmentation itself, separate entities manage many utilities, the transit system, parks, etc.
4. Goal setting: Identifying a common set of goals may be impossible. The chief executive can use judgement, but it is for good reason that power is delegated to elaborate systems of commissions, boards, and advisers on many topics.
5. Disturbances: These are unpredictable, often external to the city, and often not visible to the public (who sets the goals) until it is too late to prevent their impact. (e.g., climate change)
There are, in general, two responses to most of these concerns. Savas himself took one approach: give up on city government and advocate for privatization of service delivery. Presumably the cold logic of the profit motive would sweep away administrative, regulatory, and decision-making quirks of city governments. The other approach is to attempt to reform the government. In fact, IBM staff have admitted the “challenges” that will face a contemporary agenda for cybernetics. I think the need for contemporary urban government reorganization and reform is acute in many cities, but interest in it seems limited.
Notably, neither of these approaches truly addresses the challenges posed by the short time-horizon of elected officials, difficulty setting goals or forming consensus, and unpredictable disturbances. These three point to the need for planning to solve urban problems: a multi-stakeholder process involving analysis, deliberation, and solution design that both forges a consensus about the definition of a public problem and crafts a desired solution. It seems to me that in the face of the enormity of the challenges we face we need both smart planning and an efficiency-driven smart cities movement willing to push for reform but respectful of democratic systems.
> E.S. Savas in Science magazine, 1970: “Cybernetics in City Hall“
Posted: September 1st, 2011 | Author: Rob Goodspeed | Filed under: Urbanism and Planning | Tags: Data | No Comments »
I just posted a new article on the Planetizen blog: “The Coming Urban Data Revolution“:
Historically, data sources for urban planning have remained relatively stable. Planners relied on a collection of well-known government-produced datasets to do their work, including statistics and geographic layers from federal, state and local sources. Produced by regulatory processes or occasional surveys, the strengths and limitations of these sources are well known to planners and many citizens. However all this is beginning to change. Not only has the U.S. Census Bureau’s American Community Survey introduced a bewildering variety of data products, all with margins of error, three interrelated categories of new data are growing rapidly: crowdsourced, private, and “big” data.
Posted: August 12th, 2011 | Author: Rob Goodspeed | Filed under: Urban Development | Tags: smarter cities, urban modeling | 5 Comments »
In June I took the general exams for my PhD program, which involved a one-week written and oral test on topics related to my chosen fields — urban information systems and democratic land use planning. This means over the past year I’ve plowed through much of the literature on urban modeling from the 1950s to the present day. As a result, I’ve been feeling acute déjà vu reading about the latest efforts by IBM and others to model “smart” cities, presented as a new frontier for cities devoid of any previous research.
For example, here is a description of an IBM project announced this week:
This problem–if you can’t measure it, you can’t manage it–combined with the impulse to improve cities by models, is driving both IBM’s “smarter city” strategy and the nascent “urban systems” movement, which seek to apply complexity science to cities. IBM … today announced the latest plank in its smarter city platform: an “app” containing 3,000 equations which collectively seek to model cities’ emergent behavior. IBM also revealed its first customer, the City of Portland, Oregon. Systems Dynamics for Smarter Cities, as the app is called, tries to quantify the cause-and-effect relationships between seemingly uncorrelated urban phenomena. What’s the connection, for example, between public transit fares and high school graduation rates? Or obesity rates and carbon emissions? To find out, simply round up experts to hash out the linkages, translate them into algorithms, and upload enough historical data to populate the model.
Here is a description of Jay Forrester’s 1969 book Urban Dynamics. (A MIT professor emeritus, Forrester is known as the founder of System Dynamics.)
In this controversial book, Jay Forrester presents a computer model describing the major internal forces controlling the balance of population, housing, and industry within an urban area. He then simulates the life cycle of a city and predicts the impact of proposed remedies on the system. Startling in its conclusions, this book became the basis of a major research effort that has influenced many government urban-policy decisions.
The contemporary smarter cities discourse seemed to start as merely a marketing ploy, but recently its proponents have sought a more substantial foundation. Although maybe there is more under the surface, so far all I have seen is warmed-over systems modeling or system optimization of the type invented in the 1950s and 1960s. If the promoters of these methods hope for contemporary relevance they must explain why — and how — the severe challenges these approaches face in a democratic society can be overcome.
Perhaps the most well-known article in this field is Douglass Lee’s 1973 article “Requiem for Large-scale Models” (PDF) where Lee, then a freshly minted Berkeley PhD, laid out the “seven sins” of the early generation of large-scale models (which included Forresters’ urban dynamics model): hypercomprehensiveness, grossness, hungriness, wrongheadedness, complicatedness, mechanicalness, and expensiveness. Importantly, he described desirable characteristics for city models:
- Transparency (“‘Black-box’ models will never have an impact on policy other than possibly through mystique, and this will be short lived and self-defeating.”)
- Balance between theory, objectivity, and intuition (“large-scale modeling has been significantly lacking in theory”)
- Start with a particular policy problem that needs solving, not a methodology that needs applying
- Build only very simple models
These recommendations reflect two fundamental differences between cities and other complex systems: randomness and democracy. These underlying theoretical challenges face any would-be urban modeler, from hacktivist to corporate consultant, engaged in the “battle for control of smart cities,” described by Anthony Townsend in a 2010 report and in his forthcoming book.
Urban systems aren’t just complex systems, they’re highly random ones subject to internal and exogenous shocks almost impossible to model, let alone predict. (e.g., gas prices, hurricanes, Justin Bieber concerts, etc) Most concerning, contradictory theories describe these models’ most most important variable, human behavior. These theories all have limited explanatory power but some validity, e.g., economics’ utility maximization and sociology’s social norms.
Secondly, the promise of urban optimization must be reconciled with democratic government. IBM has been running ads where the their employees boast of all the good things they are doing — tracking food for safety or reducing crime. Every time I see them, I think about priorities and trade-offs. Who decided these were the right priorities for resources? Individually they are achieving laudable goals, but they can only be judged in context. Only a democratically legitimate government can determine whether money is well spent on a food or crime tracking systems, versus other pressing concerns like education, health care, and infrastructure.
This post is not a critique of using data and analytical methods for urban policy. To the contrary, I think they’re as needed as ever and have been working with MAPC on a scenario modeling platform. There very well may be analytical innovations, like cellular automata, genetic algorithms, or complexity theories, which could be applied to create useful urban models. However new technology and new buzzwords does not eliminate the long-running theoretical and practical challenges of using models to improve urban life, or the importance of learning from history.
Posted: July 22nd, 2011 | Author: Rob Goodspeed | Filed under: Transportation | Tags: cycle tracks, cycling, hubway | Comments Off
Boston has been slow to join the urban bicycling renaissance. In this very strong-mayor city, Mayor Menino had a public about-face in 2007. After long neglecting bicyclists in the city, he hired a “bike czar” and the city began implementing bike racks and lanes. The mayor himself even bought a bike for neighborhood rides.
With the introduction of a new bike sharing system (called Hubway) at a noon press conference next week at city hall on Tuesday, July 26th, Boston joins other innovative U.S. cities like Minneapolis and Washington, D.C. which have followed the lead of Paris and Montréal in rolling out large-scale lending systems.
However, the cantankerous response in the local press illustrates the forces resisting progressive transportation planning here. Instead of encouraging bikes, one columnist writes, the city should ban them, focusing on the roads that were “built for cars.” An equally exaggerated rejoinder suggested banning cars instead. An article in this week’s paper contains critiques of the initial station locations, as well as the patient explanations that the company has had more success with a measured roll-out of a dense network centered on active downtown neighborhoods.
Ironically, surrounding cities like Cambridge may very well be better biking cities, with flatter terrain and more extensive bike lane networks. Since the system was procured through a regional planning agency (note: where I have worked and still consult for), adding new institutions or cities will be seamless process, theoretically expanding to any of the 101 cities and towns surrounding Boston where sharing might make sense.
However what’s missing is a broader discussion about what a 21st century bike infrastructure the city — and region — could use. These changes include some of the policy issues and off-road trails discussed by the 2007 MAPC Regional Bicycle Plan, but others will include detailed changes to our streets. Ironically these could include both more and less lines and signals on the streets. It’s a well-known finding that some streets that seem dangerous are not, so long as certain conditions are met. In other cases, separated infrastructure is safer, or required to encourage cycling. However these changes cost money, and an even scarcer commodity on many city streets. The city’s bike director Freedman explained the dilemma in the Globe:
“Every study and survey of cyclists points to the fact that the only way you are going to have women, children, and seniors cycling in any mass numbers is by providing cycle facilities completely segregated from traffic, with timed lights at intersections that let riders cross without being worried by being hit,’’ Pucher said in a telephone interview.
Freedman, a former Olympic cyclist, said establishing truly protected lanes is not a matter of technology. “It is a question of space. The real issue is a public process of communities deciding what we want. Do we keep the priority for cars or do we start making compromises that integrate bikes? We’ve made tremendous strides, but there is a price and the price is space.’’
I’m not a cycling specialist, so I will leave it to others to discuss what exact changes are needed in Boston. During a trip to the Netherlands last year, like many American visitors I was amazed by the highly developed cycling infrastructure. It led to me ask, if we follow some of the steps taken there, what changes are possible? These photos were taken in The Hague, Delft, and Rotterdam in July 2010.
1. Bike Parking
The most basic amenity is a place to park your bike when in the office or running errands. Bicycle racks are expanding in Boston, but this also involves well-designed, convenient storage spaces inside buildings and at transit hubs.
2. Cycle Tracks
In many European cities, busy streets have separated bicycle and walking areas, both separated from the vehicle lanes. In the case above, special accommodation is made for the bus stop. MIT has installed a cycle track on Vassar Street.
3. Bike Signals
Where a crossing is inevitable, a signaled intersection may be needed. If you look closely above, you can see the red light is in the shape of a bike.
4. Smart Intersections
Good intersections have carefully thought out interactions which balance demands between modes, take into account desire lines, and provide subtle guides to encourage safety and efficiency. This corner gracefully accommodates left turning bicycles onto the street cycle track. The best example of this in the Boston area is a special bike crosswalk at Harvard, allow cyclists to safely cross a busy street.
5. Multimodal Connections
The pyramid structure is above the entrance to a train station. In the foreground, a bicycle track linked to the broader neighborhood system. The Southwest Corridor Park is a start in this area in Boston.
6. Bike Freeways
To the right is the entrance ramp to a limit-access freeway. To the left is the accompanying bicycle freeway.
As the number of cyclists continues to expand in Boston and other U.S. cities, more and more of these European-style interventions may be needed to ensure safety and promote the expansion of the ultimate green transportation mode.
> Boston Bikes
> MAPC Bicycling Projects