When I first heard about Baltimore’s CitiStat program, which uses city data to “provide timely, reliable services to Baltimore’s residents,” I envisioned a public sector version of an executive dashboard. The mayor (the program started under Martin O’Malley, it continues under Sheila Dixon) would have data at their fingertips through a computer interface or screen of some type to evaluate the performance of city government, sending directions off to the various city administrators.
I was wrong. The award-winning program is less about technology and more about creating the management approach to directly link timely data to decision makers, and hold administrators responsible for performance. It’s remarkable how little technology it actually involves. According to a Center for American Process briefing “CitiStat uses basic Microsoft Office programs — such as PowerPoint for presentations and Excel to gather data — as well as geographic information system, or GIS, mapping software from ESRI’s ArcView unit, which costs less than $1,000.” The program cost $285,000 to set up and around $400,000 a year to run (mostly salaries), but has estimated to save the city over $350 million since its creation in 1999.
Facing a declining population, limited financial resources, and rampant absenteeism and inefficiency in city government when he took office, former Mayor Martin O’Malley told a conference he didn’t create CitiStat to win awards, but simply “to survive.” Adapting a system called CompStat invented by the New York City Police department, the city create a special office and hearing room (right). Every two weeks, participating city agencies submit data on predetermined metrics, such as days employees were absent or potholes fixed. City analysts write a 8 to 12 page memo for the mayor and cabinet, using the reported data, field research, and interviews of key staff. At the hearing (above) photos, charts, and data illustrating problem areas are displayed and discussed with the agency head responsible. In the program’s parlance, the tenants are:
- Accurate and Timely Intelligence Shared by All
- Rapid Deployment of Resources
- Effective Tactics and Strategies
- Relentless Follow-up and Assessment
The CitiStat model has inspired many similar efforts around the country. O’Malley, since elected governor of Maryland, is busy developing systems for Maryland state government and management of the Chesapeake Bay. And told a conference recently “It’s my belief that the lessons we are learning in Maryland can be applied to any government, at any level, anywhere in the world.”
The copycats are trying to do exactly that, and with varying degrees of success. In the Boston area, the Rappaport Institute sponsored an event in 2003 titled “Bringing CitiStat to Massachusetts,” and some cities to adopt the program most completely since then have been Somerville, Springfield, and Amesbury. The two major limitations of Baltimore’s CitiStat are the lack of any technical infrastructure for internal information management and and the lack of public transparency through the Internet. Technically, the system requires emailing spreadsheets back and forth (there are templates on the website) and the resulting data and meetings are not shared online in a timely way (the most recent agency reports are from October and the all-important analyst memoranda are not posted.)
The cities of Boston and Washington, D.C. are innovating in the areas of performance indicators and data availability, but neither has put the whole package together: CitiStat’s management intensity combined with robust and transparent data architecture.
The Boston About Results (BAR) program compiles quarterly performance measures drawing from a database of electronic records:
BAR collects data on hundreds of performance measures from a wide range of departments in a centralized system that integrates a department’s mission, strategies, measures and resources. The data is used by city officials to identify trends, raise questions and devise new management practices to constantly improve city services.
Absent are any independent city employees scrutinizing the data and conducting investigations, or any description of the management mechanism whereby Mayor Thomas Menino holds anyone accountable for performance. A dry PDF published quarterly means the data is not as timely as Baltimore’s bimonthly review, however does provide an interesting window into city agency performance:
Washington, D.C. has become a leader in publishing data in multiple formats including XML files ready for real-time publishing and analysis by third-party software. The Office of the Chief Technology Officer data portal contains over 274 datasets, which they hope will prove a “catalyst ensuring agencies operate as more responsive, better performing organizations.” In order to encourage the development of creative use of all this data, the OCTO recently wrapped up an “Apps for Democracy” contest for the best mashups that use city data sources. Presumably some of these data sources could be used to evaluate city performance (service requests and police reports are published, but not evaluated for trends) by the city or a third party.
The use of specially defined data metrics to measure government performance is necessary because government cannot readily be measured in the terms used by the private sector: profitability. The systems must be unique because what is measured — the specific services offered and associated performance expectations — are defined by the voters.
One potential pitfall to this management approach is the limits of quantitative data. (That is, if you can avoid the typical pitfalls from this brief) Although the absentee rate and number of potholes fixed can be counted easily enough, a host of government functions are not easily counted. Looking at the city of Boston’s performance data, it struck me one of the agencies missing from the system was the Boston Redevelopment Authority, the planning and urban renewal agency for the city. If it was included, what metrics should be measured? Plans produced? Property values increased? The quality of plans produced? How should the success in coordinating private investment to create quality neighborhoods be evaluated?
Urban development is certainly not the only partly qualitative function of urban government, just one of interest to me. It should be remembered an earlier generation of reformers also sought to make government more efficient, and Robert Moses himself even concocted a scheme to evaluate the performance of city employees early in his career. The danger is that the drive for performance or efficiency can sideline worthy yet difficult to quantify government functions, or create a management framework where normative issues, such as the values informing goal setting itself, remain unexamined.
> Baltimore CitiStat
> D.C. Office of the Chief Technology Officer
> Boston About Results
> Rappaport Institute: “The Seven Big Errors of PerformanceStat”
This is super interesting stuff.
I noticed that the BAR page you call out in the post links to a private Flickr image.
Pingback: Matthew Yglesias » Making Government Work With Citistat
Good post rob. You’re completely right on the final part about quantifying the unquantifyable. O’Malley came to Annapolis with this idea that if you relentlessly apply STATing to different functions of government that you’ll get higher efficiency. I think this strategy is especially ill-suited in the case of baystat It’s very hard to get the on-the-ground results when you have two dozen counties with independent planning authority and 5.5 million people making individual land management decisions. At least baystat produced a good website that is great at informing the public. Unfortunately, like many government website, it will probably be forgotten about by the next administration.
More recently the governor tried to implement STATing through his greenprint initiative which tries to apply a ecological scoring system to allocate state land preservation dollars. It’s very easy to get lost in the numbers, the process, and the mapping. There is also too heavy a reliance on GIS layer which are only as accurate as the person who made them. Good conservation opportunities are missed and the bureaucracy of the ranking system can delay acquisitions by months. Ironically, the same day the governor announced the initiative, he also announced $56 million in land acquisition for properties that scored poorly under Greenprint’s ecological scoring system. Basically, in the end they are buying the properties that they probably would of anyway without the STATing.
Pingback: Goodspeed Update » Blog Archive » What Government Data Should be Transparent?