More and more governments are publishing data feeds, whether of news, alerts regarding public services, or even exposing administrative data. In the UK, the “Mash the State” project has the goal of encouraging every local unit of government publish a news RSS feed.
Inspired by the project, Steve Clift asked “what web feeds should government websites provide?” on the Democracies Online listserv.
1. What’s New – Comprehensive feed of all new pages/documents across
the site/agency posted/updated online.
2. Upcoming Public Meetings – Meetings coming up with links to
available meeting documents
3. Press Releases
I added a few more, from the perspective of urban planning more specifically:
– 311 service requests
– Geocoded feed with project proposals at various stages of the development review process (site plan review, zoning variance, etc)
– Feeds specific to the process of creating certain plans or policy documents (feed for comprehensive plan, or downtown revitalization plan, etc)
– Geocoded feeds of recently issued permits, by type (building or construction permits, parade or public space use permits, liquor licenses, etc)
– Real-time data on urban systems such as traffic or transit alerts.
Dan Knauss thought the question itself was off-base: “What is needed is a querying syntax like Apache’s Lucene with output options in a number of different XML schemas. Then you can pull whatever you want from a database in any format that’s provided.” He points to this project in Milwaukee that works off a dataset fed by government email lists, because they don’t offer RSS.
What data feeds do you think the government should publish?
I was just at a lecture/class by Archon Fung from the Kennedy School today, and the 311 service came up. He brought up an interesting point: the problem with 311 is, from a citizen’s perspective, you have really no idea how many other people have put in the same call, when others put in a call (if ever), and when you can expect some service. He mentioned a new technique in which you basically post pictures from your cell phone to a site the geocodes service requests and neatly categorizes them, all dated, so you can see how responsive government is. He mentioned a city (maybe in Florida, or California–I can’t remember) in which they are testing the idea. Would do a lot for transparency, that’s for sure.
Jeremy, thanks for the comment. I agree – I think most 311 is still done one-to-one, governments should be thinking about ways to connect citizens to each other to solve public problems.
What you describe could be a potential use of the live 311 feed produced by a government – sifting through to see how many duplicates they are and if it is possible evaluate completion rates.
Good point by Fung. I’ve read some of his stuff from a friend who’s in law school, so it’s great to see more signs of a generation rising up that gets the groundwork of “gov2.0.”
My point was the same as Sir Tim Berners-Lee ‘s in his recent TED talk–give us all the data, raw, whole and complete, now. All public information/data that’s digital should be in queryable databases so it’s structured, and then you can futz with the way output is structured forever after.
Incidentally, that neighborhood services and crime mapping application for Milwaukee was spawned because of the tedium and inefficiency of trying to watch all the individual items for a whole neighborhood and getting redundant material since the email output is based on .5 mile (max.) radii from address-based centerpoints. That particular instance of it requires 24 centerpoints to cover two strategic planning areas–a rather small part of the city. And it took hours to set that up due to the poor interface the City has for it.
PS: you may be thinking of the #2 finalist in the TC50 startup contest, CitySourced.com which is setting up a 311 system (with mobile apps) for San Jose. Nation-wide service. Took a couple of weeks and under $100k to build. There is also SeeClickFix.com.
Regarding the ability for the public to evaluate completion rates, don’t forget that is wanted by government administrative/budgeting offices too! Don’t underestimate how many bureaucratic divisions have resisted technology or abused it so that very little evaluative data can be had easily on their performance. It’s probably just not available in many cases, which is evidently the case in Milwaukee based on their current proposed budget and plans for a unified call center that is in large part a 311 system.
Dan, while I agree pushing the data access question even farther upstream from an RSS feed to the database itself is conceptually the right approach, I am concerned in practice the community that would know how to write the proper queries is just too small. This came up earlier on my post on government data — see Mari’s comment.
Rob, that is the wrong way to look at it. It’s a patronizing and paternalistic view of the public–“you can’t handle open access, so we’ll build you a bunch of filters and fixed-contexts, special interfaces to guide you.” If governments open their data, capable people will use it and be able to build independent applications and interfaces for broader audiences. This has already been seen in the real estate business, and EveryBlock is another prime example of what happens if government data is simply made available.
Government entities should open their data to the maximum extent, and if they want to build more guided, restricted interfaces to hand-hold the public at the public’s expense, they can do that, but others who do this on their own for public-interest and/or commercial reasons will likely deliver a better product (more valuable and relevant to the general public) that is not based on taxes.
See the principles of open data at the Independent Government Observers Task Force:
Also “Government Data and the Invisible Hand” by David G. Robinson, Harlan Yu, William P. Zeller, Edward W. Felten -Yale Journal of Law and Technology (2009)
If the next Presidential administration really wants to embrace the potential of Internet-enabled government transparency, it should follow a counter-intuitive but ultimately compelling strategy: reduce the federal role in presenting important government information to citizens. Today, government bodies consider their own websites to be a higher priority than technical infrastructures that open up their data for others to use. We argue that this understanding is a mistake. It would be preferable for government to understand providing reusable data, rather than providing websites, as the core of its online publishing responsibility.
Rather than struggling, as it currently does, to design sites that meet each end-user need, we argue that the executive branch should focus on creating a simple, reliable and publicly accessible infrastructure that exposes the underlying data. Private actors, either nonprofit or commercial, are better suited to deliver government information to citizens and can constantly create and reshape the tools individuals use to find and leverage public data. The best way to ensure that the government allows private parties to compete on equal terms in the provision of government data is to require that federal websites themselves use the same open systems for accessing the underlying data as they make available to the public at large.
Dan, the blog post and discussion I linked to is all about the “Government Data and the Invisible Hand” article.
Feeds may be somewhat guided, but it is not exactly as bad as putting data on straight HTML websites – any structured data like XML or RSS can be sliced and diced by 3rd party apps.
Dan- Citysourced.com from San Jose was EXACTLY what Fung referred to. Thanks.
Comments are closed.