Posted: March 28th, 2011 | Author: Rob Goodspeed | Filed under: Social Networking, Technology | Tags: platforms, social construction theory, sociotechnical systems | 3 Comments »
It seems that every day the word ‘platform’ becomes more ingrained in the way we think about online tools to do good and address public problems. The ubiquity of the term may be due to its fundamental ambiguity, which it shares with other terms like ‘sustainability’ and ‘participation.’
In an incisive article on the subject last year, Tarleton Gillespie analyzed how the word “platform” was used by major players like Flickr, YouTube, and Google. (I mentioned his article previously but will summarize the thesis here.) In the article, he points out the contradictory ways the companies use the term as part of a rhetorical strategy to serve their interests. On the one hand, as platforms they argue for limits to legal liabilities for actions of their users. On the other, as a platform of opportunity for advertisers, they define and enforce restrictions on users’ speech and activities. He concludes “the discourse of the ‘platform’ works against us developing such precision, offering as it does a comforting sense of technical neutrality and progressive openness.”
However, as we consider how to apply innovating online technologies for community engagement or governance activities, talk of ‘platforms’ can be troubling from another point of view as well.
Discussions of sociotechnical systems argue humans are just as important as the technical artifacts. An extensive literature on usability and systems development has developed a nuanced understanding of any system as a composite of technical and social components. As a simple example, what an expert user can do with a laptop is much different than what a grandparent can do upon first receiving one. In a larger case one theorist argues “the remarkably low accident rates in commercial air transport, for example, reflect the success of vigilant organizations, legal apparatus, and social learning about accidents as much as the demonstrate the quality of aircraft design and maintenance.” Malcolm Gladwell’s fascinating discussion of air safety in Outliers describes how improving air safety often entails new social rules, such as banning idle chatter in the cockpit during key times, not simply technical ones.
Just as it obscures the internal tensions between different interests, the term “platform” alienates us from this more contextual view of technology. We often jump to the position that solving the problem entails designing the platform, implying it is a neutral system equally usable by any visitor. In reality, according the theory proposed here, solving any problem involves modifying or creating both social and technical components. We are dimly aware of a first-mover advantage in a “space,” but much less aware of the process of creating a useful system. In fact, social construction theory argues technologies are mutually constructed between system designers and engineers and users. Internet “platforms” such as Facebook and Twitter are both powerful independent companies, and in a subtle dialog with their users about how their systems should evolve. The simplest examples are how Twitter has incorporated hashtags and @ tweets into their technical architecture, and Facebook has gone through a well-publicized dance about how to manage the news feed, privacy settings, and even whether you can delete your account.
Of course, this links directly to broader debates about the merits (and measurement) of investments in physical versus social infrastructures. Although it can never be fully resolved, the purpose of the post is to temper technical enthusiasm with a more nuanced view of the origin and evolution of a new category of sociotechnical systems: online platforms.
Posted: January 14th, 2011 | Author: Rob Goodspeed | Filed under: Technology, Uncategorized, Urbanism and Planning | Tags: Planning 2.0, Urbanism and Planning | 1 Comment »
I’m helping plan this conference at MIT in April. We opened registration and announced the call for papers today.
REGISTRATION INFORMATION & CALL FOR PRESENTATIONS AND PAPERS
Friday, April 8, 2011
11:30 AM – 6:00 PM
Location: MIT Building 9
New technologies are transforming how we communicate, expanding access to data and information, and revolutionizing how we understand and navigate our cities. Join a diverse groups of practitioners, scholars, students, and citizens for a half-day conference on the impact of these changes on the field of urban planning. Held one day before the start of the American Planning Association’s National Conference (also in Boston), this will be an opportunity to meet innovators from around New England and the across the nation.
The event will include discussion of urban modeling, urban sensing for planning, planning support systems, meeting technology, social media and Web 2.0 tools, and gaming for participation.
Register using the following link. Registration is free:
Participants have four options for presentations:
– Lightning Talks – presenters will have 20 slides, 20 seconds per slide, advance automatically.
– Paper Session – Presentation of a paper, submitted two weeks before the conference. Should be no more than 5-10 pages.
– Presentation Session – Presentation without a formal paper, A/V materials optional.
– Idea Session – A facilitated conversation on a topic. Will be finalized on the day of the conference.
If you would like to present, submit the presenter name(s), presentation type, and proposed presentation title to rob.goodspeed at gmail.com by Friday, February 25. The timeline for presentations is below.
Friday, 2/25 – Title and Abstracts due for presenters
Monday, 2/28 – Accepted presenters notified
Monday, 3/28 – Papers and final presentation titles due
Friday, 4/8 – Conference day
For more information see the conference website:
Or contact planningtech at mit.edu
Posted: January 10th, 2011 | Author: Rob Goodspeed | Filed under: Technology | Tags: #crowdsourcedcities, GIS innovation | 4 Comments »
At a conference attended in December on the “Future of the Crowdsourced City” a major topic of discussion was how city governments — or other city organizations — could embrace new technologies.
Although nearly 20 years old, I thought this article was remarkably relevant for debates today about how to foster technology-enabled innovation in local governments. Titled “Implementing GIS for Planning: Lessons from the History of Technological Innovation,” the article is written by noted planning scholar Judith Innes and a former student David Simpson, and was published in the Journal of the American Planning Association in 1993.
Observing that GIS had recently become widespread in government, the article poses the question how planners can adapt this technology to their unique needs. In order to answer the question, they argue planners should approach GIS as a socially constructed technology. Contrary to the view that all innovation is produced by an inventor and recognized by a market, this view stresses innovation resulting from an iterative, nonlinear process. Therefore innovation is “only integrated into practice through mutual adaptation between users’ practices and technology’s capabilities.” Following a study by Bikson about organizations adopting computer systems, they argue successful introduction of new technology requires several factors: organizational mission to implement the technology, training programs and rewards for employees who learn to use it, user participation in development.
Introducing technology alone may not be enough: what is needed is creating a culture of innovation? The article identifies five principles for innovation success from Rogers (1983):
- observability of benefits
- relative advantage
- ability to make small trials
- compatibility (with community’s culture)
Does your innovation meet these criteria? This paper concludes that although GIS violates the conditions more often than not, they are cautiously optimistic about the ability of planners to “develop strategies that will encourage transformations of planning practice in response to the opportunities that GIS offer.”
Without going into a full discussion, the prognosis was prescient. Although GIS has emerged as a discrete technology and profession, planners have influenced its development and have created new tools to fit their needs, such as web-based data viewers or specialized analysis tools. In a recent exchange, several scholars argued the field had lost some control over the technology, while others answered new Web 2.0 technologies may actually be much more suited for the collaborative nature of planning practice than older technologies were.
Although new technologies today are radically different than those who spawned these theories, following them seems reasonable advice for success. However they also caution against naive assumptions technology leads to organizational transformation. According to the the principle of compatibility, if an innovation is not compatible with the prevailing organizational culture, winning acceptance may require nothing less than cultural change.
JAPA: “Implementing GIS for Planning: Lessons from the History of Technological Innovation“
Posted: October 17th, 2010 | Author: Rob Goodspeed | Filed under: Government, Technology | Tags: crowdsourcing, Gov 2.0 | Comments Off
Lately I’ve been involved in a lot of conversations about crowdsourcing in the public sector. Although they’re sometimes confused, in general I think there are two types we can talk about: crowdsourcing policy (or ideas) and public goods (tangible work or services). This is a topic included in my Open Government Strategy for the City of Boston.
The best analysis of private sector crowdsourcing of ideas is this recent article in the Sloan Management Review. The researchers analyze three crowdsourcing projects: Linus, Wikipedia, Innocentive, and Threadless. By breaking down the organization of each case, they make clear these projects not a utopian creative free-for-all, but instead a carefully constructed set of rules and practices that combines forms of decision-making, creativity, and incentives in new ways to create new ideas. For example, Wikipedia relies on the decisions of editors for disputed articles, Threadless users vote for the best ideas, and on Innocentive businesses to pick winners. In each the rewards to the contributors differ, but exist even when they are non-monetary, often in the form of “love” or “glory.”
Two examples of policy crowdsourcing are Peer to Patent and Next Stop Design. Peer to Patent opens patent applications, with the permission of the applicant, to a pilot system which allows the public to contribute to the research on “prior art.” The idea is by allowing experts to contribute to this process, they can accelerate the work of the Patent Office in determining which ideas deserve patents. The project was founded by Beth Simone Noveck, a professor at New York Law School who leads the Obama administration’s open government initiative. The project is successful because it enables topic experts to conveniently contribute information that expedites the official process. However, it remains a voluntary pilot project and has not been taken to scale for the entire government.
The Next Stop Design project, launched by researcher Daren Brabham solicited designs for a Salt Lake City bus stop from around the world. Daren, now a professor at UNC Chapel Hill, wrote a PhD dissertation about public sector crowdsourcing. If you can access it, he lays out his approach in a recent article in Planning Theory. He argues that crowdsourcing can replace conventional approaches of citizen participation:
In essence, any urban planning project is predicated on a problem. Typically that problem is how best to accommodate changing populations with different infrastructure, all while considering the interests of residents, developers, business owners, and the environment. If a problem can be framed clearly, and if all the data pertaining to a problem can be made available, then that problem can be crowdsourced.
Since I’d argue most planning projects involve multiple, contested problems, I’m not sure crowdsourcing can replace a host of existing theory and approaches. However, where the problem contains a significant design element, and the boundaries are noncontroversial (such as a bus stop), it may be an excellent strategy.
Finally, what about crowdsourcing public goods themselves? In the words of Tim O’Reilly, can government be a “platform for greatness”? Last month I argued such thinking was silent to the realities of government: power is divided between agencies, it’s run by politicians, and most people may not agree this is the way to go to begin with. The problems seem more surmountable at a local level. Mitch Weiss, the Mayor of Boston’s Chief of Staff, raised the issue at a provocative talk at the Rappaport Institute titled “How “Peer-Produced” Government Can Help Fill Potholes, Save Cities, and Maybe Even Rescue Democracy.” I worked with him last summer, and I think their initiatives to release data and improve citizen’s ability to communicate with government has been very positive. However, I’m not sure the city will ever be coordinating peer-produced services.
Even if we can overcome the formidable institutional and political barriers, there are good reasons why governments may never be directly involved in facilitating the peer-production or crowdsourcing of public goods. I encountered a good explanation about why this summer at the iGov Research Institute. Bas Kotterink, a researcher with the Netherlands research organization TNO, proposed the following hypothesis in a presentation:
Governments are not geared for co-creation. Instead, they should facilitate and monitor user and company-led innovation of public tasks with a more proactive role in democracy (inclusion) and enforcement, protecting basic human values such as privacy and dignity.
He argued that since the rules of private and government action are so different, initiatives at either extreme are the most able to product public goods. Mixtures of the both – such as some e-participation initiatives – are doomed to fail. However, he stakes out an important role for government. They can ensure minimum standards for key services are protected by punishing offenders and enforcing regulations, or providing it themselves when market failures occur. They can promote data standards and access to public data (such as in apps competitions). They can define and protect standards of individual privacy. Although they may not directly produce certain public goods as in the past, governments will continue to play a critical role we are only beginning to understand.
Posted: September 22nd, 2010 | Author: Rob Goodspeed | Filed under: Technology | Tags: Boston, Gov 2.0, open data, open government | 2 Comments »
This post is the first of a two part series on my work creating an open government strategy for the City of Boston this past summer.
During his campaign for the presidency, Barack Obama often mentioned expanding civic participation. Solving our toughest problems, he argued, would require action by both government and regular citizens. “The most important thing we can do right now,” he said in February 2008, “is to reengage the American people in the process of governance.”
Once elected, to realize this vision of a more participatory government President Obama turned to the very tool which his campaign had exploited so successfully to win: the Internet.
Issued his first day in office, Obama’s Memorandum on Transparency and Open Government called for the Federal government to “harness new technologies to put information about their operations and decisions online and readily available to the public,” and directed the newly-appointed Chief Technology Officer to coordinate the implementation of this vision through an Open Government Directive. The policy described three types of activities: transparency, participation, and collaboration. The wide-ranging efforts sparked by the memorandum and the subsequent directive include experiments with online commenting and discussion, a new government data portal, and targeted programs to capture citizen ideas and expertise.
The Open Government Directive mades the intended purpose of the initiatives clear: for Federal agencies to gain knowledge and effectiveness without ceding any authority. The purpose of collaboration is to improve effectiveness by coordinating private actors with Federal goals. Participation is about capturing ideas and expertise, not delegating power to citizens. Only transparency has political overtones, but only in the most abstract sense: it is supposed to improve “accountability.”
If citizens want to be engaged in a more substantive way in the governance of the nation, it seems they should look elsewhere. That place is often cities and towns, where more complex forms of citizen participation are not only possible but commonplace. Although cities have experimented with publishing data and using technology to engage their citizens, they have existing political structures and limited technical resources. How can they translate the principles of open government into specific initiatives? Can the potential for greater citizen influence be realized at the local level? In short, what does it take to become an “open” city?
An Open Government Strategy for Boston
I attempted to answer this question last summer as a Rappaport Public Policy Fellow at the City of Boston, working with Nigel Jacobs, Chris Osgood, and Mitch Weiss in the Mayor’s Office, and Chief Technology Officer Bill Oates in the Department of Innovation and Technology. The project involved two components:
- An open government strategy (describing what to do and why)
- A technical assessment of two specific recommendations (data and ideas portals).
This post contains the first portion, however I plan to post the second shortly.
Although I considered alternate frameworks, I decided to keep the Obama Administration’s proposal that “open government” should include the categories of transparency, participation, and collaboration. However, each category is tailored to the unique characteristics of local government, and Boston in particular. In addition, since each level requires greater government resources to implement, I argued higher levels should be focused on topics identified as priority areas by the city’s elected officials. Although I view open government as creating a more participatory democracy, it exists in combination with the representative legitimacy of elected officials.
The City of Boston already publishes extensive amounts of information through its website. My report identified 19 data “tools” allowing citizens to query data in city databases, 7 datasets already published online, and 30 GIS layers routinely released in response to inquiries. However, few of these are published in “raw” formats easily consumed by citizens or data intermediaries, who can re-use the data to create citizen-facing apps and services. In addition, legislative information is available but not easily navigated or searched.
Therefore I proposed following the lead of other cities by creating a central public data portal to host data intended for the public. Such a portal would make data easy to find, encourage consistent metadata, and make data available to developers in a controlled way through an Application Programming Interface (API). This system should serve multiple users: researchers seeking raw data, developers seeking an API, the general public who wants limited navigation functions, and all users with the ability to comment and provide feedback on data usefulness and quality. Separate from the data portal, I also contributed to the process of evaluating legislation management systems, which serve both the City Clerk’s need for organization and citizen’s desire for access to legislative information.
Like most cities, city employees in Boston interact with citizens on a daily basis. I proposed augmenting the existing approaches with an online feedback portal. Similar to HUD’s Ideas in Action website, it would include forums dedicated to specific topics where citizen input is desired. Organizing participation through a central website could result in improved transparency, consistency, and satisfaction by citizens and city officials alike. Of course, the process of crafting plans and policies will always involve committees, meetings, hearings, and perhaps someday mechanisms like participatory budgeting. The online portal would serve as a low-committment and understandable starting point for more complex forms of citizen participation.
Finally, building on top of transparency and participation is the most complex form of citizen engagement, collaboration. I proposed several types:
- Applications competitions or initiatives to encourage private developers to create citizen-facing apps that use city data.
- Innovation, analysis and visualization challenges to encourage creativity to re-think existing processes, or explore complex datasets.
- Formalizing a variety of ways to collaborate with academic researchers to produce reports, policy analysis, urban designs and plans, and other products at minimal cost to the city.
Of course, taking these steps will involve significant organizational and technical change for the city. My next post will summarize the technical options for implementing the two specific recommendations, the data portal and ideas website. However, like for the Federal government, open government isn’t about technology per se, but instead how it is used. In that spirit, my report concludes with a brief section on the theories of “targeted transparency” and participatory democracy. Although the product of a summer of discussions with city officials, this strategy is nothing more than a loosely theorized set of recommendations about how technology can achieve some modest goals of improving access to information and communication with the existing government. It does not consider whether it’s possible to re-imagine government in a more fundamental way. For that, I look forward to a lively discussion with Mitch Weiss on Thursday.
Read the report: Open Government Strategy for the City of Boston (pdf)
Other posts: Open Government Reading List, What Government Data Should Be Transparent?, What is Government 2.0?
Posted: September 13th, 2010 | Author: Rob Goodspeed | Filed under: eGovernment, Government, Technology | Tags: Gov 2.0 | 2 Comments »
One of the most visible supporters of technical innovation in government recently has been Tim O’Reilly. Perhaps best known for popularizing the term “Web 2.0,” O’Reilly’s media company publishes popular software manuals and organizes industry-leading conferences for Internet entrepreneurs. In the past few years, he’s increasingly turned his attention to applying innovative internet technology to government, organizing in 2009 the inaugural Gov 2.0 Summit and Expo in Washington, D.C., events which bringing together high-ranking government officials and technology gurus.
O’Reilly’s agenda includes nothing less than the complete transformation of government. The internet has unleashed tremendous creativity through Web 2.0 websites, he reasons, so why can’t similar results be organized for government? The argument is presented as a chapter in an edited volume published by O’Reilly Media last may titled Open Government. The chapter, titled “Government as a Platform,” is also available online, and summarizes the argument he’s made in many blog posts and lectures:
Web 2.0 was not a new version of the World Wide Web; it was a renaissance after the dark ages of the dotcom bust, a rediscovery of the power hidden in the original design of the World Wide Web. Similarly, Government 2.0 is not a new kind of government; it is government stripped down to its core, rediscovered and reimagined as if for the first time.
And in that reimagining, this is the idea that becomes clear: government is, at bottom, a mechanism for collective action. We band together, make laws, pay taxes, and build the institutions of government to manage problems that are too large for us individually and whose solution is in our common interest.
Government 2.0, then, is the use of technology—especially the collaborative technologies at the heart of Web 2.0—to better solve collective problems at a city, state, national, and international level.
For too long government has been nothing more than a vending machine, O’Reilly argues, dispensing services to citizens in exchange for taxes. When we didn’t like what it produced, we resorted to shaking the machine — political protest. What we should be doing, O’Reilly argues, is creating a government which enables collective action, and captures the energy and innovation of the marketplace. In short, government should be a “platform of greatness,” coordinating and empowering individuals to serve the public interest.
The concept has caught on in some circles, embraced by groups like New York City’s Open Planning Project, a nonprofit dedicated to open source mapping software, open data, and democratizing the planning process, who included O’Reilly in a recent film about the value of publishing transit data. O’Reilly showed this film during his opening remarks at this year’s Gov 2.0 Summit, which concluded earlier this month. However, the opening began with a sober tone. After the enthusiasm of the first event, achieving Government 2.0 is “harder than it appears,” he conceded. However, O’Reilly said he still believes “Gov 2.0 answers the debate we’ve been having whether government is too big or too small … and creates the possibility of doing less and getting more.”
Indeed, technical innovations are slowly filtering into government. With open standards and the growth of sophisticated free and open-source technology, more and more proprietary and difficult-to-use vendor products are finally feeling healthy competition. Government data has the potential for improving journalism, access to services, and the evaluation of policy.
However, if we are to follow the metaphor to its logical conclusion, to truly reinvent government along different lines, what are the obstacles might we face? How might the lessons of Wikipedia, Facebook, and Youtube be applied to the ancient art of government? Unpacking these reasons may help explain why the path of government reform is a difficult one.
1. What’s a “Platform” Anyway?
In his thoughtful recent article “The Politics of ‘Platforms’” Tarleton Gillespie argues web companies use the word platform in a variety of ways. To regulators, they’re merely neutral platforms not responsible for the views expressed by participants and exempt from regulation. To users, it’s a privildged platform subject to detailed terms of service and censorship of offensive content. To other media companies, they’re lucrative platforms for profit. He concludes, “in other words, [these examples] represent an attempt to establish the very criteria by which these technologies will be judged, built directly into the terms by which we know them.”
In addition, all the private “platforms,” have some kind of internal governance who set the rules for participants. Whether groups of editors on Wikipedia or a corporate board, none institutionlize a type of governance anywhere near the complexity of real government. In fact, most are basically benevolent dictators with CEO’s held accountable by market forces. And if they can establish a monopoly, they’re only restrained by goodwill (such as “don’t be evil”) and any applicable laws. This issue brings us to the second obstacle.
2. The Federalist System
From outside of government, it’s easy to assume government has the power to do whatever it wishes so long as the elected officials agree and can obtain sufficient funds. Not true. Public power is carefully and deliberately divided between a bewildering array of states, agencies, municipalities, districts, quasi-public entities. As an example of this, some of the best ways to curb harmful externalities (like carbon dioxide emissions) are through taxes. In Massachusetts, cities cannot create new taxes without the approval of the state legislature. Period. The federal government often seeks to reform education. The only problem? Schools are run by local school boards. Federal education policy does have a variety of carrots and sticks at its disposal, but only local school districts control every aspect of schooling, or implement radically innovative new programs. This brings us to the next obstacle: who’s in charge.
It’s easy to think Federal agencies are out there advancing, say, transportation or health and human services, in a general way. To the contrary, they operate under specific legislative guidelines . At the local level, although more policy entrepreneurship is possible, it always occurs under the watchful eye of lawyers and generally subject to legislative intervention. In fact, in theory elected or appointed officials run the whole operation of government. Implementing Government 2.0 therefore must involve the hard work of crafting detailed proposals, lobbying, and promotion used by any interest group. Which leads to the most incorrigible force of resistance of all.
4. We the People
O’Reilly is confident his vision of Government 2.0 transcends ideology. I’m not so sure. Any proposal for how government should operate is inherently ideological. His is no different. It includes a celebration of the market, belief in the power of individual creativity, and a desire to get government out of the business of providing direct services. In these ways it can be said to resemble neoliberalism quite closely, although perhaps with an assumption collective action outside of the market is necessary. This ideology may seem appealing to technologists, but a host of Americans may think otherwise. Leftists may prefer the old vending machine (where we can ensure the quality of public services), and conservatives may want to continue to shrink government. Even if it costs less, they might argue, we shouldn’t be tackling some problems through government at all.
The goal of this post is not to deflate the momentum of Government 2.0 advocates, but temper their enthusiasm with some realism. Publishing open data about transit service does seem somehow new for government. Yet we should never lose sight of what’s happening: a marginal increase in convenience for citizens, and some modest profits for software developers. Organizing a distributed, crowdsourced alternative to the subway? If it were even possible, this would require the cooperation of multiple government agencies, breaking union contracts, re-writing state law, and convincing everyday citizens an alternative to the existing one-agency system is desirable.
For these reasons, achieving public benefits through technology is often easier to organize completely outside of government. For example, a grassroots movement to clean up Estonia in one day was very successful, but nearly impossible to imagine under the guidance of a government agency. (What about liability? What about union rules? Did the legislature authorize everything properly?)
In Boston, we’ve anxiously awaited real-time arrival data for buses and trains. However, I’m not sure how that relates to the health of the underlying service, with billions of dollars of debt and backlogged maintenance. Until we can figure out how to use technology to tackle those problems (Crowdsourced railcar maintenance? DIY track inspections?) Government 2.0 will remain a buzzword and not a true reform movement.
Posted: May 11th, 2010 | Author: Rob Goodspeed | Filed under: Government, Technology | Tags: Gov 2.0, Government Data, Transparency | 3 Comments »
At an event I attended in March, Massachusetts’ Chief Information Officer Ann Marguiles raised a simple yet profound issue. Although they’re committed to open data, the Commonwealth was still to figure out which datasets to post online through their new data portal mass.gov/data.
Plenty of transparency advocates would say the answer should be “all of it.” However, I think this answer is unsatisfactory for a couple reasons. First, Massachusetts faces very real resource constraints. Administrative data is managed by hundreds of legacy systems across over 100 independent agencies. Many of these systems contain personal or otherwise sensitive data that precludes throwing open the doors, and requires time to create public reporting scripts. Second, the “free it all” position overlooks the government’s role as data collector. Plenty of information is collected and released merely as a public service: environmental data, population statistics, etc. Instead of just focusing on making paper records digital, we should discuss the larger issue: what types of information should governments make available?
I think there are several basic categories of types of data government should release. Each has its own logic, and a review of the categories can emphasize the multiple purposes of transparency.
1. Data “About the World” To Inform Research and Policy Debate
For a variety of reasons, governments often collect some of the most accurate and up-to-date descriptive data about communities. This includes vast array of geographic data, school and testing data, demographic data, employment and economic statistics, and more. It should be released primarily because it enhances our ability to create good policy, or collective understanding more generally.
2. Data Released to Improve Service Delivery
Some data should be released because it improves access to government services. This includes cases where the data itself is the service (e.g., research reports), but also includes more technical forms such as transit system data, government facility locations, and service details.
3. Data to Help Hold Government Accountable
A host of budget, voting, and performance data should be released to hold government accountable. However, metrics produced internally as part of stat-type programs introduces the problem of mixed motives. Why would governments want to release the data that can be used against them? This problem can be partially avoided by separating the data from the operations within the government organization. This concern also introduces the important issue of presenting information in accurate ways, and including metadata about definitions and collection methodology.
4. Data to Change Private Decisions to Achieve Policy Goals
In their book Full Disclosure, Archon Fung, Mary Graham, and David Weil argue many transparency policies fall into the new category of “targeted transparency.” Including mortgage reporting requirements, nutrition labels, and automobile crash ratings, these efforts make information available with the deliberate intention to achieve a public objective by influencing private decisions. These policies succeed when they provide people facts they want in the “times, places, and ways that enable them to act.” They stress these aren’t limited to policies seeking economic changes, but also include campaign finance reporting laws which work through political channels. Although implemented with the intention of reaching end users, the ease of citizens to access this data ranges widely. Some data are readily available, but governments rely heavily on intermediaries to analyze and present more complex (and politically-charged) data like the toxics release inventory or mortgage lending data from banks.
5. Data Posted to Improve Access Within or Across Government
Although it’s rarely discussed, I think an important use of available data is to help break down barriers within and between government agencies. This will be an unintended use so long as our governments are separated into layers and silos. This purpose explains why so much of the data on the HUDUser website are specific to certain policies or programs: the intended users are state and local governments and nonprofits, not the general public.
What do you think? Are these the right categories, or have I omitted something important?
> Data and Decisions in Government
> Does Data Matter in Urban Policy?
> What is Government 2.0?