It seems that every day the word ‘platform’ becomes more ingrained in the way we think about online tools to do good and address public problems. The ubiquity of the term may be due to its fundamental ambiguity, which it shares with other terms like ‘sustainability’ and ‘participation.’
In an incisive article on the subject last year, Tarleton Gillespie analyzed how the word “platform” was used by major players like Flickr, YouTube, and Google. (I mentioned his article previously but will summarize the thesis here.) In the article, he points out the contradictory ways the companies use the term as part of a rhetorical strategy to serve their interests. On the one hand, as platforms they argue for limits to legal liabilities for actions of their users. On the other, as a platform of opportunity for advertisers, they define and enforce restrictions on users’ speech and activities. He concludes “the discourse of the â€˜platformâ€™ works against us developing such precision, offering as it does a comforting sense of technical neutrality and progressive openness.”
However, as we consider how to apply innovating online technologies for community engagement or governance activities, talk of ‘platforms’ can be troubling from another point of view as well.
Discussions of sociotechnical systems argue humans are just as important as the technical artifacts. An extensive literature on usability and systems development has developed a nuanced understanding of any system as a composite of technical and social components. As a simple example, what an expert user can do with a laptop is much different than what a grandparent can do upon first receiving one. In a larger case one theorist argues “the remarkably low accident rates in commercial air transport, for example, reflect the success of vigilant organizations, legal apparatus, and social learning about accidents as much as the demonstrate the quality of aircraft design and maintenance.” Malcolm Gladwell’s fascinating discussion of air safety in Outliers describes how improving air safety often entails new social rules, such as banning idle chatter in the cockpit during key times, not simply technical ones.
Just as it obscures the internal tensions between different interests, the term “platform” alienates us from this more contextual view of technology. We often jump to the position that solving the problem entails designing the platform, implying it is a neutral system equally usable by any visitor. In reality, according the theory proposed here, solving any problem involves modifying or creating both social and technical components. We are dimly aware of a first-mover advantage in a “space,” but much less aware of the process of creating a useful system. In fact, social construction theory argues technologies are mutually constructed between system designers and engineers and users. Internet “platforms” such as Facebook and Twitter are both powerful independent companies, and in a subtle dialog with their users about how their systems should evolve. The simplest examples are how Twitter has incorporated hashtags and @ tweets into their technical architecture, and Facebook has gone through a well-publicized dance about how to manage the news feed, privacy settings, and even whether you can delete your account.
Of course, this links directly to broader debates about the merits (and measurement) of investments in physical versus social infrastructures. Although it can never be fully resolved, the purpose of the post is to temper technical enthusiasm with a more nuanced view of the origin and evolution of a new category of sociotechnical systems: online platforms.