Tarleton Gillespie, Microsoft Research; Cornell University
Mike Ananny (University of Southern California)
As social media platforms inseparably embed themselves in social life, a paradox emerges: they become both more powerful and harder to see. Critical inquiry into platforms and their relevance to society, when it occurs, is often colonized by the perspectives of platform makers themselves: the values that motivated them, and the cultures and ideologies that sustain their values in design, dominate the ways of understanding a platform’s significance. These systems, people, and values are too often out of reach of scholars whose job is to ‘to make explicit, orderly, consistent’and open to critical analysis’these ‘orientations’that are usually taken for granted by empirical researchers.’ (Calhoun, 1995, p. 5) We are left needing new empirically grounded critical modes of inquiry to describe and critique the invisible power of platforms.
We propose one such avenue by combining tools drawn from the study of infrastructures and exceptions. Information infrastructures, the ‘scaffolding in the conduct of modern life,’(Bowker & Star, 1999, p. 47) are the largely unseen social and technological forces governing public action. They are embedded, transparent, taken-for-granted, ruled by unquestioned standards, and visible only when seen as failing (Star & Ruhleder, 1996). Infrastructures want to persist.
Exceptions are the conceptual complements to infrastructures. They represent breaks in either the prescriptive rules governing action or the descriptive rules illustrating patterns (Schauer, 1991b). Exceptions show how some actors have power ‘to change rules and to avoid their constraints.’(Schauer, 1991a, p. 873) Exceptions aim to elide. Eventually, the patchwork of ‘bottom-up, relational processes’out of which exceptions emerge can lead to the ‘institutionalization of new rules’(Colyvas & Maroulis, 2015, p. 601) that can re-orient platforms, vacate guiding principles, re-structure power relations, or suggest the need for entirely new organizational forms.
To understand the power and invisibility of platforms, then, we must study how they change. When the ‘rules’governing platforms’everything from algorithmic codes to social norms to regulatory regimes’bend or break, some platform dynamics are seen as immutable, essential, uncontroversial’infrastructural’while others are malleable, contingent, and contestable’exceptional. Fissures thus become diagnostic: ways to see which aspects of platforms are ‘inside’a domain of power, beyond the reach of critics, and stable; which are ‘outside’and changeable only through exogenous events and critical agents; and which set of social, technological, and organizational arrangements become the ‘obligatory points of passage’through which changes must flow. (Law & Bijker, 1992)
We focus our inquiry on two types of platform exceptions: reversals (when norms of acceptability, government oversight, and commercial pressure cause social media platforms to voluntarily overturn their own normative principles) and conversions (when social media platforms adjust policies, revise algorithms, and alter patterns of use in order to standardize exceptions into new platform procedures).
When do platforms refuse participation? The governance of social media platforms is founded on a contradiction: make an open space for speech and participation, by deleting content and suspending users. Platforms are now in a pitched battle against trolls, pornographers, harassers, and racists’not just to identify and inhibit them, but to position them as misusers of otherwise progressive and healthy information infrastructures (Brunton, 2013).
There is no more powerful glimpse into this tension than around extremist content. Terrorist organizations employ social media in powerful ways, to circulate their beliefs, lure new recruits, and spread fear and uncertainty. From one view, they undermine the principles animating social media: open debate, democratic participation, global understanding. On the other hand, they respond to the very same invitation to connect and participate that social media have long offered: express your beliefs without intermediaries, connect to those you otherwise could not, form lasting and meaningful global bonds. They are, in the extreme, power users. "
> In 2008, calls from prominent U.S. and U.K. politicians convinced YouTube to remove a handful of ‘terrorist training videos,’retain others, defend their protection of free speech, and alter the way users flagged videos with extremist content.
> Twitter and Facebook have inconsistently responded to ISIS-produced videos of beheadings, struggling not only to discern what is newsworthy from what is harmful, but to account for the fact that non-extremist users choose to circulate these videos.
> Recently the U.S. and several European governments have called upon Silicon Valley companies to ‘do more’in response to ISIS. Their proposals reveal how terrorism pushes them to reverse their stated philosophies (Gillespie, 2010): removing users for political speech, tracking users without their consent, feeding data to governments, algorithmically manipulating results, specifically altering their index, and blurring content and advertising
Major social media platforms now promise to inhibit extremist content in a variety of ways: all forms of governance using the Internet’s infrastructure (DeNardis, 2012) against reprehensible users. This is ‘normal discipline in the age of crisis’ (Cohen, 2006, p. 19). These power users test these infrastructures and their institutional fortitude: when does an intermediary choose to become a barrier? Under what conditions is a provider of information willing to stop providing information and upend its claimed ideals in the process?
When do private media platforms behave less like ‘noxious markets’ (Satz, 2010) and more like public services? In certain circumstances, ostensibly private online environments of algorithmically optimized peer-to-peer exchange shift from marketplaces that eschew external regulation to platforms that look or act like public service platforms for distributing non-rivalrous and non-excludable goods and services (Baker, 2002). For example:
> During a 2014 hostage crisis in downtown Sydney, a social media outcry forced Uber to refund customers and change the algorithm that charged riders a minimum of $100 and four times the standard rate to escape from danger. And before a severe snowstorm forecast in 2015, Uber capped its New York City surge pricing to 2.8 times the normal rate and donated all proceeds to the American Red Cross.
> After many AirBnB hosts spontaneously offered free or low-cost accommodation to New Yorkers displaced by the 2012 Hurricane Sandy, AirBnB created a ‘disaster response’program that waved platform fees, encouraged hosts to offer listings for free, and partnered with cities to identify new potential AirBnB hosts.
> After Google’s crowdsourced traffic routing app Waze directed large numbers of drivers to narrow, remote, and normally quiet Los Angeles streets, residents complained to Google and the City, and filed fake collision reports in an attempt to trick the app’s routing algorithm. Google refused to change the app, saying that its algorithm helped commuters cope with the real culprit ‘traffic jams caused by LA’s poor urban planning.
What drives these instances of discovering or designing for public interests on private platforms? In each case, there was an exceptional conversion (Ananny, 2015) in which a platform’s social or technological features changed in response to exogenous events, external actors, or normative rationales. The platform shifted from being purely a place for peer-to-peer exchange of private goods to being an environment for discovering, debating, and trading in public goods. These conversions entailed some mix of algorithm redesign, policy revision, and norm change. But why do only some moments force conversions, what types of thinking about public goods do they reveal, and how are these logics tied to platform design?
In reversals and conversions we see evidence of ‘exceptions’: moments when a platform’s dominant values change because some event or actor successfully calls that platform to task and permeates the protective discursive claims in which it surrounds itself. Using these cases of reversal and conversion, we explicate the idea of ‘exceptional platforms,’read contemporary platform dynamics for evidence of such changes, and suggest that these concepts might be used to trace interplays between dominant and subversive forces that govern platforms ‘to explain how and why platforms change.
Ananny, M. (2015). From noxious to public? Tracing ethical dynamics of social media platform conversions. Social Media & Society, 1(1), 1-3. doi: 10.1177/2056305115578140
Baker, C. E. (2002). Media, markets, and democracy. Cambridge, UK: Cambridge University Press. Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences. Cambridge, MA: The MIT Press. Brunton, F. (2013). Spam: A shadow history of the internet. Cambridge, MA: The MIT Press.
Calhoun, C. (1995). Critical social theory. London, UK: Blackwell. Cohen, J. E. (2006). Pervasively distributed copyright enforcement. Georgetown Law Journal, 95(1), 1-48.
Colyvas, J. A., & Maroulis, S. (2015). Moving from an Exception to a Rule: Analyzing Mechanisms in Emergence-Based Institutionalization. Organization Science, 26(2), 601-621. doi: 10.1287/orsc.2014.0948
DeNardis, L. (2012). Hidden levers of internet control: An infrastructure-based theory of internet governance. Information, Communication & Society. Gillespie, T. (2010). The politics of ‘platforms.’New Media & Society, 12(3), 347’364.
Law, J., & Bijker, W. E. (1992). Postscript: Technology, stability, and social theory. In W. E. Bijker & J. Law (Eds.), Shaping technology/building society: Studies in sociotechnical change (pp. 290-308). Cambridge, MA: MIT Press.
Satz, D. (2010). Why some things should not be for sale: The moral limits of markets. Oxford, UK: Oxford University Press.
Schauer, F. (1991a). Exceptions. The University of Chicago Law Review, 58(3), 871-899.
Schauer, F. (1991b). Playing by the rules: A philosophical examination of rule-based decision-making in law and in life. Oxford, UK: Oxford University Press.
Star, S. L., & Ruhleder, K. (1996). Steps toward an ecology of infrastructure: Design and access for large information spaces. Information Systems Research, 7(1), 111-134.