The Internet, Policy & Politics Conferences

Oxford Internet Institute, University of Oxford

Nicolas Suzor: The responsibilities of platforms: a new constitutionalism to promote the legitimacy of decentralized governance

Nicolas Suzor, Queensland University of Technology

The ways in which platforms are governed _matters_. Platforms mediate the way people communicate, and the decisions they make have a real impact on public culture and the social lives of their users. The extent to which this is true is obscured by the discursive work undertaken by platforms to distance themselves from the suggestion that they do any 'governing' at all (Gillespie 2010). But platforms, of course, are not neutral. Their architecture (Lessig 2006) and algorithms (Gillespie 2014) shape how people communicate and what information is presented to participants. Their policies and terms of use are expressed in formally neutral terms but the powers they provide are carefully wielded and selectively enforced (Humphreys 2007). Their ongoing governance processes are shaped by complex socio-economic forces (van Dijck & Poell 2013) and the interplay of emergent social norms (Taylor 2006).

Platforms are also increasingly being coopted in public regulatory projects. Nation states around the world are coming to the realization that the only effective and scalable way to regulate the actions of people on the internet is through online intermediaries. Copyright law provides the most developed example; notice and takedown procedures under the US Digital Millennium Copyright Act are almost a de facto rule of large western platforms; Google alone now receives over 65 Million takedown notices for its search engine results from copyright owners (Google Inc, 2016). Building on the success of copyright law, Governments are increasingly requiring online intermediaries to do more to respond to privacy claims, to disclose information about their users and to block access to content they deem objectionable or unlawful.

Civil society groups, too, are seeing some success in influencing the governance of private networks. The discourse is increasingly framed in the recognition that private firms ‘should address adverse human rights impacts with which they are involved’(United Nations, 2011). This new language of 'responsibility' (Ruggie, 2008) has been adopted by disparate global groups of state and non-state actors in debates over freedom of speech, rights of individual privacy, and rights to be free from harassment and abuse (Citron, 2014; UNESCO, 2014). Pressure on intermediaries is steadily mounting from all sides, including civil society groups that are actively lobbying for intermediaries to resist obligations that would limit freedom of speech (e.g. Kiss, 2014; IPRC 2014; Global Network Initiative 2012).

This paper presents a review of the legal terms and conditions of fifteen of the largest English-language social media platforms. Each contract was analyzed to identify the extent to which they provided protections for the interests of users. In all cases examined, the terms of service provided broad, unfettered discretion to platform owners. Like constitutional documents, terms of service grant powers; but unlike constitutions, they rarely limit those powers or regulate the ways they are exercised.

I argue that a new constitutionalism is needed to protect substantive and procedural rights in a decentralized regulatory environment. Existing conceptions of constitutionalism ‘the appropriate limits of regulatory power ‘are insufficient in this context (Black, 1996). Because terms of service are thought of as private contractual bargains, the law has no established language through which to understand where the limits on private platform governance ought to be drawn (Suzor 2011). In legal terms, the discretion of the platform owner is practically absolute. The language of constitutional rights ‘freedom of speech and association, requirements of due process and natural justice, rights to participate in the democratic process ‘has almost no application in the 'private' sphere; constitutional law applies only to 'public' actions (Berman 2000).

Recognizing that intermediaries always exercise some degree of regulatory control over their networks, I argue that some level of decentralization of governance is both inevitable and desirable. It will become increasingly important, however, to ensure that private platforms enforce rules in a manner that is regular, transparent, equally and proportionately applied, and fair (Fitzgerald, 2000; Suzor, 2011). This paper seeks to progress this debate by providing a framework to evaluate the legitimacy of platform governance in practice. I propose that the legitimacy of the ways in which the users of platforms are governed should be evaluated against the principles of the rule of law. In particular, I suggest that we should care deeply about the extent to which private governance is consensual, predictable, equal, and fair. The evaluation of terms of service presented in this paper on these metrics provides an important starting point for reconceptualising limits on governance power in the platform society.


Black, J. (1996). Constitutionalising Self-Regulation. The Modern Law Review, 59(1), 24’55. doi:10.1111/j.1468-2230.1996.tb02064.x

Citron, D. K. (2014). Hate Crimes in Cyberspace. Cambridge, Massachusetts/ ; London, England: Harvard University Press. Fitzgerald, B. F. (2000). Software as Discourse: The Power of Intellectual Property in Digital Architecture. Cardozo Arts & Entertainment Law Journal, 18, 337.

Gillespie, T. (2010). The politics of ‘platforms.’New Media & Society, 12(3), 347’364. doi:10.1177/1461444809342738 Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: essays on communication, materiality, and society (pp. 167’193). Cambridge, Massachusetts: The MIT Press. Global Network Initiative. (2012). Principles. Available at: [Accessed June 2, 2015]. Google Inc. (2016, February 29). Copyright Removal Requests ‘Google Transparency Report. Retrieved March 1, 2016, from

Humphreys, S. (2007). ‘You’re In Our World Now’: ownership and access in the proprietary community of an MMOG. In S. V. D. Graaf & Y. Washida (Eds.), Information communication technologies and emerging business strategies (p. 76). Hershey, PA: Idea Group Pub. IPRC, 2014. Charter of Human Rights and Principles for the Internet, Internet Rights and Principles Coalition. Available at: [Accessed February 18, 2015].

Kiss, J. (2014, March 12). An online Magna Carta: Berners-Lee calls for bill of rights for web. The Guardian. Retrieved from

Lessig, L. (2006). Code (Version 2.0). New York: Basic Books. Ruggie, J. (2008). Protect, Respect and Remedy: A Framework for Business and Human Rights. Innovations: Technology, Governance, Globalization 3, 189.

Suzor, N. (2011). The role of the rule of law in virtual communities. Berkeley Technology Law Journal, 25, 1819. UNESCO. (2014). Fostering Freedom Online: the Role of Internet Intermediaries. Retrieved from

United Nations. (2011). Guiding Principles on Business and Human Rights. Retrieved from Dijck, J. van, & Poell, T. (2013). Understanding Social Media Logic. Media and Communication, 1(1), 2’14.

Nicolas Suzor