Social media platform corporations such as Meta (Facebook), Twitter etc. find themselves in a position having to interpret international human rights norms, in particular Article 19 of the International Covenant of Civil and Political Rights (ICCPR) (freedom of expression). Millions or even billions of content moderation decisions need to be taken on the platforms each day that affect users’ human rights interests. Since content moderation is integral to the technical and commercial set-up of these platforms, corporate decision-making vis-à-vis human rights and thus corporate interpretation of international human rights norms is inevitable. Yet, corporations are flawed interpreters. Whereas they act, like a court or tribunal, as triadic decision-makers, they, unlike a court or tribunal, do not share the neutrality, impartiality and independence of the latter. In particular, they are responsible to their shareholders and they pursue commercial interests when moderating content. This article seeks to grapple with the theoretical and doctrinal implications of flawed but inevitable corporate human rights interpretation. Taking the early practice of the Oversight Board, a body established by Meta, Inc. (Facebook) in order to tackle the ‘hard cases’ of content moderation, as a case study, the pitfalls and challenges of corporate human rights interpretation become apparent. In the end, I submit a few suggestions in order to remedy what seems to be here to stay with us at least for as long as an important part of public discourse will be channelled through social media platforms.
Sunday, September 25, 2022
Kulick: Corporations as Interpreters and Adjudicators of International Human Rights Norms – Meta‘s Oversight Board and Beyond
Andreas Kulick (Univ. of Tuebingen) has posted Corporations as Interpreters and Adjudicators of International Human Rights Norms – Meta‘s Oversight Board and Beyond (The Law and Practice of International Courts and Tribunals, forthcoming). Here's the abstract: