Columbia Global Freedom of Expression seeks to contribute to the development of an integrated and progressive jurisprudence and understanding on freedom of expression and information around the world.  It maintains an extensive database of international case law. This is its newsletter dealing with recent developments  in the field.

Community Highlights and Recent News

Special Collection on the Case Law on Freedom of Expression: The Decisions of the Oversight Board from the Perspective of International Human Rights, by Juan Barata presents a general overview of the decisions adopted by Meta’s Oversight Board (OSB) since its creation, particularly regarding the use of international human rights law to interpret the meaning and scope of Meta’s products’ values and community standards. Barata offers critical insights regarding the way that the Board uses international human rights standards originally created to govern the relationship of the individual and the State and reflects on the nuances and adaptations introduced by the OSB when examining selected content moderation decisions. The main purpose of the publication is to serve as a tool for everyone interested in the challenges associated to the growing phenomenon of respect and application of human rights by private technological companies. Case analyses of all the decisions to date will be published in the following weeks, and the first five are featured below.

  • View the panel discussion, “International Human Rights Law as the Basic Framework of Meta’s Oversight Board Decisions,” about the paper launch here. 

● Register Now! Laughing Matters? Humor and Free Speech in the Digital Age. Join Global Freedom of Expression, Temple Law School, and the University of Groningen for a symposium sponsored by the Dutch Research Council to explore the legal boundaries of humor and free speech in the digital age. The day-long event will showcase interdisciplinary collaboration between practicing lawyers, legal scholars and humanities-oriented humor researchers who are working to map the juridical handling of humor across different regions. Aiming to set a foundation for further collaboration, this symposium will feature a series of short presentations on current issues and ongoing projects, followed by an open Q&A at the end of each panel. Friday, October 14, 2022. More information and register here.

● AI Law and Policy Diploma Course.  The Centre for Communication Governance (CCG) at the National Law University (NLU) Delhi is launching an eight-month online diploma course, curated by CCG and NLU experts, to explore legal, public policy, socio-political, and economic aspects of artificial intelligence: What AI opportunities and challenges emerge? How to govern AI domestically and globally? How to apply data protection principles to AI systems? The CCG invites lawyers, policy professionals, scholars, and graduates from various disciplines, as well as corporate, government, civil society, and media representatives to participate in the course. Registration is open until October 20, 2022, 11:59 PM IST.

Decisions this Week

The Oversight Board
Oversight Board Case of Knin Cartoon
Decision Date: June 17, 2022
The Oversight Board overturned Meta’s original decision to uphold a Facebook post in which ethnic Serbs were depicted as rats. Although Meta eventually removed the content, it initially considered the post did not infringe the company’s Hate Speech Community Standard. It later decided it infringed the “spirit” but not the letter of the Standard, since the policy did not prohibit attacks against groups under a protected characteristic identified implicitly. It then decided it did infringe the letter of the policy. In its decision, the Board found that post breached the Hate Speech Community Standard and the Violence and Incitement Community Standard. It found it was dehumanizing, hateful, and may contribute to a climate in which people could feel justified in attacking ethnic Serbs. In the Board’s view, removing the content from the platform was necessary to address the severe harms posed by hate speech based on ethnicity, and aligned with Meta’s human rights responsibilities.

Oversight Board Case of “Two-Buttons” Meme
Decision Date: May 22, 2021
The Oversight Board overturned Facebook’s (now Meta) decision to remove a comment on Facebook that included an adaptation of the “two buttons” meme. The meme depicted a cartoon character, sweating, with the Turkish flag substituting his face, in front of two buttons with corresponding statements in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it”. Facebook considered that the line “The Armenians were terrorists that deserved it” violated the company’s Hate Speech Community Standard. After analyzing the content as a whole, the Board considered that the comment was of satirical nature, and rather than mock or discriminate against Armenians, the post criticized, and raised awareness about, the Turkish government’s contradictory denialism of the Armenian genocide. Likewise, the Board considered that Facebook’s restriction of the user’s freedom of expression was not necessary or proportional, under international human rights standards, since the removed content did not endorse hateful speech against Armenians, on the contrary it criticized said speech.

Oversight Board Case of Mention of the Taliban in News Reporting
Decision Date: September 15, 2022
The Oversight Board overturned Meta’s original decision to remove a Facebook post from a news outlet page reporting a positive announcement from the Taliban regime in Afghanistan on women and girls’ education. The case originated in January 2022 when a popular Urdu-language newspaper in India reported on its Facebook page that Zabiullah Mujahid, a member of the Taliban regime in Afghanistan and its official central spokesperson, announced that schools and colleges for women and girls would reopen in March 2022. Meta removed the post, imposed “strikes” against the page administrator, and limited their access to certain Facebook features. This was because the company determined the post violated its Dangerous Individuals and Organizations Community Standard under its prohibition on praising a designated terrorist group. Nevertheless, after the Board selected the case for review, Meta determined that this was an enforcement error; that the content fell into the Dangerous Individuals and Organizations Community Standard policy exception for reporting and, thus, should not have been removed.

Oversight Board case of Depiction of Zwarte Piet
Decision Date: April 13, 2021
The Oversight Board upheld Facebook’s (now Meta) decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface contained in its Hate Speech Community Standard. The case originated after a Facebook user in the Netherlands shared a post including text in Dutch and a 17-second-long video on their timeline featuring two adults portraying Zwarte Piet, a traditional  Dutch Christmas character, with their faces painted black and wearing Afro wigs under hats and colorful renaissance-style clothes. In its decision, the Board considered that while Zwarte Piet represents a cultural tradition shared by many Dutch people without apparent racist intent, the use of blackface was widely recognized as a harmful racial stereotype. A majority of the Board saw sufficient evidence of harm to justify removing the content. They argued that allowing such posts to accumulate on Facebook would help create a discriminatory environment for Black people that would be degrading and harassing. They believed that the impacts of blackface justified Facebook’s policy and that removing the content was consistent with the company’s human rights responsibilities.

Oversight Board Case of Armenians in Azerbaijan
Decision Date: January 28, 2021
The Oversight Board upheld Facebook’s (now Meta) decision to remove a post on Facebook in which a user, in the accompanying text, used the word “taziks” — “wash bowl” in Russian—, a play on words on “azik”, a slur, or derogatory term, to refer to Azerbaijanis. The user also claimed that Azerbaijanis had no history compared to Armenians. Facebook deleted the post arguing that it breached the company’s Hate Speech Community Standard. The Board agreed with Facebook, considering that the post — uploaded amidst a recent armed conflict between Armenia and Azerbaijan— was meant to dehumanize its target. Likewise, the Board considered that Facebook’s measure to remove the content was a restriction that complied with International Human Rights standards on freedom of expression, including that the limitation was both necessary and proportional.

hing Freedom of Expression Without Frontiers

This section of the newsletter features teaching materials focused on global freedom of expression which are newly uploaded on Freedom of Expression Without Frontiers.

Journalism and Whistleblowing: An Important Tool to Protect Human Rights, Fight Corruption, and Strengthen Democracy

This brief comes as part of the UNESCO series World Trends in Freedom of Expression and Media Development. It looks at the relationship between journalism and whistleblowers as mutually beneficial and an important tool to protect human rights, fight corruption, and strengthen democracy. The paper provides a survey of legal definitions and protections for whistleblowers in jurisdictions around the world. It concludes with good practices and recommendations for improving the protection of whistleblowing by supporting laws, regulations, technologies, and training.

Post Scriptum

Scholarship on the Oversight Board

The Facebook Oversight Board’s Human Rights Future by Laurence R. Helfer and Molly K. Land in the Duke Law School Public Law & Legal Theory Series, argues that by comparing Meta’s Oversight Board to domestic courts, many observers arrive at incorrect assessments of it. The authors suggest that the Board is better viewed as “a de facto human rights tribunal.” As such, they “examine the human rights origins of the Oversight Board, its strategies for pressuring Meta to improve its content moderation policies, and how it is extending human rights norms to private social media companies.” While describing some of the Board’s two-year achievements as impressive, the authors offer recommendations for it to be more effective and able to navigate arising challenges.

“Inside the Making of Facebook’s Supreme Court,” by Kate Klonick in The New Yorker compares Facebook’s Oversight Board to “a private Supreme Court.” Klonick reports on the eighteen months of the Board’s development – from the initial workshops with experts to the Board’s first rulings. The article also gives a brief history of Facebook’s content moderation decisions and reconstructs how the idea to establish the Board came about. Klonick’s reporting is dense with interviews that show controversy accompanies the Board’s structure, role, and decisions.

“How to Judge Facebook’s New Judges. The social media company’s search for consistent rules has been long, winding, and entirely self-defeating,” by Jacob Mchangama in Foreign Policy Magazine reviews the context within which Facebook’s Oversight Board emerged: the controversy of “not removing enough and also removing too much content,” the company’s constantly changing community standards, and lack of transparency. Mchangama argues the Oversight Board has the potential to make Facebook’s content moderation practices more legitimate and forwards several principles that could effectively guide the Board’s decision-making and strengthen free speech.

But Facebook’s Not a Country: How to Interpret Human Rights Law for Social Media Companies, by Susan Benesch in the Yale Journal on Regulation focuses on how international human rights law could apply to social media content regulation but argues each of its provisions must first receive interpretation tailored to the context of private companies. Benesch sets out “to fill some of the gap” and explains Articles 19 and 20 of the International Covenant on Civil and Political Rights in relation to social media companies.

Applying International Human Rights Law for Use by Facebook, by Michael Lwin in the Yale Journal on Regulation examines the proposals that suggest social media companies endorse international human rights law (IHRL) as a tool to navigate content moderation. Lwin, however, argues “IHRL was written and ratified for use by states, not private companies” and advocates for reinterpretation and readaptation of the law to the new context. The article takes the first step in that direction and presents “a framework for the use of IHRL by social media companies.”

This newsletter is reproduced with the permission of Global Freedom of Expression.  For an archive of previous newsletters, see here.