Facebook’s Oversight Board (OB) was instituted to respond to the growing concerns regarding Facebook’s inadequate content moderation standards. The company has been alleged to have proliferated and played an important role in several instances of human right violations, hate and misinformation campaigns related to elections and COVID 19 among other issues. The introduction of the OB – the Facebook Supreme Court, as it has been dubbed – was met with a lot of scepticism, with many arguing that it was an attempt to deflect actual accountability. The Board was established as an independent body with a maximum of 40 members, separate from Facebook’s content review process with the power to review decisions made by the company and suggest changes and recommendations. Notably, the OB will be reviewing cases that are of grave concern and have the potential to guide future decisions and policies. Appeals can be made by the original poster or the person who previously submitted it for review or by Facebook itself referring matters.
So far, the Board has delivered seven decisions and has accepted others with the decisions to be released soon. The cases referred to the Board included alleged hate speech in a post addressing the treatment of Uyghurs in China, the removal of a post made by an Indian in a public forum that was perceived to be a veiled threat and the removal of a post that was incorrectly attributed to Nazis, among others. The Board upheld Facebook’s decision in only one case and in the others ordered the reinstatement of the posts subject to the consent of the users involved.
The Board recently announced that it had accepted a case submitted to it by Facebook regarding the permanent suspension of former US President Donald Trump from Facebook and Instagram. This will arguably be one of the biggest and most significant decisions that it will render. As with any institution of power, defects and faults have been raised with respect to the functioning of the Board. Firstly, the Board can only hear appeals when the appeal procedures within Facebook are exhausted; further, the Board can only deliberate on whether the content remains on the platforms and not when disputed content is left up on the site. In the case of President Trump, it was Facebook which referred the case, which leads to the question of how this would impact users in a similar condition who may not have Facebook’s backing in terms of filing an appeal before the OB.
Another issue that is commonly raised is the rules that govern the functioning of the Board. Notably, the rules make only passing references to human rights and other aspects of international law. The charter of the OB begins with a mention of freedom of speech and expression as a fundamental human right but reduces the same to mere guiding values. Facebook’s principle of freedom of expression and providing a ‘voice’ has been reiterated as essential in many decisions; however, the fate of other human rights issues remains murky.
Interestingly, this criticism has seemingly been dealt with by the Board, as human rights and Facebook’s outstanding obligations under the UN Guiding Principles on Business and Human Rights (UNGPs) were mentioned as a significant contributor to the Board’s decision-making in the rulebook for Case Review and Policy Guidance. In the decisions rendered so far, the Board, while dealing with the relevant standards applicable, first interpreted the case in relation to Facebook’s community standards and then moved on to human rights (expressed as values) and obligations under the UNGPs, implying that human rights play a substantial role in the decision-making process.
Thirdly, the OB is tasked with the mammoth task of reviewing thousands of appeals pending before it, with over 20,000 cases having been referred to the Board. While attempts have been made to make this process as transparent and open as possible, the basis for determining which case would be selected is loosely defined. The criteria to select these cases revolve around choosing cases ‘that raise important issues pertaining to respect for freedom of expression and other human rights and/or the implementation of Facebook’s Community Standards and Values.’ Therefore, the selection of these cases implies that they would be critical to public discourse and raise substantial questions regarding Facebook’s policies.
Given the OB’s limited operation, it would only be fair to give them the benefit of the doubt. In the few decisions delivered so far, certain interesting observations have been made.
In a case related to adult nudity and sexual activity standards, the OB, while overturning the decision related to the removal of a post related to breast cancer awareness, directly challenged Facebook’s AI code that was responsible for removing the post. It noted that the post was allowed under a policy exception for breast cancer awareness and signalled that Facebook’s automated moderation (AI moderation codes) raised serious human rights concerns. According to the Board, “automated content moderation without necessary safeguards is not a proportionate way for Facebook to address violating forms of adult nudity”. Further, it was noted that “as Facebook’s rules treat male and female nipples differently, using inaccurate automation to enforce these rules disproportionately affects women’s freedom of expression. Enforcement which relies solely on automation without adequate human oversight also interferes with freedom of expression.”
The OB is also expanding on and clarifying the extent of its ‘jurisdiction’. In the same case, Facebook admitted that it had made an error and claimed that the Board should decline to hear the case since no disagreement existed between the user and Facebook. The OB rejected this argument and observed that the need for disagreement only comes in when the user exhausts Facebook’s internal appeal process. Since the user and Facebook disagreed, and the appellate process was exhausted, the OB had the requisite grounds to hear the case. The OB even noted that the restoration of content was not the only remedy that it could offer. The Board has the requisite power to demand an explanation regarding why posts were taken down and to provide users such an explanation. The OB noted that the standards applicable at Facebook – particularly the private internal code related to content moderation – were vague. In instances where posts were removed due to the use of AI moderators, users need to be informed of the same and must be able to appeal the same to a human reviewer.
As noted before, the Board’s mandate and size of the task merits patience from the stakeholders involved. The OB is independent, and Facebook must implement its binding decisions unless they contravene any law. These decisions also have precedential value, with the potential to impact the formulation of standards for both human and AI moderators. This is perhaps the closest attempt that can be made to actually regulate and supervise the lax and inconsistent standards applicable at Facebook.
What remains to be seen is whether Facebook will demonstrate genuine compliance with the Board’s decisions. While the decisions of the OB are binding, its recommendations are not. Choosing not to comply with these recommendations will prove antithetical to one of the main motivations behind establishing the Board, which is centred around regaining public trust and confidence. The OB has issued a host of recommendations in the course of issuing its decisions, and recently, Facebook responded to some of these recommendations. Facebook has and will be making changes in response to 11 of the recommendations, including updating Instagram’s community guidelines related to adult nudity, establishing a Transparency Center, and consolidating health misinformation policies. In relation to the last recommendation, Facebook’s stance seemingly contradicts the Board’s observations. Facebook stated that it would not be relaxing or softening its rules related to COVID 19 misinformation; however, the OB observed that in cases where posts contradict the advice of health authorities and where the potential for physical harm is identified but not imminent, Facebook should adopt ‘less intrusive measures.’
The institution of the OB has also forced big tech companies to look at better ways to handle its content moderation issues with reports indicating that they might be inclined to either join the board or establish a structure on similar lines. The OB is currently in an experimental stage and is forced to work under limited circumstances. The call for more public participation and transparency heralded by the OB does have the potential to ensure that Facebook, to some extent, can be held accountable for its public and private policy choices and decisions. In order to ensure that the OB functions in accordance with the purpose of its establishment, its decisions and recommendations must be respected by Facebook in toto, with the goal of ensuring that the ‘precedential’ value of the decisions are appreciated and interpreted appropriately.
This article is written by Lian from r-TLP as part of an ongoing collaboration between Tech Law Forum blog, NALSAR and r-TLP.
The Tech Law Forum is a student-run initiative at NALSAR University of Law, and it aims, essentially, to provide an open platform for opinions, comments, and responses, from students – a platform for discussions and debates on all issues relating to Technology Law, with a specific but not limited focus on India.