Facebook Reveals Details About Content Oversight Board
Amid persistent criticism over its content moderation modus operandi, Facebook (NASDAQ:FB) has now published a charter containing details regarding the constitution of an Oversight Board that will act as the ultimate authority over the company’s content removal decisions. The company has also pledged to make this body operational by November 2020. Nick Clegg, Facebook’s VP Global Affairs and Communications, released the following statement on the subject matter: “The content policies we write and the decisions we make every day matter to people. That’s why we always have to strive to keep getting better. The Oversight Board will make Facebook more accountable and improve our decision-making. This charter is a critical step towards what we hope will become a model for our industry.”
The charter elaborates that the board is likely to have 11 members initially but that this number will eventually rise to 40. Each member will serve a 3-year term categorized as part-time employment. According to Facebook, the membership of the board will be comprised of individuals that are independent, impartial, skilled in thoughtful deliberation and efficient decision-making process, and possess a vast repertoire of knowledge regarding digital content and governance. Moreover, the company is planning to establish a trust to determine and disseminate the compensations as well as to play an important collaborative role in the initial appointment and subsequent term renewal of board members.
The charter also highlights the operational details of the Oversight Board. Verified Facebook users will be able to submit cases for consideration by the board through a dedicated portal. Moreover, Facebook itself may also refer such cases to this decision-making body that are deemed ‘significant’ and ‘difficult’. In other words, the cases must create real-life implications and be sufficiently ambiguous without a clear-cut determination to be referred by the company. Thereafter, a selection committee – comprising of 5 members and constituted from within the board membership – will select relevant cases from the submitted repository for review by the board. Subsequently, a panel of 5 board members will be assigned to each selected case with at least one member hailing from the region of origination of that case. This panel will have the power to demand the retrieval of any pertinent information that Facebook may have regarding a particular case and to interview the aggrieved party as well as any expert whose technical assistance may be required. This panel’s decision is then to be circulated in the entire board which can recommend another panel should the majority dispute the verdict. Nonetheless, the final verdict is to be publicized within 2 weeks from the date of the selection of the case and will be binding on Facebook. The verdict may also include non-binding policy recommendations.
Facebook Oversight Board - What Does it Mean?
Although this charter constitutes a step in the right direction, it still allows substantial wiggle room to Facebook in the decision implementation sphere. Crucially, the company is only obligated to implement the decision vis-à-vis the particular case that is referred to the Oversight Board and may choose to ignore the formulation of a blanket policy based on that decision if it is operationally infeasible to do so. This, in turn, limits the implications and financial costs associated with a determination by the decision-making body.
In recent years Facebook has faced growing criticism over the opacity of its ‘community standards’ which guide its content moderation activities and that have led to several controversies. The most recent altercation occurred in August 2019 when Lila Rose, founder of an anti-abortion group, posted a video on her Facebook page asserting that abortion was never a medical necessity. Facebook referred this assertion to Health Feedback, a fact-checking website, which negated the accuracy of the claim. Consequently, five Republican Senators accused Facebook of employing biased fact-checking tactics. In response, the tech giant removed the assessment by Health Feedback while allowing the original video to remain posted on Lila Rose’s page. This incident only served to highlight the discretionary nature of Facebook’s content moderation policies.
In an era where free speech is no longer a black and white matter, the Oversight Board could serve to shield Facebook against criticism stemming from censorship or political bias allegations. It might also strengthen the company’s position vis-à-vis the FTC and other regulatory agencies throughout the globe by demonstrating the company’s practical commitment to safeguarding the interests of its users over pure mercantile concerns. Whether this is a shrewd gambit or a genuine manifestation of Facebook’s desire for greater transparency can only be determined once the Oversight Board becomes functional and the degree to which this body is able to exercise influence over the company’s content moderation activities.
From a big tech regulation perspective, moves like this help show that Facebook is serious about attempting to get a grip on its moderation policies so that it can use them to enforce fair standards as well as bat away suggestions that it should be broken up.