Oversight Board, an independent body set up by Facebook, on Wednesday announced the names of 20 members and asserted that the organisation was committed to operating with transparency and fairness in its decisions related to content on Facebook and Instagram.
Facebook, which has faced flak in many parts of the world over various issues, including data breaches, in 2018 announced plans to create an independent oversight board for content moderation in a transparent manner.
The Oversight Board, which includes former judges, journalists and human rights activists as well, will review appeals from users on material that has been taken down from Facebook and Instagram, and make binding content decisions for the social networking platforms.
Members will make decisions based on these principles, and the impact on users and society, and Facebook will implement the Board's decisions, unless implementation could violate the law.
The members come from diverse backgrounds with expertise in areas like digital rights, religious freedom, conflicts between rights, content moderation, digital copyright, internet censorship, platform transparency and civil rights.
Sudhir Krishnaswamy, the vice-chancellor of National Law School of India University - who co-founded an advocacy organisation that works to advance constitutional values for everyone, including LGBTQ+ and transgender persons in India, is also among the chosen members.
More From This Section
"Over the last year, Facebook has helped set the foundations for the Board to operate as its own independent entity by drafting a charter and a set of bylaws supporting the administration and establishing key infrastructure and setting up and providing irrevocable funding for an independent trust," Facebook Director of Governance and Strategic Initiatives Brent Harris said in a conference call.
He added that the company will continue to work with the Board until up to 40 members have been selected after which the board will take on sole responsibility for selection of members in the future.
"We know our work is not done, and we share the Board's ambition for its role to grow over time...I want to re-iterate the foundational commitment we are making to the board, Facebook will implement the board's decisions, unless doing so violates the law, and will respond constructively, quickly and transparently to policy guidance, put forth by the Board," he emphasised.
The Board will begin considering cases later in the year, including hearing appeals from Facebook and Instagram users and cases referred to the board by Facebook for review.
When the Board begins hearing cases later this year, users will be able to appeal to the Board in cases where Facebook has removed their content, but over the following months it will add the opportunity to review appeals from users who want Facebook to remove content.
Users who do not agree with the result of a content appeal to Facebook can refer their case to the Board by following guidelines that will accompany the response from Facebook. At this stage, the Board will inform the user if their case will be reviewed.
The Board can also review content referred to it by Facebook, including many significant types of decisions, including content on Facebook or Instagram, on advertising, or Groups. The Board will also be able to make policy recommendations to Facebook based on case decisions. The membership of the Board will be expanded to about 40 people.
Helle Thorning-Schmidt, a co-chair of the Board and former Prime Minister of Denmark, said the Board will take final and binding decisions on what content stays up, and what content is removed on Facebook and Instagram.
"We are basically building a new model for platform governance...I think we all know that one of the fundamental challenges that face us in the digital age is defending the freedom of expression and human rights, and I look forward to embracing that challenge and serve the online community with transparency and with fairness," she added.
Co-Chair Jamal Greene, a Columbia Law professor, noted that one of the major challenges for any moderation team is developing trust and a sense of legitimacy.
"If decisions are not transparent and if people don't trust that decisions are not being motivated by financial interest or political interest or reputational interest...We are independent of Facebook, we are independent of other social media companies, we contract directly with the Oversight Board, and we can't be removed by Facebook.
"In fact, some of us have been publicly critical of how the company has handled content issues in the past. So all told, we're in a position to make decisions, free from influence and interference," he asserted.
Michael McConnell, co-chair and a former US federal circuit judge, said the Oversight Board will bring a "higher degree of safety and political neutrality" to the Facebook platform on which many people globally depend on as a principal means of communication.
He said the Board will have to select the cases it works on from hundreds of millions of postings that may be involved.
"What we intend to do is to focus first on cases that affect large numbers of users...on cases that have or look like they may have a major effect on public discourse. And then finally, on cases that raised significant policy questions across the platform so that we're not just going to be deciding individual cases, but looking at cases that will guide Facebook, and its decision on a wide range of matters," he explained.
Catalina Botero-Marino, co-chair and Dean of the Universidad de los Andes Faculty of Law, said all decisions of the Board will be made public, and Facebook will have to respond to them.
"We will publish our decisions on our website while protecting the identity and privacy of those involved. We will also issue a public annual report to evaluate the progress of our work as a board, and whether we believe, Facebook is living up to its commitments," she said.
Disclaimer: No Business Standard Journalist was involved in creation of this content