Facebook ‘Oversight’ Board To Make Content Rulings
Board will make final decisions over objectionable content, but will not be able to change company policy
Facebook has announced an “independent oversight board” to make final decisions about content on the social platform.
The firm has been in the spotlight in recent years over after the platform was used to post questionable content from certain groups. Governments around the world continue to pressure social networking firms to crack down on hate speech and other extremist content.
The Facebook Oversight Board is designed to take the decision making over content out of Facebook’s remit. The board is essentially designed to be a type of appeals body, through which users can challenge company decisions on controversial content.
Oversight Board
Facebook announced the new board and it governance structure, and said it will start off with 11 members – expected to be announced by the end of the year.
The board should start hearing cases in early 2020 and will eventually be made up of 40 members who will serve three year terms.
“The content policies we write and the decisions we make every day matter to people,” said Nick Clegg, VP Global Affairs and Communications at Facebook and former deputy prime minister of the UK. “That’s why we always have to strive to keep getting better. The Oversight Board will make Facebook more accountable and improve our decision-making. This charter is a critical step towards what we hope will become a model for our industry.”
And while the oversight board will make final decisions about content, it will not be able to change policy at Facebook.
“The board will make decisions on content and, if needed, provide policy guidance,” said Facebook.
However Facebook will be required to respond publicly to any recommendations the board makes.
Both Facebook and its users will be able to submit cases, and decisions will be made by panels of five members and then forwarded to the rest of the board.
Community standards
“Facebook is built to give people a voice,” wrote CEO Mark Zuckerberg. “We also recognize that there are times when people use their voice to endanger others. That’s why we have Community Standards to articulate what is and isn’t allowed on our platforms.”
“We are responsible for enforcing our policies every day and we make millions of content decisions every week,” said Zuckerberg. “But ultimately I don’t believe private companies like ours should be making so many important decisions about speech on our own. That’s why I’ve called for governments to set clearer standards around harmful content. It’s also why we’re now giving people a way to appeal our content decisions by establishing the independent Oversight Board.”
“If someone disagrees with a decision we’ve made, they can appeal to us first, and soon they will be able to further appeal to this independent board,” he added. “The board’s decision will be binding, even if I or anyone at Facebook disagrees with it.”
Earlier this month Facebook announced it was opening up its face recognition technology to all users, but there is an option for them to opt-out for privacy reasons.