This is AI generated summarization, which may have errors. For context, always refer to the full article.
Facebook is one of the most-used, popular online platforms in the world, keeping about a third of the world connected and in touch with one another.
An assortment of problems come attached to this. These include hate speech and harassment, privacy issues, misinformation campaigns, and the indirect consequences of all those issues on the people involved.
Of course, Facebook aims to steer itself forward by presenting a special board who will oversee and deliberate on various issues pertaining to the company’s moderation and actions moving forward. This board is supposed to be independent of Facebook and its members cannot be removed by the company.
To wit, the oversight board is filled with some brilliant people, who are likely very good at what they do at the scale in which they do it. (READ: Who’s who on Facebook panel for content decisions)
Assuming they really want to help Facebook and the people using the social media platform, I believe they will be humbled by the sheer difficulty of the task they have taken upon themselves.
In an op-ed piece on The New York Times, the oversight board’s lead members said they “will focus on the most challenging content issues for Facebook, including in areas such as hate speech, harassment, and protecting people’s safety and privacy. It will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns).”
By itself, the phrasing is pretty decent. They want to work to fix Facebook’s deficiencies when it comes to the “most challenging content issues for Facebook.”
But the most challenging content issues for Facebook are not the most challenging content issues for everyone, and that has to be made clear.
In an op-ed piece on Wired, media scholar Siva Vaidhyanathan said it better than I ever could: “Only in the narrowest and most trivial of ways does this board have any such power. The new Facebook review board will have no influence over anything that really matters in the world.”
What really matters in the world?
For one thing, it doesn’t appear as if the oversight board will be making a code of conduct for Facebook to follow, but will instead be reviewing individual cases. A small group of people handling a small group of cases related to something used by a third of the world? That sounds problematic.
I do not expect them to get very far unless they have some way of pressuring Facebook to follow a code of conduct or even just some kind of contingency planning for “really bad things” happening.
As such, the board will – slowly, as they are a board that might need to deliberate – go through the least pressing, yet most annoying problems of Facebook.
What is a more pressing, yet not-as-annoying problem of Facebook? People dying because of actions caused by disinformation and hate campaigns in countries where Facebook operates but has no board member who can do anything about it because they have no knowledge of the situation there.
It’s not as annoying because the person who could contest the misinformation, such as in the 2018 Sri Lanka riots – which Facebook has apologized for – or the situation against the Rohingya in Myanmar, may already be dead or otherwise not using Facebook because they’re running for their life. (READ: Unliked: How Facebook is playing a part in the Rohingya genocide)
Guilt over inaction?
Another pressing problem is the erosion of democracy around the world due to coordinated campaigns against countries, journalists, human rights activists, and the like.
Unless the board has a mandate – and likely a vision that will enable them to see how their actions will ripple through multiple societies – I don’t think they’ll have the gumption to be able to do anything, much less feel guilt over inaction.
In a webinar held on May 18 and detailed by Broadband Breakfast, Michael McConnell, professor at Stanford Law School and a member of the oversight board, said, “We are not frontline internet cops.”
He explained the purpose of the oversight board was to give “a deliberative second look at the process” of removing allegedly dangerous content from the most popular social media platform in the world.
More to the point for the average user of Facebook, will this oversight board keep the social network from delving into my data to make a buck? I don’t think so.
Facebook’s new board, acting as advisors, will also comment on a wide range of issues, which Facebook is required to react publicly to. But it seems as if because they’re paid by Facebook for this advisory role, they won’t step on quite as many toes as is warranted.
“We have nothing to say about Facebook’s business,” said John Samples, vice president of the Cato Institute think tank, and a member of the Facebook oversight board as well, during that same webinar.
I honestly want their oversight board to succeed in steering Facebook properly towards stopping riots and the deaths of people, at the very least… but I don’t think it’ll work the way I would want it to.
I wonder if they’ll even allow themselves to feel the gravity of the task, or take responsibility for Facebook’s failures.
That’s partly a sad commentary on the notion of an oversight board with not as much leverage as I’d like, but may mostly end up feeling like bad public relations for Facebook and Mark Zuckerberg,
Facebook’s problem, I think, will not be solved by a Facebook solution. – Rappler.com