No one has set any clear standard about how badly a politician can break Facebook’s rules before getting kicked off the platform, and yesterday the company’s wannabe court missed a chance to fill the void. In a decision anticipated with the fervor that might attend a high-profile Supreme Court ruling, the Facebook oversight board told the platform that, while it might have been right to ban then-President Donald Trump on January 7 for his role in stoking the Capitol riot and because of the risk of continuing violence, the ongoing “indefinite” nature of the ban is not justified. The board gave Facebook six months to go back to the drawing board and work out what to do with Trump’s account now.
But this is the exact question Facebook had asked the board to settle. The board respectfully declined. In fact, the board’s decision resolved essentially nothing—except that Facebook wasn’t exactly wrong on January 7—and leaves open the possibility that this whole charade will happen again before the year is out.
The oversight board is a weird creature. It has no mandate beyond that which Facebook deigns to give it. Its decisions arguably affect free speech, but not in the legal sense, because they implicate only a single private social-media platform. The board is like a moot court in a state without real courts, or a model United Nations in a world with no United Nations. And yet the oversight board was the only public-facing deliberative body contemplating what Ben Smith of The New York Times described as “one of the most important questions in the world.”
Writing speech rules is hard. Offline justice systems have been struggling with it for centuries. At best, the board can come up with better policies than Facebook CEO Mark Zuckerberg can. At worst, it can still provide a check on Facebook’s power over what you or I—or even the president of the United States—can say in a very important corner of the online world. Zuckerberg alone decides is a terrible way to determine what should be allowed on a communication medium used by 2.8 billion people.
This is where the board comes in. Even if Congress could bridge the cavernous divide between Democrats’ and Republicans’ views on content moderation, the First Amendment would stand in the way of lawmakers coming anywhere near these kinds of decisions. Other countries are not constrained by the U.S. Constitution, but almost all democracies have significant qualms about governments making granular decisions about what people are allowed to say. Even if governments made such rules, the sheer scale and speed at which content moderation happens makes such an idea practically impossible. And so most of Facebook’s speech rules will always be settled by private, not public, power. In this world, even a moot court in which an essentially random group of people play judges is better than no court at all.
The people whom Facebook tapped for its board are precisely the kind you’d imagine on a supreme council of elders: retired judges, a Nobel Prize winner, a former prime minister of Denmark, a former editor in chief of The Guardian, human-rights advocates, and lots of lawyers. They reportedly get paid six-figure salaries, and their press road show this week suggests that they are enjoying the spotlight.
Members are still trying to make sense of their role. Facebook is typically responsive to pressure from only a small slice of Western media. The board is a global body, and in some decisions leaps at the opportunity to criticize Facebook’s failures overseas. Just last week, the board issued a decision about a post in India that was critical of Prime Minister Narendra Modi and his party. It drew attention to Facebook’s mistaken removal of the post and urged Facebook to make changes to stop such mistakes from happening again. Given the current political and public-health crisis in India, protecting political speech criticizing the government is an urgent responsibility. In that case, the board stepped up.
But it was far too modest in the Trump case. Writing the rules is Facebook’s job, the board’s members concluded, and the board will merely call balls and strikes on whether those rules are consistent with Facebook’s own values and the board’s interpretation of international human-rights law. The board made a strategic, perhaps even Solomonic, judgment to uphold Facebook’s initial decision but also not give its stamp of approval to kicking Trump off the platform forever. It dinged Facebook for making a “vague, standardless” decision—but then sent the controversy back to Facebook with a pretty vague and standardless set of instructions.
In doing so, the board passed up an opportunity to lay out specific principles for making decisions that currently are unconstrained. This should be one of the board’s superpowers: Precisely because it is not a government body, it can offer guidance that governments cannot.
The oversight board’s deliberations were convincing enough, apparently, that the conservative commentator Charlie Kirk suggested the U.S. Supreme Court should take up Trump’s case and overturn the board. But to many critics, the Trump decision confirms that the board is just an exercise in kayfabe—the professional-wrestling conceit of people agreeing to take seriously something that is obviously fake. And the media outlets that earnestly cover the charade, the prominent people who wrote submissions to the board about the case, the Trump representatives who submitted a user statement to the board, and the commentators, including me, who waste time analyzing its decisions as if they are more than just blog posts are helping legitimize Facebook’s sham justice system.
Of course Facebook set up the board not out of altruism, but because Zuckerberg thinks it benefits the company. Of course he wants to attract attention to the board. And of course we shouldn’t forget the many ways in which Facebook still needs to expand the board’s remit to make its oversight meaningful. Many of the most important problems with Facebook, such as what kinds of content it chooses to amplify, its ad-targeting practices, and its data collection, are all beyond what the board can review.
But none of this changes the fact that the board exists and is making decisions with real consequences. And while conservative outrage over the board’s decision this week is comical, the idea that Facebook should have clear standards isn’t. No one is suggesting that the board makes other regulatory measures unnecessary. No lawmaker has found or will find such an argument persuasive. By all means, break Facebook up! Fine it! Yell at its executives in the halls of Congress! Still, Facebook will persist, and the board can help constrain it. This is not an either-or proposition.
The oversight board is indeed a PR exercise. And its benefit is precisely that it happens in public, and not inside the black box of Facebook. At this point, democratic institutions aren’t solving the problems that Facebook creates. So the board is the worst option except for all the rest. If only the board’s members really grasped the opportunity they’ve been given.
"do it" - Google News
May 07, 2021 at 01:55AM
https://ift.tt/3emAeDv
Facebook's Made-Up Court Is Better Than No Court at All - The Atlantic
"do it" - Google News
https://ift.tt/2zLpFrJ
https://ift.tt/3feNbO7
Bagikan Berita Ini
0 Response to "Facebook's Made-Up Court Is Better Than No Court at All - The Atlantic"
Post a Comment