Scholars deliberate the value of Facebook’s ‘Supreme Court’

By Michael R. Malone

Scholars deliberate the value of Facebook’s ‘Supreme Court’

By Michael R. Malone
University of Miami experts in law and communications weigh the potential for Facebook’s oversight body to satisfy critics and ease tensions in the digital war playing out on its platform.

In late January, Facebook’s Oversight Board, the independent body created to rule on emblematic hate speech, misinformation, and violent content that circulates on its platform, issued its first set of decisions—reversing four of five cases in which the company had removed content that it alleged violated its policies. 

The same 20 judges are set to soon decide on the high-stakes case of whether to uphold the Facebook ban on former President Donald Trump’s account. 

Sam Terilli, chair of the Department of Journalism and Media Management in the University of Miami School of Communication who practiced law for more than 30 years, and John Newman, associate professor in the School of Law, both expressed tepid support at best for the new initiative. They doubted the board’s authority to address the core issues generating the tension and its ability to manage the avalanche of controversial content produced continuously by Facebook’s more than 3 to 4 billion users worldwide—80 percent of them outside the United States. 

Sam Terilli

“Clearly Facebook has gone to a great deal of trouble—creating an independent endowment for funding [the board], selecting very interesting people from a wide cross section, and even giving the board clear authority to make decisions on its takedowns,” Terilli remarked. 

“This provides one avenue for people who are upset when their posts are taken down, but that’s just half the problem and, at best, the board will be able to examine only a tiny fraction of these,” Terilli added. “My bigger concern is that it does nothing for the content—the hate speech, defamation, horrible invasions of privacy, and incitement of violence—that remains up.” 

More than two years in development, the initiative was suggested to Facebook by Harvard Law professor Noah Feldman, who continues as a chief adviser, according to media reports. Facebook consulted more than 2,000 experts from 88 countries and held six input sessions around the globe. Heather Moore, Facebook’s tech governance leader, led the effort to write the charter, and Mark Zuckerberg, CEO, has already expressed his willingness to abide by the court’s decisions. 

Newman, like Terilli, noted that while news councils and ombudsmen at newspapers have served as independent arbitrators of content controversy—to varying degrees of success—there are no real precedents in the media and communications field for this intent. 

“There’s not anything parallel where a company like Facebook is trying to exert this degree of content moderation while also trying to retain its privileged status under Section 230,” he said, referencing the special provision under the Communication Decency Act which grants immunity to service providers for content posted on their platforms. 

Newman headshot

Newman identified four concerns that could undermine the board’s effectiveness: judge selection, case selection, judicial bias, and the court’s subject matter jurisdiction. 

“For the judges’ selection process to be fair and balanced, you’d almost need a double-blind system, and even for Facebook to cede control to some other entity to make the selection,” he remarked, regarding the effort to comprise an ideologically diverse court. 

Meeting the intent to hear emblematic cases also posed a problem, he said. “Just like our U.S. Supreme Court, they’re not going to hear anywhere near all the disputes that are out there, so there’s going to be some editorial decision-making that happens for the caseload to get cut down and funneled to get a cross-section of cases—I’m not sure what a good process even looks like,” Newman said.   

In terms of the risk of judicial bias, he pointed to the fact that, while the judges—eventually 40 in all—are all expected to be high profile experts with separate sources of income, Facebook will ultimately be providing compensation. 

“There’s always a risk then that you develop warm, fuzzy feelings about the entity that’s putting food on your table,” he said. “How do you guard against that? You would need for this to be externally funded, and who’s going to pay for that?” 

Yet most concerning for Newman, whose core expertise is in anti-trust regulation and competition or absence thereof, is Facebook’s business model—over which the court clearly has no jurisdiction. 

He described the company’s business model in terms of a normal market with producers, such as wheat, steel, or some other commodity, and in this case, attention. 

“Facebook’s role is to harvest attention and distribute it to the consumers who really want it, like any other distributer,” he explained. “In this case, the consumers of attention are advertisers, they’re the ones who want our eyeballs. 

“What that leads to as an incentive structure for Facebook is: ‘I want to harvest as much attention as possible, I want to pay my producers the lowest price possible for their attention, and then I want to sell as much as I can to advertisers.’ ” 

On a platform designed for people to trade as much of their attention as possible, Newman said the big issue is one of “attention over extraction.” 

“This is the big one,” he said. “The [business] incentive is not just to design a great product—that’s there, too—but to design it to addict people. 

“The problem has to do with overconsumption and over extraction of eyeballs,” Newman added, recognizing that even increased competition might not fix the problem. Still, he pointed to the Federal Trade Commission (FTC), the government body established to police unfair competition among its other responsibilities, as a possible arbitrator. 

“If you’re out there designing an addictive platform, that could be viewed as unfair,” Newman said. "And while the FTC hasn’t done a lot with its policing power over the past five decades,” he continued, “as a real, external power it could potentially solve that problem.”    

He noted that the genesis for the venture was to guard against removing too much content. 

“If Facebook’s incentive is to over err on the side of taking stuff down—and you could see that happening because of political pressure—then the Oversight Board’s sole function will be to get stuff back up and that would be in keeping with their original purpose,” Newman pointed out. 

For the initiative to gain traction, Terilli urged Facebook to be more transparent and to better define and communicate the board’s authority. 

“Facebook needs to step forward and start adopting clearer, more direct policies on both sides, regarding the content that’s left up and what’s taken down,” he said. “You have to develop this body of precedent so that people know how to behave, and that would apply to Facebook as well as to users,” added Terilli, who has served as media counsel for a host of clients during his professional career. 

“I’m just not convinced that Facebook has gone far enough yet,” he said, while suggesting that the manner in which the courts operate, including the establishment of precedents based on decisions, as well as the resolution of some disputes through arbitration and mediation, offers a model for the Oversight Board to follow. 

“Providing an alternative mechanism, such as the Alternative Dispute Resolution for resolving disputes without litigation in a private process as opposed to a public, official process, has been around forever and can be quite effective,” Terilli noted. “Of course, not every case goes to court, civil or criminal. And instead, you rely on precedent for guidance to establish the acceptable forms of behavior, whether you’re a government enforcement agency or a private person.” 

Judges and appellate courts write reasoned opinions, citing precedent dealing with evidence and the rules of law that then legitimize the decision,” he indicated. 

“The party who has lost often walks away disgruntled. But we can at least in society generally maintain a high degree of confidence in the impartiality and consistency in the judiciary, even though judges are human beings, too, and are affected by the same things as the rest of us,” Terilli pointed out.  

Does the initiative satisfy anyone at this point?

“Probably free speech aficionados will be slightly mollified by this,” Newman said, adding “but I suspect that it won’t make the majority of the people who are currently upset happy, and there is a potential for people to get upset at the board—which from Facebook’s point of view might be a good way to deflect criticism away from the company.” 

“It’s a first step, but if it’s the only step, then I think it’s a huge mistake,” Terilli said. “Facebook needs to establish a transparent mechanism that really functions. The alternative is something that the tech world doesn’t want, which is to repeal or drastically cut back the immunity under Section 230,” he added. “And guess what? If people are posting content on Facebook that really is obviously false and defamatory and Facebook has liability for it, then they’ll figure out a clear process to get it taken down.”