Please ensure Javascript is enabled for purposes ofwebsite accessibilityOnline content moderation: 'An imperfect science' | WPDE
Close Alert

Online content moderation: 'An imperfect science'

Seen on the screen of a device in Sausalito, Calif., Facebook CEO Mark Zuckerberg announces their new name, Meta, during a virtual event on Thursday, Oct. 28, 2021. (AP Photo/Eric Risberg, File)
Seen on the screen of a device in Sausalito, Calif., Facebook CEO Mark Zuckerberg announces their new name, Meta, during a virtual event on Thursday, Oct. 28, 2021. (AP Photo/Eric Risberg, File)
Facebook Share IconTwitter Share IconEmail Share Icon
Comment bubble

HUNT VALLEY, Md. (TND) — Whether it’s misinformation, shadow banning or removing posts, members of both major political parties have their eyes on social media platforms and how they moderate their content.

Members of Congress have expressed they want to crack down on Big Tech for a slew of different reasons, with primary efforts focused on anti-trust laws and transparency about some sites’ harmful effects on teens. But a deadlocked Congress doesn’t allow for much legislation to move through the pipeline. So many states are taking initiatives to pass laws regulating Big Tech themselves.

But a recent effort is stuck between a rock and a hard place — Section 230 of the Communications Decency Act and the First Amendment. And that’s online content moderation.

Republican lawmakers in states like Florida and Texas have been bemoaning sites like Twitter and Facebook for removing certain posts and banning users. The dialogue ramped up when companies cracked down on what they deemed “misinformation” about the 2020 election, and the eventual banning of former President Donald Trump from Twitter after the Jan. 6 attack on the U.S. Capitol.

Florida and Texas both passed bills last year dealing with content moderation which were soon blocked by state courts. Florida’s bill levied fines and penalized social media platforms for blocking or inhibiting content from political candidates and media outlets. Texas’ bill stated, “a social media platform may not censor a user, a user’s expression or a user’s ability to receive the expression of another person based on the viewpoint of the user or another person.”

“What we’ve been seeing across the U.S. is an effort to silence, intimidate and wipe out dissenting voices by the leftist media and big corporations Florida is taking back the virtual public square as a place where information and ideas can flow freely,” Florida Lt. Gov. Jeanette Nunez, said of Senate Bill 7072.

Texas Gov. Greg Abbott said of House Bill 20:

We will always defend the freedom of speech in Texas. Social media websites have become our modern-day public square. They are a place for healthy public debate where information should be able to flow freely — but there is a dangerous movement by social media companies to silence conservative viewpoints and ideas.”

Free speech advocacy groups NetChoice and the Computer and Communications Industry Association sued both states for passing these bills. The groups’ amicus briefs accused Florida’s law of violating Section 230 and Texas’ law for violating the First Amendment. Judges blocked both bills, and both states are now appealing the decisions.

There have also been bills introduced in Ohio, Alabama and Tennessee trying to prohibit companies from removing users’ legal speech, while other states want to prohibit algorithmic curation or develop transparency requirements.

David Greene is the civil liberties director and a senior staff attorney at the Electronic Frontier Foundation, which also filed amicus briefs in both Florida’s and Texas’ cases finding the laws unconstitutional.

“What we’ve seen so far is courts, really, across the country with judges of all political stripes, have rejected a legal requirement that an online service has to publish somebody’s speech,” Greene said. “Actually, this issue has been litigated in courts all over the country in a bunch of different contexts. They’ve all lost. Every court, without exception, that has considered these issues has said that these companies have a First Amendment right. They can make whatever decision they want. They can’t be compelled to publish anybody’s speech or to present it in a [certain] way or to prioritize it or anything like that.”

The laws are fairly simple. Social media companies, like any other businesses, have their own First Amendment rights. They can say what they want to say, carry viewpoints they want to carry, kick people off their platforms and make editorial decisions without explaining why. Section 230 protects these companies from liability when they do or do not remove content, shielding them from legal challenges except for violating federal criminal or copyright law or violating federal or state sex-trafficking law.

Lawmakers on both sides say sites like Twitter and Facebook are “public squares” because they use the platforms to communicate with their constituents. But advocates say the law requires Americans to look at what a “public square” is a little differently.

“Regardless of what people say, that Twitter or Facebook or YouTube is a public square, they really are not,” said Kevin Goldberg, the First Amendment specialist at the Freedom Forum Institute. “They are part of a public square. The internet as a whole is better analogized to the public square. [Social media companies] are businesses within the public square. We want to keep the government out of that marketplace to the greatest degree possible.”

The other approach is common carriage regulation as a means to force companies to host more speech by treating all legal content equally without discrimination. Common carriers put themselves in the public to carry goods or people for a fee, like railroads, telecommunication providers and airlines.

Justice Clarence Thomas wrote about this approach in Biden. v. Knight First Amendment Institute at Columbia University (2021), saying, “In many ways, digital platforms that hold themselves out to the public resemble traditional common carriers. Though digital instead of physical, they are at bottom communications networks, and they ‘carry’ information from one user to another. A traditional telephone company laid physical wires to create a network connecting people. Digital platforms lay information infrastructure that can be controlled in much the same way.”

But common carriers hold themselves as neutral conduits, while social media companies tell users from the jump their content moderation rules do not treat all content equally. They promote content users like and get rid of content users don’t like to compete with each other, and to compete for digital advertisers. As Matthew Feeney, the director of Cato’s Project on Emerging Technologies points out, social media companies as common carriers would likely have users viewing violent or racist images and spam that’s treated just like any other content, because it would have to be treated that way.

I don’t think it’s an apt comparison,” Goldberg said. “There’s a vast difference between a social media company that deals in speech and viewpoints and a common carrier like a phone company that simply carries all speech and viewpoints that comes across its wire.”

A guide was published Thursday by a former Facebook policy executive discussing the possibility of states regulating online content. Matt Perault, former head of global policy development at Facebook’s parent company Meta and now the director of the University of North Carolina’s Center on Technology Policy, writes in his guide: “On the right, legislators have introduced dozens of bills addressing what they see as problematic online censorship. On the left, legislators have introduced a series of bills addressing what they see as harmful online content. Yet, state legislation from both Democrats and Republicans faces significant legal and practical challenges, limiting the efficacy of state government reform efforts to date. Rather than focus on the problems with existing approaches, this brief offers an affirmative agenda. While the left and the right disagree about the specific problems that need to be addressed, both want to improve the health of our communication systems.”

The guide outlines the challenges to regulating online content and provides a “set of potential interventions state governments can adopt that are politically and legally feasible, that are unlikely to have a significant negative impact on product quality and that will make at least incremental improvements to online content and content moderation.”

These interventions include studying the impacts of content moderation; facilitating data sharing between platforms and researchers; prosecuting platforms for egregious and systematic violations of state consumer protection laws; or expanding criminal laws related to false election speech. Yet, the guide also recommends increasing funding for other communications departments and local newsrooms, in an effort to “support informational health.” In other words, instead of moderating the content of private businesses that see multitudes of misinformation every day, support the ones that try to produce high-quality information.

The EFF, of which Greene is a member, endorsed the Santa Clara Principles, which dictate “how best to obtain meaningful transparency and accountability around internet platforms’ increasingly aggressive moderation of user-generated content.”

The principles are companies should ensure human rights and due process, publish clear and precise rules and policies and understand the context of the posts being moderated, among other things. Greene said they also outline requirements to notify users when their content has been moderated and avenues for appeal.

“I think users who have their stuff removed or down-ranked certainly have a right to be angry about that, upset or bewildered or question it. I completely understand them wanting to do that,” Greene said. “It’s unlikely to be a legal violation, but that doesn’t mean it’s not something that should be taken seriously. It’s the best practice to have very clear editorial guidelines and then to apply them consistently. It’s impossible to do it perfectly, and probably nearly impossible to do it well.”

Goldberg pointed out that forcing transparency could qualify as being overly burdensome on the companies, and he doubts ensuring even-handedness in a marketplace of ideas is a compelling government interest.

“I think that we have the Truth Socials and the Parlers and the Rumbles of the world alongside the YouTubes and the Twitters and the Facebooks of the world, because as a matter of policy, I want to see more options. I realize that it becomes difficult because then you have people talking in an echo chamber on Twitter and people talking in an echo chamber on Truth Social, and therefore we don’t get the interaction we’d like between viewpoints, but it’s better than not having the potential for interaction at all,” Goldberg said.

In the end, the internet has such a massive volume of content that in practice, content moderation is difficult to pull off on such a grand scale. Goldberg described it as something that’s “always going to be an imperfect science.”

“It either takes an army of people reviewing every decision and trying to ensure that those decisions are applied even-handedly, regardless of politics, regardless of viewpoints, with strict adherence to the letter of your rules or the law. OK, that’s hard. That’s expensive,” he said. “Or, you go the other way. You automate the process and you lose something perhaps even more important, which is context. You have an army of bots that don’t understand the human language and nuance that probably end up in a worse place.”

Neither First Amendment advocate sees the appeals going anywhere for Texas or Florida, even considering both Courts of Appeals handling the cases have conservative majorities. However, if both courts have opposing rulings, they say the concept would likely head to the Supreme Court. But Greene and Goldberg said a ruling that governmental involvement in content moderation is unconstitutional is the inevitable outcome.

“Once we give the government the role in setting editorial policy, then we’re only getting what the government wants and not what a variety of users want. And I think what we’ve seen, especially with the federal government over the past six years, is that can swing wildly. If the government had the power to tell social media companies what they must and could not publish and exercise that power, we’d have a much different internet between 2016 and 2020 than we would have between 2021 and the present,” Greene said. “That’s exactly why the First Amendment limits the role of government. It’s really that it’s not an appropriate role for government to control these channels of communications in this way.”

Goldberg said increasing polarization has led many Americans to forget all viewpoints are protected by the First Amendment, and people are protected when they use their speech to affect results.

Comment bubble

“I just hope that people kind of stop for a second and think about what they might be losing by pressing so hard to restrict somebody else from speaking, and particularly, what we all would lose if they really want the government to step into this discussion.”

Loading ...