It’s Finally Happening. Big Tech Takes on the First Amendment

supreme case, supreme court, case, court case, law, big tech, first amendment

In legal case news, the U.S. Supreme Court will decide on how to best apply the First Amendment to social media platforms.

  • The cases, Moody v. NetChoice and NetChoice v. Paxton, present pivotal legal battles, pitting states against tech giants over the extent of content moderation powers.
  • Should social media platforms be treated as publishers with editorial discretion or neutral conduits obligated to transmit all content impartially?

The U.S. Supreme Court is set to deliberate on how the First Amendment intersects with social media platforms based on cases brought forward by Texas and Florida.

The First Amendment to the United States Constitution is arguably one of the most important parts of the Bill of Rights. It protects five fundamental freedoms:

  1. Freedom of Religion
  2. Freedom of Speech
  3. Freedom of The Press
  4. Freedom of Assembly
  5. Right to Petition the Government

This Amendment explicitly prohibits the government from interfering without justifiable cause. It’s the reason why, in 1971, the New York Times managed to publish classified documents exposing the U.S.’s historical involvement in the Vietnam War without prosecution or persecution. However, while the First Amendment is a powerful safeguard for freedom of expression, it is not a get-out-of-jail-for-free card.

Today, we stand at another monumental point in history. The Supreme Court is considering two important First Amendment cases on how the First Amendment can be best applied to social media platforms. Its decision could reshape the landscape of online speech and the role that Big Tech plays in moderating content on its platforms. It could redefine freedom of speech on social media.

The cases, Moody v. NetChoice and NetChoice v. Paxton, are pitting the states of Texas and Florida against tech giants like Meta and Google.

The first case dates back to 2021, when Florida passed a law, SB 7072, that limits the power of social media platforms to moderate content. The bill specifically points to deplatforming political candidates. A clear jab at all the platforms that removed Former President Donald Trump after the January 6 U.S. Capitol attack for the part he played in it. NetChoice, a trade association representing tech companies, then sued Florida officials. They claimed that the law violated the companies’ own First Amendment rights by restricting their editorial discretion over content moderation. To make matters more interesting, they also mentioned the federal precedent that protects platforms from liability for user-generated content (Section 230 of the Communications Decency Act).

In the case of NetChoice v. Paxton, that same year, the state of Texas passed House Bill 20 wanting to regulate large social media platforms like Facebook, X (formerly known as Twitter), and YouTube. NetChoice challenged two of its provisions:

  1. Section 7, which prohibits platforms from censoring user posts based on viewpoint, except for unlawful and violence-inciting content.
  2. Section 2, which forces platforms to disclose their content moderation and content promotion practices, to publish an “acceptable use policy,” and to maintain a complaint-and-appeal system for users who disagree with content moderation decisions.

Again, NetChoice claims the law restricts their editorial discretion over content moderation, forcing them to host content they disagree with.

In both cases, there was a lot of blocking and appealing until the Supreme Court agreed to hear both cases jointly.

The situation boils down to the role of social media platforms in facilitating public discourse and the extent to which they should be allowed to moderate user-generated content.

The million-dollar question here becomes: Should the Law treat social media platforms as publishers with editorial discretion over content? Or as neutral conduits obligated to transmit all content without interference?

On the one hand, companies should have the right to moderate the content on their platforms. Look at Elon Musk’s X for example. Not too long ago, the text-based platform got into trouble with advertisers because there was anti-Semitic content on it, and advertisers refused to have their ads appear next to morally questionable tweets. Whether Elon admits it or not, the company’s revenue then took a hit. The free speech that these states advocate for allows such distasteful (I’m trying to be nice here) content, but the advertisers won’t put up with it. Damned if you do, damned if you don’t.

On the other hand, social media platforms should not interfere with the public discourse to the point that it skews the events. Even if you are not a fan of Donald Trump, you have to admit that deplatforming him was silencing an individual, which defies the point of free speech. Interfering will drive users to use more niche apps like Trump’s Truth Social, which only creates an echo chamber. And THAT definitely does nothing good for the public.

The Judges will find themselves stuck between a rock and a hard place. That’s for sure.

The alarming part is the timing.

The 2024 U.S. presidential elections are quickly approaching (November 5th, 2024), but not before the Supreme Court makes a decision (sometime around June). Not only could it transform the internet, but it could also swing the elections.

The content of social media platforms, including Facebook, X, and YouTube, holds significant influence over public discourse, especially during election cycles. If these state laws are struck down, the platforms may adopt a more proactive stance in filtering out harmful or misleading content. If these laws are upheld, however, the opposite may happen, as people go rampant with misleading posts. While the average person may not have much sway, a political candidate’s posts may sway the votes.

As I’ve pointed out, not reigning in the amount of moderation a company can conduct on its platform may drive certain demographics away, leading to echo chambers. Such chambers are notorious for only accepting the views and facts that fit their narratives. No one has ever grown intellectually from having their beliefs validated with no challenge.

Another point to consider here is the political advertising and campaigns. If the Supreme Court views these platforms as newspapers of sorts, they will have a greater say in which political ad to run. That also skews the metrics.

This ruling, whether it ends up being in favor of Big Tech or not, will set a precedent for future cases. It will guide how social networking platforms navigate content during election seasons.

This turning point in history will define how we interact with one another online and what these tech companies are supposed to do about it.

Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Impact section to stay informed and up-to-date with our daily articles.