Two Supreme Court Cases This Week Could Alter The Whole Internet

The Supreme Court is set to be hearing back-to-back oral arguments next week on two legal cases which could dramatically change the face of online speech and content moderated.

The outcome of oral arguments, which are scheduled on Tuesday and Wednesday will determine whether tech companies and social media firms can be legally liable for recommending content to their users, or for helping to support terrorist acts by hosting content from terrorists. This is the first review of a controversial federal law that protects websites from lawsuits involving user-generated content.

The cases that are closely watched which are named Gonzalez V. Google and Twitter v. Taamneh, carry significant risks for the entire internet. A growing number of websites and apps with legal liability to host or promote content could result in major changes to sites like Facebook, Wikipedia, and YouTube among others.

The lawsuit has generated an array of ferocious debates in recent years from tech companies regarding the possible impact on the future of the internet. US legislators as well as civil society organizations from more than a dozen state governments have joined the discussion by filing papers at the Court.

 

The central issue in the legal fight the issue is Section 230 of the Communications Decency Act which is a 30 years-old federal law that courts have repeatedly ruled provides broad protections for technology platforms, but which is now under scrutiny in light of the rising concerns about Big Tech’s decisions on content moderation.

The law is criticized by those from each side of the spectrum. A lot of Republican officials have claimed the fact that Section 230 gives social media platforms the right to restrict opinions of a conservative nature. Many prominent Democrats such as president Joe Biden, have argued Section 230 protects technology giants from having to answer for the spread of false information or hate speech.

Also Read  TikTok Ban could Compromise Personal Cybersecurity

In recent years, a few members of Congress have been pushing for modifications in Section 230 that might expose technology platforms to greater risk as well as plans to modify US antitrust rules as well as other bills designed to reinsert the major technology platforms. However, these efforts have mostly stagnated which leaves an open position for the Supreme Court as the likeliest source of changes in the future to the manner in which you regulate digital platforms in the United States regulate digital services.

Gonzalez v. Google

The lawsuit that involves Google is focused on whether the company can be brought into court because of its YouTube subsidiary’s algorithmic promoting of terrorist-related videos on its platform.

The plaintiffs of this case – families of Nohemi Gonzalez who was killed in the 2015 ISIS attacks in Paris The specific suggestions were in violation of the US anti-terrorism law by assisting to increase the number of viewers who were radicalized and promoting ISIS’s global view.

The allegations seek to carve out recommendations on content, to ensure that they don’t get protections under Section 233, which could subject tech companies to more responsibility for the way they operate their services.

Google as well as other companies in the tech industry have stated that the understanding of Section 230 would increase the legal risks of the process of ranking, sorting, and curating content on the internet which is a fundamental characteristic of the modern-day internet. Google has stated that in this scenario websites would try to be secure by either removing more content than is needed or abstaining from the moderation of content completely and allowing more dangerous content to appear on their sites.

Also Read  Some Twitter Users Claim their Accounts have been Locked out.

Friends-of-the-court-filings made from Craigslist, Microsoft, Yelp, and others have suggested the risks aren’t restricted to algorithms. They could affect any web-based activity that could be considered to be giving an offer. It could mean that even ordinary web users who serve as moderators for various websites may be at risk of legal liability as per a report by Reddit and a number of volunteers who are Reddit moderators. Oregon Democratic Senator. Ron Wyden and former California Republican Rep. Chris Cox as the primary authors of Section 230, argued to the Court that Congress intended to pass this law to allow websites the ability to moderate content according to their own preferences.

It is also worth noting that the Biden administration also joined in on the matter. In a brief it filed in December, the administration argued the fact that Section 230 does protect Google and YouTube from lawsuits “for not removing third-party content, which includes the content that it has suggested.” However, according to the government’s case, these protections don’t apply to Google’s algorithms since they are a representation of the company’s speech, not the words of any other.

Twitter is. Taamneh

The second instance, Twitter v. Taamneh will determine whether social media companies are able to be accused of aiding and abetting the commission of a specific terrorist act when they have hosted content from users that shows broad support of the terrorist group responsible for the violence, but without reference to the specific terrorist act at issue.

The plaintiffs in the lawsuit – families of Nawras Alassaf victim of the ISIS terrorist attack that took place that took place in Istanbul in 2017 — have claimed that the social media giants, including Twitter, newly assisted ISIS in breach of a US anti-terrorism law by permitting some of the group’s content to remain on their platforms, despite rules designed to restrict that kind of content.

Also Read  The AI Job Paying Up to $335K—and You Don't Need a Background in Computer Engineering

Twitter has declared that simply the fact that ISIS used its platform to promote its cause does not constitute the company’s “knowing” support for the terrorist organization, and in any event, the company is not responsible under the antiterror law because the material involved in the case was not specifically related to the incident which killed Alassaf. The Biden administration in its own brief has endorsed that position.

Twitter was also known to have previously stated that it was exempt from the lawsuit due to Section 230.

Other tech platforms like Meta as well as Google have argued in this case that, if the Court decides that the tech companies are not liable under US anti-terrorism law in the circumstances, it will be able to avoid a debate about Section 230 altogether in both instances, since the allegations at issue are thrown out.

In recent times, however, a few Supreme Court justices have shown concern for Section 230, and have suggested that they would like to hear cases relating to the statute. In the year 2000, Supreme Court Justices Samuel Alito, Clarence Thomas, and Neil Gorsuch wrote that new state laws, including Texas’s, which would require social media platforms to publish content that they prefer to remove are raising questions of “great significance” concerning “the ability of powerful social media companies to influence public debate about the pressing questions of the moment.”

A variety of petitions are in the process of which ask the Court to reconsider the Texas law and the similar law passed in Florida. The Court this month put off the decision of whether to consider these cases, requesting instead an administration like the Biden administration to present its opinions.

Sunil Kumar writes about smartphones and laptops for Gadgets360TechNews, out of Delhi. He is the Deputy Editor (Reviews) at Gadgets360TechNews. He has frequently written about the smartphone and PC industry and also has an interest in photography.

Sharing Is Caring:

Leave a Comment