On Wednesday, the Supreme Court will continue its examination – begun today – of the legal rules governing the operation of some of the biggest social media platforms. The Justices will next hear a case involving Twitter, reviewing the scope of anti-terrorism laws when used to challenge those platforms’ hosting of terrorist propaganda.
The Court will broadcast “live” the audio (no video) of the hearing on its homepage, supremecourt.gov To listen, click on “Live Audio” and follow the prompt when the courtroom scene appears lower on the page. The audio also will be available, under the title of the case, on C-Span TV at this link: cspan.org/supremecourt
Wednesday’s case: Twitter v. Taamneh The hearing will begin at 10 a.m.; it is scheduled for 70 minutes.
Background: In the social media world, the phrase that describes its greatest challenge is “content moderation.” Since computer-based platforms can be entered by users, numbering in the billions, the task of the operator is to allow or block posts or videos that will negatively affect the quality of the site in ways ranging from inappropriate or improper to illegal or dangerous. It is the same thing, basically, as a single editor sitting at a desk and monitoring what comes in, item by item, but it is far more technically sophisticated on social media platforms.
No government, at any level, has yet figured out how to create a system of official oversight that will both assure that the platforms are free of “malware” (computer speak for noxious content) and that they foster a great deal of free speech. So, social media are largely left to self-regulation, to develop their own processes of selection and rejection.
This week, the Supreme Court is delving into the legal side of content moderation – in the Google case today, focusing on legal protection for that process, and in the Twitter case tomorrow, examning legal responsibility for failure to block terrorist propaganda.
Specifically at issue tomorrow will be two laws, the Anti-Terrorism Act, passed in 1990, and its amended 2016 version, the Justice Against Sponsors of Terrorism Act. Americans who claim harms from “an act of international terrorism” by an organization that has been officially designated as terrorist may sue any person or entity that aids the performance of that act or conspires with the organization involved. Winning such a case can result in an award of triple the amount of actual damages.
Facts of this case: On January 1, 2017, an ISIS gunman entered the Reina nightclub in Istanbul, Turkey, and began shooting. Killed at the scene were 39 people; 69 were injured. Among the dead was Nawras Alassaf, a Jordanian citizen, who was celebrating the New Year with his wife. The day after the shooting, ISIS claimed responsibility for the massacre.
Alassaf’s relatives in the United States sued in federal court in California, accusing Twitter, Google, YouTube and Facebook of allowing ISIS the free use of their platforms to send out terrorist propaganda and recruiting information. The lawsuit asserted that ISIS could not have grown into a major terrorist organization without that access. The core of their complaint was that the platforms had “aided and abetted” ISIS in violation of the 1996 anti-terrorism law and its amended 2016 version.
A federal trial judge dismissed their lawsuit against all of the platforms, finding a lack of evidence that they had any knowledge or awareness of the Istanbul attack and the role of ISIS in it. That was the same outcome that another judge in California had reached in the Gonzalez family case against the platforms, and as a third federal judge there had done in a third lawsuit, by relatives of three people who had been killed in a terrorist mass shooting incident in December 2015 at an office holiday party in San Bernardino, Calif. All of the families filed separate appeals, and all three cases wound up before the same panel of judges in a federal appeals court. (The lawsuit growing out of the San Bernardino shooting was dropped after the appeals court ruled against it.)
The appeals court ruled against the family in the Paris shooting case, but ruled in favor of the family in the Istanbul case. After an appeal was filed in the Supreme Court by the family in the Paris case against Google and YouTube, Twitter also took its case to the Justices in the Istanbul case. (Facebook did not file its own appeal, but told the Court it supported the positions of Google and Twitter.) The Justices’ hearing in the Google case was held today; the Twitter case comes up tomorrow.
Although Google had won legal immunity from the appeals court based on Section 230 of the Communications Decency Act, that court did not consider that provision at all in Twitter’s case.
The questions before the Court in the Twitter case: Can a social media platform be held to blame for aiding and abetting a terrorist organization’s mass shooting incident on a theory that it did not do enough to limit that organization’s access to the platform? Does aiding and abetting only occur under the 1996 and 2016 laws if the platform ’s services were actually used as part of the incident?
The Biden Administration’s Justice Department has entered the case in support of Twitter. Its brief argued that it is not enough to satisfy the federal ban on aiding and abetting terrorism that a social media platform knew a terrorist group was using the platform’s generic services along with possibly billions of others. There must be some knowledge that its services have some link to a terrorist organization or its actions, the Department said.
Significance: It is important to keep in mind that this case involves a legal duty for social media to engage in “content moderation” only in the specific context of the federal ban on assisting terrorist organizations in violent action. And, even in that context, the case is only about what it means to “aid and abet” such violence – a discrete issue under a pair of quite specific federal laws.
At the same time, however, the Twitter case is one of more than a dozen lawsuits filed across the country, seeking to hold such platforms liable for some incident of violence around the globe, so there is wide interest in how the Court will rule in that context. The family in the Istanbul case seems to have a considerable challenge in making its case before the Justices, because the Twitter case is the only one of those so far that have not been dismissed by lower courts. As Twitter’s appeal contends, this case is “the sole outlier.”
The case does provide a vivid indication of how difficult it may be to impose legal remedies on social media platforms when someone suffers harm from a video or a message posted on their sites. The largest of these platforms have billions of users, and it may well be true that many of them are misusing or abusing their access to the site, and yet, how difficult is it to trace someone’s specific injury (or even death) to such a post? How does one link a harmful effect to an electronic cause?
Anything that the Court may do in answering those questions, or even beginning to hint at answers, could provide a beginning on a much larger cultural problem – how to keep the Internet in check.
The Twitter argument will conclude this week’s hearings. Next Monday, the Court will hear a case seeking to clarify when the use of someone’s name while committing another crime is a form of identity theft, resulting in an added fine or more prison time.