Random Image Display on Page Reload

Elon Musk’s Lawsuit Against a Group That Found Hate Speech on X Isn’t Going Well

X alleges that the Center for Countering Digital Hate cost it millions by showing that hate speech was spreading on the platform. In a hearing Thursday, a federal judge sounded skeptical of those claims.

Aerial view of the X logo on top of a building with a red overlay effect

The X headquarters in San Francisco in July 2023.Photo-illustration: WIRED Staff; JOSH EDELSON/Getty Images

Soon after Elon Musk took control of Twitter, now called X, the platform faced a massive problem: Advertisers were fleeing. But that, the company alleges, was someone else’s fault. On Thursday that argument went before a federal judge, who seemed skeptical of the company's allegations that a nonprofit’s research tracking hate speech on X had compromised user security, and that the group was responsible for the platform’s loss of advertisers.

The dispute began in July when X filed suit against the Center for Countering Digital Hate, a nonprofit that tracks hate speech on social platforms and had warned that the platform was seeing an increase in hateful content. Musk’s company alleged that CCDH’s reports cost it millions in advertising dollars by driving away business. It also claimed that the nonprofit’s research had violated the platform’s terms of service and endangered users’ security by scraping posts using the login of another nonprofit, the European Climate Foundation.

In response, CCDH filed a motion to dismiss the case, alleging that it was an attempt to silence a critic of X with burdensome litigation using what’s known as a “strategic lawsuit against public participation,” or SLAPP.

On Thursday, lawyers for CCDH and X went before Judge Charles Breyer in the Northern California District Court for a hearing to decide whether X’s case against the nonprofit will be allowed to proceed. The outcome of the case could set a precedent for exactly how far billionaires and tech companies can go to silence their critics. “This is really a SLAPP suit disguised as a contractual suit,” says Alejandra Caraballo, clinical instructor at Harvard Law School's Cyberlaw Clinic.

Unforeseen Harms

X alleges that the CCDH used the European Climate Foundation’s login to a social network listening tool called Brandwatch, which has a license to access X data through the company’s API. In the hearing Thursday, X’s attorneys argued that CCDH’s use of the tool had caused the company to spend time and money investigating the scraping, for which it also needed to be compensated on top of payback for how the nonprofit’s report spooked advertisers.

Judge Breyer pressed X’s attorney, Jonathan Hawk, on that claim, questioning how scraping posts that were publicly available could violate users’ safety or the security of their data. “If [CCDH] had scraped and discarded the information, or scraped that number and never issued a report, or scraped and never told anybody about it. What would be your damages?” Breyer asked X’s legal team.

Breyer also pointed out that it would have been impossible for anyone agreeing to Twitter's terms of service in 2019, as the European Climate Foundation did when it signed up for Brandwatch, years before Musk’s purchase of the platform, to anticipate how its policies would drastically change later. He suggested it would be difficult to hold CCDH responsible for harms it could not have foreseen.

“Twitter had a policy of removing tweets and individuals who engaged in neo-Nazi, white supremacists, misogynists, and spreaders of dangerous conspiracy theories. That was the policy of Twitter when the defendant entered into its terms of service,” Breyer said. “You're telling me at the time they were excluded from the website, it was foreseeable that Twitter would change its policies and allow these people on? And I am trying to figure out in my mind how that's possibly true, because I don't think it is."

Speaking after the hearing, Imran Ahmed, CEO of CCDH, was optimistic about the direction of the judge’s inquiry. “We were particularly surprised by the implication in X Corp.’s argument today that it thinks that CCDH should somehow be on the hook for paying for X Corp. to help neo-Nazis, white supremacists, and misogynists escape scrutiny of their reprehensible posts,” he says. “We can't help but note that X Corp. really had no response to our assertion that Musk changed X's policies to reinstate white supremacists, neo-Nazis, misogynists, and other propagators of hateful and toxic content.”

Breyer did not indicate Thursday when he would rule on whether the case could move forward.

Broken Trust

After taking over Twitter in late 2022, Musk fired much of the company's trust and safety team, which kept hateful and dangerous content as well as disinformation off the platform. He then also offered amnesty to users who had been banned for violating the platform’s policies. CCDH is among a number of organizations and academics who have published evidence showing that X has become a haven for harmful and misleading content under Musk’s watch.

The suit against CCDH was just one of many ways in which platforms have sought to limit transparency in recent years. X now charges $42,000 for access to its API, making analyzing data from the platform financially inaccessible to many researchers and members of civil society. For its part, Meta has wound down CrowdTangle, a tool that allowed researchers and journalists to track the spread of posts, and cut off researchers at New York University who were studying political ads and Covid-19 disinformation.

Both Meta and X filed suit against Bright Data, a third-party data collection service, for scraping their platforms. In January, Meta’s case against Bright Data was dismissed. “The Facebook and Instagram Terms do not bar logged-off scraping of public data; perforce it does not prohibit the sale of such public data,” wrote US federal judge Edward Chen in his verdict. “The Terms cannot bar Bright Data’s logged-off scraping activities.”

Bright Data spokesperson Jennifer Burns calls the platforms’ suits against the company “an effort to build a wall around publicly available data.”

Caraballo, of Harvard Law School, says Elon Musk appears to have decided lawsuits are a good strategy for silencing critics of his social platform. In November, X filed a lawsuit against the watchdog group Media Matters for America, accusing the group of trying to drive advertisers away from the platform by reporting how ads appeared next to neo-Nazi content.

The suit was filed in Texas, where anti-SLAPP laws that can be used to quash frivolous lawsuits do not apply in federal courts, which will make it more difficult for the case to be dismissed, says Caraballo. “I think it's incredibly concerning that this is part of that broader pattern, because these are the mechanisms that hold powerful companies accountable,” she says.

She guesses that while X might be able to move forward with a narrow version of its claim that CCDH breached its terms of service, “most of the claims will get tossed out.”

X did not respond to request for comment by the time of publication.

Vittoria Elliott is a reporter for WIRED, covering platforms and power. She was previously a reporter at Rest of World, where she covered disinformation and labor in markets outside the US and Western Europe. She has worked with The New Humanitarian, Al Jazeera, and ProPublica. She is a graduate of… Read more
Platforms and power reporter

    More from WIRED

    The Wild Claim at the Heart of Elon Musk’s OpenAI Lawsuit

    Elon Musk’s lawsuit against OpenAI hinges on a dubious claim that the company has already developed ‘artificial general intelligence’—and handed it over to Microsoft.

    Will Knight

    Elon Musk Sues OpenAI and Sam Altman for ‘Flagrant Breaches’ of Contract

    In the lawsuit, Musk claims OpenAI has abandoned its mission to develop AI for the benefit of humanity.

    Morgan Meaker

    The Mind-Blowing Experience of a Chatbot That Answers Instantly

    AI chips from startup Groq allow chatbots to answer queries almost instantly. That could open up whole new use cases for generative AI helpers.

    Steven Levy

    The US Supreme Court Holds the Future of the Internet in Its Hands

    If the court backs provocative laws from Texas and Florida that limit social platforms’ ability to moderate content online, life could become radically different.

    Amanda Hoover

    Facebook, Instagram, and Threads Are Coming Back Online After a 2-Hour Outage

    People around the world reported that Meta’s social platforms Facebook, Instagram, and Threads suffered outages for about two hours this morning.

    Amanda Hoover

    How to Turn Off Facebook’s Two-Factor Authentication Change

    With Meta’s updated 2FA process, the company now automatically trusts devices you often use.

    Reece Rogers

    Meta Abandons Hacking Victims, Draining Law Enforcement Resources, Officials Say

    A coalition of 41 state attorneys general says Meta is failing to assist Facebook and Instagram users whose accounts have been hacked—and they want the company to take “immediate action.”

    Dell Cameron

    RIP Apple Car. This Is Why It Died

    Any tech company moving into the auto space needs a manufacturing partner. But the Apple’s EV died as it lived: alone.

    Aarian Marshall

    *****
    Credit belongs to : www.wired.com

    Check Also

    Dubai deluge likely made worse by warming world, scientists find

    A powerful rainstorm that wreaked havoc on the desert nation of the United Arab Emirates …