X Corp. sues New York over content moderation disclosure law, citing First Amendment violations
X Corp. has filed a federal lawsuit challenging New York’s Senate Bill S895B, which mandates that social media platforms publicly disclose how they define and moderate content like “hate speech,” “misinformation” and “extremism.”
The company argues the law violates the First Amendment and New York’s Constitution by compelling speech and pressuring platforms to adopt the state’s ideological framing of controversial issues.
The law’s “Content Category Report Provisions” threaten violators with fines of up to $15,000 per day and potential legal action from New York Attorney General Letitia James.
X claims the law was enacted with a viewpoint-based motive, citing correspondence where legislators dismissed proposed changes due to Elon Musk’s personal speech on the platform.
The lawsuit follows X’s recent legal victory in California, where the Ninth Circuit struck down key parts of a similar law (AB 587), ruling they unconstitutionally compelled speech and failed strict scrutiny.
In a complaint filed this June in the U.S. District Court for the Southern District of New York, the company alleges that the law, Senate Bill S895B, amounts to unconstitutional government coercion and an overreach into the editorial judgment of private platforms. (Related: X under Musk rejoins pro-censorship advertisers’ alliance.)
At the center of the lawsuit are the so-called “Content Category Report Provisions,” which mandate that social media platforms publish detailed accounts of whether and how they moderate controversial content categories. Noncompliance could subject companies to daily fines of up to $15,000 and legal action by New York Attorney General Letitia James.
X Corp. argues that the provisions pressure platforms to adopt the state’s framing of politically charged issues.
The complaint states that the law violates both the First Amendment of the U.S. Constitution and Article I, Section 8 of the New York Constitution by compelling speech and imposing viewpoint-based burdens on online platforms. The company further alleges that New York lawmakers demonstrated clear bias in the bill’s formation and response to opposition.
In correspondence included as part of the complaint, the bill’s sponsors refused to engage with X Corp.’s suggested changes, citing owner Elon Musk’s use of the platform to “promote content that threatens the foundations of our democracy.” X then claimed this response reveals a constitutionally impermissible, viewpoint-based motive.
“The government cannot do indirectly what it is barred from doing directly,” the complaint argues, citing U.S. Supreme Court precedent.
X is asking the court to declare the law facially and as-applied unconstitutional, block its enforcement and award the company legal fees.
Court sides with X, strikes down key parts of California social media law
Filed by a coalition of 11 lawmakers, 10 Democrats and one Republican, AB 587 was championed by California Gov. Gavin Newsom and Attorney General Rob Bonta as a transparency measure to hold platforms accountable for their content moderation decisions. The law was passed in 2022 and required social media companies with users in California to submit detailed semiannual reports to the state.
However, the most controversial aspects of the law, now removed as part of the settlement, would have forced companies to disclose whether and how they define and moderate content in categories such as “disinformation,” “extremism,” “harassment” and “foreign political interference.” Companies were also expected to report the number of enforcement actions taken within each category and provide granular detail about how flagged content was handled.
Opponents, led by X Corp., argued that these provisions amounted to government overreach and would compel speech, effectively requiring platforms to adopt the state’s framing of complex and politically charged topics. The court agreed, ruling that the state cannot coerce private platforms into adopting or reporting on ideological definitions.
The U.S. Court of Appeals for the Ninth Circuit concluded that mandating companies to disclose content moderation practices under government-defined categories “likely compel[s] non-commercial speech” and therefore must withstand strict constitutional scrutiny – a bar the law did not meet.
What remains of AB 587 is a narrower requirement: Social media platforms must still submit their terms of service to the state every six months and describe any changes and how those terms are enforced. But platforms will no longer be required to report on how they moderate specific types of controversial speech or provide detailed enforcement metrics.
A "rapid" national investigation into NHS maternity services has been launched by the government.The announcement comes after Health Secretary Wes Streeting met families who have lost babies and amid the ongoing investigations at some NHS trusts into maternity care failings. The investigation in England is intended to provide truth to families suffering harm, as well as driving urgent improvements to care and safety, as part of efforts to ensure "no parent or […]
Post comments (0)