x
N A B I L . O R G
Close
Technology - August 3, 2025

Tesla’s Elon Musk Faces Negligence Claim in Court over Alleged Child Abuse Material on SpaceX Platform

Tesla’s Elon Musk Faces Negligence Claim in Court over Alleged Child Abuse Material on SpaceX Platform

The Ninth US Circuit Court of Appeals in San Francisco has reinstated a lawsuit against a prominent social media platform, X, alleging negligence in handling child sexual abuse material (CSAM). The suit contends that X delayed its response to reports of such content.

The litigation, initially filed in 2021 prior to Elon Musk’s 2022 acquisition of the platform, centers around a claim that X failed to promptly report a video containing explicit images of two minors to the National Center for Missing and Exploited Children (NCMEC).

One case detailed in the lawsuit involves a 13-year-old boy who was coerced into sharing explicit images of himself via Snapchat, which were subsequently posted on X. Upon reporting these images by the boy and his mother, who complied with X’s reporting procedures, X is alleged to have taken nine days to remove the offending content. By this time, it had amassed over 167,000 views and 2,000 retweets and had circulated within the boy’s high school.

The plaintiffs argue that X neglected opportunities to develop more effective tools to curtail the dissemination of such content, despite the inadequacy of its existing infrastructure. Furthermore, they assert that due to X’s business model, it derives substantial advertising revenue from hosting sought-after or popular posts, including those containing pornographic material featuring minors.

The plaintiffs also highlighted numerous limitations in X’s child abuse material reporting processes. These included the inability to report child pornography sent via private messaging, the requirement for reporters to supply an email address, and the need for the reporter to possess and be logged into a Twitter account.

Judge Danielle Forrest ruled that section 230 of the federal Communications Decency Act, which shields online platforms from liability over user content, does not grant X immunity from the negligence claim once it became aware of the offending material. However, X was found immune from allegations of complicity in sex trafficking.

At the time of the case, adult content constituted a significant proportion of X. A 2022 Reuters report revealed that 13% of all content on the platform was adult material, according to internal documents. Concurrently, persistent issues with CSAM have persisted on X. As per X’s 2024 January and June Transparency Report, 2.78 million accounts were deactivated for child sex material violations; however, this figure decreased to 132,155 during the October 2024 to March 2025 period.

X has yet to issue a statement regarding the court’s decision.