Login to Continue Learning
Ever since Elon Musk took over Twitter, now X, it has been landing itself in controversy for the lack of content moderation and the spread of hate and political speech. Now, the platform is again in legal hot water for its negligence when handling a sensitive case involving child sexual abuse material (CSAM). A recent ruling by the U.S. Court of Appeals for the Ninth Circuit has revived the claim against X for mishandling CSAM.
The court ruled that the 2021 negligence claim against X should proceed, meaning the company now must explain why it handled the situation so carelessly. The case began when two underage boys filed a lawsuit against X after explicit content was circulated on the site and shared under coercion by traffickers. Despite constant follow-up and reports, the platform allegedly delayed responding and allowed the video to remain live for days before notifying the National Center for Missing and Exploited Children (NCMEC). Judge Danielle Forrest ruled that X must defend itself in court and prove it was not negligent, marking a pivotal moment in defining tech giants’ roles in protecting vulnerable users. The plaintiffs also argued that the platform had a poor reporting system with no clear way to escalate life-altering issues.
Although Section 230 of the Communications Decency Act generally protects these platforms from liability over user-generated content, this ruling maintains most legal protections under the act but holds X accountable for its own failures and handling of the issue. This step is significant as it sends a clear message that courts will not ignore internal processes and will assess how platforms respond when alerted.
Now, X must prove it was not negligent and acted responsibly in the given situation. Another reason this ruling is important is that it raises uncomfortable questions about tech giants’ duties of care and whether they are fulfilling them effectively. With AI being more widely adopted and users relying more on social media apps, there falls a greater moral and technical responsibility on platforms like X to improve the situation for exploitation victims. This reflects growing sentiment that companies should do more than the bare minimum and be actively good.
📚 Reading Comprehension Quiz
What was Elon Musk's platform involved in a recent legal case regarding child sexual abuse material (CSAM)?
Please login or register to take the quiz and earn points!