X's Late Update on Kids Online Safety Act Fails to Protect(This title emphasizes that X's update was late and still didn't achieve the goal of protecting, while keeping it within 20 words.)

Dec 12, 2024 at 7:40 PM
Last week, the Senate presented yet another version of the Kids Online Safety Act. This draft, allegedly written with the help of X CEO Linda Yaccarino, aims to deal with the crucial free speech problems within the bill. However, at its core, it remains an unconstitutional censorship measure that endangers the online speech and privacy rights of all internet users.

Tell Congress: Vote No on KOSA

Key Points of the Bill

The most significant update, as claimed by its authors, is supposed to minimize the bill's impact on free speech. But, as we've previously stated, the "duty of care" section is the biggest problem. It forces online services to make policy changes based on online speech content. Although the bill's authors wrongly claim it only regulates platform designs and not speech, the listed harms like eating disorders, substance use disorders, and suicidal behaviors are not caused by platform design.

The authors failed to understand the difference between protecting individual expression and safeguarding a platform from the liability KOSA imposes. KOSA is likely to increase risks for children as it will restrict their access to online resources on addiction, eating disorders, and bullying. It will lead to services imposing age verification and content restrictions, stifling minors from finding supportive online communities.

The "Duty of Care" Requirement

The updated bill added only one sentence to the "duty of care" requirement, stating that nothing in this section should be construed to allow a government entity to enforce based on user viewpoints. But the user's viewpoint was never affected by the duty of care in the first place. Platforms must mitigate the listed harms, not users, and their ability to share user views is at risk, not users' ability to express them. Adding this sentence doesn't change how the bill will be interpreted or enforced. The FTC could still hold a platform liable for the speech it contains.

For example, a covered platform like reddit hosting a forum for discussing overcoming eating disorders. Even if the speech is legal and helpful, the FTC could hold reddit liable for violating the duty of care by allowing young people to view it. The same could apply to a Facebook group about LGBTQ issues or a post about drug use shown by X through its algorithm.

Compulsive Usage and the Bill's Scope

Another issue with KOSA is its vague list of harms. The latest update requires that the harms of "depressive disorders and anxiety disorders" have objectively verifiable and clinically diagnosable symptoms related to compulsive usage. But the definition of compulsive usage is also vague, stating it as a persistent and repetitive use that significantly impacts major life activities.

There is no clinical definition of "compulsive usage" of online services. This updated definition combines elements that seem medical or legal but lacks specific legal meaning and is dangerously vague. The persistent use of social media can impact socializing and communicating, but that doesn't mean it's "compulsive" and harmful. Nonetheless, the FTC could still hold platforms liable for showing content that causes anxiety or depression.

Dangerous Censorship in Must-Pass Legislation

The latest KOSA draft comes as the incoming FTC Chair nominee, Andrew Ferguson, has vowed to protect free speech by "fighting back against the trans agenda." KOSA would give the FTC wide authority to decide what content platforms must prevent young people from seeing. Just passing the bill could lead to platforms taking down protected speech and implementing age verification requirements.

No representative should include this controversial and unconstitutional bill in a continuing resolution. A law that forces platforms to censor truthful online content has no place in a last-minute funding bill.