Controversy continues over the federal Kids Online Safety Act (KOSA), which seeks to hold platforms accountable for feeding harmful content to minors. KOSA is the lawmakers’ response to whistleblower Frances Haugen’s shocking revelations to Congress. In 2021, Haugen leaked documents and testified that Facebook knew its platform was addictive and harmful to teenagers—but blinded by the pursuit of profit, it chose to ignore the damages.
Sen. Richard Blumenthal (D-Conn.), who sponsored KOSA, was among the lawmakers stunned by Haugen’s testimony. He said in 2021 that Haugen had shown that “Facebook exploits teenagers with powerful algorithms that amplify their insecurities.” Haugen’s testimony, Blumenthal claimed, provided “powerful proof that Facebook knew its products were harmful to teenagers.”
But when Blumenthal introduced KOSA last year, the bill immediately faced massive blowback from more than 90 organizations—including tech groups, digital rights advocates, legal experts, child safety organizations, and civil rights groups. These critics have warned lawmakers of several flaws in KOSA, but they are concerned that the bill imposes a vague “duty of care” on platforms that “is effectively a guideline for the use of broad content filtering to limit minors’ access to certain online content.” The fear is that the duty of care provision is likely to lead the platforms to overly moderate and inaccurate filtering of content that is considered controversial—things like information on LGBTQ+ issues, drug addiction, eating disorders, mental health issues, or escaping from abusive situations.
So, the regulators took a red pen to KOSA, which was reintroduced in May 2023 and revised this July, removing some sections and adding new provisions. Supporters of KOSA claim the changes adequately address critics’ feedback. These supporters, including tech groups that helped draft the bill, told Ars they are pushing to pass the revised bill this year.
And they can only get their way. Some critics in the past seemed satisfied with the latest KOSA changes. LGBTQ+ groups such as GLAAD and the Human Rights Campaign have withdrawn their opposition, Vice reported. And in the Senate, the bill gained more bipartisan support, attracting 43 co-sponsors from both sides of the aisle. Surveying the legal landscape, it appears more likely that the bill could be passed soon.
But should it?
Not all critics agree that recent changes to the bill go far enough to fix its biggest flaws. In fact, the bill’s staunchest critics told Ars that the law is irreparably flawed—because of the virtually unchanged duty of care provision—and that it still risks creating more harm than good for children.
These critics also warn that all Internet users will be harmed, as the platforms will likely begin to censor a wide range of protected speech and limit user privacy by age-gating the Internet. .
“Duty of care” fatal flaw?
To address some of the criticisms, the bill changes platform restrictions so that platforms do not begin to block children from accessing “resources for preventing or mitigating” any harm described below in the duty of care provision. That in theory means that if KOSA passes, children should still be able to access online resources to deal with:
- Mental health disorders, including anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.
- Patterns of use that exhibit or encourage addictive-like behaviors.
- Physical violence, online bullying, and harassment of minors.
- Sexual exploitation and abuse.
- Promotion and sale of narcotic drugs, tobacco products, gambling, or alcohol.
- Predatory, unfair, or deceptive marketing practices, or other financial harm.
Previously, this section describing harmful content was somewhat vague and only covered minors’ access to resources that could help them prevent and reduce “suicidal behaviors, substance use, and other damage.”
Irene Ly—who serves as technology policy counsel for Common Sense Media, a nonprofit that provides age ratings for technology products—helped advise KOSA’s authors on the bill’s changes. . Ly told Ars that these changes to the bill reduced the duty of care substantially and reduced the “potential for unintended consequences.” As a result, “the support for KOSA from LGBTQ+ groups and policymakers, including openly gay members of Congress” “significantly increased,” said Ly.
But the duty-of-care section hasn’t really changed much, KOSA critics told Ars, and Vice reported that many KOSA supporters are far-right, anti-LGBTQ organizations it seems. hopes to have the opportunity to censor a wide range of LGBTQ+ content online. .
The only other change to KOSA is to strengthen the “awareness standard” so that platforms can only be held liable for violating KOSA if they do not make “reasonable efforts” to mitigate damages if known. They know that kids are using their platforms. Previously, platforms could be found liable if a court ruled that the platforms “reasonably knew” that there were minors on the platform.
Joe Mullin, a policy analyst for the Electronic Frontier Foundation (EFF)—a leading digital rights nonprofit that opposes KOSA—told Ars that while the change in knowledge base is “really positive” because it was “tightened a little.” However, the revised KOSA “doesn’t really solve the problem” that many critics have with the law.
“I think the duty of care section is fatally flawed,” Mullin told Ars. “They never changed it.”
In a letter to lawmakers last month, legal experts with think tank TechFreedom similarly described the duty of care section as “badly flawed” and warned that it appears to violate the First Amendment. The letter urges lawmakers to reconsider KOSA’s approach:
The unconstitutionality of KOSA’s duty of care is highlighted by its vague and imprecise nature. Platforms cannot ‘prevent and mitigate’ complex psychological issues arising from circumstances throughout a person’s life, which may be reflected in their online activity. These circumstances mean that material that is harmful to one minor may help or save the life of another, especially when it relates to eating disorders, self-harm, drug use, and bullying. Minors are individuals, with different needs, emotions, and predispositions. However, KOSA requires platforms to create an unlikely one-size-fits-all approach to deeply personal issues, thus ultimately serving the best interests of not minor.
ACLU senior policy advisor Cody Venzke agreed with Mullin and TechFreedom that the duty of care section is still the biggest flaw in the bill. Like TechFreedom, Venzke remains unconvinced that the changes will be significant enough to ensure that children are not cut off from some online resources if KOSA passes in its current form.
“If you take away the broad, vague duty of care provisions, the platforms end up getting the bad as well as the good, because content moderation is biased,” Venzke told Ars. “And it does young people no good to know that they can still look for helpful resources, but those helpful resources that are supported have been swept away by the extensive removal of content.”