Debate continues to rage over the federal Kids Online Safety Act (KOSA), which seeks to hold platforms liable for feeding harmful content to minors. KOSA is lawmakers' answer to whistleblower Frances Haugen's shocking revelations to Congress. In 2021, Haugen leaked documents and provided testimony alleging that Facebook knew that its platform was addictive and was harming teens—but blinded by its pursuit of profits, it chose to ignore the harms.
Sen. Richard Blumenthal (D-Conn.), who sponsored KOSA, was among the lawmakers stunned by Haugen's testimony. He said in 2021 that Haugen had showed that "Facebook exploited teens using powerful algorithms that amplified their insecurities." Haugen's testimony, Blumenthal claimed, provided "powerful proof that Facebook knew its products were harming teenagers."
But when Blumenthal introduced KOSA last year, the bill faced immediate and massive blowback from more than 90 organizations—including tech groups, digital rights advocates, legal experts, child safety organizations, and civil rights groups. These critics warned lawmakers of KOSA's many flaws, but they were most concerned that the bill imposed a vague "duty of care" on platforms that was "effectively an instruction to employ broad content filtering to limit minors’ access to certain online content." The fear was that the duty of care provision would likely lead platforms to over-moderate and imprecisely filter content deemed controversial—things like information on LGBTQ+ issues, drug addiction, eating disorders, mental health issues, or escape from abusive situations.
So, regulators took a red pen to KOSA, which was reintroduced in May 2023 and amended this July, striking out certain sections and adding new provisions. KOSA supporters claim that the changes adequately address critics' feedback. These supporters, including tech groups that helped draft the bill, told Ars that they're pushing for the amended bill to pass this year.
And they might just get their way. Some former critics seem satisfied with the most recent KOSA amendments. LGBTQ+ groups like GLAAD and the Human Rights Campaign removed their opposition, Vice reported. And in the Senate, the bill gained more bipartisan support, attracting a whopping 43 co-sponsors from both sides of the aisle. Surveying the legal landscape, it appears increasingly likely that the bill could pass soon.
But should it?
Not all critics agreed that recent changes to the bill go far enough to fix its biggest flaws. In fact, the bill's staunchest critics told Ars that the legislation is incurably flawed—due to the barely changed duty of care provision—and that it still risks creating more harm than good for kids.
These critics also warn that all Internet users could be harmed, as platforms would likely start to censor a wide range of protected speech and limit user privacy by age-gating the Internet.
“Duty of care” fatally flawed?
To address some of the criticisms, the bill changed platform restrictions so that platforms wouldn't start blocking kids from accessing "resources for the prevention or mitigation of" any harms described under the duty of care provision. That theoretically means that if KOSA passes, kids should still be able to access online resources to deal with:Previously, this section describing harmful content was slightly more vague and only covered minors' access to resources that would help them prevent and mitigate "suicidal behaviors, substance use, and other harms." Irene Ly—who serves as counsel on tech policy for Common Sense Media, a nonprofit that provides age ratings for tech products—helped advise KOSA authors on the bill's amendments. Ly told Ars that these changes to the bill have narrowed the duty of care enough and reduced "the potential for unintended consequences." Because of this, "support for KOSA from LGBTQ+ groups and policymakers, including openly gay members of Congress" was "significantly boosted," Ly said. But the duty of care section didn't really change that much, KOSA critics told Ars, and Vice reported that many of KOSA's supporters are far-right, anti-LGBTQ organizations seemingly hoping for a chance to censor a broad swath of LGBTQ+ content online. The only other KOSA change was to strengthen the "knowledge standard" so that platforms can only be held liable for violating KOSA if they don't take "reasonable efforts" to mitigate harms when they know that kids are using their platforms. Previously, platforms could be found liable if a court ruled that platforms "reasonably should know" that there are minors on the platform. Joe Mullin, a policy analyst for the Electronic Frontier Foundation (EFF)—a leading digital rights nonprofit that opposes KOSA—told Ars that while the knowledge standard change is "actually positive" because it's "tightened up a bit." Still, the revised KOSA "doesn't really solve the problem" that many critics still have with the legislation. "I think the duty of care section is fatally flawed," Mullin told Ars. "They really didn't change it at all." In a letter to lawmakers last month, legal experts with the think tank TechFreedom similarly described the duty of care section as "incurably flawed" and cautioned that it seemingly violates the First Amendment. The letter urged lawmakers to reconsider KOSA's approach:
- Mental health disorders, including anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.
- Patterns of use that indicate or encourage addiction-like behaviors.
- Physical violence, online bullying, and harassment of the minor.
- Sexual exploitation and abuse.
- Promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol.
- Predatory, unfair, or deceptive marketing practices, or other financial harms.
The unconstitutionality of KOSA’s duty of care is highlighted by its vague and unmeetable nature. Platforms cannot 'prevent and mitigate' the complex psychological issues that arise from circumstances across an individual’s entire life, which may manifest in their online activity. These circumstances mean that material harmful to one minor may be helpful or even lifesaving to another, particularly when it concerns eating disorders, self-harm, drug use, and bullying. Minors are individuals, with differing needs, emotions, and predispositions. Yet KOSA would require platforms to undertake an unworkable one-size-fits-all approach to deeply personal issues, thus ultimately serving the best interests of no minors.ACLU's senior policy counsel Cody Venzke agrees with Mullin and TechFreedom that the duty of care section is still the bill's biggest flaw. Like TechFreedom, Venzke remains unconvinced that changes are significant enough to ensure that kids won't be cut off from some online resources if KOSA passes in its current form. "If you take the broad, vague provisions of the duty of care, platforms are going to end up taking down the bad as well as the good, because content moderation is biased," Venzke told Ars. "And it doesn't do youths any good to know that they can still search for helpful resources, but those helpful resources supported have been swept up in the broad takedowns of content."
Will KOSA require age verification?
KOSA supporters told Ars that the bill specifically does not require platforms to verify ages of users. Josh Golin, executive director at Fairplay—a nonprofit child advocacy organization that helped draft KOSA—told Ars that "contrary to KOSA's critics' claims, the legislation does not require content takedowns, age verification, or government IDs to use the Internet." "In response to good-faith criticism of the 2022 version of KOSA, the bill has been considerably improved and, as a result, opposition from civil society has decreased significantly," Golin told Ars. "Perhaps that's why the remaining opponents are so desperate to scare people with doomsday scenarios, rather than engaging with the actual text of the bill." But critics who feel the text is still too broad said that platforms could be more at risk for lawsuits if they don't verify users' ages. Mullin told Ars that without age verification, it's still unclear what standard that platforms could use to avoid liability. Further, Mullin is concerned that platforms are not being size-gated under KOSA, which means any platform of any size could be liable and thus feel pressured to verify users' ages. Until these points are clarified, Mullin's not sure how KOSA wouldn't lead to a future where the whole Internet is age-gated. "Is it good enough to say, 'Hey, if you're under 18, don't come here' or 'Click to confirm you're 18 or over?'" Mullin asked. "Will that be good enough?" In TechFreedom's letter to lawmakers, the think tank predicted that KOSA would "force platforms to age-verify users" because it's the most risk-averse path to compliance. "While doubtless well-intentioned, these changes merely trade a clear, explicit mandate for a vague, implicit one; the unconstitutional effect on anonymous expression will be the same," TechFreedom's letter said. As the KOSA debate remains heated, Golin and Ly questioned the motives of some of KOSA's loudest critics like TechFreedom. Golin told Ars that because the think tank receives substantial financial contributions from Google and Meta, he thinks TechFreedom is motivated "to defend the status quo" and defeat KOSA. TechFreedom's president Berin Szóka and free speech counsel Ari Cohn authored the letter to lawmakers. Szóka told Ars that "it's an old rule of Washington that those who have no substantive response to critiques of legislative text fall back on impugning motives. That's especially true when bills like KOSA haven't been vetted properly in hearings," Szóka said. Cohn told Ars that he welcomes "anyone who disagrees with our analyses to engage on the substance of those disagreements." He thinks "that would be a productive conversation well worth having." "The principles that make KOSA obviously unconstitutional are well-settled law," Cohn said. "Effectively ending anonymous speech online and imposing a duty on platforms to protect us from ideas is not bad because it threatens the 'status quo' of major platforms. It is bad because it threatens the First Amendment rights of every American who speaks or receives information online—and that is a sacrifice that Congress is not authorized to make." Cohn also balked at criticism that TechFreedom is only upholding the status quo in this debate. He said that TechFreedom's "fidelity is to the law," not to platforms. "Our principles, legal analysis, and positions are not for sale," Cohn said.Critics: KOSA promotes censorship, not privacy
Evan Greer, deputy director of the nonprofit digital rights advocacy group Fight for the Future, told Ars that the organization "strongly supports strict regulation of Big Tech companies," but as far as Fight for the Future can tell, KOSA is not the privacy bill that supporters claim that it is. "If KOSA were actually a privacy bill as its supporters claim, we would be all about it," Greer told Ars. "We support cracking down on tech companies harvesting of data, we support an end to manipulative business practices like autoplay, infinite scroll, intrusive notifications, and algorithmic recommendations powered by commercial surveillance. What we don't support is a bill that gives state attorneys general the power to dictate what content younger people can see on social media. That's where KOSA goes off the rails and becomes a censorship bill, rather than a privacy bill." Because KOSA enforcement falls to state attorneys general—many of whom are elected officials— the ACLU's senior policy counsel Cody Venzke told Ars that it's easier for the government to target and censor specific viewpoints that clash with their party politics. Mullin agreed, saying that part of the reason why KOSA has so much bipartisan support is because both Democrats and Republicans are in favor of censoring opposing viewpoints and are "assuming the censorship will go their way." "It's just totally crazy," Mullin told Ars. "People have very different views on how you can mitigate" harms like "eating disorders, addiction, bullying, sexual exploitation, drug use, alcohol use, gambling, tobacco use, and all predatory or deceptive marketing practices" by "controlling online speech." "People don't agree about what's harmful on any of these issues," Mullin said. "These are challenging things to deal with, and families do it differently. And I don't think it's gonna be better when the government starts creating rules about it." Venzke told Ars that "outside of very narrow exceptions," it's "not Congress's role to decide what is good speech and what is bad speech." And he thinks that question probably shouldn't be up to platforms to answer, either. Venzke warned that KOSA would take decisions about kids' welfare out of the family's hands. Instead, Venzke said that platforms would be required to "look at content that causes depression or anxiety"—"which is, of course, anything that's out there in the world"—and make decisions that could result in platforms dictating what speech flies online. Beyond cutting kids off from information, that could lead to wide-ranging censorship of the entire Internet, and TechFreedom's Cohn told Ars that's why he's found the bill's widespread support "baffling." "By requiring platforms to protect us from 'harmful' ideas, it effectively deputizes major platforms to determine what speech is safe for us," Cohn told Ars. "If the government wanted to do that, the uproar would be rightfully furious. It is baffling that people who don't trust the platforms to begin with think it's any less terrible to outsource that work to them." In addition, many young people are engaged with "politically challenging topics like LGBT rights, gun violence, and climate change," Venzke told Ars. If state attorneys general are "simply telling platforms to question these topics" as "the things that cause anxiety for young people," that is likely to chill young people's "opportunity to participate" in meaningful debates where young people have recently been vocal advocates online. Unsurprisingly, last month, Techdirt reported that TikTok influencers had caught wind of how KOSA might be used to censor speech and have begun speaking out against KOSA online. The EFF has also been conducting outreach and urging young people to oppose the law. Mullin estimated that the EFF has helped tens of thousands of concerned citizens send letters to Congress urgently challenging KOSA. And when the EFF's efforts are "combined with allies and petitions from other sources," Mullin estimated that "well over a quarter of a million messages" opposing KOSA have reached lawmakers across the country.Concerns about "openly hostile" AGs
While its staunchest critics claim that KOSA can't be fixed, others have suggested radical changes that lawmakers seem hesitant to make. For example, one of the biggest ways that critics say that KOSA could be improved would be to remove the section giving enforcement authority to state attorneys general. Greer told Ars that "numerous states have AGs that are openly hostile to basic human rights. These are the same people who are trying to ban books, ban drag performances, ban gender-affirming care, ban abortions, ban people from talking about abortions online, etc., etc. KOSA would give them power to silence speech they don't like." That's why Greer said that some human rights groups still oppose KOSA and consider its enforcement section among its core flaws. Techdirt's Mike Masnick also criticized Democrats for supporting the bill. He wrote that some KOSA supporters, like the Heritage Foundation, "flat out admitted that they planned to use KOSA to censor LGBTQ+ content" and "Republicans admit up front how they plan to abuse it." A research associate in The Heritage Foundation's Tech Policy Center, Jake Denton, told Ars that his organization supports KOSA partly because it would keep "depraved and harmful content" out of families' homes, which he said will make the Internet "a safer environment for American families." Mullin told Ars that rather than empowering families, KOSA leaves deciding "what websites and apps aren't doing a good enough job" to protect kids to 50 attorneys general and the Federal Trade Commission. "That's a pretty bad place to put family decisions," Mullin said. And "it's going to result in censorship. I think that's the goal."Can KOSA be fixed?
KOSA strives to answer one of lawmakers' biggest questions after Haugen's whistleblowing testimony to Congress: Can platforms be legally required to care about causing harm to kids? Venzke told Ars that the ACLU would be willing to reconsider its opposition to the bill if the duty of care provision was radically amended or struck entirely. "We've been very clear with the bill sponsors and supporters that our concerns stem primarily, but not exclusively, from the duty of care provision and its sort of vague, broad vision of speech," Venzke told Ars. "If that provision were to be changed or eliminated, then we would look at the bill's value, how it might affect us online, especially marginalized communities." It seems unlikely that lawmakers would choose to cut out the bill's heart, though. That leaves the possibility that lawmakers could possibly revise the duty of care provision in a way that could satisfy KOSA's staunchest critics. Venzke said that wasn't impossible. "Any time that you are seeking to regulate in a space where the primary product, so to speak, is speech or the dissemination of speech, you have to be very careful about intruding on speech rights, including on controversial or difficult topics," Venzke said. "There are ways to approach mitigating online harms, but it requires that you be exact about your approach." Venzke suggested that lawmakers could address what he sees as the duty of care's main flaws by being "very exact" about what the problems are that KOSA is addressing and "very respectful of the fact that a lot of the rough and tumble of speech is just that it is controversial." As it's currently written, KOSA's duty of care provision remains too vague for some critics to embrace, and both Mullin and Greer told Ars that the only solution would be to remove it entirely. "The fundamental flaw with this bill is that the duty of care section is a censorship section," Mullin told Ars. "And that couldn't be fixed without removing the duty of care section," and lawmakers likely wouldn't go that route with KOSA because "that would be a pretty different bill." Greer has floated at least one idea to replace the duty of care provision, though. "We have urged KOSA's sponsors to change the bill by removing the overly broad 'duty of care,' which is an inherently flawed model that gives the government too much power to control speech and replace it with strict regulations on how companies collect and use data," Greer told Ars. "The First Amendment prevents the government from dictating what speech platforms can recommend to younger users, but we can absolutely ban companies from harvesting our kids' data and using it to recommend content to them." What would be even better than a law protecting kids' data, though, would be a law cracking down on the "abusive business practices that harm everyone," Greer said. "A comprehensive federal data privacy law would do way more to protect kids than any of these 'kids bills' that have gained steam in Congress because they're politically expedient, not because they are substantively the right policy to address harm," Greer told Ars. Going this route, Congress could also avoid "thorny issues around age verification," Greer said. "Lawmakers and organizations who genuinely want to take on Big Tech rather than just talking about it need to read the writing on the wall," Greer added. "Censorship bills are not going to pass on our watch. So if we want to take on Big Tech and win we have to focus on the areas where we have real consensus: privacy, antitrust, algorithmic transparency, and cracking down on content-agnostic manipulative design."Platforms already avoiding taking responsibility
Another much less-discussed way that KOSA strives to restrict platforms from harming kids is by requiring more transparency. If KOSA passed, platforms would have to provide a "readily accessible" way for parents and kids to directly report harm. This means parents and kids would finally have a portal to demand answers from platforms. Platforms with more than 10 million monthly active users would have one week, maximum, to respond to these reports, while platforms with users below that threshold would have to respond within 21 days. Additionally, platforms would be required to issue annual public reports "identifying the reasonably foreseeable risk of material harms to minors and describing the prevention and mitigation measures taken to address such risk based on an independent, third-party audit conducted through reasonable inspection of the covered platform." These annual reports would have to include:In this way, KOSA could help society learn much more about how kids are actually harmed by platforms, which could theoretically help lawmakers respond to feedback and eventually use more exact descriptions of harms in KOSA's text. The bill also creates a Kids Online Safety Council to weigh in on how the law is interpreted. Over time, it's possible that KOSA could be refined. But KOSA also has been amended to vaguely protect platforms' trade secrets, which makes it unlikely that platforms would have to be transparent about "how recommendation systems and targeted advertising systems" work to "contribute to harms to minors." Ars reached out to Sen. Blumenthal, who introduced KOSA, and he said that platforms are already resisting agreeing with research findings that show kids can become addicted to platforms that feed them harmful content. “I am frustrated and infuriated that the tech industry often refuses to even acknowledge that there is a problem—companies like TikTok and Meta wave away compelling and alarming research on the harm their products do to kids while doubling down on efforts to recruit younger users," Blumenthal told Ars. "The further away we move from Frances Haugen’s groundbreaking disclosures, the easier it is for Big Tech to slip back into old talking points denying and distracting from their role in the youth mental health crisis." Blumenthal suggested that this is one reason why KOSA must urgently pass, with the duty of care provision remaining at its core. "Transparency and independent child safety research is crucially important but not enough on its own," Blumenthal told Ars. "We’ve seen Big Tech’s indifference to public shaming and how it prioritizes its financial bottom line, which is why transparency measures must also be paired with serious safeguards and an obligation to act to protect children.”
- An assessment of the extent to which the platform is likely to be accessed by minors.
- A description of the commercial interests of the covered platform in use by minors.
- An accounting of the number of individuals using the covered platform reasonably believed to be minors.
- An assessment of the reasonably foreseeable risk of harms to minors posed by the covered platform.
- An assessment of how recommendation systems and targeted advertising systems can contribute to harms to minors.
- A description of whether and how the covered platform uses system design features that increase, sustain, or extend use of a product or service by a minor.
- A description of whether, how, and for what purpose the platform collects or processes categories of personal data that may cause reasonably foreseeable risk of harms to minors.
- An evaluation of the efficacy of safeguards for minors.
- An evaluation of any other relevant matters of public concern over risk of harms to minors.