KOSA doesn’t keep kids safe (and won’t hold up in court). Let’s do something better.
The Kids Online Safety Act’s First Amendment Problems
Fight for the Future has opposed the Kid’s Online Safety Act’s (“KOSA”) Duty of Care because we believe it makes kids less safe and would harm some of the most vulnerable due to its likely impact in restricting speech and access to information for politically targeted and marginalized communities. This issue with the Duty of Care also makes it vulnerable to a First Amendment challenge, meaning that the emphasis on a Duty of Care model is not only misguided but also likely in vain. Recently, state bills that KOSA is modeled after have been challenged under the First Amendment and lost. This shows that relying on a flawed Duty of Care model to hold Big Tech accountable is as ineffective as it is dangerous.
KOSA’s Impacts on Speech
KOSA’s most obvious predecessor is SESTA/FOSTA, a Trump-era bill that its supporters claimed would clamp down on online sex trafficking. Instead, the bill did almost nothing to accomplish its goal, and has actively harmed LGBTQ+ people and sex workers whose harm reduction resources were decimated by the subsequent crackdown on speech. KOSA has the same problem as SESTA/FOSTA: social media companies will be incentivized to overreact and over-police politically targeted communities, content, and resources. Big Tech will respond to KOSA’s Duty of Care model in the cheapest way possible, and that will likely mean removing or shadow banning any content that they think enforcers would find objectionable – regardless of context, purpose, or value to certain communities.
NetChoice v. Bonta (9th Circuit)
The Ninth Circuit held that the provisions in California’s Age Appropriate Design Code (“CAADC”) that are similar to KOSA’s duty of care most likely facially violates the First Amendment. The CAADC requires companies to create reports on whether their products, designs, and algorithms could harm kids and mitigate or eliminate the risks associated with those harms – similar to KOSA. The Ninth Circuit found that not only was mandating the creation of these reports likely a violation of the First Amendment, but that the requirements to police content “deputizes covered businesses into serving as censors for the State” because it requires private actors to “determin[e] whether material is suitable for kids.” The states arguments that the bill is about design and not content were rejected because the court found that the regulations on design “require consideration of content or proxies for content.”
Importantly, the court states:
[A] business cannot assess the likelihood that a child will be exposed to harmful or potentially harmful materials on its platform without first determining what constitutes harmful or potentially harmful material. To take the State’s own example, data profiling may cause a student who conducts research for a school project about eating disorders to see additional content about eating disorders. Unless the business assesses whether that additional content is “harmful or potentially harmful” to children (and thus opines on what sort of eating disorder content is harmful), it cannot determine whether that additional content poses a “risk of material detriment to children” under the CAADCA. Nor can a business take steps to “mitigate” the risk that children will view harmful or potentially harmful content if it has not identified what content should be blocked.
The Western Division of Texas ruled that HB 18, another bill similar to KOSA, violates the First Amendment. The bill has a similar list of harms but has a more explicit monitoring-and-filtering requirement. KOSA’s duty of care has an implicit monitoring-and-filtering requirement, since the companies must look at content to prevent and mitigate the listed harms. This difference is unlikely to save the bill.
The court found persuasive many of the arguments that are likely to be leveled at KOSA in a First Amendment challenge. For example, the court found that “[t]erms like ‘promoting,’ […] ‘substance abuse,’ ‘harassment,’ and ‘grooming’ are undefined, despite their potential wide breadth and politically charged nature.” The court also found the law to be underinclusive by preventing certain content on some platforms while teenagers have access to the same content in other places. For example, the court states “[a] teenager can read Peter Singer advocate for physician-assisted suicide in Practical Ethics on Google Books but cannot watch his lectures on YouTube or potentially even review the same book on Goodreads.” The court states that the net result of the bill is to prohibit “minors from participating in the democratic exchange of views online.” The court also notes that “[t]he Supreme Court has repeatedly emphasized that ‘[s]peech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the young from ideas or images that a legislative body thinks unsuitable for them.’”
Section 230 will not protect against harms
Some defenders of KOSA have stated that these content-based harms will not come to pass because of protections offered by Section 230. However, the Third Circuit has recently ruled that content served via algorithm – which is typical of almost all content viewed on the Internet today – does not receive protection under Section 230. This ruling appears to follow the trend in the Ninth Circuit of narrowing Section 230.
We are wasting energy on a bill that will censor important, life-saving content that won’t even pass legal muster.
Congress has spent months pushing KOSA when we could have been focused on passing comprehensive privacy and antitrust laws that have consensus amongst the communities that want to fight back against Big Tech. Not only will KOSA do damage to LGBTQ youth and kids at large, the courts have proven that the duty of care model will not hold up in court. We need to stop wasting time and get serious about this issue. KOSA is not the answer.
The following statement can be attributed to Sarah Philips (she/they), campaigner at Fight for the Future:
“The dozens of reproductive justice, LGBTQ, and digital rights groups that have steadfastly opposed KOSA maintain that this bill will censor LGBTQ and abortion content, further endangering marginalized youth instead of keeping them safe. Despite hundreds of thousands of youth emailing and writing to Congress, these concerns have been largely ignored. Now, the courts are saying the same: this type of legislation will lead to censorship. We could be using this time to fight for digital privacy, anti-trust legislation, and common-sense fixes that we have consensus over, but instead we are wasting time fighting over a piece of legislation that won’t survive legal challenge. If this is really about protecting kids, we need to do better by them, particularly marginalized kids.”