For immediate release: February 15, 2024


Today, the sponsors of the Kids Online Safety Act (KOSA) released a new version of the bill and announced a number of new cosponsors. Digital rights group Fight for the Future, who have helped mobilize hundreds of thousands of young people to oppose dangerous provisions in the bill, issued the following statement, which can be attributed to the group’s director, Evan Greer (she/her):

“Fight for the Future and the coalition of dozens of human rights, LGBTQ, racial justice and civil liberties groups that have been raising concerns about KOSA for months are still assessing the latest version of the bill. Unfortunately, the bill’s sponsors did not include us in discussions about changes and would not share the text with us despite repeated requests.

Based on our initial read, we are glad to see the attorney general enforcement narrowed. We agree that this will somewhat reduce the immediate likelihood of KOSA being weaponized by politically motivated AGs to target content that they don’t like. We also appreciate that it seems an attempt was made to limit the Duty of Care to design features. However, by not clarifying that the Duty of Care only applies in a content neutral manner, as we have asked, it still invites the harms that we’ve warned about.

As we have said for months, the fundamental problem with KOSA is that its duty of care covers content specific aspects of content recommendation systems, and the new changes fail to address that. In fact, personalized recommendation systems are explicitly listed under the definition of a design feature covered by the duty of care. This means that a future Federal Trade Commission (FTC) could still use KOSA to pressure platforms into automated filtering of important but controversial topics like LGBTQ issues and abortion, by claiming that algorithmically recommending that content “causes” mental health outcomes that are covered by the duty of care like anxiety and depression. 

It’s important to remember that algorithmic recommendation includes, for example, showing a user a post from a friend that they follow, since most platforms do not show all users all posts, but curate them in some way. As long as KOSA’s duty of care covers content recommendation systems without a limitation to ensure it can only be applied to content-neutral design choices, platforms will be likely to react the same way that they did to the broad liability imposed by SESTA/FOSTA: by engaging in aggressive filtering and suppression of important, and in some cases lifesaving, content.

Because the duty of care still covers content recommendation in this way, we unfortunately must remain opposed to KOSA at this time because we still believe it would do more harm than good unless it is amended further. We refuse to accept that trans youth and human rights must be collateral damage in the fight to keep kids safe online. We will continue to urge Senate leaders to make additional changes to KOSA to clarify that the duty of care only applies to content-agnostic design features, like autoplay, infinite scroll, and notifications, rather than to how content is ordered or displayed. We will also redouble our efforts in the House to make sure that if this bill moves forward, it is amended to ensure it will protect all kids, rather than endangering some of the most vulnerable.”