Fight for the Future releases draft of proposed social media legislation that protects ALL kids from Big Tech
Massachusetts-based digital rights group Fight for the Future has released a working draft of a model bill they are offering as an alternative to the well-intentioned legislation put forth by Governor Maura Healey and the Massachusetts House addressing harms of social media for young people.
“We agree with Massachusetts lawmakers that we need to rein in Big Tech and stop them from hurting our kids and our democracy. We want to help them do that without throwing LGBTQ+ kids and undocumented families under the bus,” said Evan Greer (she/her), Boston-based Director of Fight for the Future, “Massachusetts has a chance to lead the nation by passing the strongest protections against Big Tech abuses anywhere in the country. We’re grateful that the Governor, Attorney General’s office, and Massachusetts legislators have agreed to meet with us. We don’t just want to tell them why their current bills are dangerous and unworkable, we want to give them concrete alternatives that accomplish the same goals. We all want the same thing: to crack down on large social media companies’
A chorus of experts and a growing coalition of LGBTQ+, human rights, civil liberties and racial justice organizations have raised the alarm that, as written, the existing proposals would do more harm than good.
Fight for the Future’s model bill combines the most workable parts of the Governor’s proposal and the House bill H. 5366, while replacing unconstitutional and dangerous age verification and parental consent provisions with strong protections for all users turning off by default surveillance driven algorithms, autoplay, infinite scroll, and location sharing.
Toplines
- Ensures protections for ALL children, including LGBTQ+ and other vulnerable youth
- Reins in Big Tech by requiring them to make their platforms safer for all users, not just minors.
- Fixes drafting error in bill that defined “social media” so broadly it included small projects like Bluesky, GitHub, the Trevor Project and Wikipedia
- Adds a pilot program, as suggested by the Student Mental Health coalition, to educate young people on safe and healthy internet practices.
- Forces big tech companies to disclose what information they are collecting on all users, not just minors and how they are using it to change their algorithms.
- Adds a private right of action to aid in enforcement of the newly created user rights.
- Removes unworkable and unconstitutional age verification language, protecting the legislation from First Amendment challenges
- Replaces unworkable parental consent provisions with easy to use tools for parents and kids, that won’t require parents to upload sensitive documents
Notes
- Fight for the Future will not oppose the inclusion of the cellphone ban as written in the Senate bill
- Fight for the Future strongly supports the inclusion of the Massachusetts Data Privacy Act
Proposed Massachusetts Social Media Safety Act
The General Laws are hereby further amended by inserting after chapter 93L the following chapter:-
Chapter 93M: ONLINE PROTECTION
Section 1.
As used in this chapter, the following words shall, unless the context clearly requires otherwise, have the following meanings:-
“Account”, a unique profile for a user of a social media platform.
“Addictive social media feed”, the presentation of content to users where the social media platform employs algorithms that analyze user data or information on users to select content for users and includes any of the following addictive features:
(i) infinite scrolling, which shall include: (A) continuously loading content, or content that loads as the user scrolls down the page without the need to open a separate page; and (B) seamless content, or the use of pages with no visible or apparent end or page breaks;
(ii) push notifications or alerts sent by the social media platform to inform a user about specific activities or events related to the user’s account;
(iii) displays of personal interactive metrics that indicate the number of times other users have clicked a button to indicate their reaction to a user’s content or have shared or reposted the user’s content;
(iv) content generated by an algorithm based on a user’s activity on the social media platform; or
(v) auto-play video or video that begins to play without the user first clicking on the video or on a play button for that video.
“Algorithmic Process”, a computational process, including one derived from machine learning or other artificial intelligence techniques, that processes personal information or other data for the purpose of determining the order or manner that a set of information is provided, recommended to, or withheld from a user of an social media platform, including the provision of commercial content, the display of social media posts, or any other method of automated decision making, content selection, or content amplification.
“Algorithmic ranking system”, an automated computational process, including a process derived from algorithmic decision making, machine learning, statistical analysis or other data processing or artificial intelligence techniques, used to determine the selection, order, relative prioritization or relative prominence of content to be recommended or displayed to a user based. in whole or in part, on information associated with the user, the user’s device or the user’s previous interactions with content shared by other users.
“Autoplay”, a feature of a social media feed or landing page where content is automatically and continuously played in a social media feed without any manual input from a user.
“Content”, text, image, audio or video created, shared or accessed through a social media platform.
“Connected account”, an account directly connected to another account by an affirmative request by 1 user and an affirmative confirmation by another user.
“Educational technology platform”, a software application or web-based technology, including but not limited to Learning Management Systems (LMS), designed to provide school-home communication, educational information, experiences, training or instruction to build knowledge, skills or a craft, provided that, for purposes of this chapter: (i) the software application or web-based technology is approved by the school district for the purpose of communicating with parents or for conveying educational content to students; (ii) the school district complies with the Family Educational Rights and Privacy Act of 1974 (FERPA), 20 U.S.C. 1232g, and its implementing regulations, 34 C.F.R. Part 99, in its use of any software application or web-based technology; and (iii) the school district has an executed student data privacy agreement governing the use of any software application or web-based technology that collects student data that includes a requirement that the software application or web-based technology complies with FERPA, 20 U.S.C. 1232g and 34 C.F.R. Part 99.
“Infinite scroll”, a feature of a social media feed or landing page that provides an automatically and continuously loading social media feed or landing page where additional content displays at the bottom of such feed or landing page without any manual input from a user.
“Minor”, a user or prospective user who is under 18 years of age.
“Parent”, a parent or legal guardian.
“Precise geolocation data”, information derived from technology, including, but not limited to, latitude and longitude coordinates from global positioning system mechanisms or other similar positional data, that reveals the past or present physical location of an individual or device that identifies or is linked or reasonably linkable to 1 or more individuals with precision and accuracy within a radius of 1,750 feet.
“Push notification”, an automatic electronic message displayed on a user’s personal electronic device, as defined in section 40 of chapter 69, when the social media platform is not actively open or visible on the personal electronic device that prompts the user to check and engage with the social media platform.
“Social media feed”, the presentation of content to users of a social media platform.
“Social media platform”, a public or semi-public website, online service, online application or mobile application that primarily serves as a medium for displaying content generated by users through a social media feed, and that allows users to create an account or profile to post, share, view and interact with user-generated content, provided however, that the following services shall not be included: (i) email, SMS, MMS, RCS or similar text messaging telecommunications; (ii) cloud storage services or document viewing, sharing or collaboration services; (iii) an educational technology platform; (iv) platforms organized as a not-for-profit; (v) open source software-developing and-sharing platforms; and (vi) has less than 100,000,000 monthly global active users or generates less than $1,000,000,000 in gross revenue per year.
“User”, a person who accesses or uses a social media platform by establishing an account or profile, or seeks to establish such an account or profile, and who the social media platform has actual knowledge is a Massachusetts resident.
“User-directed feed”, a social media feed in which the content presented has been recommended, selected or prioritized for display based solely on the user’s expressly selected preferences, including user-directed algorithms, content from connected accounts, content the user has subscribed to or content presented in response to a specific search inquiry by the user.
Section 2.
(a) A social media platform shall set the default settings users to ensure a heightened level of privacy and limit the use of features that prolong that user’s engagement with the social media platform. The default settings shall include, but not be limited to:
(i) restricting the visibility of the user’s account to only connected accounts;
(ii) disabling the visibility or sharing of the user’s precise geolocation data with other users;
(iii) limiting the user to only sharing content with connected accounts:
(iv) limiting the user to only direct messaging with connected accounts:
(v) presenting or displaying only a user-directed feed;
(vi) disabling all of the features of an addictive social media feed as defined in Section(1);
(vii) disabling notifications to the user concerning a social media feed between the hours of 10:00 p.m. and 7:00 a.m. and, upon actual knowledge that an account belongs to a minor user, additionally during hours when school is typically in session as reported by the governor’s office;
(viii) upon actual knowledge that an account belongs to a minor user, restricting a minor user from accessing the social media platform between the hours of 10:00 p.m. and 7:00 a.m. and during hours when school is typically in session as reported by the governor’s office;
(ix) providing a clear and conspicuous reminder to the user after accessing the social media platform for more than 1 hour of use, and every 30 minutes thereafter, provided that the social media platform shall require the user to acknowledge the reminder before proceeding to use the social media platform; and
(x) upon actual knowledge that an account belongs to a minor user, prohibit the minor user from accessing constitutionally unprotected explicit content, such as pornography or obscenity, on a social media feed to the extent that the unprotected content is known to the social media platform.
(b) The default settings provided in subsection (a) for all users of a social media platform may be changed pursuant to section 6.
(c) The default settings provided in clause (ii) of subsection (a) for a user of a social media platform may be adjustable in a manner that allows the sharing of the user’s precise geolocation location data with only user-selected individual connected accounts.
(d) A social media platform shall restrict from public visibility a user’s account within 1 hour of receiving a request for a restriction by the user and shall delete a user’s account within 3 days of receiving a request for a deletion by the user. Any restriction or deletion pursuant to this subsection shall include all information and material made publicly available by the user on the social media platform. Upon deletion of a user’s account, the social media platform shall permanently delete all personal information held by the social media platform related to the terminated user. Nothing in this subsection shall require a social media platform to contravene any federal or state law or regulation or require a social media platform to delete information subject to a law enforcement investigation.
(e) A social media platform shall provide a conspicuous tool with each item of content to allow a user the ability to flag or otherwise indicate that the user found the content to be unwanted or harmful.
(f) A social media platform shall provide a conspicuous tool that enables a user to reset the algorithmic ranking system applied to the user’s social media feed such that it clears the learned recommendation profile based on the user’s previous interactions with content.
(g) A social media platform shall present clear and conspicuous warnings on the negative effects of social media use on social, emotional and physical health in at least the following circumstances: upon the activation of a user’s account, provided that the social media platform shall require the user to acknowledge the warning before proceeding to use the social media platform; and, upon actual knowledge that an account belongs to a minor user, whenever a minor seeks to adjust the default settings of a social media platform pursuant to subsection (b), provided that the social media platform shall require the minor to acknowledge the warning before proceeding to adjust the default settings.
The attorney general may, in consultation with the department of public health, the department of mental health and the department of elementary and secondary education, promulgate regulations setting forth the text and manner of presenting such warnings taking into consideration medical and sociological research, including from government publications and peer-reviewed scholarly articles.
(h) No social media platform shall withhold, degrade, lower the quality or increase the price of any product, service or feature to a user due to the social media platform not being permitted to provide an addictive social media feed to the user.
(i) Nothing in this chapter shall be construed as preventing any action taken in good faith to restrict access to or availability of content that a social media platform considers to be obscene, lewd, lascivious, excessively violent, harassing or otherwise objectionable or harmful content, whether or not such content is constitutionally protected. Nothing in this chapter shall be construed as preventing a social media platform from engaging in content moderation, including the removal of spam, scams, phishing attempts, and illegal materials.
Section 3.
(a) A social media platform shall clearly and conspicuously post de-identified aggregate data on minors’ use of the social media platform on its website on at least a quarterly basis. Such data shall include but not be limited to: (i) the number of minors who use the platform, broken down by age or age range; (ii) the amount of time minor users spend on the platform, broken down by age or age range; and (iii) the frequency and type of modification of default settings for social media accounts used by minors. The attorney general may promulgate regulations requiring the reporting of additional de-identified aggregate data about minors’ use of social media platforms. Nothing in this subsection shall be interpreted to require companies to collect data they are not already collecting in their ordinary course of business.
(b) Every 30 days, the social media platform shall survey users that they have actual knowledge are a minor to determine whether, and to what extent, each minor user has experienced unwanted or harmful activity on the social media platform. The social media platform shall make available de-identified aggregate data on the results of these surveys and the flagging of unwanted or harmful content pursuant to subsection (f) of section 3 on its website on at least a quarterly basis.
(c) the attorney general shall, in consultation with security researchers and experts, ensure that such data releases are not vulnerable to re-identification or other security vulnerabilities in order to protect the covered minors.
Section 4.
The Department of Elementary and Secondary Education shall establish and operate a [three] year pilot program starting on [date] to support Massachusetts public school districts in implementing evidence-informed harm-reduction educational programming focused on the mental health impacts and responsible use of technology. Programming shall include, but not be limited to, instruction and supports addressing technology addiction, short-form and algorithm-driven media, social media use, video games, artificial intelligence, large language models, chatbots, and emerging digital tools, and shall emphasize student well-being, safe use, critical consumption, and responsible decision-making. The Department shall deliver a report on [date] outlining the plans for such program, including any necessary budgetary outlays necessary to accomplish its objectives.
Section 5.
(a) With respect to each type of algorithmic process utilized by a social media platform, such social media platform shall disclose the following information to users of the social media platform in conspicuous, accessible, and plain language that is not misleading:
(i) The categories of personal information the social media platform collects or creates for the purposes of the type of algorithmic process.
(ii) The manner in which the social media platform collects or creates such personal information.
(iii) How the social media platform uses such personal information in the type of algorithmic process.
(iv) The method by which the type of algorithmic process prioritizes, assigns weight to, or ranks different categories of personal information to withhold, amplify, recommend, or promote content (including a group) to a user.
(b) Such social media platform shall make available the notice described in subparagraph (A) in each language in which the social media platform provides service.
Section 6.
(a) A social media platform shall establish a mechanism by which the user may adjust and provide the option to password lock the default settings pursuant to subsection (b) of section 2. To the extent possible, a social media platform shall offer granular settings so that a user can personalize their experience and privacy. The attorney general may promulgate guidelines necessary to balance user choice and usability in establishing these settings.
(b) Nothing in this chapter shall be construed as requiring a social media platform to provide a parent any additional or special access to or control over the data or accounts of their minor user child.
Section 7.
(a) A violation by a social media platform of this chapter shall be deemed an unfair or deceptive act or practice in trade or commerce under chapter 93A.
(b) A social media platform found to be in violation of section 2 shall be punished by a civil fine of not more than $5,000 per violation; provided, that a social medial platform shall be in violation of section 2 for each user account not in compliance with section 2.
(c) A social media platform violation of section 3 shall be punished by a civil fine of not more than $1,000,000; provided, that each day that a violation of section 3 persists shall be considered a separate violation under this section.
Section 8.
(a) a person may bring a civil action against a covered entity or service provider for a violation of Section 2 subsections (a) through (h), or a regulation promulgated thereunder, in an appropriate Massachusetts district court.
(b) In a civil action brought under paragraph (1) in which the plaintiff prevails, the court may award the plaintiff—
(i) an amount equal to the sum of any actual damages;
(ii) injunctive relief;
(iii) declaratory relief; and
(iv) reasonable attorney fees and litigation costs.
Section 9.
The attorney general may promulgate regulations necessary to effectuate the purposes of this chapter.