Tracking state legislation affecting social media platforms
102 Total
24 Signed Into Law
65 In Progress
13 Failed / Vetoed
Alaska — HB318 (2026)
Introduced
Social Media/minors
REFERRED TO JUDICIARY — Feb 18, 2026 · Session ends 0.0
Age Estimation (non-ID)Default Restrictive Privacy for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Targeted Advertising to MinorsSchool-Hours Restrictions
NotesThis Alaska bill requires social media platforms to disable "addictive design features" (like infinite scroll and autoplay video) by default for known minors, block notifications during nighttime and school hours, ban targeted advertising to minors, and limit data collection.
Alaska — HB47 (2025)
Passed One Chamber
Chld Sex Abuse Mat; Gen Images; Soc Media
YUNDT, CRONK, KAUFMAN — Apr 13, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Verification (Gov't ID)Easy Account Deletion for MinorsEnforcement through Attorney GeneralEnforcement through private right of actionNotification Time RestrictionsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesAs introduced, this bill was a number of sensible changes to the state's laws regarding child sex abuse material (CSAM) to make it easier to treat computer generated material identically to actual CSAM. However, a later amendment added a number of the provisions of the model AADC in abbreviated form, going well beyond updating the state's CSAM possession and distribution laws and imposing a number of additional provisions regarding age verification and parental consent and control over the accounts of any user under 18.
Alaska — SB 262 (2026)
Introduced
Social Media/minors
REFERRED TO LABOR & COMMERCE — Feb 23, 2026 · Session ends 0.0
Minimum Age Ban
NotesRequires covered sites to completely prevent anyone under 16 from creating an account and terminate the accounts of all existing under-16 users.
Alabama — HB171 (2026)
Failed
Social media; certain media feeds lacking age verification, Attorney General authorized to enforce
Pending House Children and Senior Advocacy — Jan 13, 2026
Age Estimation (non-ID)Enforcement through Attorney GeneralEnforcement through private right of actionMandatory Disclosure of PoliciesNotification Time RestrictionsParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Algorithmic/Personalized Feeds
NotesThis Alabama bill targeted social media platforms that use personalized algorithmic feeds (termed 'addictive feeds') by requiring age verification or parental consent before minors can access such feeds, banning notifications to minors between midnight and 6 a.m., and mandating transparency about how algorithms prioritize content. The legislative session ended March 27, 2026 without it advancing.
Arkansas — SB611 (2025)
Signed Into Law Partially Enjoined
To Amend The Social Media Safety Act.
Notification that SB611 is now Act 900 — Apr 22, 2025
Age Verification (Gov't ID)Default Restrictive Privacy for MinorsEnforcement through Attorney GeneralEnforcement through private right of actionMandatory Third-Party AuditsMinimum Age BanNotification Time RestrictionsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesEnacted in 2025 as Acts 900 and 901. Lawsuit filed by Netchoice: NetChoice v. Griffin, 5:25-cv-05140, (W.D. Ark.) District court issued temporary injunction against Act 901 on 15 December 2025. Request for temporary injunction against Act 900 is still pending.
Action Alert Arizona — HB2991 (2026)
Passed One Chamber
Social media; online content; minors
Senate minority caucus: Do pass — Apr 7, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Verification (Gov't ID)Content-Specific RestrictionsEasy Account Deletion for MinorsMinimum Age BanParental Consent for Account CreationProhibit "Addictive Design Features"
NotesRequires covered sites to age-verify all users, terminate accounts of users under 14, require parental consent for 14-15 year olds, and prevent access to material "harmful to minors" for under-18 accounts. "Harmful to minors" definition incorporates by reference criminal statute 13-3501, which explicitly calls out "homosexuality" as inherently "sexual conduct".
Action Alert Arizona — SB1747 (2026)
Passed One Chamber
Social media; online content; minors.
House COM Committee action: do pass amended/strike-everything, voting: (8-0-2-2-0-0) — Mar 24, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Verification (Gov't ID)Content-Specific RestrictionsEasy Account Deletion for MinorsMinimum Age BanParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Algorithmic/Personalized Feeds
NotesCompanion bill to HB 2991. Requires covered sites to age-verify all users, terminate accounts of users under 14, require parental consent for 14-15 year olds, and prevent access to material "harmful to minors" for under-18 accounts. "Harmful to minors" definition incorporates by reference criminal statute 13-3501, which explicitly calls out "homosexuality" as inherently "sexual conduct".
California — AB1700 (2026)
Introduced
e-Safety Commission: youth online protection.
Re-referred to Com. on P. & C.P. — Mar 23, 2026 · Session ends 0.0
Age Estimation (non-ID)Age Verification (Gov't ID)Minimum Age Ban
NotesThis bill creates a new California e-Safety Commission responsible for developing age compliance guidelines, reviewing age verification technologies, and investigating noncompliance by online services subject to minimum age laws. It sets the minimum age at 16. The bill defines 'covered entity' very broadly as 'a person or organization providing online services subject to a law that pertains to the minimum age".
California — AB1709 (2026)
Introduced
Covered platforms: age restriction: e-Safety Advisory Commission.
From committee: Do pass and re-refer to Com. on JUD. (Ayes 13. Noes 1.) (April 16). Re-referred to Com. on JUD. — Apr 16, 2026 · Session ends 0.0
Enforcement through Attorney GeneralMinimum Age Ban
NotesThis began as a placeholder (intent) bill that simply declared the California Legislature's intention to enact a minimum age requirement for opening or maintaining a social media account. The intent bill has now been modified to add the intended age threshold (16).
California — AB1946 (2026)
Introduced
Reporting mechanism: child sexual abuse material.
From committee: Do pass and re-refer to Com. on JUD. (Ayes 14. Noes 1.) (April 16). Re-referred to Com. on JUD. — Apr 16, 2026 · Session ends 0.0
Enforcement through private right of actionMandatory Third-Party Audits
NotesThis bill increases the statutory damages that social media platforms must pay when they knowingly facilitate, aid, or abet the commercial sexual exploitation of minors—raising the floor from $1 million to $1.5 million and the cap from $4 million to $4.5 million per act. It applies to platforms as defined in existing California law (Business and Professions Code Section 22675).
California — AB2 (2024)
Passed One Chamber
Injuries to children: civil penalties.
In committee: Hearing postponed by committee. — Jul 14, 2025 · Session ends 0.0
Duty of Care ProvisionsEnforcement through private right of action
NotesThis bill would allow children (or their families) to sue social media platforms for statutory damages—up to $1 million per child or triple actual damages—when a platform fails to exercise ordinary care toward a child.
California — AB2273 (2022)
Signed Into Law Partially Enjoined
The California Age-Appropriate Design Code Act.
Chaptered by Secretary of State - Chapter 320, Statutes of 2022. — Sep 15, 2022
Age Estimation (non-ID)Data Protection Impact AssessmentsDefault Restrictive Privacy for MinorsProhibit "Dark Patterns"
NetChoice, LLC v. Bonta, 5:22-cv-08861, (N.D. Cal.)
NotesEnacted in 2022. Lawsuit filed by Netchoice: NetChoice, LLC v. Bonta, 5:22-cv-08861, (N.D. Cal.) Currently mostly temporarily enjoined, although several appellate rulings have allowed some provisions to be enforced.
California — AB2426 (2026)
Introduced
Online platforms: educational children’s content.
In committee: Set, first hearing. Hearing canceled at the request of author. — Apr 9, 2026 · Session ends 0.0
Content-Specific RestrictionsMandatory Disclosure of PoliciesMandatory Third-Party AuditsRestrict Targeted Advertising to Minors
NotesThis California bill would require large online video platforms (those with over $100 million in annual revenue) to create a dedicated 'walled garden' section featuring at least four hours of educational children's content, free from targeted advertising and accessible without an account.
California — SB976 (2024)
Signed Into Law Partially Enjoined
Protecting Our Kids from Social Media Addiction Act.
Chaptered by Secretary of State. Chapter 321, Statutes of 2024. — Sep 20, 2024
Age Estimation (non-ID)Default Restrictive Privacy for MinorsNotification Time RestrictionsParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsSchool-Hours RestrictionsTime Tracking / Usage Limits
NetChoice v. Bonta, 5:24-cv-07885, (N.D. Cal.)
NotesThis California law (SB 976) prohibits social media and similar platforms from serving algorithmically personalized content feeds ('addictive feeds') to minors without verified parental consent, restricts notifications during nighttime and school hours, and requires default-on parental controls including a one-hour daily time limit and private account settings. The law applies broadly to any internet-based service or app that provides a personalized feed as a significant part of its service — not just traditional social media — which means smaller or independent platforms with algorithmic recommendations could also be covered. Enforcement is exclusively through the California Attorney General.
Colorado — HB1136 (2024)
Signed Into Law Enjoined
Healthier Social Media Use by Youth
Governor Signed — Jun 6, 2024
Notification Time RestrictionsState-Mandated Warning DisplaysTime Tracking / Usage Limits
NetChoice v. Weiser, 1:25-cv-02538, (D. Colo.)
NotesThis Colorado bill requires social media platforms to show health-related warnings and usage notifications to users under 18, either through a custom engagement-information tool or pop-up notifications triggered after one hour of daily use or during nighttime hours (10 PM–6 AM). It also directs the state Department of Education to create a free resource bank on social media's mental health impacts for schools. Enacted in 2024. Lawsuit filed by Netchoice: NetChoice v. Weiser, 1:25-cv-02538, (D. Colo.) District court issued temporary injunction on 6 Nov 2025. Colorado appealed to the Tenth Circuit: NetChoice v. Weiser, 25-1456, (10th Cir.)
Colorado — HB1148 (2026)
Failed
Protections for Youth on Social Media
House Committee on Judiciary Postpone Indefinitely — Apr 7, 2026
Age Estimation (non-ID)Block Adult-Minor ContactContent-Specific RestrictionsDefault Restrictive Privacy for MinorsDuty of Care ProvisionsEasy Account Deletion for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized Feeds
NotesThis bill combines restrictions on minors' use of online gaming sites that allow microtransactions with restrictions on social media in general, using the definition of "social media" from the now-enjoined 2024 HB 1136.
Action Alert Connecticut — HB05037 (2026)
Introduced
An Act Promoting The Safety Of Minors On Social Media Platforms.
File Number 179 — Mar 26, 2026 · Session ends 0.0
Age Estimation (non-ID)Block Adult-Minor ContactContent-Specific RestrictionsDefault Restrictive Privacy for MinorsNotification Time RestrictionsParental Consent for Account CreationParental Control over SettingsRestrict Algorithmic/Personalized FeedsState-Mandated Warning DisplaysTime Tracking / Usage Limits
NotesRequires "commercially reasonable" age determination for under-18, parental consent for many features, non-dismissable warning screen for 30 seconds covering 75% of screen (repeated every 3 hours for 10 seconds at 25%), default 1-hour daily time limit for under-18, no notifications between 9pm-8am without parental approval.
Florida — HB 3 (2024)
Signed Into Law In Effect
Online Protections for Minors
Chapter No. 2024-42 — Mar 25, 2024
Account Termination on Parental RequestAge Estimation (non-ID)Age Verification (Gov't ID)Easy Account Deletion for MinorsMinimum Age BanParental Consent for Account CreationProhibit "Addictive Design Features"
COMPUTER & COMMUNICATIONS INDUSTRY ASSOCIATION v. UTHMEIER, 4:24-cv-00438, (N.D. Fla.)
NotesThis Florida law bans children under 14 from having social media accounts entirely and requires parental consent for 14-15 year olds. It also requires websites with substantial "material harmful to minors", defined as more than 1/3 of the content on the site, to verify users are 18+ through age verification. "Material harmful to minors" is defined with a modified obscenity test derived from Miller v. California, 413 U.S. 15 (1973). The district court issued a preliminary injunction through the lawsuit, CCIA & NetChoice v. Uthmeier 4:24-cv-00438, (N.D. Fla.), on 3 June 2025, but Florida appealed to the 11th Circuit. On 25 November 2025, the 11th Circuit held in a 2-1 opinion that the district court erred in applying strict scrutiny and should have applied the less stringent intermediate scrutiny standard. They stayed the injunction, thus allowing the state to begin enforcing the law.
Florida — S 7072 (2021)
Signed Into Law Enjoined
Social Media Platforms
Chapter No. 2021-32, companion bill(s) passed, see SB 7074 (Ch. 2021-33) — May 25, 2021
Content-Specific Restrictions
NETCHOICE LLC v. UTHMEIER, 4:21-cv-00220, (N.D. Fla.)
NotesProhibits social media platforms from "deplatforming" candidates for office, defined as to "permanently delete or ban a user or to temporarily delete or ban a user from the social media platform for more than 14 days". Preliminary injunction went to the Supreme Court as Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton, 603 U.S. 707 (2024). Supreme Court upheld the preliminary injunction and remanded for further development of the record on the First Amendment issues involved.
Georgia — SB343 (2025)
Failed
Age Verification of Account Holders; providers of social media platforms from permitting a minor aged 14 years or younger to be an account holder; prohibit
Senate Read and Referred — Mar 20, 2025
Age Verification (Gov't ID)Minimum Age BanParental Consent for Account Creation
NotesMandates age verification, bans social media use by users under 14, requires parental consent for users between 14 and 18.
Georgia — SB351 (2024)
Signed Into Law Enjoined
"Protecting Georgia's Children on Social Media Act of 2024"; enact
Act 463 — Apr 23, 2024
Age Estimation (non-ID)Age Verification (Gov't ID)Parental Access to Account/ActivityParental Consent for Account CreationRestrict Targeted Advertising to Minors
NetChoice v. Carr, 1:25-cv-02422, (N.D. Ga.)
NotesEnacted in 2024. Lawsuit filed by Netchoice: NetChoice v. Carr, 1:25-cv-02422, (N.D. Ga.) District court issued temporary injunction on 25 June 2025. Georgia appealed to the Eleventh Circuit: NetChoice v. Attorney General, State of Georgia, 25-12436, (11th Cir.)
Georgia — SB495 (2026)
Failed
"Age-Appropriate Design Code Act"; enact
Senate Read and Referred — Feb 12, 2026
Age Estimation (non-ID)Block Adult-Minor ContactDefault Restrictive Privacy for MinorsEasy Account Deletion for MinorsMandatory Disclosure of PoliciesMandatory Third-Party AuditsNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized Feeds
NotesAdapted from the model AADC. Short timeline remaining and not under active consideration.
Hawaii — SB2761 (2026)
Passed One Chamber
Relating To Social Media.
The committee(s) on JHA recommend(s) that the measure be deferred. — Apr 1, 2026 · Session ends 0.0
Age Estimation (non-ID)Minimum Age Ban
NotesProhibits all users under 16 from having accounts on any "social media" service, with an extremely broad definition covering any website accepting user-generated content displayed publicly. Does not mandate age verification; self-attested age is sufficient. The committee has recommended the bill be deferred.
Idaho — H0542 (2026)
Signed Into Law Not Yet in Force
Adds to existing law to establish the Stop Harms from Addictive Social Media Act.
Reported Signed by Governor on April 2, 2026 Session Law Chapter Effective: — Apr 2, 2026
Account Termination on Parental RequestAge Estimation (non-ID)Default Restrictive Privacy for MinorsEasy Account Deletion for MinorsParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NotesRequires "covered social media platforms" (those owned by a company that made at least $1b in advertising in at least one of the past three years) to perform age estimation (not verification), DOB collection on registration with parental consent for under-16, most restrictive privacy defaults for under-16, parental control over time limits, and account termination on parental request. Effective date is July 1, 2026.
Illinois — HB4750 (2026)
Introduced
SOCIAL MEDIA-MENTAL HEALTH
Rule 19(a) / Re-referred to Rules Committee — Mar 27, 2026 · Session ends 0.0
Enforcement through Attorney GeneralState-Mandated Warning DisplaysTime Tracking / Usage Limits
NotesThis Illinois bill would require all social media platforms to display a mental health warning label every time any user (not just minors) logs in, and to show a pop-up usage timer at least every 30 minutes of active use.
Illinois — HB5066 (2026)
Introduced
SOCIAL MEDIA AGE RESTRICTION
Rule 19(a) / Re-referred to Rules Committee — Mar 27, 2026 · Session ends 0.0
Age Estimation (non-ID)Enforcement through Attorney GeneralMinimum Age Ban
NotesThis Illinois bill would ban children aged 16 and under from having their own accounts on social media platforms, requiring platforms to implement age assurance systems to enforce this restriction. The definition of 'social media platform' is fairly broad—covering any public or semi-public internet service where a substantial function is social interaction.
Illinois — HB5511 (2026)
Passed One Chamber
DIGITAL AGE ASSURANCE
Added Co-Sponsor Rep. Robyn Gabel — Apr 16, 2026 · Session ends 0.0
Age Estimation (non-ID)Default Restrictive Privacy for MinorsEnforcement through Attorney GeneralNotification Time RestrictionsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsRestrict Algorithmic/Personalized Feeds
NotesThis Illinois bill would require mobile operating system providers (like Apple and Google) to collect users' ages at device setup and share age bracket signals with app developers. Covered platforms—defined broadly as sites or apps with profiles, user-generated content, and private messaging used by minors—must use those signals to apply default privacy protections for minors, block personalized algorithmic feeds unless parents consent, and restrict late-night notifications.
Illinois — HB5561 (2026)
Introduced
PUBLIC SAFETY-SOCIAL MEDIA
Rule 19(a) / Re-referred to Rules Committee — Mar 27, 2026 · Session ends 0.0
Content-Specific RestrictionsEnforcement through private right of action
NotesThis bill creates the Youth Public Safety and Social Media Accountability Act, which makes it a Class A misdemeanor to coordinate or promote gatherings of 100+ minors without a permit that are expected to cause violence or property damage, including via social media. It requires social media platforms to remove such content and to carry at least $5 million in liability insurance. The definition of 'social media platform' is broad, covering any Internet-based service with Illinois users whose substantial function is social interaction.
Illinois — SB3264 (2026)
Introduced
ONLINE SAFETY ACT
Rule 2-10 Committee Deadline Established As April 24, 2026 — Mar 27, 2026 · Session ends 0.0
Block Adult-Minor ContactEnforcement through Attorney GeneralMandatory Disclosure of PoliciesProhibit "Addictive Design Features"Prohibit "Dark Patterns"
NotesThis Illinois bill would require social media platforms to create an online safety center with mental health resources and cyberbullying reporting information, and to establish a cyberbullying policy. It also restricts how any online service targeting minors can operate—banning manipulative consent designs (dark patterns), requiring default blocking of unsolicited adult-to-minor messages, and prohibiting design features meant to keep minors using the platform longer.
Illinois — SB3977 (2026)
Introduced
DIGITAL AGE ASSURANCE
Senate Committee Amendment No. 1 Assignments Refers to Executive — Apr 14, 2026 · Session ends 0.0
Age Estimation (non-ID)Default Restrictive Privacy for MinorsEnforcement through Attorney GeneralNotification Time RestrictionsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsRestrict Algorithmic/Personalized Feeds
NotesThis Illinois bill would require operating system providers (like Apple and Google) to collect users' ages at device setup and share age-bracket signals with app developers, so social media sites can identify minors. Sites must then apply default privacy protections for minors (restricting location sharing and financial transactions), ban personalized algorithmic feeds for minors without parental consent, and block late-night notifications.
Indiana — HB1408 (2026)
Signed Into Law Not Yet in Force
Education matters.
Public Law 100 — Mar 4, 2026
Age Estimation (non-ID)Block Adult-Minor ContactDefault Restrictive Privacy for MinorsEnforcement through Attorney GeneralParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesRequires sites that meet the definitions to identify Indiana residents under 16, track their time spent on the site, obtain parental consent for account creation, and allow parents full access to the user's account.
Kansas — HB2657 (2026)
Failed
Prohibiting social medial platforms from allowing children under 16 years of age to create, maintain or access an account unless the platform has obtained verified parental consent.
House Referred to Committee on Legislative Modernization — Feb 3, 2026
Account Termination on Parental RequestEasy Account Deletion for MinorsEnforcement through Attorney GeneralMinimum Age BanParental Consent for Account Creation
NotesThis Kansas bill would ban children under 16 from having social media accounts unless the platform obtains verified parental consent. Parents can revoke consent and request account deletion at any time. The definition of "social media" is so broad that it would generally cover any site that allows user-generated content.
Kansas — SB499 (2026)
Failed
Age-Appropriate Design Code Act
Senate Referred to Committee on Federal and State Affairs — Feb 10, 2026
Age Estimation (non-ID)Block Adult-Minor ContactData Protection Impact AssessmentsDefault Restrictive Privacy for MinorsEasy Account Deletion for MinorsMandatory Disclosure of PoliciesMandatory Third-Party AuditsNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsTime Tracking / Usage Limits
NotesAADC-based provisions. Legislature adjourned for non-exempt bills on 27 March, but this bill's committee is exempt. Still, never had a committee hearing.
Action Alert Kentucky — HB227 (2026)
Passed One Chamber
AN ACT relating to addictive online platforms.
returned to Judiciary (S) — Apr 1, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Estimation (non-ID)Default Restrictive Privacy for MinorsEasy Account Deletion for MinorsParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesBadly written with contradictory requirements (sometimes under-13, sometimes under-16, sometimes under-18). Best reading: age estimation (not verification), COPPA-style parental consent for under-13, parental account termination for under-18, possibly time limits for 13-18.
Kentucky — HB232 (2026)
Introduced
Youth Online Safety Act
to Small Business & Information Technology (H) — Jan 15, 2026 · Session ends 0.0
Age Estimation (non-ID)Content-Specific RestrictionsEnforcement through Attorney GeneralEnforcement through private right of actionNotification Time RestrictionsParental Consent for Account CreationRestrict Algorithmic/Personalized Feeds
NotesThis Kentucky bill (the 'Youth Online Safety Act') would require large social media platforms to default minors under 18 to chronological, non-algorithmic feeds unless parents consent to personalized feeds, block push notifications to minors overnight, require parental consent for in-app purchases, and mandate filtering of harmful content like self-harm and substance abuse material.
Kentucky — HB633 (2026)
Introduced
AN ACT relating to data privacy.
to Small Business & Information Technology (H) — Feb 20, 2026 · Session ends 0.0
Account Termination on Parental RequestBlock Adult-Minor ContactContent-Specific RestrictionsDefault Restrictive Privacy for MinorsEasy Account Deletion for MinorsNotification Time RestrictionsParental Access to Account/ActivityParental Control over SettingsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsSchool-Hours RestrictionsTime Tracking / Usage Limits
NotesTypical AADC-style provisions. Less active than companion bill HB 227.
Louisiana — SB162 (2023)
Signed Into Law Ruled Unconstitutional
Creates the Secure Online Child Interaction and Age Limitation Act. (8/1/23) (EN INCREASE GF EX See Note)
Effective date 7/1/2024. — Jun 28, 2023
Age Estimation (non-ID)Block Adult-Minor ContactEnforcement through Attorney GeneralParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NetChoice v. Murrill, 3:25-cv-00231, (M.D. La.)
NotesEnacted in 2023. Lawsuit filed by Netchoice: NetChoice v. Murrill, 3:25-cv-00231, (M.D. La.) District court ruled the law unconstitutional and permanently enjoined it on 15 December 2025.
Massachusetts — H4229 (2025)
Introduced
Protecting children from addictive social media feeds
Accompanied a study order, see H5071 — Feb 12, 2026 · Session ends 0.0
Age Estimation (non-ID)Notification Time RestrictionsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized Feeds
NotesCompanion bill to S30. Same provisions: age verification, ban algorithmic feeds and midnight-6AM notifications for under-18.
Massachusetts — H5295 (2026)
Introduced
Relative to regulation of minors on social media
Senate concurred — Mar 23, 2026 · Session ends 0.0
Age Verification (Gov't ID)Enforcement through Attorney GeneralMinimum Age Ban
NotesThis Massachusetts bill would ban children under 16 from creating or maintaining social media accounts and require all new users to verify their age using a birth certificate or government-issued ID.
Massachusetts — S2581 (2025)
Passed One Chamber
To promote student learning and mental health
Passed to be engrossed - 129 YEAS to 25 NAYS (See YEA and NAY No. 156 ) — Apr 8, 2026 · Session ends 0.0
School-Hours Restrictions
Notestk
Massachusetts — S30 (2025)
Introduced
Protecting children from addictive social media feeds
Bill reported favorably by committee and referred to the committee on Senate Ways and Means — Jul 24, 2025 · Session ends 0.0
Age Estimation (non-ID)Notification Time RestrictionsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized Feeds
NotesAbridged version of model AADC requiring age verification of all users, banning algorithmic feeds and notifications between midnight and 6AM for users under 18. Companion to H4229.
Maryland — HB 603 (2024)
Signed Into Law In Effect
Consumer Protection - Online Products and Services - Data of Children (Maryland Kids Code)
Approved by the Governor - Chapter 461 — May 9, 2024
Age Estimation (non-ID)Data Protection Impact AssessmentsDefault Restrictive Privacy for MinorsProhibit "Dark Patterns"
NetChoice v. Brown, 1:25-cv-00322, (D. Maryland)
NotesEnacted in 2024. Lawsuit filed by Netchoice: NetChoice v. Brown, 1:25-cv-00322, (D. Maryland)
Michigan — HB4388 (2025)
Introduced
Trade: business practices; regulation of social media use by minors; provide for. Creates new act.
Referred To Committee On Communications And Technology — Mar 19, 2026 · Session ends 0.0
Enforcement through Attorney GeneralEnforcement through private right of actionNotification Time RestrictionsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsRestrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesThis Michigan bill would require social media platforms with at least 5 million users worldwide to verify the age of Michigan residents and obtain parental consent before minors can create or maintain accounts. Minor accounts would face default nighttime access restrictions (10:30 PM–6:30 AM), bans on algorithmic recommendations and data collection, and parents would be granted full access to view their child's posts and messages.
Action Alert Michigan — SB 758
Introduced
Michigan comprehensive AADC-plus provisions
· Session ends Dec 31, 2026
Block Adult-Minor ContactDefault Restrictive Privacy for MinorsMandatory Third-Party AuditsNotification Time RestrictionsParental Control over SettingsProhibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsSchool-Hours RestrictionsTime Tracking / Usage Limits
NotesModel AADC gone further: privacy tools/defaults for minors, prohibit adult-minor contact unless connected, hide minor connection lists from adults, no notifications 10pm-6am and 8am-4pm school weekdays, no "dark patterns" (vaguely defined), no personalized feeds for minors, time tracking, parental settings control, mandatory annual third-party audits with public reports. SB 759 companion makes violations an unfair trade practice.
Action Alert Michigan — SB0757 (2025)
Introduced
Stop Addictive Feeds Exploitation for Kids Act
Placed On Immediate Passage — Mar 25, 2026 · Session ends 0.0
Notification Time RestrictionsParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsSchool-Hours Restrictions
NotesRegulates "addictive feeds" for under-18: sites can only offer an "addictive feed" if they have actual knowledge user is over 18 or verified parental consent. No notifications about addictive feeds between 10pm-6am year-round and 8am-4pm weekdays during school year.
Minnesota — HF 4138 (2026)
Introduced
Stop Harms from Addictive Social Media Act
Author added Dippel, Zeleznikar, and Bakeberg — Apr 7, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Estimation (non-ID)Default Restrictive Privacy for MinorsEasy Account Deletion for MinorsEnforcement through Attorney GeneralEnforcement through private right of actionParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NotesThis Minnesota bill (the 'Stop Harms from Addictive Social Media' act) targets very large social media sites earning over $1 billion in annual ad revenue, requiring parental consent for accounts held by children 15 and under, banning "addictive design features" and targeted ads on those accounts, and setting privacy defaults to the most restrictive levels. It requires sites to give parents tools to monitor and limit their children's usage and allows both parents and children to sue platforms directly.
Minnesota — SB 4097 (2024)
Signed Into Law Enjoined
Minnesota Prohibiting Social Media Manipulation Act
Secretary of State, Filed — May 21, 2024
Mandatory Disclosure of Policies
NetChoice v. Ellison, 0:25-cv-02741, (D. Minnesota)
NotesThe law doesn't specify any changes that a social media site must make to its product or features, but requires the site to prominently disclose a number of things, including moderation policy, algorithmic ranking, usage statistics, statistics about how many notifications it sends, and all "product experiments" conducted on 1000 or more users.
Missouri — HB2392 (2025)
Introduced
Establishes the "Missouri Social Media Safety for Minors Act"
HCS Reported Do Pass (H) — Apr 2, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Estimation (non-ID)Block Adult-Minor ContactEnforcement through Attorney GeneralEnforcement through private right of actionMinimum Age BanParental Access to Account/ActivityParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Targeted Advertising to Minors
NotesThis Missouri bill bans children under 14 from social media entirely, requires parental consent for 14- and 15-year-olds, and requires sites to give parents tools to monitor and delete their minor children's accounts. It also prohibits "addictive design features", adult-to-minor direct messaging (unless verified), and targeted advertising to minors, with enforcement by the attorney general and a private right of action for parents.
Action Alert Missouri — HB3393 (2026)
Introduced
Missouri Social Media Safety for Minors Act
HCS Reported Do Pass (H) - AYES: 12 NOES: 1 PRESENT: 0 — Apr 2, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Estimation (non-ID)Block Adult-Minor ContactEasy Account Deletion for MinorsMinimum Age BanParental Access to Account/ActivityParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Targeted Advertising to Minors
NotesRequires age verification of all users including logged-out users, prohibits registration for under-16, parental consent for 16-18, full parental access/control over 16-18 accounts, prohibits "addictive and manipulative design features" (undefined) targeting minors.
Mississippi — HB1126 (2024)
Signed Into Law In Effect
"Walker Montgomery Protecting Children Online Act"; establish to protect minors from harmful content.
Approved by Governor — Apr 30, 2024
Age Estimation (non-ID)Content-Specific RestrictionsDuty of Care ProvisionsParental Consent for Account CreationRestrict Targeted Advertising to Minors
Netchoice, LLC v. Fitch, 1:24-cv-00170, (S.D. Miss.)
NotesDistrict court issued temporary injunction on 18 June 2025. Mississippi appealed to the Fifth Circuit, who overturned the injunction and allowed the law to go into effect. The Supreme Court declined to reverse the Fifth Circuit's decision. The law is in effect while the litigation continues.
Mississippi — HB1224 (2026)
Signed Into Law
MS Keeping Kids Safe Online Act;
Approved by Governor — Apr 8, 2026
Enforcement through Attorney GeneralEnforcement through private right of actionMandatory Disclosure of PoliciesProhibit "Addictive Design Features"
NotesThis law prohibits covered services from making false claims about safety or addictiveness, requires disclosure of harms to minors, and declares services with "addictive design features" offered to minors as "defective products."
Nebraska — LB1119 (2026)
Introduced
Age-Appropriate Online Design Code Act
Banking, Commerce and Insurance AM2237 filed — Mar 11, 2026 · Session ends 0.0
Content-Specific RestrictionsDefault Restrictive Privacy for MinorsEasy Account Deletion for MinorsNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Targeted Advertising to MinorsSchool-Hours Restrictions
NotesBased on model AADC. Requires easy account deletion for under-18, no targeted advertising to under-18, no notifications 10pm-6am and 8am-4pm school weekdays. Explicitly does not mandate age verification. Only applies to services with actual knowledge that 2%+ of users are minors.
Nebraska — LB838 (2026)
Signed Into Law Not Yet in Force
Age-Appropriate Online Design Code Act
President/Speaker signed — Apr 10, 2026
Easy Account Deletion for MinorsNotification Time RestrictionsParental Control over SettingsRestrict Targeted Advertising to Minors
NotesSimilar provisions to LB 1119 were added to this unrelated bill via amendment on 6 March 2026, plus privacy tools for minors and parental control requirements. Effective date for the age verification requirements appears to be July 17, 2026.
New Hampshire — HB1650 (2025)
Failed
Relative to an age-appropriate design code.
Inexpedient to Legislate: Motion Adopted Voice Vote 03/11/2026 House Journal 7 — Mar 11, 2026
Age Estimation (non-ID)Block Adult-Minor ContactDefault Restrictive Privacy for MinorsDuty of Care ProvisionsEasy Account Deletion for MinorsMandatory Disclosure of PoliciesNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized Feeds
NotesAdapted from model AADC. Privacy tools with maximum privacy defaults for under-18, including not showing minors' posts to adults or allowing adult interaction with minors' posts. Duty of care to prevent "emotional distress, compulsive use, or discrimination" (defined by AG regulations). No notifications midnight-6am for under-18. The committee has deemed it "inexpedient to legislate", which means it is highly unlikely to advance.
New Jersey — A2739 (2026)
Introduced
Prohibits social media platforms from promoting certain practices or features of eating disorders to child users.
Reported out of Assembly Comm. with Amendments, 2nd Reading — Feb 12, 2026 · Session ends 0.0
Content-Specific RestrictionsDuty of Care ProvisionsMandatory Third-Party Audits
NotesPrevents covered sites from using designs/algorithms that could cause child users to develop eating disorders including promoting diet products. Requires quarterly internal and annual external audits. Nearly invalidates itself by precluding liability for content the site didn't post. No data anonymization provisions for third-party audits. No scientific consensus exists on what causes eating disorders in children.
Action Alert New Jersey — A4013 (2026)
Introduced
Requires certain social media platforms to take certain actions concerning user mental health.
Reported and Referred to Assembly Appropriations Committee — Feb 19, 2026 · Session ends 0.0
State-Mandated Warning DisplaysTime Tracking / Usage Limits
NotesRequires time-tracking and proactive monitoring of ALL users for "problematic behaviors" (3+ hours/day, accessing within 10 minutes of waking, 10+ posts/day). State-mandated warnings: 25% screen on first login (10 sec), non-dismissable 75% screen after 3 hours (90 sec, repeating hourly). Must inform users of "problematic behavior" with state-approved resources. State-mandated disclaimer under every ad. Companion to SB 3412.
Action Alert New Jersey — A4015 (2026)
Introduced
New Jersey Kids Code Act
Reported and Referred to Assembly Appropriations Committee — Feb 19, 2026 · Session ends 0.0
Block Adult-Minor ContactDefault Restrictive Privacy for MinorsEasy Account Deletion for MinorsMandatory Disclosure of PoliciesMandatory Third-Party AuditsNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsSchool-Hours Restrictions
NotesBased on model AADC. Privacy tools/defaults for minors, prohibit adult-minor contact unless connected, hide minor connections from adults, no notifications 10pm-6am and 8am-4pm school weekdays, no "dark patterns" (vaguely defined), time tracking for minors, mandatory annual third-party audits with public reports. Companion to SB 3413.
New Jersey — AB 1358 (2026)
Introduced
Concerns social media privacy and data management for children and establishes New Jersey Children's Data Protection Commission.
Introduced, Referred to Assembly Science, Innovation and Technology Committee — Jan 13, 2026 · Session ends 0.0
Age Estimation (non-ID)Data Protection Impact AssessmentsDefault Restrictive Privacy for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesProhibit "Dark Patterns"
NotesThis New Jersey bill requires social media platforms to complete Data Protection Impact Assessments before launching features likely to be accessed by children (under 18), set default privacy settings to the highest level for minors, and prohibit deceptive design practices targeting children. It also creates a Children's Data Protection Commission to advise the Legislature on best practices.
Action Alert New Jersey — S3412 (2026)
Introduced
Requires certain social media platforms to take certain actions concerning user mental health.
Introduced in the Senate, Referred to Senate Health, Human Services and Senior Citizens Committee — Feb 9, 2026 · Session ends 0.0
State-Mandated Warning DisplaysTime Tracking / Usage Limits
NotesRequires time-tracking and proactive monitoring of ALL users for "problematic behaviors" (3+ hours/day, accessing within 10 minutes of waking, 10+ posts/day). State-mandated warnings: 25% screen on first login (10 sec), non-dismissable 75% screen after 3 hours (90 sec, repeating hourly). Must inform users of "problematic behavior" with state-approved resources. State-mandated disclaimer under every ad. Companion bill to AB 4013.
Action Alert New Jersey — S3413 (2026)
Introduced
New Jersey Kids Code Act
Introduced in the Senate, Referred to Senate Law and Public Safety Committee — Feb 9, 2026 · Session ends 0.0
Block Adult-Minor ContactDefault Restrictive Privacy for MinorsEasy Account Deletion for MinorsMandatory Disclosure of PoliciesMandatory Third-Party AuditsNotification Time RestrictionsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsSchool-Hours Restrictions
NotesBased on model AADC. Privacy tools/defaults for minors, prohibit adult-minor contact unless connected, hide minor connections from adults, no notifications 10pm-6am and 8am-4pm school weekdays, no "dark patterns" (vaguely defined), time tracking for minors, mandatory annual third-party audits with public reports. Companion bill to AB 4015.
New Jersey — SB 4106 (2026)
Introduced
Concerns social media privacy and data management for children and establishes New Jersey Children's Data Protection Commission.
Introduced in the Senate, Referred to Senate Health, Human Services and Senior Citizens Committee — Mar 19, 2026 · Session ends 0.0
Age Estimation (non-ID)Data Protection Impact AssessmentsDefault Restrictive Privacy for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesProhibit "Dark Patterns"
NotesThis New Jersey bill requires social media platforms to complete Data Protection Impact Assessments, set children's (under 18) privacy settings to the highest level by default, and prohibit deceptive design practices and unnecessary data collection involving minors. It also creates a Children's Data Protection Commission to advise the Legislature.
New York — A09415 (2025)
Introduced
Protects minors online from social media and harmful content; establishes penalties for failing to restrict certain minors from certain content.
enacting clause stricken — Mar 4, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Verification (Gov't ID)Content-Specific RestrictionsEasy Account Deletion for MinorsEnforcement through Attorney GeneralEnforcement through private right of actionMinimum Age BanParental Consent for Account CreationProhibit "Addictive Design Features"
NotesThis New York bill bans children under 14 from holding social media accounts, requires parental consent for 14-15 year olds, and mandates age verification on websites distributing content harmful to minors.
New York — S 8827 (2026)
Signed Into Law Not Yet in Force
Requires warning labels on addictive feature platforms which provide features such as addictive feeds, autoplay, infinite scroll, like counts, and/or push notifications; relates to the effectiveness of certain provisions of law relating thereto.
SIGNED CHAP.85 — Feb 13, 2026
Enforcement through Attorney GeneralProhibit "Addictive Design Features"State-Mandated Warning Displays
NotesThis New York bill requires social media platforms that use addictive design features—specifically addictive algorithmic feeds, autoplay, and infinite scroll—to display a Surgeon General-style warning label to users. The warning must appear when a user first accesses the platform each day (covering at least 25% of the screen for 10 seconds) and again after 3 hours of cumulative use (covering 75% of the screen for 30 seconds). The bill cross-references New York's existing definition of 'social media platform' from Article 45 (the SAFE for Kids Act). Active date January 1, 2027.
New York — S00927 (2025)
Introduced
New York Social Media Regulation Act
REFERRED TO INTERNET AND TECHNOLOGY — Jan 7, 2026 · Session ends 0.0
Block Adult-Minor ContactEnforcement through Attorney GeneralEnforcement through private right of actionNotification Time RestrictionsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsRestrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NotesThis New York bill would require social media platforms with at least 5 million users worldwide to obtain parental consent before allowing minors to create accounts, verify the age of all New York account holders, block minors' access between 10:30 p.m. and 6:30 a.m., ban all advertising on minors' accounts, prohibit data collection from minors' usage, and give parents full access to view their children's accounts.
New York — S07694 (2023)
Signed Into Law Not Yet in Force
Stop Addictive Feeds Exploitation (SAFE) For Kids Act
SIGNED CHAP.120 — Jun 20, 2024
Age Estimation (non-ID)Enforcement through Attorney GeneralNotification Time RestrictionsParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Algorithmic/Personalized Feeds
NotesThis bill would prohibit social media sites from showing algorithmically personalized feeds to users under 18 unless the site verifies the user is not a minor or obtains parental consent. It also blocks overnight notifications (12 AM–6 AM) to minors without parental consent. The definition of "addictive social media platform" only covers sites where the algorithmic feed is a "significant part" of the service. This bill has passed, but part of the law requires the Attorney General to release regulations about what qualifies as "commercially reasonable and technically feasible methods for covered operators to determine if a covered user is a covered minor", and the law can't be enforced until 180 days after those regulations are finalized; therefore, the law is technically "in effect", but cannot yet be enforced or challenged until those regulations are finalized.
Ohio — HB33 (2023)
Signed Into Law Ruled Unconstitutional
Parental Notification by Social Media Operators Act
Effective Operating appropriations effective July 4, 2023. Other provisions generally effective October 3, 2023. Some provisions subject to special effective dates. — Jul 4, 2023
NetChoice, LLC v. Yost, 2:24-cv-00047, (S.D. Ohio)
NotesEnacted in 2023. Lawsuit filed by Netchoice: NetChoice, LLC v. Yost, 2:24-cv-00047, (S.D. Ohio). District court ruled the law unconstitutional and permanently enjoined it on 14 May 2025. Ohio has appealed to the Sixth Circuit: NetChoice, LLC v. David Yost, 25-3371, (6th Cir.).
Oklahoma — HB 4358 (2026)
Introduced
Social media; social networks; minors; verification; effective date.
Second Reading referred to Rules — Feb 3, 2026 · Session ends 0.0
Age Estimation (non-ID)Age Verification (Gov't ID)Enforcement through Attorney GeneralEnforcement through private right of actionMinimum Age BanProhibit "Dark Patterns"
NotesThis Oklahoma bill bans children under 16 from holding accounts on social media platforms and requires platforms to verify users' ages using government IDs or other commercially reasonable methods. It also restricts how platforms can process minors' personal data, prohibits dark patterns targeting minors, and allows both the Attorney General and individuals to bring enforcement actions.
Oklahoma — SB 1727 (2026)
Introduced
Social media; authorizing certain cause of action against social media companies; establishing criteria to recover certain damages; authorizing certain rebuttable presumption. Effective date.
Coauthored by Representative Newton (principal House author) — Feb 12, 2026 · Session ends 0.0
Enforcement through private right of actionNotification Time RestrictionsParental Consent for Account CreationProhibit "Addictive Design Features"Time Tracking / Usage Limits
NotesThis Oklahoma bill creates a private right of action allowing minors (under 18) or their parents to sue social media companies for mental health harms caused by excessive use of algorithmically curated platforms. Plaintiffs get a favorable presumption of causation unless the platform implements protective measures: a 3-hour daily time limit, nighttime access restrictions, parental consent requirements, and disabling addictive design features like autoplay and infinite scroll.
Oklahoma — SB1871 (2026)
Introduced
Social media; requiring certain age verification; requiring certain parental consent. Emergency.
Coauthored by Representative Newton (principal House author) — Feb 12, 2026 · Session ends 0.0
Age Verification (Gov't ID)Default Restrictive Privacy for MinorsEasy Account Deletion for MinorsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Time Tracking / Usage Limits
NotesRequires disabling logged-out viewing, freezing all Oklahoma accounts, mandatory age verification via state ID, usual privacy settings/defaults for under-18, disable "engagement prolonging" features for under-18, disable accounts at parental request, parental access to settings, minors cannot change privacy settings without parental consent.
Pennsylvania — HB1430 (2025)
Introduced
Providing for protection of minors on social media; and imposing penalties.
Referred to Communications & Technology — May 8, 2025 · Session ends 0.0
Account Termination on Parental RequestAge Estimation (non-ID)Block Adult-Minor ContactEasy Account Deletion for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsProhibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesThis Pennsylvania bill would require parental consent before children under 16 can open social media accounts, ban data mining and targeted advertising for minors under 18, block unknown adults from contacting minors by default, disable personalized recommendation algorithms unless minors opt in, and prohibit "dark patterns". This bill has been languishing in committee with no action for long enough that it probably has very little chance of passage, but is technically still active.
Rhode Island — H7632 (2026)
Introduced
Requires that any covered entity that develops/provides online services, products, or features that children are reasonably likely to access shall consider the best interest of children when designing/developing such online service, product, or feature.
Committee recommended measure be held for further study — Apr 8, 2026 · Session ends 0.0
Age Estimation (non-ID)Data Protection Impact AssessmentsDefault Restrictive Privacy for MinorsDuty of Care ProvisionsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesProhibit "Dark Patterns"
Notestk
Rhode Island — H7746 (2026)
Introduced
Provides protections to children using online platforms by requiring platforms to turn off open chats by default for young users, and requires parent to approve children's financial transactions on gaming and social media sites.
Committee recommended measure be held for further study — Apr 8, 2026 · Session ends 0.0
Age Estimation (non-ID)Block Adult-Minor ContactDefault Restrictive Privacy for MinorsEnforcement through Attorney GeneralParental Access to Account/ActivityParental Control over SettingsProhibit "Dark Patterns"
Notestk
Rhode Island — H7953 (2026)
Introduced
Rhode Island Social Media Regulation Act
Committee recommended measure be held for further study — Apr 8, 2026 · Session ends 0.0
Age Verification (Gov't ID)Enforcement through Attorney GeneralEnforcement through private right of actionMinimum Age Ban
NotesThis bill would ban Rhode Island residents under 18 from holding accounts on social media platforms, effective January 1, 2027, and require platforms to verify users' ages using government-issued ID.
Rhode Island — H7954 (2026)
Introduced
Regulates how certain large social media platforms utilize algorithms.
Committee recommended measure be held for further study — Mar 5, 2026 · Session ends 0.0
Enforcement through Attorney GeneralEnforcement through private right of actionMandatory Disclosure of Policies
NotesThis bill would require large digital platforms (those with over 1 million monthly active users nationally that allow user-generated content) to publicly disclose how their algorithms rank, recommend, and amplify content, including whether engagement metrics, paid promotion, or behavioral profiling influence what users see. It also requires platforms to detect and label AI-generated synthetic media and prohibits deceptive practices around algorithmic transparency. The committee has recommended that it be "held for further study", which means it is highly unlikely to advance.
Rhode Island — S2406 (2026)
Introduced
Age-Appropriate Design Code
Committee recommended measure be held for further study — Mar 3, 2026 · Session ends 0.0
Age Estimation (non-ID)Data Protection Impact AssessmentsDefault Restrictive Privacy for MinorsDuty of Care ProvisionsMandatory Disclosure of PoliciesProhibit "Dark Patterns"
NotesRequires data protection impact assessments provided to AG on demand, privacy tools with maximum privacy defaults for under-18. Creates duty of care for preventing certain harms to under-18. Explicitly does not mandate age verification but includes self-attested and imputed age data. The committee has recommended that it be "held for further study", which means it is highly unlikely to advance.
South Carolina — H3431 (2024)
Signed Into Law
Age Appropriate Design Act Code
Act No. 96 — Feb 6, 2026
Age Estimation (non-ID)Content-Specific RestrictionsData Protection Impact AssessmentsDefault Restrictive Privacy for MinorsDuty of Care ProvisionsMandatory Third-Party AuditsNotification Time RestrictionsParental Access to Account/ActivityParental Control over SettingsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsSchool-Hours RestrictionsTime Tracking / Usage Limits
Netchoice v. Wilson, 3:26-cv-00543, (D.S.C.)
NotesLawsuit filed by Netchoice: Netchoice v. Wilson, 3:26-cv-00543, (D.S.C.)
South Carolina — H4591 (2025)
Passed One Chamber
Stop Harm from Addictive Social Media
Referred to Committee on Labor, Commerce and Industry — Apr 1, 2026 · Session ends 0.0
Account Termination on Parental RequestAge Estimation (non-ID)Default Restrictive Privacy for MinorsEasy Account Deletion for MinorsParental Consent for Account CreationProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NotesDuplicates much of already-enacted HB 3431 with some different provisions. Legislature still amending. May be an attempt to pass something new and repeal the one they're being sued over.
South Carolina — H5209 (2026)
Introduced
South Carolina Social Media Regulation Act
Referred to Committee on Judiciary — Feb 18, 2026 · Session ends 0.0
Age Estimation (non-ID)Block Adult-Minor ContactContent-Specific RestrictionsParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NotesDuplicates much of already-enacted HB 3431 with some different provisions. Probably dead.
South Dakota — HB1053 (2025)
Signed Into Law In Effect
Require age verification by websites containing material that is harmful to minors, and to provide a penalty therefor.
Signed by the Governor on February 27, 2025 H.J. 418 — Feb 27, 2025
Age Verification (Gov't ID)Content-Specific RestrictionsEnforcement through Attorney General
Notestk
Tennessee — HB1891 (2024)
Signed Into Law In Effect
AN ACT to amend Tennessee Code Annotated, Title 47, Chapter 18, relative to protecting minors from social media.
Effective date(s) 01/01/2025 — May 13, 2024
Account Termination on Parental RequestAge Verification (Gov't ID)Parental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsTime Tracking / Usage Limits
NetChoice v. Skrmetti, 3:24-cv-01191, (M.D. Tenn.)
NotesEnacted in 2023. Lawsuit filed by Netchoice: NetChoice v. Skrmetti, 3:24-cv-01191, (M.D. Tenn.) District court declined to issue preliminary injunction on 18 June 2025 and allowed the law to go into effect.
Texas — HB 18 (2023)
Signed Into Law Enjoined
SCOPE Act
See remarks for effective date — Jun 13, 2023
Account Termination on Parental RequestContent-Specific RestrictionsDefault Restrictive Privacy for MinorsMandatory Disclosure of PoliciesParental Access to Account/ActivityParental Consent for Account CreationParental Control over SettingsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
Computer & Communications Industry Association v. Paxton, 1:24-cv-00849, (W.D. Tex.)
NotesTexas HB 18 (the SCOPE Act) requires social media-like digital services to register users' ages, set strict default privacy protections for known minors (no targeted ads, no data sharing, no geolocation), give verified parents tools to control their child's account settings and screen time, and filter harmful content. Lawsuit filed as Computer & Communications Industry Association v. Paxton, 1:24-cv-00849, (W.D. Tex.) and the district court granted an injunction on 30 August 2024. Texas has appealed to the Fifth Circuit.
Federal — HB3149 (2025)
Introduced
App Store Accountability Act
Forwarded by Subcommittee to Full Committee (Amended) by Voice Vote. — Dec 11, 2025 · Session ends 0.0
Age Estimation (non-ID)Enforcement through Attorney GeneralMandatory Disclosure of PoliciesParental Consent for Account Creation
NotesThis federal bill requires large app stores (5+ million US users) to verify the age category of all users at account creation and obtain verifiable parental consent before minors can download apps or make purchases. App developers must use the app store's age signal and notify parents of significant changes.
Federal — HB3921 (2025)
Introduced
STOP CSAM Act of 2025 Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2025
Referred to the House Committee on the Judiciary. — Jun 11, 2025 · Session ends 0.0
Content-Specific RestrictionsEnforcement through private right of actionMandatory Disclosure of Policies
NotesThe STOP CSAM Act of 2025 is a federal bill that strengthens protections for child victims of sexual exploitation and imposes new obligations on sites to identify child sexual abuse material (CSAM) and remove and report it. While the goal is a good thing, a number of the provisions in the law would require sites to weaken or remove end-to-end encryption (which protects everyone's safety online), which is counterproductive. The law's imposition of a lesser standard of knowledge than the current "actual knowledge" standard also will incentivize sites to remove and report anything that could possibly be CSAM: NCMEC, the designated recipient of those reports, already receives far more reports per year than they can investigate, which seriously hampers their ability to handle those reports and forward them to law enforcement in a timely fashion, and this law would make that problem worse.
Federal — HB6291 (2025)
Introduced
Children and Teens’ Online Privacy Protection Act
Forwarded by Subcommittee to Full Committee by the Yeas and Nays: 14 - 10. — Dec 11, 2025 · Session ends 0.0
Account Termination on Parental RequestEasy Account Deletion for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesParental Access to Account/ActivityParental Consent for Account CreationRestrict Targeted Advertising to Minors
NotesThis federal bill updates and expands COPPA (Children's Online Privacy Protection Act) to cover teens ages 13-16 in addition to children under 13. It bans targeted advertising to minors, limits data collection to what is necessary, requires consent before collecting personal information, gives parents and teens rights to access and delete data, and preempts state laws on the same subject.
Federal — HB7757 (2026)
Introduced
KIDS Act et al
Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned. — Mar 3, 2026 · Session ends 0.0
Age Estimation (non-ID)Block Adult-Minor ContactContent-Specific RestrictionsDefault Restrictive Privacy for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesMandatory Third-Party AuditsParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NotesThis is a sweeping federal bill that bundles multiple child safety measures: it requires adult content sites to verify users aren't minors, mandates that social media sites provide default-on privacy protections and parental controls for known minors under 17, bans disappearing messages and direct messaging for children under 13, restricts algorithmic recommendations, requires annual third-party audits, regulates AI chatbots interacting with minors, and prohibits advertising drugs/alcohol/gambling to minors.
Federal — HB8250 (2026)
Introduced
Parents Decide Act
Referred to the House Committee on Energy and Commerce. — Apr 13, 2026 · Session ends 0.0
Notestk
Federal — SB1748 (2025)
Introduced
Kids Online Safety Act
Read twice and referred to the Committee on Commerce, Science, and Transportation. (Sponsor introductory remarks on measure: CR S2929-2930) — May 14, 2025 · Session ends 0.0
Default Restrictive Privacy for MinorsDuty of Care ProvisionsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesMandatory Third-Party AuditsParental Control over SettingsProhibit "Addictive Design Features"Prohibit "Dark Patterns"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to MinorsTime Tracking / Usage Limits
NotesThe Kids Online Safety Act (KOSA) is a federal bill that would require websites, video games, messaging apps, and video streaming services to exercise a 'duty of care' to protect minors (under 17) from harms like eating disorders, substance abuse, bullying, and compulsive usage. Sites must provide minors with safety safeguards set to maximum privacy by default, give parents tools to manage children's accounts, limit addictive design features, restrict certain advertising, ban "dark patterns", and undergo annual independent audits (for large platforms).
Federal — SB1829 (2025)
Introduced
STOP CSAM Act of 2025 Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2025
Placed on Senate Legislative Calendar under General Orders. Calendar No. 106. — Jun 26, 2025 · Session ends 0.0
Content-Specific RestrictionsEnforcement through private right of actionMandatory Disclosure of Policies
NotesThe STOP CSAM Act of 2025 is a federal bill that strengthens protections for child victims of sexual exploitation and imposes new obligations on sites to identify child sexual abuse material (CSAM) and remove and report it. While the goal is a good thing, a number of the provisions in the law would require sites to weaken or remove end-to-end encryption (which protects everyone's safety online), which is counterproductive. The law's imposition of a lesser standard of knowledge than the current "actual knowledge" standard also will incentivize sites to remove and report anything that could possibly be CSAM: NCMEC, the designated recipient of those reports, already receives far more reports per year than they can investigate, which seriously hampers their ability to handle those reports and forward them to law enforcement in a timely fashion, and this law would make that problem worse.
Federal — SB1885 (2025)
Introduced
Stop the Scroll Act
Committee on Commerce, Science, and Transportation. Ordered to be reported with an amendment in the nature of a substitute favorably. — Apr 14, 2026 · Session ends 0.0
Enforcement through Attorney GeneralState-Mandated Warning Displays
NotesThis bill would require social media platforms to display a mental health warning label every time a user logs on and again after each hour of continuous use. The warning must alert users to mental health risks and link to crisis resources like the 988 Lifeline.
Federal — SB278 (2025)
Introduced
Kids Off Social Media Act
Placed on Senate Legislative Calendar under General Orders. Calendar No. 108. — Jun 30, 2025 · Session ends 0.0
Enforcement through Attorney GeneralMinimum Age BanRestrict Algorithmic/Personalized FeedsSchool-Hours Restrictions
NotesThis bill bans children under 13 from having social media accounts, prohibits personalized algorithmic feeds for users under 17, and requires schools receiving federal broadband subsidies to block social media on school networks and devices.
Federal — SB737 (2025)
Introduced
SCREEN Act Shielding Children's Retinas from Egregious Exposure on the Net Act
Read twice and referred to the Committee on Commerce, Science, and Transportation. — Feb 26, 2025 · Session ends 0.0
Age Estimation (non-ID)Content-Specific RestrictionsMandatory Disclosure of PoliciesMandatory Third-Party Audits
NotesThis federal bill requires sites to implement age verification technology to prevent minors from accessing content that is sexually explicit or pornographic. The definitions are broad enough that it would force most sites that host user-generated content to either engage in strict age verification or remove anything their users post that could possibly qualify as sexually explicit or pornographic -- something that, historically, results in a significant chilling effect on scientific discussion, age-appropriate sex education, and LGBTQ+ people whose very existence is presumed to be "sexually explicit".
Federal — SB836 (2025)
Passed One Chamber
Children and Teens’ Online Privacy Protection Act
Held at the desk. — Mar 16, 2026 · Session ends 0.0
Account Termination on Parental RequestEasy Account Deletion for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesParental Access to Account/ActivityParental Consent for Account CreationRestrict Targeted Advertising to Minors
NotesThis bill updates the Children's Online Privacy Protection Act of 1998 (COPPA) to extend privacy protections to teens ages 13-16, in addition to children under 13. It bans targeted advertising to minors under 17, requires verifiable consent before collecting their personal data; gives parents and teens rights to access, delete, and correct personal information; and limits data retention. The bill applies broadly to any commercial 'operator' of a website, online service, online application, or mobile application that collects personal information from minors, no matter the size.
Utah — HB0311 (2023)
Signed Into Law Enjoined
Social Media Usage Amendments
Governor Signed in Lieutenant Governor's office for filing — Mar 23, 2023
Duty of Care ProvisionsMandatory Third-Party AuditsProhibit "Addictive Design Features"
NetChoice LLC v. Reyes, 2:23-cv-00911, (D. Utah)
NotesEnacted in 2023, repealed and replaced in 2024. Lawsuit filed by Netchoice: NetChoice LLC v. Reyes, 2:23-cv-00911, (D. Utah Dec 18, 2023). District court issued temporary injunction on 10 September 2024. Utah appealed to the 10th Circuit: NetChoice v. Brown, 24-4100, (10th Cir.)
Virginia — SB854 (2025)
Signed Into Law Enjoined
Consumer Data Protection Act; social media platforms, responsibilities and prohibitions to minors.
Acts of Assembly Chapter text (CHAP0703) — May 2, 2025
Age Estimation (non-ID)Parental Consent for Account CreationTime Tracking / Usage Limits
NetChoice v. Jason S. Miyares, 1:25-cv-02067, (E.D. Va.)
NotesEnacted in 2025. Lawsuit filed by Netchoice: NetChoice v. Jason S. Miyares, 1:25-cv-02067, (E.D. Va.) District court issued temporary injunction on 27 February 2026. Virginia appealed to the Fourth Circuit: NetChoice v. Jay Jones, 26-1252, (4th Cir.)
Vermont — H0823 (2026)
Introduced
An act relating to social media warning labels
Read first time and referred to the Committee on Commerce and Economic Development — Jan 29, 2026 · Session ends 0.0
Enforcement through Attorney GeneralState-Mandated Warning DisplaysTime Tracking / Usage Limits
NotesThis Vermont bill would require all social media platforms to display mental health warning labels every time a user logs in, similar to cigarette warning labels, and to show pop-up time-tracking notifications at least every 30 minutes. These requirements apply to all users, not just minors.
Vermont — H0897 (2026)
Introduced
An act relating to prohibiting the use of social media by children
Read first time and referred to the Committee on Commerce and Economic Development — Feb 11, 2026 · Session ends 0.0
Minimum Age Ban
NotesThis is a placeholder (intent) bill that would require social media sites to prevent anyone under 18 from Vermont from creating an account.
Wisconsin — AB1161 (2026)
Failed
Online services accessed by minors, minors’ personal data, and granting rule-making authority.
Failed to pass pursuant to Senate Joint Resolution 1 — Mar 23, 2026
Age Estimation (non-ID)Block Adult-Minor ContactDefault Restrictive Privacy for MinorsDuty of Care ProvisionsEasy Account Deletion for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesNotification Time RestrictionsRestrict Algorithmic/Personalized Feeds
NotesThis Wisconsin bill would have imposed strict data collection limits, default privacy protections, algorithmic feed restrictions, a duty of care, and transparency requirements on online businesses whose services are reasonably likely to be accessed by minors under 18. The legislative session ended 19 March 2026, and so we consider the bill dead, although there's a special session 14 April.
Wisconsin — AB960 (2026)
Failed
Requiring social media platforms to provide mental health warnings and providing a penalty.
Failed to pass pursuant to Senate Joint Resolution 1 — Mar 23, 2026
Enforcement through Attorney GeneralEnforcement through private right of actionState-Mandated Warning Displays
NotesThis Wisconsin bill would have required all social media platforms to display a mental health warning label every time a user logs in, alerting users to potential negative mental health effects and providing crisis resources like the 988 Suicide & Crisis Lifeline. The legislative session ended 19 March 2026, and so we consider the bill dead, although there's a special session 14 April.
Wisconsin — HB 963 (2026)
Failed
Social media accounts for minors and providing a penalty.
Failed to concur in pursuant to Senate Joint Resolution 1 — Mar 23, 2026
Account Termination on Parental RequestAge Estimation (non-ID)Default Restrictive Privacy for MinorsEasy Account Deletion for MinorsEnforcement through Attorney GeneralEnforcement through private right of actionParental Consent for Account CreationParental Control over SettingsProhibit "Addictive Design Features"Restrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesThis Wisconsin bill would have required large social media sites (those with at least $1 billion in annual revenue, including parent/affiliate companies) to obtain verifiable parental consent before allowing minors under 18 to have accounts, set default privacy to the most restrictive levels, ban "addictive design features" like infinite scrolling and autoplay video for minors, and block targeted advertising to minors. The legislative session ended 19 March 2026, and so we consider the bill dead, although there's a special session 14 April.
Wisconsin — SB 758 (2025)
Failed
Social media platforms’ treatment of minors and providing a penalty. (FE)
Failed to pass pursuant to Senate Joint Resolution 1 — Mar 23, 2026
Age Estimation (non-ID)Enforcement through Attorney GeneralEnforcement through private right of actionRestrict Algorithmic/Personalized FeedsRestrict Targeted Advertising to Minors
NotesThis Wisconsin bill would have required social media sites to stop collecting or using data about minors under 18, ban algorithmic content recommendations for minors, and block targeted advertising to minors, using a DOJ-approved method to determine whether users are minors. The legislative session ended 19 March 2026, and so we consider the bill dead, although there's a special session 14 April.
Wisconsin — SB933 (2026)
Failed
Requiring social media platforms to provide mental health warnings and providing a penalty.
Failed to pass pursuant to Senate Joint Resolution 1 — Mar 23, 2026
Enforcement through Attorney GeneralEnforcement through private right of actionState-Mandated Warning Displays
NotesThis Wisconsin bill would have required all social media platforms to display a mental health warning label every time a user logs in, alerting users to potential negative mental health effects and providing crisis resources like the 988 Suicide & Crisis Lifeline. The legislative session ended 19 March 2026, and so we consider the bill dead, although there's a special session 14 April.
Wisconsin — SB978 (2026)
Failed
Online services accessed by minors, minors’ personal data, and granting rule-making authority.
Failed to pass pursuant to Senate Joint Resolution 1 — Mar 23, 2026
Age Estimation (non-ID)Block Adult-Minor ContactDefault Restrictive Privacy for MinorsDuty of Care ProvisionsEasy Account Deletion for MinorsEnforcement through Attorney GeneralMandatory Disclosure of PoliciesNotification Time RestrictionsRestrict Algorithmic/Personalized Feeds
NotesThis Wisconsin bill would have imposed strict data collection limits, default privacy protections, algorithmic feed restrictions, a duty of care, and transparency requirements on online businesses whose services are reasonably likely to be accessed by minors under 18. The legislative session ended 19 March 2026, and so we consider the bill dead, although there's a special session 14 April.
No bills match your filters.