Law Reform: Lawmakers Want to Take the Algorithm Out of Your Child’s Feed
Lawmakers are coming after addictive tech product design, opaque algorithms, and weak safeguards to protect kids online, thanks to Bill SB1748 which is now gaining momentum.
The U.S. Senate is back at it again with the Kids Online Safety Act which has just been reintroduced. From making platforms curb addictive design features to giving parents real control and demanding transparency on how algorithms feed content to minors, this bill will have serious impact. What’s surprising? This is not the first attempt. A nearly identical bill failed in the House just months ago. Lawmakers are clearly not giving up, and neither should parents or users who care about online safety.
🇺🇸 Why Kids' Online Safety Is Back on the Agenda
The Kids Online Safety Act has returned to the spotlight in the United States Senate. The bill, known as SB 1748, was recently reintroduced after a previous version in the House of Representatives, HR 7891, failed to move forward before the 118th Congress adjourned. The reintroduction reflects a renewed sense of urgency to establish protections for children and teenagers who use digital platforms.
The key motivation is concern about the growing influence of social media and online platforms on young users. Lawmakers are increasingly alarmed by the design of apps that can lead to compulsive use, expose minors to harmful content, and allow interaction with strangers without proper safeguards.
The failure of HR 7891 did not slow momentum. Instead, it highlighted the political will to keep the issue alive. The new Senate bill builds on the earlier proposal by keeping many of the same themes while refining its approach.
According to the text of HR 7891, the legislation defined minors as individuals under the age of 17 and introduced a duty of care for “high impact online companies.” This duty required platforms to take reasonable measures to prevent harms such as anxiety, eating disorders, substance abuse, compulsive use, and exposure to inappropriate content. The bill also pushed for design changes, including limits on autoplay, reward systems, and algorithmic recommendations.
The bill required companies to offer tools that allow parents and minors to manage settings, control recommendations, and set time limits. It also proposed that platforms issue regular public reports on risks and mitigation efforts.
This framework showed a clear attempt to assign responsibility to companies, especially those with large user bases and significant revenue.
Although HR 7891 did not become law, many of its elements are expected to reappear in SB 1748. With bipartisan support and mounting public pressure, the reintroduction of the Kids Online Safety Act indicates that lawmakers are not prepared to let the issue go.
What’s in SB 1748? Key Provisions at a Glance
The reintroduction of the Kids Online Safety Act (SB 1748) in the U.S. Senate marks a significant step toward improving online protections for minors. This bipartisan effort aims to address the growing concerns about the digital environment's impact on children and teenagers.
Core Objectives of SB 1748
SB 1748 seeks to establish a framework that compels online platforms to prioritise the safety and well-being of young users. The bill outlines several key provisions:
Duty of Care: Platforms are required to exercise reasonable care in designing and operating their services to prevent and mitigate harms to minors. These harms include mental health issues such as anxiety and depression, exposure to sexual exploitation, substance abuse, and other online risks.
Safeguards for Minors: The bill mandates that platforms implement default settings that restrict access to minors' personal data, limit communication with unknown users, and control features that may encourage compulsive usage. These measures are designed to create a safer online environment for children and teenagers.
Parental Tools: SB 1748 emphasises empowering parents by requiring platforms to provide tools that allow guardians to supervise and manage their children's online activities. This includes access to privacy settings and the ability to control account features.
Transparency and Accountability: The legislation calls for independent audits and research into how platforms affect the well-being of young users. This provision aims to ensure that companies are held accountable for their impact on minors and that they take meaningful steps to address potential risks.
What Could Be Different This Time? Comparing SB 1748 with HR 7891
The reintroduction of the Senate Bill 1748 indicates a renewed effort to establish comprehensive protections for minors online. While the full text of SB 1748 is subject to publication, insights can be drawn from its predecessor, House Bill 7891, introduced in April 2024.
Key Provisions in HR 7891
HR 7891 aimed to impose a "duty of care" on online platforms to prevent and mitigate harms to minors, including mental health disorders, compulsive usage, cyberbullying, and exposure to harmful content.
The bill defined minors as individuals under 17 and covered platforms as those likely to be accessed by minors.
The legislation required platforms to provide:
Tools for minors and parents to manage privacy settings, limit interactions, and control algorithmic recommendations.
Transparency reports detailing the prevalence of harmful content and the platform's efforts to address it.
Research on the impact of social media on minors and guidance for educational institutions.
Anticipated Enhancements in SB 1748
Given the bipartisan support and the urgency to protect minors online, SB 1748 is expected to retain the core objectives of HR 7891 while addressing previous concerns:
Clarification of "Duty of Care": To alleviate fears of censorship, the new bill may provide clearer definitions of harmful content and the responsibilities of platforms.
Enhanced Enforcement Mechanisms: SB 1748 might delineate the roles of federal and state authorities in enforcing the provisions, ensuring consistent application across jurisdictions.
Refined Age Verification Processes: The bill could propose more robust age verification methods to prevent minors from accessing inappropriate content.
Why the Bill Matters: Real-World Impact on Platforms and Parents
Strengthening Parental Controls and Default Safety Settings
The Kids Online Safety Act (KOSA) reinforces the importance of empowering parents with effective tools to manage their children's online experiences. Under this legislation, platforms are required to provide parents with accessible controls to oversee their child's account settings, including privacy configurations, screen time limitations, and purchase restrictions.
A vital aspect of KOSA is the mandate for platforms to default to the highest privacy settings for users identified as minors. This approach ensures that, unless adjusted, children's accounts are set to the most protective configurations, reducing the risk of unintended exposure to harmful content or interactions.
Furthermore, KOSA requires platforms to implement clear and straightforward reporting mechanisms. These systems enable both minors and their parents to report harmful content or interactions easily, ensuring timely responses from the platform.
The legislation aims to create a safer digital environment for young users.
The Act may stipulate that platforms must provide annual public reports detailing the risks to minors and the measures taken to mitigate these risks. This transparency holds platforms accountable and encourages continuous improvement in safeguarding children's online experiences.
Enhancing Algorithm Transparency and User Autonomy
KOSA will address concerns about the influence of algorithmic recommendation systems on minors by requiring platforms to disclose how these algorithms operate. Specifically, platforms must provide clear explanations of how content is curated and presented to young users, including the use of personal data in these processes.
Importantly, the legislation will grant minors and their parents the ability to opt out of personalised algorithmic recommendations. This provision empowers users to have greater control over the content they encounter, reducing the likelihood of exposure to potentially harmful or addictive material.
KOSA will seek to mitigate the risks associated with algorithm-driven content, promoting a healthier online environment for children and teenagers.
Clarifying Age Verification and Data Collection Policies
A notable aspect of KOSA is its stance on age verification. The legislation explicitly states that it does not require platforms to implement age gating, age verification, or the collection of additional user data for these purposes.
Instead, KOSA focuses on ensuring that platforms provide appropriate safeguards for users who are known or reasonably believed to be minors. This approach aims to protect children's privacy while still enforcing necessary safety measures.
The act seeks to balance the need for child protection with concerns about user privacy and data security.
Establishing a Duty of Care and Accountability Measures
KOSA introduces a "duty of care" for online platforms, obligating them to act in the best interests of minors using their services. This duty encompasses the prevention and mitigation of risks such as exposure to harmful content, including material related to self-harm, eating disorders, and substance abuse.
To enforce this duty, the legislation requires platforms to undergo independent, third-party audits assessing their compliance with safety standards. These audits evaluate the effectiveness of implemented safeguards and identify areas for improvement.
Additionally, KOSA mandates that platforms provide detailed reports on their risk assessments and mitigation strategies, ensuring transparency and accountability. By holding platforms responsible for the safety of their younger users, the act aims to foster a more secure online environment for children and teenagers.
Companies would need to publish regular public reports about how their platforms affect kids. That means real data about screen time, types of content being shown, and how many minors are using their services. If a platform is exposing kids to harmful material, they would need to admit it. This kind of openness could finally bring some accountability to how apps operate behind the scenes.
Tech platforms might soon have to explain how their recommendation systems work (not Substack, duh 😉). If a platform is promoting content using a secret algorithm, users (especially kids and their parents) will have the option to opt out and switch to a simple, chronological feed. Transparency is the goal, thanks to forthcoming Bill SB1748, and it could implicitly affect how social apps keep young people glued to their screens.