What Is the Kids Online Safety Act?
The Kids Online Safety Act (KOSA) is a bipartisan bill introduced in the U.S. Senate on May 14, 2025. Its goal is clear: hold online platforms accountable for protecting minors from digital harms.
KOSA applies to platforms that children under 18 are likely to use. These include social media sites, gaming apps, and connected devices. Broadband services and educational tools used in schools are exempt.
Instead of banning features like infinite scroll or autoplay, the Act requires platforms to take proactive steps. They must reduce risks and give minors and parents tools to manage online use.
The Duty of Care Under the Kids Online Safety Act
Section 3 of the Act creates a duty of care for online platforms. They must exercise reasonable care in design and operation to prevent specific harms.
These harms include:
-
Mental health risks like anxiety, depression, eating disorders, and substance use.
-
Physical threats such as violence or sexual exploitation.
-
Social harms, including cyberbullying, predatory marketing, or promotion of self-harm.
The law also focuses on addiction-like behaviors. Platforms must address features such as endless feeds, autoplay, and push notifications. While not banned, they must provide tools for limits, opting out, or disabling addictive elements.
Safeguards for Kids and Families
Section 4 of the Kids Online Safety Act introduces practical tools for families. Platforms must:
-
Enable privacy by default.
-
Allow minors to block unwanted communications, especially from adults.
-
Give parents supervisory controls, including:
-
Daily time limits or break requirements.
-
Tracking time spent on platforms.
-
Monitoring interactions in a way that respects privacy.
-
These safeguards give families control, making online use more intentional and less harmful.
Transparency and Accountability
Sections 5 and 6 focus on transparency. Platforms must disclose how algorithms deliver content, their policies, and related risks.
They must also issue annual public reports with details on:
-
Risk assessments.
-
Mitigation strategies.
-
Prevalence of harmful content.
-
Number of minor users.
This level of reporting builds trust and allows regulators, parents, and educators to hold platforms accountable.
Research and Best Practices
The Act is not static. Sections 7 through 10 direct federal agencies, including HHS and NIH, to study the effects of online platforms on children.
Independent researchers will gain access to platform data. The law also calls for reports on age verification and best practices for online safety. By grounding rules in science, the Kids Online Safety Act aims to evolve with technology.
Enforcement of the Kids Online Safety Act
Section 11 gives enforcement power to the Federal Trade Commission (FTC) and state attorneys general. Violations count as unfair or deceptive practices. Platforms may face civil penalties, and in some cases, individuals can bring private legal action.
To guide implementation, Section 12 creates the Kids Online Safety Council, made up of experts, advocates, and industry leaders.
If passed, most provisions will take effect 18 months after enactment, giving platforms time to adjust.
Why the Kids Online Safety Act Matters
The Kids Online Safety Act represents a major shift in online accountability. It emphasizes mitigation over prohibition, focusing on reducing risks while preserving innovation.
However, it is not a complete solution. Critics worry about free speech, burdens on platforms, and gaps in coverage. Supporters argue itโs a long-overdue update to protect children in the digital age.
As the Act moves through Congress, parents, educators, and tech professionals should stay informed. In the meantime, tools like screen time limits and open conversations remain essential for online safety.
https://www.congress.gov/bill/119th-congress/senate-bill/1748/text