The Kids Online Safety Act is bipartisan legislation to protect children online and hold social media companies accountable. Senator Richard Blumenthal, a Democrat from Connecticut, and Senator Marsha Blackburn, a Republican from Tennessee introduced the act and have lauded it as one of the very few pieces of legislation in our current political climate that both sides of the ideological aisle can reasonably support.
The Internet is a difficult thing to control. Children have access to a range of inappropriate material that those over 40 could not comprehend harkening back on their youth. We are living through a teen mental health crisis at the moment where bullying, suicidal thoughts, and eating disorders are at record numbers.
The Kids Online Safety Act will provide parents and young people alike the safeguards and tools they need to protect themselves. Some of the critical pieces worth covering of the legislation are Sections 3 and 4.
Under Section 3 – Duty of Care – the act covers the following critical areas:
- Prevention of Harm to Minors
- Promotion of suicide, sexual exploitation, substance abuse, eating disorders, and unlawful products for minors (alcohol, gambling, tobacco products, and narcotics) are the main harms that covered platforms must take measures to mitigate and prevent.
- Limitation
- Duty of care, however, does not require a platform to limit potentially essential support services such as suicide prevention services to minors.
Under Section 4 – Safeguards for Minors – four areas are highlighted:
1. Reporting Mechanism
- Platforms must provide a reporting vehicle to alert any potential harm to minors. Moreover, responses should be timely depending on the urgency of the situation.
2. Safeguards
- Features such as control personalization systems, screen time limitation, and the regulation of features that spur compulsive use are required of the platforms. These are aimed at protecting minors against addiction, exploitation, and stalking. Minors will have these controls set by default and they will be understandable for adolescent audiences.
3. Parental Tools
- Required are tools to help parents track a minor’s use of time, purchases, and control their safety settings. They will be offered as opt-in options for teens but enabled by default for children.
4. Illegal Products Advertising
- Some of the previously mentioned unlawful products (narcotics, alcohol, etc.) will be prohibited from being advertised.
Concluding, there is a very important section on transparency. A platform must issue a public report annually that identifies the platform’s risk to minors. This will be conducted by a third-party audit and cover mitigation and prevention measures the platform has taken.
The act was first introduced in February 2022 and passed the Commerce Committee on July 22 on a unanimous vote.