The US Senate Committee on Commerce, Science and Transportation Wednesday approved two bills that would expand the safety and privacy accorded to children and teenagers on commercial electronic platforms. The move is seen as a response to concerns over algorithms used by social media platforms such as TikTok, Snapchat and YouTube.
The first bill, the Kids Online Safety Act (S. 3663), was introduced on February 16. It defines a “minor” as an individual aged 16 years or younger and imposes a duty upon platforms (applications and services) to act in a minor user’s “best interests.” Specifically, platforms will have a duty to prevent risks or harm to minors associated with consuming content that promotes self-harm, suicide, eating disorders, substance abuse, addictions and predatory marketing practices.
The bill requires platforms to provide safeguards to protect minors and their personal data, ranging from making it harder for strangers to contact a minor to providing minors the option to “opt-out” of algorithms that use personal data and limiting the time they spend on the platform, including by reducing features such as notifications and “auto-play.” Settings must default to the strongest safeguards when it is known to the platform that a particular user is a minor. The bill also mandates parental controls and a “readily accessible and easy-to-use” harm reporting mechanism. Platforms with over 10 million active users must issue public reports at least every year on the foreseeable risks to minors.
Meanwhile, the second bill, the Children and Teens’ Online Privacy Protection Act (S. 1638), introduced Wednesday by Senators Ed Markey and Bill Cassidy, amends the Children’s Online Privacy Protection Act (COPPA, 15 U.S. Code §§6502-6506) to expand the protections accorded to those under 13 years old to all minors between 12 and 16 years old. For instance, a year after the bill’s promulgation, it will be unlawful for platforms to use or sell minors’ personal information for targeted marketing.
Violation of these rules, once enacted, will be considered an “unfair or deceptive act or practice” within the meaning of the Federal Trade Commission (FTC) Act (15 U.S.C. 57a(a)(1)(B)), enabling the FTC to institute actions against errant platforms.
Action against the harmful impact of social media on children’s privacy and mental health has intensified over the past year. Markey, who also sponsored the COPPA in 1998, stated in 2021 that all Big Tech social media platforms operated on “the same computer code of misconduct,” thus “endangering kids and teens.”
The bills, adopted by voice vote, must pass in the full Senate before they can go to the House of Representatives.