The New York Times (NYT) and The Wall Street Journal (WSJ) revealed previously redacted details Saturday of a US multi-state lawsuit against Facebook parent company Meta, alleging that Meta purposefully designed its social media platforms to be attractive to children and collected children’s personal data despite the company’s guidelines barring young children from joining its platforms.
The original lawsuit, filed in October, was heavily redacted and alleged that Meta engineered its social media platforms to maximize the number of youth users and the amount of time those youth users spent on the site in order to collect and sell their information for profit. The lawsuit also alleged that Meta went out of its way to hide studies and reports on the long and short-term consequences of extended social media usage on young users. The lawsuit claimed this violates the Children’s Online Privacy Protection Rule, promulgated by the Federal Trade Commission (FTC), which forbids companies from collecting data from children under 13 without obtaining consent from the child’s parents or guardians, posting clear notice of data collection practices, allowing parents/guardians to easily view what data has been collected from their child, and having clear and strong security protocols to protect user privacy. The rule also places a total ban on games for children under 13 for which a child would have to share personal information to play or to gain a prize or incentive.
The NYT and WSJ received a newly unsealed version of the unredacted October suit, including some evidence cited by the plaintiff states. The unredacted complaint alleges that Meta received millions of complaints that accounts were being run or owned by users under 13, violating Meta’s policies and the FTC rules. However, despite receiving these complaints, Meta only deactivated a small portion of the accounts in question. The complaint also cites Meta company documents which include statements from company leaders acknowledging engineering choices made to exploit young users psychologically, including taking advantage of children’s difficulty in properly calculating risk, impulsivity and susceptibility to peer pressure. Communications from one unnamed executive warned of the possibility of the US government reforming regulations on online safety for children as a potential business risk to Meta. That same executive later complained that the company was freely identifying and using children’s data to sell for profit but not to target and deactivate accounts known to be run or owned by children against company policies.
Meta responded to the WSJ and NYT news reports in a statement to CNN, writing:
We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents. We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work using selective quotes and cherry-picked documents.
This is not the first time Meta has been mired in controversy for its use of private user data and platform policy management and moderation. The European Data Protection Board (EDPB), the online data regulator for the EU, banned Meta in early November from using online behavior-based data to target advertising. Meta Ireland was also fined by the Irish Data Protection Commission (DPC) in January for the use of personal user data to target potential customers for advertising. Meta has also been accused of serious failures in moderation allegedly contributing to human rights abuses in Myanmar and Ethiopia.