Facebook parent Meta has been accused of violating federal children’s privacy law in the US by knowingly allowing underage users on Instagram. According to a recently unsealed legal complaint filed by the attorneys general of 33 states, Meta received more than 1.1 million reports of users under the age of 13 on Instagram since 2019.
Despite these reports, the social media company allegedly disabled only a fraction of these accounts and has continued to collect personal data from children, including their locations and email addresses, without their parent’s consent.
The complaint, filed last month in a District Court in California, alleges that Meta has repeatedly violated the Children’s Online Privacy Protection Act (COPPA), a federal law that prohibits the collection of personal data from children under the age of 13 without parental consent. The states are seeking civil penalties against Meta, which could amount to hundreds of millions of dollars or more.
In addition to the privacy violations, the lawsuit also accuses Meta of ensnaring young people to Instagram and Facebook and contributing to a mental health crisis among them due to the addictive nature of these social media platforms.
The complaint alleges that Meta carried out internal studies that show the negative impacts of its platforms on young users, yet the company did not reveal the findings to the public or take any substantial steps to protect young users.
Meta has denied the allegations in the recent lawsuit and has reiterated that it is committed to protecting children on its platforms. The company has also said that it is cooperating with the ongoing investigations.
The lawsuit is the latest in a series of legal challenges against Meta over its handling of children’s data. In 2019, the company was fined $5 billion by the US antitrust watchdog Federal Trade Commission (FTC) for violating COPPA. In addition, FTC also imposed new restrictions and a modified corporate structure to make Meta more transparent and accountable in its handling of user privacy.
Instagram currently has over 1 billion monthly active users out of which 8% are estimated to be in the age group of 13 to 17, according to Statista. Though the platform requires users to be at least 13 years old in the US to be on their platform, the internal documents allege that the company allowed millions of users even below 13 years to be on the platform without seeking permission from their parents.
The unsealed documents also revealed that Meta employees in their internal communication expressed concerns about Instagram's harmful content negatively impacting the mental well-being of teenagers, with ranking algorithms trapping them in negative feedback loops.
Meta has also been rolling out age verification tools for teenagers since 2022 and has expanded it to multiple countries in 2023. To update their age to 18 or over, users must provide an ID, a video selfie, or mutual friend verification.
Under India's Digital Data Protection Act of 2023, online platforms must obtain verifiable parental consent for users under the age of 18. Failure to do so could result in penalties of up to ₹200 crore (approx $24 million).