New Mexico is the latest state to accuse Meta of failing to protect underage users, as the state's attorney general's office filed a lawsuit against the company this week after investigators created fake accounts on Instagram and Facebook, claiming to be teenagers, according to an Engadget report.
They used AI-generated profile pictures for the accounts, and the attorney general's office alleges that these accounts were filled with explicit messages, images, and sexual offers from users.
The lawsuit further alleges that Meta allowed Facebook and Instagram to become a marketplace for predators targeting children, as reported by The Wall Street Journal.
Additionally, the lawsuit claims that Meta failed to implement measures to prevent children under the age of 13 from using its platforms and that CEO Mark Zuckerberg personally made product choices that increased risks to children.
To circumvent Meta's age restrictions, investigators provided adult birthdates when creating fake accounts for four children (often children incorrectly claim to be older to access online services they aren't supposed to).
However, the lawsuit suggests that the accounts were used by children, with one posting about losing a baby tooth and starting seventh grade. Under the lawsuit, investigators also created an account to appear as if a fictional child's mother might have been trafficking her.
The lawsuit, among other things, alleges that the accounts sent sexually explicit images to children and offered payment for sexual activities.
Within two days of creating a fake 13-year-old girl's account, Meta's algorithms suggested she follow a Facebook account with over 119,000 followers that posted adult content.
Investigators flagged inappropriate content (including some that appeared to be of naked, underage girls) through Meta's reporting systems, but according to the lawsuit, Meta's systems often found that these images were allowed on its platforms.
In a statement to the newspaper, Meta claimed to prioritize child safety and invest heavily in safety teams.
Earlier this year, Meta formed a working group to address child safety issues after reports indicated that Instagram algorithms helped accounts that purchased and posted sexual material involving underage users find each other.
The lawsuit in New Mexico comes after a group of 41 states and the District of Columbia filed lawsuits against the company in October, alleging, among other things, that the company knew its "addictive" aspects were harmful to young users and misled people about the safety on its platforms.