Meta Faces Another Lawsuit Over Child Safety – Meta is facing accusations from New Mexico for allegedly neglecting the protection of younger users. The state’s attorney general has initiated legal action this week, citing instances where investigators, posing as preteens or teenagers on Instagram and Facebook through test accounts with AI-generated profile photos, encountered explicit messages, images, and sexual propositions from other users.
The attorney general’s office further contends that Meta’s algorithms actively recommended sexual content to these test accounts. The suit claims that “Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey,” according to The Wall Street Journal. Furthermore, the accusation asserts that Meta did not implement measures to prevent individuals under the age of 13 from accessing its platforms.
People Also Read: Judge Blocks Montana TikTok Ban Law
It also claims that CEO Mark Zuckerberg bears personal liability for decisions related to products that heightened risks for children. To circumvent Meta’s age restrictions, investigators submitted adult dates of birth when creating fictitious accounts for four children. This tactic is commonly employed by kids attempting to access online services.
Despite using adult dates of birth, the investigators implied that the accounts belonged to children, with one account posting about losing a baby tooth and starting seventh grade. The lawsuit alleges that the investigators also arranged the account to create the impression that the fictional child’s mother might be involved in trafficking.
According to the lawsuit, the accounts received explicit content, including child sexual images and propositions for paid sex. Two days after setting up an account for a fictitious 13-year-old girl, Meta’s algorithms recommended following a Facebook account with over 119,000 followers that shared adult content.
Despite investigators flagging inappropriate material, including potentially underage nude images, through Meta’s reporting systems, the lawsuit alleges that Meta’s systems frequently deemed such content permissible on its platforms. In a statement to the Journal, Meta claimed it prioritizes child safety and invests heavily in safety teams.
“We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” the company said. Meta asserts that it actively works to prevent harmful adults from contacting children on its platforms.
Earlier this year, Meta established a task force dedicated to addressing child safety concerns, prompted by reports suggesting that Instagram’s algorithms facilitated connections between accounts involved in the commissioning and purchasing of underage sexual material. Just last week, the Journal highlighted the reported prevalence of child exploitation material on Instagram and Facebook.
People Also Read: Meta Reportedly Set to Launch Threads in the EU Next Month
According to the Canadian Centre for Child Protection, a “network of Instagram accounts with as many as 10 million followers each has continued to livestream videos of child sex abuse months after it was reported to the company.” Meta says it has taken action over such issues. The New Mexico legal action comes after a collective lawsuit filed by 41 states and the District of Columbia in October.
Among various claims, they asserted that the company was aware that its “addictive” features were detrimental to young users and accused it of misleading people about safety on its platforms. As the legal proceedings unfold, the broader conversation about social media’s responsibility to safeguard its users, especially minors, continues to evolve.