Leading Adviser Quits Over Instagram’s Failure to Remove Self-Harm Content – A prominent psychologist who advises Meta on suicide prevention and self-harm has resigned from her position, alleging that the tech giant is neglecting harmful content on Instagram and disregarding expert guidance in favor of financial gains.
Lotte Rubæk, who has served on Meta’s global expert panel for over three years, expressed concern that the company’s failure to promptly remove self-harm images is exacerbating the vulnerability of young women and girls, contributing to an increase in suicide rates.
Feeling disheartened by Meta’s reluctance to address these issues, the Danish psychologist has stepped down from her role, asserting that the company prioritizes profit over the well-being and safety of its users. She believes that Meta is leveraging harmful content to keep susceptible youth engaged with their platforms, to the detriment of their mental health.
In her resignation letter, she wrote: “I can no longer be part of Meta’s SSI expert panel, as I no longer believe that our voice has a real positive impact on the safety of children and young people on your platforms.” In an interview with the Observer, Rubæk said: “On the surface it seems like they care, they have these expert groups and so on, but behind the scenes there’s another agenda that is a higher priority for them.”
People Also Read: Child Safety Groups and Prosecutors Criticize Encryption of Facebook and Messenger
That agenda, she said, was “how to keep their users’ interaction and earn their money by keeping them in this tight grip on the screen, collecting data from them, selling the data and so on.” A Meta spokesperson said: “Suicide and self-harm are complex issues and we take them incredibly seriously.”
“We’ve consulted with safety experts, including those in our suicide and self-harm advisory group, for many years and their feedback has helped us continue to make significant progress in this space.” “Most recently we announced we’ll hide content that discusses suicide and self-harm from teens, even if shared by someone they follow, one of many updates we’ve made after thoughtful discussion with our advisers.”
Rubæk’s cautionary message coincides with recent research conducted by Ofcom, released last week, which reveals that exposure to violent online content is practically unavoidable for children in the UK, with many encountering such content as early as primary school age. Instagram was cited among the main apps mentioned by respondents in the study.
Initially approached to join the esteemed group of experts, which consists of 24 publicly listed members, Rubæk, who heads the self-injury team in child and adolescent psychiatry in the Capital Region of Denmark, received the invitation in December 2020.
People Also Read: Elon Musk Makes Grok AI Open Source Amid Ongoing OpenAI Lawsuit
This invitation followed her public criticism of Meta, then known as Facebook, regarding an Instagram network associated with the suicides of young women in Norway and Denmark, following a documentary aired by Danish broadcaster DR. She accepted the invitation with the hope of influencing positive change on the platform to enhance safety for young individuals.
However, after witnessing her suggestions repeatedly disregarded over the course of a couple of years—with the original network she criticized still in existence—she concluded that the panel served merely as a façade. Now she believes the invitation could have been an attempt to silence her. “Maybe they wanted me to be a part of them so I wouldn’t be so critical of them in the future.”
In correspondence reviewed by the Observer, Rubæk highlighted the challenges users encountered when attempting to flag potentially distressing images to Meta in October 2021. In her exchange with Martin Ruby, Meta’s head of public policy in the Nordics, she recounted an incident where she attempted to report an image of an underweight female.
However, she received a response from Instagram indicating a shortage of moderators to review the image, resulting in its persistence on the platform. In response, Ruby said in November 2021: “Our people are looking at it, but it is not that simple.” In the same email, he mentioned the secret Instagram network that Rubæk had originally criticised, saying that Meta was “taking a closer look”.
People Also Read: Apple Settles for $490M in Lawsuit After CEO’s China Comments
Despite its extensively documented associations with suicides, Rubæk asserts that the network continues to operate today. Rubæk’s patients inform her that they have attempted to flag self-harm images on Instagram, but frequently encounter persistence of such content. One of her clients recounted an instance where she reported an image, only to find it reappeared through a friend’s account, implying that it had merely been concealed from her view.
Rubæk commented that Meta employs various tactics to circumvent content removal. “The AI is so clever, finding even the smallest nipple in a photo.” But when it comes to graphic pictures of self-harm that are proven to inspire others to harm themselves, she added, it appears to be a different story.