After whistleblower Frances Haugen revealed how Facebook, now Meta, knew about the dangers to teenager’s mental health using Instagram, it continued to “prioritise profit over safety”. The social media giant claimed otherwise, and that these results in the picture-sharing app were unintentional.
But several new court cases filed by the US government, and more explosive whistleblower testimony to US Congress, argue that Facebook “intentionally” designed “manipulative features that make children addicted to their platforms while lowering their self-esteem”.
The latest revelations follow the latest court case brought by the US government to hold Meta accountable for knowingly harming the mental health of teenagers in pursuit of profile.
“Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem,” said New York Attorney General Letitia James, part of a lawsuit by 33 attorneys general filed in California in October.
“Social media companies, including Meta, have contributed to a national youth mental health crisis and they must be held accountable,” she added.
On the same day, eight other attorney generals filed similar lawsuits in their state courts.
Facebook – as it was known until its abortive $36 billion attempt to create a virtual reality (VR) world called the metaverse – broke numerous laws designed to protect minors, the lawsuits allege.
“Meta’s design choices and practices take advantage of and contribute to young users’ susceptibility to addiction,” according to the lawsuit. “They exploit psychological vulnerabilities of young users through the false promise that meaningful social connection lies in the next story, image, or video and that ignoring the next piece of social content could lead to social isolation.”
Read More: Facebook’s teen mental health Waterloo
But, more damning details emerged this month, when parts of the lawsuit were unredacted, the Wall Street Journal reported. “Teens are insatiable when it comes to ‘feel good’ dopamine effects,” according to a now unredacted Meta presentation. “And every time one of our teen users finds something unexpected their brains deliver them a dopamine hit.”
More damaging are the newly unredacted comments made by Instagram executives. “It’s not ‘regulators’ or ‘critics’ who think Instagram is unhealthy for young teens – it’s everyone from researchers and academic experts to parents,” Instagram head of policy Karina Newton wrote in a May 2021 email. “The blueprint of the app is inherently not designed for an age group that don’t have the same cognitive and emotional skills that older teens do.”
Contrary to what a Facebook executive claimed at a Congressional hearing that it wasn’t thinking about profitability for teen-focussed apps, a 2018 email — included in the lawsuit — highlighted the product-decision making for “the lifetime value of a 13 y/o teen is roughly $270”.
Talk about a smoking gun
Meta denied this, saying “the complaint mischaracterises our work using selective quotes and cherry-picked documents,” according to spokeswoman Stephanie Otway.
The same Otway is implicated in a damning email to Instagram head Adam Mosseri after the Wall Street Journal, which broke Haugen’s whistleblower revelations in 2021, asked for comment. “Our own research confirmed what everyone has long suspected,” Otway wrote to Mosseri in August 2021.
This new batch of unredacted material also includes allegations that Meta CEO Mark Zuckerberg “instructed his subordinates to give priority to boosting its platforms’ usage above the well-being of users,” the Wall Street Journal reported.
It cites an email thread from late 2017 into early 2018 where Facebook’s chief of product Chris Cox and current chief marketing officer Alex Schultz discussed reducing how many notifications users received.
“Fundamentally I believe that we have abused the notifications channel as a company,” wrote Schultz in the unredacted email thread, concurring with Cox, who said the company shouldn’t back off doing what was “better for people” because usage metrics were down.
The WSJ reported, “Zuckerberg overrode them, according to the unredacted portions of the complaint, with executive Naomi Gleit, now head of product at Meta, saying that daily usage ‘is a bigger concern for Mark right now than user experience’.”
Another smoking gun.
Earlier in November, another highly-placed Facebook whistleblower made damaging claims about how much the company knew about the mental health dangers.
Former engineer Arturo Bejar, who worked at Facebook from 2009 to 2015, testified to a Senate Judiciary subcommittee hearing about Instagram and Facebook’s algorithms. He returned to Instagram in 2019 to work on its well-being team after his teenage daughter was sexually harassed on the picture-sharing app.
“She and her friends began having awful experiences, including repeated unwanted sexual advances, harassment,” Bejar told lawmakers. “She reported these incidents to the company and it did nothing.”
Read More: Visual misinformation is widespread on Facebook – and often undercounted by researchers
After spending a year collecting data, Bejar discovered that 51% of Instagram users reported a “bad or harmful experience” during the previous week, while 21% experienced bullying and 24% received unwanted sexual advances, according to National Public Radio. Only 2% of posts reported as harmful content were ever removed.
“It is unacceptable that a 13-year-old girl gets propositioned on social media,” Bejar testified. “We don’t tolerate unwanted sexual advances against children in any other public context, and they can similarly be prevented on Facebook, Instagram and other social media products.”
Appalled at his findings, Bejar emailed a two-page letter in 2021 to Zuckerberg, then-chief operating officer Sheryl Sandberg, Cox (then chief product officer) and Mosseri.
“I wanted to bring to your attention what l believe is a critical gap in how we as a company approach harm, and how the people we serve experience it,” he wrote. “There is no feature that helps people know that kind of behaviour is not ok.”
He never heard back from Zuckerberg, not surprisingly, while other executives who did respond did not address the problems, he said.
This month Bejar testified that “when I left Facebook in 2021, I thought the company would take my concerns and recommendations seriously. Yet, years have gone by and millions of teens are having their mental health compromised and are still being traumatised”.
- This column first appeared in Financial Mail