Focue Provides the Latest and Most Up-to-Date News, What You Focus On is What You Get.
⎯ 《 Focue • Com 》

Mark Zuckerberg vetoed attempts to address teen mental health on Meta platforms, new lawsuit alleges

2023-11-09 16:24
Still embroiled in lawsuits over the company's slow move to address its platforms' effects on
Mark Zuckerberg vetoed attempts to address teen mental health on Meta platforms, new lawsuit alleges

Still embroiled in lawsuits over the company's slow move to address its platforms' effects on young users, Meta CEO Mark Zuckerberg is now under fire for reportedly blocking attempts to address Meta's role in a worsening mental health crisis.

According to newly unsealed court documents in a Massachusetts case against Meta, Zuckerberg was made aware of ongoing concerns about user mental wellbeing in the years prior to the Wall Street Journal investigation and subsequent Congressional hearing. The CEO repeatedly ignored or shut down suggested actions by Meta's top executives, including Instagram CEO Adam Mosseri and Facebook's president of global affairs Nick Clegg.

SEE ALSO: Meta's moderation failures incite hate and human rights abuses, according to Amnesty International

Specifically, Zuckerberg passed on a 2019 proposal to remove popular beauty filters from Instagram, which many experts connect to worsening self image, unreachable standards of beauty, and perpetuated discrimination of people of color. Despite support for the proposal among other Instagram heads, the 102-page court document alleges, Zuckerberg vetoed the suggestion in 2020, saying he saw a high demand for the filters and “no data” that such filters were harmful to users. A meeting of mental health experts was allegedly cancelled a day before a meeting on the proposal was scheduled to take place.

The documents also include a 2021 exchange between Clegg and Zuckerberg, in which Clegg forwarded a request from Instagram's wellbeing team asking for an investment of staff and resources for teen wellbeing, including a team to address areas of "problematic use, bullying+harassment, connections, [and Suicide and Self-Injury (SSI)]," Insider reports.

While Clegg reportedly told Zuckerberg that the request was "increasingly urgent," Zuckerberg ignored his message.

The Massachusetts case is yet another legal hit for Meta, after being lambasted by state governments, parent coalitions, mental health experts, and federal officials for ignoring internal research and remaining complicit in social media's negative effect on young users.

SEE ALSO: 'Profound risk of harm': Surgeon General issues warning about youth social media use

On Oct. 25, a group of 41 states and the District of Columbia sued Meta for intentionally targeting young people using its "infinite scroll" and algorithmic behavior and pushing them towards harmful content on platforms like Instagram, WhatsApp, Facebook, and Messenger.

In 2022, Meta faced eight simultaneous lawsuits across various states, accusing Meta of "exploiting young people for profit" and purposefully making its platforms psychologically addictive while failing to protect its users.

Meta's not the only tech or social media giant facing potential legal repercussions for its role in catalyzing harmful digital behavior. The state of Utah's Division of Consumer Protection (UDCP) filed a lawsuit against TikTok in October, claiming the app's "manipulative design features" negatively effect young people's mental health, physical development, and personal life. Following a similar case from a Seattle public school district, a Maryland school district filed a lawsuit against nearly all popular social platforms in June, accusing the addictive properties of such apps of "triggering crises that lead young people to skip school, abuse alcohol or drugs, and overall act out" in ways that are harmful to their education and wellbeing.

Since the 2021 congressional hearing that put Meta's youth mental health concerns on public display, the company has launched a series of new parental control and teen safety measures, including oversight measures on Messenger and Instagram intended to protect young users from unwanted interactions and reduce their screen time.