Mind Matters Natural and Artificial Intelligence News and Analysis
pre-adolescent-teen-girl-texting-on-a-smartphone-lying-in-bed-at-home-candid-indoor-photo-withfocus-on-the-foreground-and-copy-space-stockpack-adobe-stock
Pre-adolescent teen girl texting on a smartphone lying in bed at home. Candid indoor photo withFocus on the foreground and copy space
Licensed via Adobe Stock

Facebook’s…Er, Meta’s Instagram Problem

Despite employee-recommended solutions, Facebook has largely turned a blind eye to the harms their own algorithms have caused in teenage girls using Instagram
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

The Wall Street Journal’s series of articles based on leaked internal documents from Facebook, Inc. (now Meta Platforms, Inc.) gave us a peek inside the company’s business model, including what the company knew about Instagram’s harmful mental health effects on teen girls.* 

Facebook employee, Frances Haugen, provided thousands of pages of internal documents and Slack conversations to the Wall Street Journal’s Jeff Horwitz, and has since made the documents available to other media outlets. Gizmodo is working with a group of experts to make the documents public, which can be viewed here.

On October 5th, Haugen testified before the Senate Committee on Commerce, Science, and Transportation on Facebook’s lack of transparency regarding the harms of its platforms. The full testimony can be seen in the video below. Haugen begins speaking at 24:00.

Among the many revelations from her testimony and her documents is Facebook’s knowledge of how harmful Instagram is for teen girls, and how it did not implement solutions that its employees recommended to mitigate those harms. As Haugen put it, “Facebook will not make necessary changes because they will put their astronomical profits before people.”

Facebook’s co-founder and CEO Mark Zuckerberg told Congress last year that the company didn’t have the data on the harmful effects of Instagram on teen mental health. In reality, the company had research showing Instagram was causing vulnerable teen girls to spiral into extreme eating disorders, suicide ideation, and self-harm. Facebook, Inc. was unwilling to make changes to how their algorithms recommend content because it would lead to a decline in user engagement. Haugen’s testimony reveals that engagement is the reigning metric at Facebook. The more users engage with the platform, the more profits for the company.

Instagram Is Uniquely Harmful to Teens

In 2017, the UK’s Royal Society for Public Health and the Young Health Movement published their #StatusOfMind report on well-being and social media platforms. The organization surveyed 1,500 people ages 14 to 24 and found that all but one of the social media platforms received negative scores for well-being. Instagram scored the lowest in terms of well-being, followed by Snapchat, both of which are “image-focused.” The average age at which a child creates an Instagram account is 10 years old even though the company says you need to be 13 years old to create an account.

As of 2019, Instagram had 22 million daily teen users, even as it is competing with TikTok and Snapchat for the teen market. Unlike its competitors though, Instagram has consistently shown increases in anxiety and depression across all groups. Furthermore, Facebook’s own internal research showed that Instagram is particularly problematic among young women, and these problems go beyond issues with social media in general.

Unlike TikTok, which focuses on performance, and Snapchat, which focuses on the face and funny filters, Instagram focuses on the whole body and lifestyle. As the WSJ article points out, the features at Instagram’s core are a perfect storm of the things that would be particularly harmful for teens. For example, users are incentivized to share only their best moments, offering a curated version of their lives. They are pressured to look perfect, or risk not getting likes (at best) or being bullied (at worst).

Haugen said that Facebook knows engagement-based rankings, or the way the algorithm picks recommended content for users, amplifies preferences:

Frances Haugen

[T]hey have done something called a pro-active incident response where they take things that they’ve heard, for example, like ‘Can you be led by the algorithms to anorexia content?’ They have literally recreated that experiment themselves and confirmed Yes this happens to people. So Facebook knows that they are leading young users to anorexia content. 

Beginning at 51:20, WSJ’s video of Frances Haugen’s Testimony before the Senate

Eating disorders, including anorexia, have the highest mortality rate among mental illnesses, disproportionately affecting young women. 

The Wall Street Journal interviewed Anastasia Vlasova for The Journal podcast. As a teen, Vlasova struggled with an eating disorder that she says was stoked by Instagram’s recommendations after she started following fitness influencers:

I realized that Instagram was really triggering a lot of my eating disorder …I realized that I had heightened anxiety and also experienced more eating disorder triggers when I was on social media, but I also acknowledge[d] the fact that I probably wouldn’t be deleting social media, so I just accepted it that I was just going to live as this anxious human being with an eating disorder because of social media.

Vlasova eventually went to counseling because she started having suicidal thoughts and finally came to a point where she was able to delete her Instagram account.

Many people, particularly those that did not grow up with social media, may wonder why Vlasova didn’t delete her Instagram account sooner. Haugen said in her testimony that Facebook knows Instagram dramatically changes the experiences of today’s high school students in negative ways, because things like bullying and ridicule no longer stay at school but follow a child everywhere they go. Facebook also knows parents are ill-equipped to help teens caught up in the social media world. 

According to Haugen,

Facebook knows that parents today, because they didn’t experience these things, they never experienced this addictive experience with a piece of technology, they give their children bad advice. They say things like ‘Why don’t you stop using it?’ And so that’s Facebook’s own research, that children express feelings of loneliness and are struggling with these things because they can’t even get support from their own parents. 

Beginning at 1:07:35, WSJ’s video of Frances Haugen’s Testimony before the Senate

I’ve written elsewhere on how technology addictions have elements of process addictions, like gambling or shopping, and anxiety-reducing compulsions, like obsessive-compulsive disorder (here and here.)

Hooking Kids When They Are Young

Haugen’s documents showed that Facebook was working on “Instagram Kids,” a platform geared toward pre-teens that would be an answer to Snapchat and TikTok’s silly filters and fun features. The company has put those plans on hold amidst the backlash from the WSJ’s Facebook Files. When asked if Facebook will likely discontinue Instagram Kids, Haugen told the Senate committee: 

I would be sincerely surprised if they do not continue working on Instagram Kids. And I would be amazed if in a year from now we don’t have this conversation again.

Senator Schatz: Why?

Haugen: Facebook understands that if they want to continue to grow, they have to find new users. They have to make sure that the next generation is just as engaged with Instagram as the current one. The way that they do that is by making sure children establish habits before they have good self-regulation.

Senator Schatz: By hooking kids?

Haugen: By hooking kids. I would like to emphasize one of the documents that we sent in on problematic use examined the rates of problematic use by age, and that peaked with 14-year-olds. It’s just like cigarettes. Teenagers don’t have good self-regulation. They say explicitly, ‘I feel bad when I use Instagram and yet I can’t stop.’ We need to protect the kids.

Beginning at 1:03:35, WSJ’s video of Frances Haugen’s Testimony before the Senate

Essentially, Facebook, Inc. wants kids to form a dependence with their platforms when they are young so the kids will continue to use the platform as they move into the lucrative teen and early college years.

When Profits Are More Important Than People: Design Features that Promote Compulsive Use

Ethicists like to debate whether some technologies are inherently good or evil. One side says technology itself is neither good nor evil; it is the people who use it. The other side often points out that the designers of that technology are moral agents, and they can design something intended for good or bad uses.

Facebook and many Silicon Valley companies espouse the first view, at least in their public statements, putting the burden of responsibility on the user. In a blog post, for example, Facebook, Inc. acknowledged that some people feel worse after passively consuming content on its platforms, but others who actively engage with fellow users felt better: 

In sum, our research and other academic literature suggests that it’s about how you use social media that matters when it comes to your well-being.

“Hard Questions: Is Spending Time on Social Media Bad for Us?” at Meta

This wording absolves the company, and its algorithmic recommendations, of responsibility for the content that users consume.

Mark Zuckerberg (screen shot from the Meta YouTube video)

However, the founders of the Center for Humane Technology, Frances Haugen, and several other former Facebook employees take the second view. Justin Rosenstein, one of the co-creators of the “like” button, and Roger McNamee, an early Facebook investor, have criticized Instagram and Facebook’s addictive design features.

To be clear, Haugen and others say that Facebook did not set out to become an addictive platform, but as the company tweaked its features to promote user engagement, the psychological impact of interacting with its social networks changed. Along with algorithms that feed extreme content to keep users engaged, the platform also uses intermitted variable rewards in the form of “likes” and alerts, prompting users to compulsively check the app and upload more content, and a bottomless newsfeed, which promotes spending more time passively consuming content, something Facebook admits leads to negative mental health effects.

The gambling industry, which profits largely from compulsive players, has been using these same design features for years. (See the Center for Humane Technology for more on this topic.)

Facebook’s Business Model Is Flawed

Instagram and its impact on teen mental health is only part of a larger problem with Facebook’s business model, and those problems are not solved by rebranding as Meta. According to Haugen, Facebook repeatedly encountered conflicts between profits and safety, and in response, the company repeatedly prioritized profits because its decisions are metrics driven. 

Additionally, Haugen, who has worked with four different social networks, says that Zuckerberg holds a unique role in the tech industry that has left him largely unaccountable for the company’s harms:

[Mark Zuckerberg] holds over 55% of all the voting shares of Facebook. There are no similarly powerful companies that are as unilaterally controlled…I received an MBA from Harvard and they emphasized to us that we are responsible for the organizations that we build. Mark has built an organization that is very metrics driven. It is intended to be flat. There is no unilateral responsibility. The metrics make the decision. Unfortunately, that itself is a decision, and in the end, if he is the CEO and the chairman of Facebook, he is responsible for those decisions.

Beginning at 37:05, WSJ’s video of Frances Haugen’s Testimony before the Senate

At the end of the day, Haugen says the “buck stops with Mark.”

Next time, we’ll take a closer look at why Facebook’s overconfidence in its own algorithms has caused global problems.


* Meta Platforms, Inc. owns Facebook, Instagram, Messenger, WhatsApp, as well as Oculus, which makes virtual reality headsets.


Heather Zeiger

Heather Zeiger is a freelance science writer in Dallas, TX. She has advanced degrees in chemistry and bioethics and writes on the intersection of science, technology, and society. She also serves as a research analyst with The Center for Bioethics & Human Dignity. Heather writes for bioethics.com, Salvo Magazine, and her work has appeared in RelevantMercatorNet, Quartz, and The New Atlantis.

Facebook’s…Er, Meta’s Instagram Problem