4 essential pieces of research that explain Meta-Facebook’s problems

Meta, née Facebook, had a rough year in 2021, in public opinion if not financially. Revelations from whistleblower Frances Haugen, first detailed in a Wall Street Journal investigative series and then presented in congressional testimony, show that the company was aware of the harm it was causing. Growing concerns about misinformation, emotional manipulation and psychological harm came to a head this year when Haugen released internal company documents showing that the company’s own research confirmed the societal and individual harm its Facebook, Instagram and WhatsApp platforms cause. The Conversation gathered four articles from our archives that delve into research that explains Meta’s problematic behavior.

  1. Addicted to engagement At the root of Meta’s harmfulness is its set of algorithms, the rules the company uses to choose what content you see. The algorithms are designed to boost the company’s profits, but they also allow misinformation to thrive. The algorithms work by increasing engagement – in other words, by provoking a response from the company’s users. Indiana University’s Filippo Menczer, who studies the spread of information and misinformation in social networks, explains that engagement plays into people’s tendency to favor posts that seem popular. “When social media tells people an item is going viral, their cognitive biases kick in and translate into the irresistible urge to pay attention to it and share it,” he wrote. One result is that low-quality information that gets an initial boost can garner more attention than it otherwise deserves. Worse, this dynamic can be gamed by people aiming to spread misinformation. “People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks,” Menczer wrote. “They have flooded the network to create the appearance that a conspiracy theory or a political candidate is popular, tricking both platform algorithms and people’s cognitive biases at once.”
  2. Kneecapping teen girls’ self-esteem Some of the most disturbing revelations concern the harm Meta’s Instagram social media platform causes adolescents, particularly teen girls. University of Kentucky psychologist Christia Spears Brown explains that Instagram can lead teens to objectify themselves by focusing on how their bodies appear to others. It also can lead them to make unrealistic comparisons of themselves with celebrities and filtered and retouched images of their peers. Even when teens know the comparisons are unrealistic, they end up feeling worse about themselves. “Even in studies in which participants knew the photos they were shown on Instagram were retouched and reshaped, adolescent girls still felt worse about their bodies after viewing them,” she wrote. The problem is widespread because Instagram is where teens tend to hang out online. “Teens are more likely to log on to Instagram than any other social media site. It is a ubiquitous part of adolescent life,” Brown writes. “Yet studies consistently show that the more often teens use Instagram, the worse their overall well-being, self-esteem, life satisfaction, mood and body image.”
  3. Fudging the numbers on harm Meta has, not surprisingly, pushed back against claims of harm despite the revelations in the leaked internal documents. The company has provided research that shows that its platforms do not cause harm in the way many researchers describe, and claims that the overall picture from all research on harm is unclear. University of Washington computational social scientist Joseph Bak-Coleman explains that Meta’s research can be both accurate and misleading. The explanation lies in averages. Meta’s studies look at effects on the average user. Given that Meta’s social media platforms have billions of users, harm to many thousands of people can be lost when all of the users’ experiences are averaged together. “The inability of this type of research to capture the smaller but still significant numbers of people at risk—the tail of the distribution—is made worse by the need to measure a range of human experiences in discrete increments,” he wrote.
  4. Hiding the numbers on misinformation Just as evidence of emotional and psychological harm can be lost in averages, evidence of the spread of misinformation can be lost without the context of another type of math: fractions. Despite substantial efforts to track misinformation on social media, it’s impossible to know the scope of the problem without knowing the number of overall posts social media users see each day. And that’s information Meta doesn’t make available to researchers. The overall number of posts is the denominator to the misinformation numerator in the fraction that tells you how bad the misinformation problem is, explains UMass Amherst’s Ethan Zuckerman, who studies social and civic media. The denominator problem is compounded by the distribution problem, which is the need to figure out where misinformation is concentrated. “Simply counting instances of misinformation found on a social media platform leaves two key questions unanswered: How likely are users to encounter misinformation, and are certain users especially likely to be affected by misinformation?” he wrote. This lack of information isn’t unique to Meta. “No social media platform makes it possible for researchers to accurately calculate how prominent a particular piece of content is across its platform,” Zuckerman wrote. Editor’s note: This story is a roundup of articles from The Conversation’s archives.

Eric Smalley, Science + Technology Editor, The Conversation This article is republished from The Conversation under a Creative Commons license. Read the original article.

https://www.fastcompany.com/90707931/4-essential-pieces-of-research-that-explain-meta-facebooks-problems?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Created 3y | Dec 24, 2021, 5:21:11 PM


Login to add comment

Other posts in this group

Feeling lonely? X cofounder Ev Williams has an app for that.

When Twitter cofounder and Medium founder Evan “Ev” Williams was planning his 50th birthday party, he didn’t know who to invite. Having spent more of his life building and scaling tech

Apr 18, 2025, 11:30:05 PM | Fast company - tech
A TikToker sues Roblox for using her viral Charli XCX dance without permission

If you thought you’d heard the last of the viral “Apple” dance, think again. The TikToker behind it is now suing Roblox over its unauthorized use.

Last year, during the height of Brat su

Apr 18, 2025, 6:50:08 PM | Fast company - tech
What to know about Jared Birchall, Elon Musk’s right-hand man

A Wall Street Journal report this week gave an extensive look into how Elon Musk, the

Apr 18, 2025, 4:40:03 PM | Fast company - tech
Netflix beats first quarter forecast, revealing it hasn’t been touched by Trump’s tariffs, yet

Netflix fared better than analysts anticipated during the first thr

Apr 18, 2025, 2:20:07 PM | Fast company - tech
Why are AI companies so bad at naming their models?

Six hours after OpenAI’s launch of GPT-4.1, Sam Altman was already apologizing. 

This time, it wasn’t about

Apr 18, 2025, 9:40:03 AM | Fast company - tech
TikTok is obsessed with this investor who bought 30 floors of a Chicago skyscraper

One of the more unique takes on the POV trend on TikTok: “POV: You bought a 100-year-old skyscraper . . . ”

For those unlikely to ever own a skyscraper themselves, TikTok’s Skyscraper Gu

Apr 18, 2025, 5:10:03 AM | Fast company - tech
Instagram launches ‘Blend’ to share personalized Reels with friends

When it comes to sharing Instagram Reels with friends, the process of three taps to get a Reel from A to B can feel surprisingly tedious. Now, Instagram has addressed that issue with its latest fe

Apr 17, 2025, 10:10:04 PM | Fast company - tech