Haugen worked at Facebook for 2 years and claims company chooses profits over safety
Haugen has accused Facebook of putting profits before the well-being of its users — from failing to protect children and their mental health, to fuelling misinformation and inciting political violence. She’s also called for stricter government oversight to address these problems.
Haugen started working at Facebook in 2019, hoping to help solve problems around misinformation on the platform, but said she quickly noticed an “impasse.” “The problem is the people whose job is to find these problems, and the people whose job is to authorize fixing these problems, are different people,” she said. If fixing a problem doesn’t align with the incentives of those authorized to fix it — such as company growth — it didn’t get fixed, she said. Part of Haugen’s work was on the civic integrity team, which she described as being tasked with making Facebook a “positive force in politics.” But that team was dissolved a month after the 2020 U.S. presidential election, its staff moved to a broader safeguarding team that did not have a specific mandate on politics. That’s when Haugen decided to go public with her concerns. “It showed a level of lack of commitment and, like, a blindness, that I was like, ‘This is just not acceptable.’ You can’t have a force that’s this dangerous that thinks itself as safe,” she said. Haugen took pictures of internal documents before she left, which became the basis of a series of Wall Street Journal exposés. Among her allegations were that the company was aware that its Instagram platform could have a negative impact on the body image and mental health of its users, but it failed to take action — something that was of particular concern to U.S. lawmakers when Haugen testified before a Senate committee last October. At the time, Facebook responded that “the story focuses on a limited set of findings and casts them in a negative light,” but it stood by the research.
Haugen also alleged that an algorithm change in 2018 prioritized showing users content with more comments or shares, but that much of that engagement was negative, such as people arguing within comment threads. Though the new algorithm brought more eyes to divisive content, she said, it also increased the amount of time users spent on the platform, which in turn increased revenue from digital ad sales. Haugen told The Current she didn’t think the company set out to “incentivize rage,” but it happened amid an overall drive to increase interaction on the site. “They didn’t spend enough on safety systems or on people watching for these problems. That was the real issue,” she said. In an email statement to The Current, a spokesperson for Meta, the parent company of Facebook, said the premise at the centre of Haugen’s claims is “false.” “Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie.” The statement further said Facebook has “over 40,000 people to do one job: keep people safe on our services.” Haugen said advocates have long raised concerns about Facebook’s operations and impact, but transparency has been a key problem. When concerns are raised, she said, the company will often downplay external evidence as “anecdotal,” without revealing their own investigations into problems or outlining what corrective action is taken. She described a hypothetical scenario in which there are concerns about children being exposed to posts about self-harm. Through legislation, she said, Facebook could be compelled to track and report how many children are seeing that content and how often. “Imagine a world where that number is reported. Would Facebook get better about self-harm content? Almost certainly. So we have to change that dynamic,” she said.
After Haugen’s testimony last October, CEO Mark Zuckerberg said the allegations mischaracterized Facebook’s work and priorities. Later that month, Facebook Inc. rebranded to Meta, with Zuckerberg laying out a vision of a digital world where people can use avatars to play games together or attend virtual concerts. To Haugen, this pivot to “video games and the metaverse” shows Zuckerberg is still primarily interested in growing the company — something he has been richly rewarded for over the years — rather than addressing its problems.
What is whistle-blowing? Under what circumstances whistle-blowing can be an act of disloyalty and betrayal?
Haugen took pictures of internal documents before she left, which became the basis of a series of Wall Street Journal exposés. Do you agree with her action? Why/why not?
What are the three questions you want to ask Haugen in order to understand the case more?
If you were Mark Zuckerberg, how would you address Haugen’s allegations (assuming that Haugen’s allegations are legitimate)?
After Haugen went public, social media experts told CBC News that social platforms were unlikely to fix the problems on their own and needed government involvement. What are the disadvantages of the government intervention (versus the social platforms being able to take ethical actions and bear the social responsibility)?