Technology and social media have been at crossroads over the past two decades. Is Facebook a mirror? An amplifier? An inducer?
“Facebook is not a neutral platform,” whistleblower Frances Haugen said.
Haugen joined Common Sense Media founder Jim Steyer for a conversation in the Annenberg Forum at Wallis Annenberg Hall on Tuesday.
As a data scientist and former product manager on Facebook’s civic misinformation team, Haugen is known for releasing tens of thousands of Facebook’s internal documents, exposing how they handle issues such as hate speech and misinformation regarding climate change and vaccines. The negative impact Facebook and Instagram have on the mental health of children, teenagers and young adults was also revealed through a series of articles posted in September 2021 by the Wall Street Journal titled, “The Facebook Files.”
Prior to 2018, Haugen said the company was primarily focused on optimizing the amount of time users spent on the platform. When a decline in content production hit Facebook in 2018, the company decided to shift the focus of its algorithm. This algorithm change brought about a shift in Facebook that prioritized content eliciting the most interactions, such as likes, comments and shares, from users.
The way to gain the most interactions quickly became apparent to Facebook. “The fastest path to a click is anger,” Haugen said.
According to Haugen, the new algorithm implemented in 2018 resulted in content that was extreme and divisive to get the most interaction on Facebook, and therefore prioritized in users’ news feeds. Some examples of content that circulated widely due to Facebook’s algorithm change included hate speech, conspiracy theories about election fraud, and, later, COVID-19 misinformation.
Haugen noted how this new algorithm also contributed to the amplification of “comparative culture on social media,” as Facebook’s own research on body image affected the social and emotional development of younger populations. Algorithms solely aim for the goal of drawing users in without malicious intent, but the unintended side effects of body image resulted in an increase of eating disorders, self-harm and suicide.
In an interview with Annenberg Media, Haugen said she felt the need to have this conversation about the negative effects of Facebook and other social media platforms with college students due to what she called an “uneven distribution of consequences.”
“We see some concentration of harm around things like misinformation in older adults…but when it comes to harms, like eating disorders, body image, self-harm, those harms are overwhelmingly concentrated in young adults.” Haugen said.
She also added that Facebook has shifted the conversation regarding this harmful content to be an issue of censorship rather than an issue of public health and safety. She said that the artificial intelligence algorithm used by Facebook cannot distinguish misinformation, hate speech and other objectionable content in other languages, particularly in linguistically diverse places such as Southeast Asia and Africa.
According to Haugen, 87% of Facebook’s operational budget for tackling misinformation, which includes things like third-party fact-checking, is for content in English, despite the fact that English-speaking users only make up 9% of those on the platform.
One consequence of Facebook’s AI technology failing to monitor content in other languages occurred when Facebook flagged and removed pro-Palestine posts, while no action was taken towards posts that supported Israel. This led to many users believing that Facebook was taking a stance as pro-Israel, when, in reality, Facebook’s AI technology was not developed enough in Arabic, leading to many Palestinian posts being wrongfully censored and flagged for containing “hate speech or symbols.”
Similarly, military officials in Myanmar used the platform to promote genocide against Rohingya Muslims. AI technology did not flag this content, and Facebook did little to oversee what was posted in Myanmar as well as other developing countries where Facebook was beginning to expand.
Haugen says that, despite this, Facebook employees are not inherently malicious. Facebook, however, promotes an ideology that the company is only responsible for connection among users, which disincentivizes the company as a whole from considering the consequences of content shared on the platform. This, along with the fact that Facebook is protected by Section 230 of the Communications Decency Act, gives them little incentive to police content on the platform. This mentality, along with a culture that prioritizes profits over public safety was a deciding factor in Haugen leaving the company in May 2021.
“The moment where I knew that a line had been crossed and there was no way the system could self-heal was almost immediately after the 2020 United States election, when Facebook disbanded the civic integrity team.” Haugen said. “As soon as the 2020 election passed, they basically were like, ‘Oh, good. There wasn’t blood on the streets. Mission accomplished. We can move on now.’”
Haugen adds that the disbandment of the civic integrity team, which combatted misinformation such as election interference and fraud, played a role into the Capitol insurrection in January 2021.
However, Haugen notes that Facebook can have a positive contribution to society if the right changes and reforms are made.
“It’s this extraordinary system of platforms, but it obviously needs a tremendous amount of work and you all have the power to change it,” Steyer added.
When asked what she would say to Mark Zuckerberg if given the opportunity, Haugen said, “can you imagine how hard it is to be Mark Zuckerberg? Every single day that you have to sit there and rationalize that you are a misunderstood hero is a day that you have to expend a lot of psychological energy to not see things. And the thing that I just really want to extend that olive branch to him and [tell him] that every single day he has a chance to have a different life.”
Additional reporting by Rory Burke.