Frances Haugen, a former Facebook product manager, has spoken up about what she called “systemic” problems with the platform’s ranking algorithm that led to the amplification of “angry content” and divisiveness.
In her interview with CBS’ 60 Minutes’ correspondent, Scott Pelley, Haugen stated that some of Facebook’s own research found that “angry content” is more likely to receive engagement, something that content producers and political parties are aware of.
“One of the most shocking pieces of information that I brought out of Facebook that I think is essential to this disclosure is political parties have been quoted, in Facebook’s own research, saying, we know you changed how you pick out the content that goes in the home feed, and now if we don’t publish angry, hateful, polarizing, divisive content, crickets. We don’t get anything. And we don’t like this. We know our constituents don’t like this. But if we don’t do these stories, we don’t get distributed. And so it used to be that we did very little of it, and now we have to do a lot of it, because we have jobs to do. And if we don’t get traffic and engagement, we’ll lose our jobs.”
After quitting her job in May, Frances Haugen leaked documents to The Wall Street Journal and to lawmakers. The documents amount to tens of thousands of pages of leaked internal company research, which she says show that the company has been negligent in eliminating violence, misinformation and other harmful content from its services, and that it has misled investors about these efforts.
In August, Facebook advertised its regulation of COVID-19 misinformation and hate speech. The company issued a public report stating that it removed 3,000 accounts, pages, and groups for violating its rules for spreading COVID-19 related misinformation. Also, Facebook said it removed 20 million pieces of COVID-19 misinformation from the platform and hate speech content removal has increased 15-fold since the company began reporting it.
However, Frances Haugen believes Facebook isn’t telling the full story in its transparency reports.
“We have no independent transparency mechanisms that allow us to see what Facebook is doing internally, and we have seen from things like the community enforcement report that when Facebook is allowed to create its own homework, it picks metrics that are in its own benefit. And the consequence is they can say we get 94% of hate speech and then their internal documents say we get 3% to 5% of hate speech. We can’t govern that.”
Before quitting Facebook, Frances Haugen worked in the social media platform’s Civic Integrity unit, as a product manager to combat election interference and misinformation. She said she said was tasked with making sure the company was “a good force in society.”
She says she could not trust Facebook’s commitment to protecting users after it disbanded the civic integrity team after the 2020 presidential race. Facebook said it distributed the work to different teams. But Haugen says Facebook stopped paying close attention, leading to the January 6 Capitol attack.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”
Haugen believes the social media giant should declare “moral bankruptcy” and level with the public on its past failures.
“The reason I came forward is Facebook has been struggling,” Frances Haugen told 60 Minutes. “They’ve been hiding information…And we need to not solve problems alone, we need to solve them together. And that’s why I came forward.”
In response, Facebook’s director of policy communications, Lena Pietsch released a statement.
“Our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” said Pietsch. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”