The name of the Facebook whistleblower who disclosed tens of thousands of pages of internal research and data, causing a firestorm for the social media firm in recent weeks, was identified as Frances Haugen on “60 Minutes” Sunday night.
The documents, according to the 37-year-old former Facebook product manager who worked on civic integrity problems at the business, indicate that Facebook is aware that its platforms are being used to promote hate, violence, and disinformation, and that the corporation has attempted to conceal that evidence.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money,” Haugen told “60 Minutes.”
“60 Minutes” correspondent Scott Pelly read from one internal Facebook (FB) document: “We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.”
Haugen filed at least eight complaints with the Securities and Exchange Commission approximately a month ago, saying that the business is concealing information concerning its flaws from investors and the public. She also provided the papers with the Wall Street Journal, which ran a multi-part investigation revealing that Facebook was aware of issues with its applications, including the harmful impacts of disinformation and the harm caused by Instagram, particularly to young girls.
Haugen, who joined Facebook in 2019 after previously working for Google (GOOGL GOOGLE) and Pinterest (PINS), is scheduled to speak before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security on Tuesday.
“I’ve seen a bunch of social networks, and it was substantially worse at Facebook than anything I’ve seen before,” Haugen said. “At some point in 2021, I realized I’m going to have to do this in a systemic way, that I’m going to have to get out enough [documents] that no one can question that this is real.”
Facebook has reacted angrily to the findings, labeling many of them “misleading” and claiming that its programs do more benefit than damage.
“Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” Facebook spokesperson Lena Pietsch said immediately following the “60 Minutes” interview. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”
Pietsch issued a more than 700-word response few hours after the interview aired, setting out what it termed “missing facts” from the episode and claiming that the interview “used select company materials to tell a misleading story about the research we do to improve our products.”
Facebook Vice President of Global Affairs Nick Clegg told CNN’s Brian Stelter on Sunday morning, prior of the “60 Minutes” interview, that “there is no perfection on social media as much as in any other walk of life.”
“We do a huge amount of research, we share it with external researchers as much as we can, but do remember there is … a world of difference between doing a peer-reviewed exercise in cooperation with other academics and preparing papers internally to provoke and inform internal discussion,” Clegg said.
Haugen said she knows that Facebook Founder and CEO Mark Zuckerberg “never set out to make a hateful platform, but he has allowed choices to be made where the side effects of those choices are that hateful and polarizing content gets more distribution and more reach.”
Haugen stated that she was recruited by Facebook in 2019 and accepted the position to work on disinformation. Her opinions about the firm began to shift once the company chose to disband its civic integrity team shortly after the 2020 Presidential Election.
She speculated that this decision, as well as the company’s choice to disable other election safety measures such as disinformation detection tools, allowed the platform to be used to assist plan the January 6 violence on Capitol Hill.
“They basically said, ‘Oh good, we made it through the election, there weren’t riots, we can get rid of civic integrity now,'” she said. “Fast forward a couple of months, and we had the Insurrection. When they got rid of civic integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.'”
When the civic integrity team was disbanded, the work of the team was assigned to other units, according to Facebook. On Twitter late Sunday night, Facebook Vice President of Integrity Guy Rosen stated that the group has been merged into other teams so that the “work pioneered for elections could be applied even further.”
According to Haugen, the social media company’s algorithm, which is meant to offer users information that they are most likely to engage with, is to blame for many of its difficulties.
We did not disband Civic Integrity. We integrated it into a larger Central Integrity team so that the incredible work pioneered for elections could be applied even further, for example across health related issues. Their work continues to this day.
— Guy Rosen (@guyro) October 4, 2021
“One of the consequences of how Facebook is picking out that content today is that it is optimizing for content that gets engagement, a reaction, but its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions,” she said. She added that the company recognizes that “if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”
In a statement sent late Sunday night, Facebook’s Pietsch stated that the site relies on “being used in ways that bring people closer together” to attract advertisers, adding that “protecting our community is more important than maximizing our profits.”
Clegg denied accusations that Facebook was involved in the January 6 unrest in an internal document published by the New York Times earlier Sunday.
“Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out,” Clegg said in the memo. “So it’s natural for people to ask whether it is part of the problem. But the idea that Facebook is the chief cause of polarization isn’t supported by the facts.”
Haugen stated that while “no one at Facebook is malevolent … the incentives are misaligned.”
“Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction,” she said. “And the more anger that they get exposed to, the more they interact and the more they consume.”