The Taste of Brazil

Former Facebook employee Frances Haugen listens to opening statements during a Senate hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ on Capitol Hill October 5, 2021 in Washington, DC.

Former Facebook employee Frances Haugen listens to opening statements during a Senate hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ on Capitol Hill October 5, 2021 in Washington, DC. Photo: Drew Angerer (Getty Images)

Facebook whistleblower Frances Haugen stepped out of the shadows Sunday after months of working secretly with lawyers, journalists and lawmakers to build a case against the company she’d once thought to change from within—but now views as fundamentally threatening to the whole of humanity. Haugen was once against thrust into the spotlight Tuesday as she appeared before a Senate subcommittee to testify about Facebook policies that placed profits before the mental wellbeing of children on its platforms.

Echoing her interview Sunday on 60 Minutes, Haugen said she joined Facebook in 2019 after someone close to her was “radicalized” online. She pursued a job at the company, she said, in an effort to improve internal policies long criticized for amplifying the most politically divisive content in order to generate engagement among its users. Facebook’s acute fixation with driving engagement—which translates into ad dollars, the company’s singular source of income—resulted in system that only serves to amplify “division, extremism, and polarization,” she said, “undermining societies around the world.”

“This is not simply a matter of some social media users being angry or unstable,” said Haugen. “Facebook became a $1 trillion company by paying for its profits with our safety, including the safety of our children. And that is unacceptable.”

Haugen, who holds an MBA from Harvard and previously worked on algorithms at Google, Pinterest, and Yelp, was recruited by Facebook in 2019 as a lead product manager for “civic misinformation,” later working on “counter-espionage” as a member of Facebook’s threat intelligence team. At Facebook, she witnessed the company consistently placing profits above all else—decisions which generated “self-harm” and “self-hate,” she said, among the platform’s youngest users.

Video Player is loading.

Current Time  0:00

Duration  1:43

Remaining Time  - 1:43

At the top of the hearing, Sen. Richard Blumenthal, chairman of the Senate’s subcommittee on consumer protection, raised the question of whether Facebook has known all along that children were becoming addicted to Instagram, the photo-sharing platform Facebook purchased in 2012. “Many of Facebook’s internal research reports indicates that Facebook has a serious negative harm on a significant portion of teenagers and younger children,” she said.

“Facebook knows that it’s amplification algorithms, things like engagement based rankings on Instagram, can lead children from very innocuous topics... all the way from something innocent like health recipes to anorexia-promoting content, over a very short period of time,” Haugen said, adding that Facebook’s internal definition of “addiction” requires that users be self-identify as having a problem.

G/O Media may get a commission

“In the end,” she said, CEO Mark Zuckerberg bears the ultimate responsibility. “There’s no one currently holding Mark accountable.”

Blumenthal last week said his office had written to Zuckerberg in August, asking whether Facebook had ever heard of his platforms having negative effects, such as suicidal thoughts, on children’s and teen’s mental health. The company effectively ducked the question, saying only that it knew of no consensus among experts as to how much “screen time” was unhealthy for kids.

Internal documents amassed by Haugen before departing Facebook in May laid bare the effects of Instagram’s engagement algorithms on teens—young girls, in particular. Leaked to the Wall Street Journal, the documents noted Instagram was responsible for worsening anxiety, depression, and even suicidal thoughts linked to body-image issues among young girls.

Separate materials shared with the Journal revealed that Facebook views children 10-years-old and younger as a “valuable” and “untapped” resource crucial to the company’s “growth.”

As of yet, Facebook has not indicated whether it plans to take legal action against Haugen for leaking company documents to the press, but has said it won’t pursue her for sharing with Senate lawmakers, whom she initially approached this summer.

Facebook, in response, attacked its own research, calling it “exploratory,” and saying its researchers did not rely on any “clinical criterion.” The company, meanwhile, has refused to release the raw data underlying its findings, preferring to annotate documents referenced in the press in an effort to downplay their significance.

“I came forward because I recognized a frightening truth: almost no one outside of Facebook knows what happens inside Facebook,” Haugen said in opening remarks. “The company’s leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world. The documents I have provided prove that Facebook has repeatedly misled us about what its own research reveals about the safety of children, its role in spreading hateful and polarizing messages, and so much more.”

Haugen went on to say it was typical at Facebook for problems to be understaffed. The threat intelligence team, for example, “could only handle a third of the cases—that we knew about.” The lack of adequate staffing disincentivized the team from improving systems designed to detect issues, which would only create more work the team was not equipped to handle.

This is a developing story and will be updated.

Subscribe to our newsletter!

News from the future, delivered to your present.