Ex-Facebook employee says products hurt kids, fuel division in Congress testimony

Former Facebook employee and whistleblower Frances Haugen arrives to testify during a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill on Tuesday, Oct. 5, 2021, in Washington. (Jabin Botsford/The Washington Post via AP, Pool)

Former Facebook employee and whistleblower Frances Haugen arrives to testify during a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill on Tuesday, Oct. 5, 2021, in Washington. (Jabin Botsford/The Washington Post via AP, Pool)

A former Facebook data scientist told Congress on Tuesday that the social network giant’s products harm children and fuel polarization in the U.S. while its executives refuse to make changes because they elevate profits over safety.

Frances Haugen testified to the Senate Commerce Subcommittee on Consumer Protection. She is accusing the company of being aware of apparent harm to some teens from Instagram and being dishonest in its public fight against hate and misinformation.

Haugen has come forward with a wide-ranging condemnation of Facebook, buttressed with tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit. She also has filed complaints with federal authorities alleging that Facebook’s own research shows that it amplifies hate, misinformation and political unrest, but the company hides what it knows.

Haugen says she is speaking out because of her belief that “Facebook’s products harm children, stoke division and weaken our democracy.”

Former Facebook employee and whistleblower Frances Haugen arrives to testify during a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill on Tuesday, Oct. 5, 2021, in Washington. (Jabin Botsford/The Washington Post via AP, Pool)

“The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people,” she says in her written testimony prepared for the hearing. “Congressional action is needed. They won’t solve this crisis without your help.”

After recent reports in The Wall Street Journal based on documents she leaked to the newspaper raised a public outcry, Haugen revealed her identity in a CBS “60 Minutes” interview aired Sunday night. She insisted that “Facebook, over and over again, has shown it chooses profit over safety.”

The ex-employee challenging the social network giant with 2.8 billion users worldwide and nearly $1 trillion in market value is a 37-year-old data expert from Iowa with a degree in computer engineering and a master’s degree in business from Harvard. Prior to being recruited by Facebook in 2019, she worked for 15 years at tech companies including Google, Pinterest and Yelp.

The panel is examining Facebook’s use of information from its own researchers on Instagram that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the teens devoted to Facebook’s popular photo-sharing platform, the peer pressure generated by the visually focused Instagram led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Haugen showed.

One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.

“The company intentionally hides vital information from the public, from the U.S. government and from governments around the world,” Haugen says in her written testimony. “The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages.”

Former Facebook employee and whistleblower Frances Haugen listens to opening statements during a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill on Tuesday, Oct. 5, 2021, in Washington. (Drew Angerer/Pool via AP)

As the public relations debacle over the Instagram research grew last week, Facebook put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12.

At issue are algorithms that govern what shows up on users’ news feeds, and how they favor hateful content. Haugen, who focused on algorithm products in her work at Facebook, said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together. Despite the enmity that the new algorithms were feeding, Facebook found that they helped keep people coming back — a pattern that helped the social media giant sell more of the digital ads that generate most of its revenue.

Haugen’s criticisms range beyond the Instagram situation. She says that Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump last year, alleging that contributed to the deadly Jan. 6 assault on the U.S. Capitol.

After the November election, Facebook dissolved the civic integrity unit where Haugen had been working. That, she says, was the moment she realized “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

Haugen says she told Facebook executives when they recruited her that she wanted to work in an area of the company that fights misinformation, because she had lost a friend to online conspiracy theories.

Facebook maintains that Haugen’s allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarization.

Sen. Marsha Blackburn, R-Tenn., left, and Sen. Richard Blumenthal, D-Conn., right speak to former Facebook data scientist Frances Haugen, center, during a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, on Capitol Hill, Tuesday, Oct. 5, 2021, in Washington. (AP Photo/Alex Brandon)

“Even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, we’re never going to be absolutely on top of this 100% of the time,” Nick Clegg, Facebook’s vice president of policy and public affairs, said Sunday on CNN’s “Reliable Sources.”

That’s because of the “instantaneous and spontaneous form of communication” on Facebook, Clegg said, adding, “I think we do more than any reasonable person can expect to.”

By coming forward, Haugen says she hopes it will help spur the government to put regulations in place for Facebook’s activities. Like fellow tech giants Google, Amazon and Apple, Facebook has for years enjoyed minimal regulation in Washington.

Separately Monday, a massive global outage plunged Facebook, Instagram and the company’s WhatsApp messaging platform into chaos, only gradually dissipating by late Monday Eastern time. For some users, WhatsApp was working for a time, then not. For others, Instagram was working but not Facebook, and so on.

Facebook didn’t say what might have caused the outage, which began around 11:40 a.m. EDT and was still not fixed more than six hours later.

Have you subscribed to theGrio’s new podcast “Dear Culture”? Download our newest episodes now!

TheGrio is now on Apple TV, Amazon Fire, and Roku. Download theGrio today!

Exit mobile version