The Senate Subcommittee on Consumer Protection, Product Safety, and Data Security heard testimony Tuesday from Facebook whistleblower Frances Haugen, a former employee of the tech giant. In a rare show of bipartisanship, a bloc of Republicans and Democrats joined together to demand action against the company.
Haugen, who used to work as a lead product manager for Facebook’s civic misinformation team, came before the Senate to discuss the company’s internal practices, with special emphasis placed on the ways that these practices disproportionately affect children.
Haugen explained, “I’m here today because I believe Facebook’s products harm children, stoke division, and weaken our democracy.”
She added, “I believe in the potential of Facebook, we can have social media we enjoy, that connects us without tearing apart our democracy, putting our children in danger, and sowing ethnic violence around the world. We can do better.”
Sen. Richard Blumenthal (D-Conn.) observed during his opening remarks that predatory targeting and marketing practices by Facebook have “put profits ahead of people.”
Facebook Targets Children, Teens With Eating Disorders
A Wall Street Journal expose, which found that Facebook knew and hid information about the addictive nature of Facebook and its subsidiary social media platform Instagram, prompted the meeting. However, the conversation between Haugen and the subcommittee delved deeper into the company’s practices.
The whistleblower said that during her time at the company, she often saw the company faced with a choice between “its own profits and our safety.” When these conflicts arose, Haugen said, the company “consistently resolve[d] these conflicts in favor of its own profits.”
One such conflict, pointed out by Blumenthal, was the company’s intentional targeting of children.
Blumenthal said that as an experiment, he and his team put together an account on Instagram posing as a teenaged girl with an eating disorder. He said that Instagram’s algorithm quickly picked up on this, and showed content glorifying and encouraging disorders like anorexia and bulimia.
Haugen expanded on the point, saying that despite their claims not to target children, Facebook sees children as a great marketing opportunity. Because they are so young, Haugen said, younger teens and even preteens do not have ingrained habits, and can be molded by advertisers into developing new habits.
For older teens, this can take the more malicious form of advertisements for vaping products, which have radically increased in popularity among even teenagers under 18.
Haugen also revealed that the algorithm maximized interaction by showing users content likely to elicit an emotional response, leading them to comment, like, or share the post. The purpose of this, she said, was not only to keep users engaged and staying on the platform for longer, but also to encourage posters to post more content. Both outcomes, she emphasized, increased the company’s ad revenue.
‘This is Facebook’s Big Tobacco Moment’: Blumenthal
Both Republicans and Democrats during the hearing put forward their support for abandoning or significantly reforming section 230 protections for the social media platform. Under a 1990s U.S. law, tech companies are not responsible for the content shared by users on their platforms.
By contrast, traditional and online media companies do not have the same protections and are able to be sued for the content they publish.
The notion of abandoning the protections was put forward by President Donald Trump in 2020 but was not followed by legislative actions. With the information provided by Haugen, which Blumenthal called a “bombshell,” senators on both sides of the political aisle seem to be strongly considering moving ahead with stripping these protections from Facebook and Instagram.
Haugen encouraged lawmakers to move ahead with the action but emphasized that the platforms’ problems could not be solved by section 230 reform alone.
“A company with such frightening influence over so many people needs real oversight,” she said.
Haugen concluded, “Congress can change the rules that Facebook plays by and stop the many harms it is now causing. We now know the truth about Facebook’s destructive impact … we must act now. I’m asking you, our elected representatives, to act.”
Blumenthal indicated that such action may be on the horizon.
“This is Facebook’s Big Tobacco moment,” said Blumenthal, comparing Facebook’s track record of hiding information on safety to the same practices by tobacco producers when it became clear that cigarettes and other tobacco products could cause cancer.
The subcommittee went on recess but will return to hear further testimony from Haugen.