A bipartisan collective of House lawmakers introduced legislation on Nov. 9 that would require Big Tech providers such as Facebook and Google to allow users to opt-out of content selected by algorithms, providing additional transparency regarding content.
The measure, dubbed the Filter Bubble Transparency Act in the House (pdf), would make platforms with more than 1 million users and $50 million in annual revenue notify users of algorithm usage and allow users to determine settings.
Reps. David Cicilline (D-R.I.) and Ken Buck (R-Colo.) introduced the legislation in the House. Sens. John Thune (R-S.D.), Richard Blumenthal (D-Conn.), Jerry Moran (R-Kan.), Marsha Blackburn (R-Tenn.), and Mark Warner (D-Va.) introduced the bill in the Senate.
“The Filter Bubble Act will bring more transparency and accountability, while giving consumers more control of their online experience on Big Tech platforms,” Buck said in a statement, according to The Washington Examiner.
“When individuals log onto a website, they are not expecting the platform to have chosen for them what information is most important,” Blackburn said in a statement.
“Algorithms directly influence what content users see first, in turn shaping their worldview. This legislation would give consumers the choice to decide whether they want to use the algorithm or view content in the order it was posted.”
Haugen, a former product manager on Facebook’s civic misinformation team, on Oct. 3 revealed that she was the individual who provided the internal documents for a Sept. 14 exposé by The Wall Street Journal that claims Instagram has a “toxic” impact on the self-esteem of young girls. She has accused Facebook, which owns the platform, of repeatedly putting profit before doing “what was good for the public,” including clamping down on hate speech.
In a Senate subcommittee hearing on Oct. 5, she raised a number of concerns, including charging that the social media platform has had a “destructive impact” on society. She cited ethnic violence in Myanmar and Ethiopia, suggesting that there’s a link between Facebook activity and violence in the regions. The social media platform’s algorithms facilitate hate, Haugen said, and therefore put profit before user safety.
Other concerns include that Facebook is aware that it’s presenting harmful, eating disorder-related content to its young users.
“In light of the serious claims made about Facebook by Ms. Haugen, we have extended an invitation for her to speak to the Board over the coming weeks, which she has accepted,” Facebook’s Oversight Board said in a statement.
“Board members appreciate the chance to discuss Ms. Haugen’s experiences and gather information that can help push for greater transparency and accountability from Facebook through our case decisions and recommendations.”
Isabel van Brugen contributed to this report.