Ex-Facebook employee and whistleblower Frances Haugen implored lawmakers Wednesday to avert the usual congressional stalemates as they weigh proposals to curb abuses on social media platforms by limiting the companies’ free-speech protections against legal liability.
Still, Haugen urged caution in making changes to the 1996 law that provides legal protection both for content the platforms carry and for companies removing postings they deem offensive. She cited unintended consequences from a previous congressional revision.
“I encourage you to move forward with eyes open to the consequences of reform,” Haugen testified at a hearing by a House Energy and Commerce subcommittee. “I encourage you to talk to human rights advocates who can help provide context on how the last reform … had dramatic impacts on the safety of some of the most vulnerable people in our society.”
Lawmakers brought forward proposals after Haugen presented a case in October that Facebook’s systems amplify online hate and extremism and fail to protect young users from harmful content.
Her previous disclosures have energized legislative and regulatory efforts around the world aimed at cracking down on Big Tech, and she made a series of appearances recently before European lawmakers and officials who are drawing up rules for social media companies.
Haugen, a data scientist who worked as a product manager in Facebook’s civic integrity unit, buttressed her assertions with a massive trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.
“Facebook wants you to get caught up in a long, drawn out debate over the minutiae of different legislative approaches. Please don’t fall into that trap,” she told the lawmakers Wednesday. “Time is of the essence. There is a lot at stake here. You have a once-in-a-generation opportunity to create new rules for our online world.”
When she made her first public appearance this fall, laying out a far-reaching condemnation of the social network giant before a Senate Commerce subcommittee, Haugen offered prescriptions for action by Congress. She rejected the idea of breaking up the tech giant as many lawmakers are calling for, favoring instead targeted legislative remedies.
Most notably, they include new curbs on the long-standing legal protections for speech posted on social media platforms. Both Republican and Democratic lawmakers have called for stripping away some of the protections granted by a provision in a 25-year-old law — generally known as Section 230 — that shields internet companies from liability for what users post. Enacted when many of the most powerful social media companies didn’t even exist, the law allowed companies like Facebook, Twitter and Google to grow into the behemoths they are today.
“Let’s work together on bipartisan legislation because we can’t continue to wait,” said Rep. Mike Doyle, D-Pa., the chairman of the communications and technology subcommittee.
Facebook and other social media companies use computer algorithms to rank and recommend content. They govern what shows up on users’ news feeds. Haugen’s idea is to remove the protections in cases where dominant content driven by algorithms favors massive engagement by users over public safety.
That’s the thought behind the Justice Against Malicious Algorithms Act, which was introduced by senior House Democrats about a week after Haugen testified to the Senate panel in October. The bill would hold social media companies responsible by removing their protection under Section 230 for tailored recommendations to users that are deemed to cause harm. A platform would lose the immunity in cases where it “knowingly or recklessly” promoted harmful content.
Haugen, who has previously focused most of her specific public remarks on her view of Facebook’s failings, brought out sharp criticism at Wednesday’s hearing of TikTok, the video platform that is wildly popular with teens and younger children.
“TikTok is designed to be censored,” she said, maintaining that its ownership by the Chinese company ByteDance makes it ripe for mandated content. Haugen asserted TikTok’s algorithms make it more addictive to users even than Facebook’s messaging service Instagram — which she has criticized in detail, citing internal research showing apparent harm to some young users, especially girls. The research suggested that peer pressure generated by Instagram has led to mental health and body-image problems and in some cases, eating disorders and suicidal thoughts.
Spokespeople for TikTok didn’t immediately respond to a message seeking comment.
Rep. Frank Pallone, D-N.J., who heads the full Energy and Commerce committee, said a proposal from its senior Republican, Rep. Cathy McMorris Rodgers of Washington, isn’t identical to the Democrats’ bill but represents a good start for potential compromise.
All of the legislative proposals face a heavy lift toward final enactment by Congress.
Some experts who support stricter regulation of social media say the Democrats’ legislation as written could have unintended consequences. It doesn’t make clear enough which specific algorithmic behaviors would lead to loss of the liability protection, they suggest, making it hard to see how it would work in practice and leading to wide disagreement over what it might actually do.
Meta Platforms, the new name of Facebook’s parent company, has declined to comment on specific legislative proposals. The company has advocated for updated regulations.
“We are a platform for free expression and every day have to make difficult decisions on the balance between giving people voice and limiting harmful content,” Meta said in a statement Wednesday. “It is no surprise Republicans and Democrats often disagree with our decisions – but they also disagree with each other. What we need is a set of updated rules for the internet set by Congress that companies should follow, which is why we’ve been asking for this for nearly three years.”
Meta CEO Mark Zuckerberg has suggested changes that would only give internet platforms legal protection if they can prove that their systems for identifying illegal content are up to snuff. That requirement, however, might be more difficult for smaller tech companies and startups to meet, leading critics to charge that it would ultimately favor Facebook.
Other social media companies have urged caution in any legislative changes to Section 230.