Law360 is providing free access to its coronavirus coverage to make sure all members of the legal community have accurate information in this time of uncertainty and change. Use the form below to sign up for any of our weekly newsletters. Signing up for any of our section newsletters will opt you in to the weekly Coronavirus briefing.
Law360 (June 24, 2020, 9:46 PM EDT) -- House lawmakers probed the role Section 230 immunity plays in fostering civic discourse on web platforms during a Wednesday hearing, as the law's original authors separately warned against using their legislation to undercut free speech online.
The House Energy and Commerce subcommittee hearing, titled "How Disinformation Online Is Dividing the Nation," explored the role of social media in disseminating often-incorrect information amid a global pandemic and intense calls for social change in the wake of the death of George Floyd, an unarmed Black man suffocated by Minneapolis police.
Spencer Overton, a law professor at George Washington University, testified that Section 230 of the Communications Decency Act clearly gives online platforms like Twitter the authority to take down misinformation, even when the source of it is the president of the United States. The law gives online platforms immunity from lawsuits over user-posted content as long the platforms behave as moderators, not content-creators.
The question, according to Overton, is whether platforms should be compelled to more closely and consistently monitor their sites, especially in instances when disinformation will disproportionately affect minority communities.
"There are serious questions about whether social media platforms should be required to engage in reasonable content moderation to prevent disinformation that results in online civil rights and other legal violations," he said in written remarks.
The hearing came on the heels of President Donald Trump's executive order suggesting that the federal government reinterpret Section 230 as a way to crack down on social media platforms that inconsistently moderate their sites. The move appeared to be fueled by Twitter's decision to label a pair of Trump's tweets about mail-in ballots as inaccurate.
Shortly thereafter, the U.S. Department of Justice and Republican senators unveiled two separate legislative plans to hold web platforms more accountable for their content moderation practices, signaling broader backing for the White House's efforts.
However, this activity could send the wrong message to web platforms that moderation is risky and deepen divisions in society that are driven by unchecked misinformation, according to Overton.
"Retaliatory threats to discourage moderation by social media platforms only make the problem worse by effectively promoting disinformation, polarization and voter suppression," he told the committee.
To mitigate the tension between content moderation and free speech, Neil Fried, an advocate with Digital Frontiers, suggested that Congress could condition Section 230 protection on websites demonstrating that they take "reasonable, good-faith steps to curb illicit activity."
A Senate bill introduced in March — the Earn It Act — would revoke tech companies' long-standing liability shield unless the companies comply with a new commission's "best practices" for combating the spread of online sexual exploitation, or show that they've taken other "reasonable" steps to address the issue.
But during a Wednesday afternoon webinar hosted by free speech trade group NetChoice, Section 230's original authors Sen. Ron Wyden, D-Ore., and former executive branch appointee Christopher Cox warned that the structure of the Earn It Act's advisory commission will thwart the law's original intent.
The 19-member commission tasked with creating the best practices would be led by U.S. Attorney General William Barr, a consistent critic of encryption who has pushed companies like Facebook and Apple to scale back the technology to help law enforcement.
This structure will not be subject to normal transparency requirements and is "designed to freeze out the public," Cox said.
If lawmakers are serious about rooting out online evils like child exploitation while preserving free speech through Section 230, the most effective solution would be to bulk up child welfare and prosecution task forces, Wyden said.
--Additional reporting by Ben Kochman. Editing by Steven Edelstone.
For a reprint of this article, please contact email@example.com.