3 Highlights From Sen. Hearing On Social Media Child Safety

(January 31, 2024, 10:34 PM EST) -- Halfway into a contentious U.S. Senate hearing on Wednesday, Meta CEO Mark Zuckerberg turned to face scores of attending families whose children had been gravely harmed by social media — some to the point of suicide — to apologize for their suffering.

A sitting woman in a black dress looks intensely at a pasty white man in a suit, who is standing up and looking off seriously into the distance, looking extremely normal. Behind them is a gallery of people, many of them are holding posters with pictures of their kids who died of suicide.

Meta CEO Mark Zuckerberg turns to address the audience during a Senate Judiciary Committee hearing on Capitol Hill in Washington, Wednesday, Jan. 31, 2024, to discuss child safety. X CEO Linda Yaccarino watches at left. (AP Photo/Jose Luis Magana)

The moment came during an emotional, four-hour-long hearing of the full Senate Judiciary Committee, during which the heads of Meta; X, formerly known as Twitter; TikTok; Snap; and Discord — some under subpoena — were grilled by bipartisan lawmakers about what their companies are doing to prevent minors from being exploited by their social media platforms.

The meeting began with a series of victim impact statements: "I was sexually exploited on Facebook." "I was sexually exploited on Instagram." "My son Riley died of suicide after being sexually exploited on Facebook."

For years, social media companies, like other online platforms, have successfully used Section 230 of the Communications Decency Act, which broadly protects online platforms from liability for harm caused by third-party content, to fend off such lawsuits. But now there is a growing bipartisan appetite to take on the 1996 law once seen as crucial to protecting the internet's growth. So far, the Judiciary Committee has sent five bills aimed at protecting kids on social media to the Senate floor.

Along with the introduction of bills like the EARN IT Act — which would make platforms criminally and civilly liable for child sexual abuse posted on their platforms — the platforms are also facing pressure on another legal front: Consolidated lawsuits in both state and federal court in California alleging Google, Meta, TikTok and Snap Inc. failed to warn users about the known risks of social media addiction have survived initial motions to dismiss and are now advancing. And 42 state attorneys general have also filed suit challenging the networks' practices.

Here, Law360 recaps three highlights from the lawmakers' grilling of the heads of Meta, TikTok, X, Snap and Discord — the latter three of whom were subpoenaed.

Zuckerberg's Apology

Sen. Josh Hawley, R-Mo., started off by attacking Zuckerberg's opening assertion that "there's little link between mental health and social media use" by pointing to the company's own internal study that found that using Instagram increased anxiety and depression for one in three teenage girls.

While Zuckerberg denied that was true, Hawley pressed on.

"There's families of victims here today. Have you apologized to the victims? Would you like to do so now?" Hawley asked. "Well, they're here, you're on national television."

Zuckerberg rose and turned from facing the lawmakers to the families sitting in the back, many holding pictures of their children aloft.

"I'm sorry for everything you have all been through," Zuckerberg said. "No one should go through the things that your families have suffered, and this is why we invest so much and we are going to continue doing industrywide efforts to make sure no one has to go through the things your families have had to suffer."

Later in the hearing, under pressure from Sen. Laphonza Butler, D-Calif., who cited instances of children gaining access to dangerous drugs on Snapchat and then overdosing, Snap CEO Evan Spiegel also apologized.

"I'm so sorry that we have not been able to prevent these tragedies," Spiegel said. "We work very hard to block all search terms related to drugs from our platform."

Social Media as a "Dangerous Product"

During his opening, Sen. Lindsey Graham, R-S.C., hit on the issue at the heart of the tsunami of suits facing social media companies: whether their algorithms are products that they can be held liable for.

"What do you do with dangerous products? Either allow lawsuits, have statutory protections to protect consumers, or you have a commission to regulate the industry in question to take your license away," Graham said. "None of that exists here."

The judges handling the lawsuits in California state and federal courts have made their initial rulings on the question this past fall, allowing the suits to proceed but reaching different conclusions.

Los Angeles Superior Court Judge Carolyn B. Kuhl said in October that the social media platforms are not tangible products and thus the product liability framework isn't applicable to claims that content driven by the platforms' algorithms causes mental health harm in teenagers. However, she found that the First Amendment and Section 230 don't bar the parents' negligence claims, since the parents are seeking to hold the platforms liable for manipulating how their children engage with them.

U.S. District Judge Yvonne Gonzalez Rogers took a different approach in her ruling a few weeks later. Unlike Judge Kuhl's broader ruling that the platforms weren't products, Judge Gonzalez Rogers looked at the functions of the platforms targeted by the suits and found that the alleged defects were comparable to "tangible personal property," rather than protected by Section 230.  

YouTube, which is owned by Google, is named in those suits, but was not present at Wednesday's hearing. Discord is not named in the suits.

Later in the hearing, an upset Sen. Amy Klobuchar, D-Minn., raised the same issue, recounting incidents of teenagers who had died after taking fentanyl they found on social media and making a comparison to the recent blowout of a Boeing 737 plane door midflight.

"Nobody questioned the decision to ground a fleet of over 700 planes," Klobuchar said. "So why aren't we taking the same type of decisive action on the danger of these platforms when we know these kids are dying?"

Some Openness to Increased Oversight

The social media executives signaled to varying degrees that they would be amenable to more regulation for their no-longer-so-young industry.

The new CEO of X, Linda Yaccarino, was the only executive to fully support the STOP CSAM Act, which improves accountability for social media platforms for the presence of child sexual abuse material. But the other executives expressed they would be open to supporting that and similar laws, though not to specific extents.

For example, Shou Zi Chew, the Singaporean CEO of TikTok, said the Chinese company would support "national privacy legislation" in the U.S.

"That sounds like a good idea," Chew said. "We just have to know what it means."

Discord CEO Jason Citron also said his platform would be open to conversations with lawmakers, and that the company would welcome legislation and regulation.

However, "there's been so much talk at these hearings and popcorn-throwing and the like, and I just want to get this stuff done. I'm so tired of this," a frustrated Klobuchar said.

"It's been 28 years since the internet, we haven't passed any of these bills ... it's time to actually pass them, and the reason they haven't passed is because of the power of your companies," she added.

--Editing by Alanna Weissman and Emily Kokoll.

For a reprint of this article, please contact reprints@law360.com.

Hello! I'm Law360's automated support bot.

How can I help you today?

For example, you can type:
  • I forgot my password
  • I took a free trial but didn't get a verification email
  • How do I sign up for a newsletter?
Ask a question!