Lawmakers on Wednesday strongly criticized the CEOs of Meta, TikTok, X, Snap, and Discord, accusing them of exacerbating “a crisis in America” by negligently ignoring harmful content targeting children on their platforms, amidst growing concerns about technology’s impact on youth.
In a heated 3.5-hour hearing, members of the influential Senate Judiciary Committee raised their voices and repeatedly rebuked the five tech leaders. They criticized the prioritization of profits over the well-being of young people, with some suggesting that the companies bore responsibility for harmful outcomes. At one juncture, lawmakers drew parallels between the tech firms and the tobacco industry.
“Every parent in America is deeply concerned about the inappropriate content our children are exposed to,” remarked Sen. Ted Cruz, R-Texas.
The tech executives, some of whom were compelled to attend via subpoena, asserted that they had allocated substantial resources to enhance safety measures on their platforms. Some expressed backing for legislation aimed at bolstering privacy and parental controls for children, while others highlighted the shortcomings of their competitors. Notably, all the executives emphasized their roles as parents.
In a particularly intense exchange with Sen. Josh Hawley, R-Mo., Mark Zuckerberg, CEO of Meta, rose from his seat to directly address a group of parents whose children had fallen victim to online sexual exploitation.
Zuckerberg expressed remorse for the hardships endured by families, acknowledging that nobody should have to endure such pain. However, he refrained from directly addressing whether Meta’s platforms had contributed to this suffering, stating instead that the company is actively investing in measures to prevent such negative experiences.
The bipartisan hearing highlighted the growing concern surrounding the impact of technology on the mental health of children and teenagers. Last year, U.S. Surgeon General Dr. Vivek Murthy identified social media as a significant factor in a crisis affecting youth mental health. Alarmingly, in 2023, over 105 million instances of online images, videos, and materials related to child sexual abuse were reported to the National Center for Missing and Exploited Children, underscoring the severity of the issue. Parents have attributed cyberbullying and child suicides to the platforms, further intensifying the scrutiny on their role in exacerbating these problems.
The bipartisan concern over Silicon Valley’s treatment of its youngest users has spurred lawmakers into action, with calls for stricter regulations and accountability measures. In response to parental outrage, legislators are proposing bills aimed at curbing the spread of child sexual abuse material and ensuring platforms take responsibility for safeguarding young people.
Tech giants are under increasing scrutiny both domestically and globally for their impact on children. Some states have passed laws mandating social media platforms to verify user ages or implement other protective measures, although these laws have faced legal challenges. Similar online safety regulations have been enacted in the European Union and Britain.
The White House joined the discussion, with Press Secretary Karine Jean-Pierre stating, “There is now undeniable evidence” linking social media to the mental health crisis among youth.
Despite the intense questioning of tech leaders, the outcome may not lead to significant change, as evidenced by past hearings. Meta executives have testified 33 times since 2017 on issues like foreign election interference, antitrust concerns, and social media’s role in the January 6, 2021 Capitol riot. However, no federal legislation has been passed to hold tech companies accountable, with numerous bills failing due to partisan disputes and lobbying efforts by the tech industry.
David Vladeck, a professor at Georgetown University’s law school and former head of consumer protection at the Federal Trade Commission, drew parallels between congressional inaction on tech legislation and the recurring disappointment of the “Peanuts” cartoon, where Lucy always pulls the football away as Charlie Brown attempts to kick it.
The federal government has failed to adequately enforce existing laws to combat online child abuse, despite authorization for increased funding. This negligence is evident in law enforcement funding not keeping pace with the rising number of online abuse reports.
Mark Zuckerberg testified before Congress for the eighth time on Wednesday, joined by Shou Chew, TikTok’s CEO, who returned less than a year after his previous appearance. Additionally, Evan Spiegel of Snap, Linda Yaccarino of X, and Jason Citron of Discord testified for the first time following subpoenas from lawmakers.
Since 2021, lawmakers have been increasingly focused on addressing social media’s harmful impact on children, prompted by whistleblower Frances Haugen’s revelations about Meta’s knowledge of Instagram’s negative effects on teenage body image. The Senate Judiciary Committee has convened multiple hearings with tech executives and child exploitation experts to address the dangers children face online.
Prior to Wednesday’s hearing, lawmakers disclosed internal emails among Meta’s top executives, including Zuckerberg, revealing the company’s refusal to allocate additional resources to address child safety concerns.
Convened in the Dirksen Senate Office Building, the hearing commenced with a video featuring victims of child sexual exploitation, who lamented the failures of tech companies in protecting them. In an unusual display of bipartisanship, members of the Senate Judiciary Committee, both Republican and Democratic, alternated in accusing tech leaders of being aware of the harm children face on their platforms.
Senator Dick Durbin, D-Illinois, who chairs the committee, criticized the companies for prioritizing engagement and profit over basic safety, stating that this approach puts children and grandchildren at risk.
At a certain juncture, Senator Hawley directly confronted Zuckerberg, stating, “Your product is causing harm to people.”
Zuckerberg and Chew were the primary targets of lawmakers’ criticism for their lack of support for child safety legislation. During the hearing, Spiegel faced scrutiny regarding drug sales on Snapchat, where he offered apologies to parents who lost children to fentanyl overdoses after purchasing drugs through the platform.
Expressing regret, Spiegel acknowledged Snap’s shortcomings in preventing such tragedies and highlighted measures like blocking drug-related search terms and collaboration with law enforcement.
Lawmakers also deliberated proposals to hold platforms accountable by repealing Section 230 of the Communications Decency Act, a 1996 statute that shields internet companies from content liability.
Senator Amy Klobuchar, D-Minnesota, emphasized the necessity of legal recourse, asserting, “Without opening up the courtroom doors, meaningful change is unlikely. The influence of money often speaks louder than our discussions here.”
Lawmakers, at times, delved into topics beyond children’s safety, with Chew fielding inquiries about ByteDance, TikTok’s parent company based in Beijing, and its handling of U.S. user data. Additionally, concerns were raised regarding reports of TikTok allegedly discriminating against Israelis, leading to the resignation of a TikTok lobbyist in Israel.
Notably absent from the hearing was YouTube, the most popular app among teenagers, with seven in ten using it daily according to Pew Research Center. TikTok follows closely behind with 58% of teen daily users, followed by Snap at 51% and Instagram at 47%.
In 2022, YouTube reported more than 631,000 pieces of content to the National Center for Missing and Exploited Children, as per a report by Google.
Apple’s absence was also notable, particularly due to its backtrack on a 2021 pledge to scan iPhones for abusive material involving children, a move that drew criticism from child safety organizations.
YouTube and Apple were notably excluded from the hearing, with a Judiciary Committee spokesperson stating that the five executives who testified represented a diverse array of companies.
In the weeks leading up to Wednesday’s hearing, several tech companies announced adjustments to their services concerning children. Meta implemented stricter controls on direct messaging for teenagers and enhanced parental controls, while Snap expressed support for the Kids Online Safety Act, proposed legislation aimed at curbing data collection on children and strengthening parental controls on social media platforms.
Outside the Capitol building on Wednesday, a nonprofit organization critical of Big Tech showcased cardboard cutouts depicting Zuckerberg and Chew atop a pile of cash, raising champagne glasses. Meanwhile, inside the hearing room, parents held up images of victims of online child sexual exploitation.
Mary Rodee, a parent present at the hearing, shared the tragic story of losing her 15-year-old son, Riley, in 2021 due to sexual exploitation on Facebook Messenger. Since then, she has been advocating for legislation to safeguard children online.
“The companies are falling short,” she stated. “Enough with the talk.”
This article was originally published in The New York Times.