With new dynamics emerging between Silicon Valley and Washington, some advocates for stronger social media regulations and parents are concerned their cause will get lost in the dust despite past momentum.
Friday marked one year since the leaders of five major social media companies — Meta, TikTok, Snap, Discord, and X — were grilled by a Senate Judiciary Committee over their platforms’ impact on young users.
The contentious hearing threw the concerns over social media harms into the spotlight and amped up pressure on businesses and policymakers to do more to protect children and teens.
At one point, two of the leaders — Meta CEO Mark Zuckerberg and Snapchat CEO Evan Spiegel — apologized to the families of children who died or were seriously harmed because of social media.
Some advocates and families appreciated the move, hoping changes might unfold in the coming months. But by the end of last year, much of the legislation on kids’ online safety failed to make it past the finish line amid pushback, largely from the House.
Meanwhile, tech executives now seem closer than ever to the White House orbit, stoking advocates’ concerns that they could have President Trump’s ear regarding their business interests.
“You have Mark Zuckerberg going from apologizing to the families who have lost children because of his products under oath to reversing that policy in defiant and then the cynical goal, I believe, of buying his way out of the ongoing lawsuits,” Tech Oversight Project Executive Director Sacha Haworth said.
Zuckerberg faced the brunt of criticism from senators on both sides of the aisle over how the company’s platforms, Facebook and Instagram, pose risks to children online.
Amid pressure from Sen. Josh Hawley (R-Mo.), the Meta CEO faced the hearing audience and apologized to the families, including Deb Schill, the mother of Becca, who died at 18 after purchasing fentanyl-poisoned drugs allegedly purchased via social media.
Schmill told The Hill she “wanted to believe him,” but knew his remarks “obviously doesn’t equate to, ‘I’m willing to sacrifice something in my life, money to make sure that doesn’t happen to anyone else.’”
While Zuckerberg pledged to continue what he called an “industry-wide effort” to prevent future harm, his future decisions at Meta contradicted what parents hoped for.
“A year ago, he stood in front of us and he made an apology, said he was sorry for what happened to our children,” Schmill said. “And then immediately after that, went back to what he always does, choosing profits over the safety of our children.”
Earlier this month, Zuckerberg announced Meta would eliminate its fact-checking program and replace it with a community notes system.
Some Democrats and tech safety groups slammed the move, arguing it was a capitulation to Trump, who has long fought against what he believes is censorship of conservative ideas.
“This retreat by Meta, which emulates the chaotic descent experienced on X, portends for a much less safe and civil online experience, especially for kids,” Stephen Balkam, the founder and CEO of the Family Online Safety Institute (FOSI) wrote.
While Zuckerberg appears to be embracing a new stance on content moderation, Hawley suggested Friday that the fight against Big Tech is not over.
“Big Tech giants like Meta’s Mark Zuckerberg are now distancing themselves from the Democrat censorship cartel because they can read the results of an election,” Hawley told The Hill. “This doesn’t mean Congress should forget the conservative speech their social media companies silenced or the users they exploited. We should break up their monopolies and give power to everyday Americans.”
Concerns over censorship were also what drove House Republican leadership to oppose the Kids Online Safety Act (KOSA), a polarizing bipartisan bill intended to create more protection for minors online.
The bill, introduced by Sens. Marsha Blackburn (R-Tenn.) and Richard Blumenthal (D-Conn.) overwhelmingly passed the Senate 91-3 months after the hearing but stalled in the House, where Speaker Mike Johnson (R-La.) declined to bring it to the floor due to free speech concerns.
Johnson has said he plans to work on the principles behind KOSA this year and spokespersons for Blackburn and Blumenthal confirmed they are working to reintroduce KOSA.
Despite the unsuccessful push last session, advocates are heading into the 119th Congress with mixed hopes that KOSA or other online safety bills will see the light of day.
“My hope is that kids’ online safety continues to be that bipartisan push. My hope is that this administration looks at it as an issue that they can win on to protect young people online,” Zaman Qureshi, a campaigner for advocacy group Accountable Tech, told The Hill.
Blumenthal appeared optimistic about KOSA’s future this week, pointing to the endorsement from X owner and Trump ally Elon Musk, who backed the bill last month after negotiating changes to address free speech concerns.
Snapchat, which was also represented at the hearing, was notably an early supporter of KOSA.
Still, the changing relationship between leading tech companies and the president could present a new obstacle if Trump sides with their lobbying efforts.
“I’m concerned that five or six big tech CEOs sat beside him [Trump] at the inauguration and that is the kind of litmus test for the kind of regulation that we’re going to see or lack thereof in this administration,” Qureshi said.
Qureshi was likely referring to Zuckerberg, along with other tech titans like Musk, Google CEO Sundar Pichai and Amazon founder Jeff Bezos, who attended Trump’s inauguration last week despite past tensions with the president.
The event showcased a months-long push by much of the industry to reconcile with the incoming president, and followed inaugural donations and trips to Trump’s Mar-a-Lago resort in Palm Beach.
Among those to make the trip was TikTok CEO Shou Zi Chew, as his video-sharing platform faced a looming government ban in the United States.
Chew took heat from lawmakers at last year’s hearing about the company’s ties to China, to which he maintained TikTok has never shared any user data with the Chinese government.
Months later, a divest-or-ban law for TikTok passed Congress with widespread bipartisan support and was signed by former President Biden. It gave the app’s Chinese-based parent company ByteDance until Jan. 19 to either divest the app or face a ban in the U.S.
While Chew faced scrutiny from Congress a year ago, Trump is giving him a much warmer reception as he works with the company to hammer out a deal after once supporting the ban.
The app went dark for nearly 12 hours earlier this month but was brought back after Trump announced his plans to delay the ban.
The move has put Trump at a crossroads with some China hawks in Congress, like Sen. Tom Cotton (R-Ark.), who said this month that ByteDance must agree to a divesture before TikTok can return to the U.S.
Cotton made headlines during the hearing last year when he repeatedly pressed Chew about his potential ties to the Chinese Communist Party. Chew pushed back, repeatedly stating he is Singaporean and has not been associated with the CCP at any point.
It remains to be seen if senators will be able to push forward with their scrutiny of Big Tech with Trump back in office.
Earlier this week, Punchbowl News reported the White House is looking to stop the Senate Commerce Committee from adopting a rules package to give Chair Ted Cruz (R-Texas) unilateral subpoena authority to investigate Big Tech companies.
Cruz later told Punchbowl, “Big Tech censorship poses the single greatest threat to free speech in this country.”
Tech advocates also have their eye on other legislation in the new year, including Cruz’s TAKE IT DOWN Act, which would criminalize nonconsensual intimate imagery, including content made with artificial intelligence. Snapchat supported the bill last session.
Andrew Zack, FOSI’s policy manager, suggested legislation might need to be more narrowly tailored to pass Congress, calling the TAKE IT DOWN Act a “good bill.”
“I am not that hopeful that something like a big, sweeping policy like KOSA or age-appropriate design code will pass and then stand up to legal challenge,” he said, adding, “So the next step to me would be these narrower, targeted, one issue at a time bills instead of a big, sweeping one.”
The Hill reached out to the companies involved with last year’s hearing and The White House for comment.
Meta pointed to various actions it took for kids safety, including the start of Instagram “teen accounts” and the introduction of new safety features to prevent sextortion scams.
While Discord was present at the hearing, the platform is not necessarily considered “Big Tech,” and was thus, not a major focus of the hearing.
“Offering a platform that is safe and that fosters meaningful connection, especially for young people, is at the center of everything we do,” Kate Sheerin, head of U.S. public policy told The Hill. “Discord will continue working with industry, parents, our partners in law enforcement, safety experts, non-profits, and with policymakers around the world on this shared priority.”