This week, the Senate is again summoning the CEOs of the biggest social media companies to Congress — some by force of subpoena. Once again, they’ll be asked to explain why their platforms are so unsafe. Why they allow our kids to be exploited, harassed and even bullied to the point of suicide. Why their algorithms are designed to reward divisive, false and defamatory content. And once again, they’ll give us the same non-answer: We take user safety seriously… but we’re not responsible for the content. We’re just neutral “platforms.”
Wrong. They’re user-content-oriented global media companies that have sucked the blood out of traditional media companies and left democracy reeling and our kids depressed. Facebook’s motto of “move fast and break things” has become the ethos of many in Silicon Valley. Now it’s time to slow down and fix things.
But we know from the last 15 years of social media dodginess that they won’t do it voluntarily. Just like other industries, they’ll have to be forced by laws and regulators to be more responsible. Fortunately, there’s a ballooning bipartisan appetite on Capitol Hill to do just that.
Traditional media companies constantly worry about libel law. Social-media companies rarely do. Because of section 230 of the Communications Decency Act of 1996, social-media companies can’t be held liable for content posted by its “users.” That’s why Fox reached a $787 million defamation settlement with Dominion Voting Systems, but X — where much of the defamation about Dominion was germinated and propagated — hides behind the veil of section-230 innocence.
As a result, some of the standards that I’ve been held to across my career as a journalist, an editor and founder and publisher of media sites simply don’t cross the minds of editors — um, “curators” — at Facebook or X.
That’s not to say they have no boundaries. Right now, social media companies can be sued for certain intellectual-property violations for some of the content they produce and for their failure to remove content they’ve agreed to take down. That’s a good start.
But we shouldn’t stop there. What if, for instance, they were held accountable for content created by those they’ve engaged in profit-sharing deals? What if algorithmic boosting of certain posts beyond their organic reach was seen as an act of publishing for which a social-media company could be held liable? Perhaps numerical thresholds could open the potential for libel suits — any post receiving more than, say, 100,000 views (which would exceed the circulation of most daily newspapers in America… all of which must verify their “user generated” letters to the editors and op-eds).
What if we took a cue from copyright law? Those who believe a copyright has been infringed by a platform can lean on the Digital Millennium Copyright Act to get injunctions that force platforms to take down the copyrighted content. A similar process could enable people to trigger injunctions that would force platforms to either take down defamatory content or eventually be held liable for it.
Beyond libel law, what if the social-media companies were fined for every unverified user on their sites? We know that bots and anonymous users are the biggest purveyors of hate, death threats and false information — much of it generated by foreign adversaries who want to destabilize our democracy. Instead of tolerating atmospheres of exploitation and cyber bullying that have led to historically high rates of teen depression and suicide, what if social-media companies were forced to install new safeguards and parental controls to protect our kids? And what if there was a digital regulatory agency — like the one envisioned by Sens. Elizabeth Warren (D-Mass.) and Lindsey Graham (R-S.C.) — to help enforce these new laws?
The 20th century is referred to as the “American Century” in part because it witnessed an historic expansion of democracy fueled by an information environment based on journalistic standards, the rule of law and a full embrace of the First Amendment.
So far, the 21st century has been one of democratic retreat, largely due to the social-media-driven information environment. If we want to improve the mental health of our children and the social health of our democracy, we have to change that environment. There is no silver bullet. It’ll require dozens of tweaks.
Of course, the CEOs testifying this week would say treating them more like traditional publishers is impossible. After all, they have millions of users. But they also have vast resources. Meta alone profited $100 billion in its last operating year. If it spent just 1/10th of those profits on content moderation it could hire 100,000 moderators at $100,000 each per year.
Smaller platforms will claim they don’t have the resources. But in any industry, bigger and more established players have more resources to operate within the regulatory environment. America’s newspapers didn’t complain that they couldn’t grow because they’d have to hire more fact checkers and editors. They assumed that editorial integrity was one of the many things they would have to expand as they grew.
Treating social media companies more like media companies won’t rid us of polarization, disinformation, and violence. And sadly, it won’t protect every child. But forcing them to put more skin in the game could quickly lead to dramatic improvements in online safety and a reduction in hateful, divisive and dubious content. So, let’s start treating the “platforms” like the grown-up media companies they’ve become, with the rights and responsibilities that entails.
Nick Penniman is CEO of Issue One.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.