Featured

The Hidden Network That Powered Cancel Culture

This article is part of Upstream, The Daily Wire’s new home for culture and lifestyle. Real human insight and human stories — from our featured writers to you.

***

Earlier this month, Bluesky CEO Jay Graber announced she was stepping down to make way for a “seasoned operator focused on scaling and execution.” For years, parts of the press and the online intelligentsia treated Bluesky as the future of public discourse. Outside those circles, most people barely noticed it existed. That gap between seeming central and being socially marginal is the real legacy of Bluesky.

More than a story about one platform’s limits, it is a window into a larger dynamic of online life, where tightly connected minorities can make their conversations look like the voice of the public.

Bluesky has around 42 million total users, a fraction of the hundreds of millions claimed by X and Threads. But more importantly, as Jesse Singal pointed out to Blocked and Reported cohost Katie Herzog, “There’s a little more than a million people who like a post at least once a day.” Herzog responded, “That’s so small,” to which Singal added, “It punches so far above its weight because the people who are on there are disproportionately in media or academia, just people with strong sort of professional opinions.” Bluesky feels like a major public square to parts of the commentariat because the people most likely to shape media narratives are overrepresented among its active users.

That observation has a name. Academics call it the majority illusion. It’s when a small but densely connected group behaves a certain way in public, and its behavior looks far more widespread than it really is. A controversy can look national even though it doesn’t extend beyond a narrow slice of highly online participants. A view can feel ubiquitous without being common. Understanding the majority illusion is key. It might also explain why the waves of cancellation have rolled back since Musk took over X.

There are two parts to the majority illusion: a production phenomenon and a perception bias.

The production side is straightforward. Online content is not created evenly. A very small number of highly active users produce a wildly disproportionate share of what everyone else sees. On Reddit, for example, the top 1 percent of users write 40% of the comments. Wikipedia, too, is largely the product of a small minority, as 77% of entries are written by 1% of users. Most people lurk, scroll, and occasionally react. A small minority posts constantly, argues constantly, and in doing so sets the apparent tone of the space.

The perception side is where the distortion takes hold. Observers infer from this uneven stream of content that they are seeing society as it is. Naturally, they then overestimate the popularity, intensity, and even moral legitimacy of certain beliefs simply because they are expressed more often. Critically, however, visibility is not the same as representativeness.

A recent paper confirmed the scale of the distortion. The American public believes that “43% of all Reddit users have posted severely toxic comments and that 47% of all Facebook users have shared false news online.” Platform-level estimates put the true figures between 3 and 7%. Importantly, the authors found the majority illusion can contribute to a feeling of moral decline. If people repeatedly encounter extreme or antisocial behavior online, they can come to believe it is normal, widespread, and reflective of the culture at large, even when it is produced by a small and unrepresentative minority.

These distortions extend well beyond platform behavior. Americans routinely overestimate the size of minority groups, the prevalence of fringe beliefs, and the popularity of radical social attitudes. Online life trains people to mistake frequency of exposure for frequency in the population.

The majority illusion also helps to explain all the episodes of overstated virality. A few years back, several outlets reported that young TikTok users were sharing Osama bin Laden’s “Letter to America” as though it were sweeping the platform. But when John Herrman looked into the supposed trend, he found only “a few dozen results” tied to the manifesto. The same trend could be found in the NyQuil chicken panic the year before. A tiny pocket of attention was converted into a national morality tale.

I suspect this framework also helps explain why the cancellation machine has weakened since Elon Musk acquired Twitter. Cancellation depended on a particular network structure. Old Twitter contained a highly concentrated cluster of journalists, academics, activists, and professional scolds who were interconnected and visible to institutional decision-makers. Because these users often moved in lockstep, outrage could appear much larger and more representative than it really was. When they converged on a target, the resulting pile-on looked to outside observers like a broad moral consensus. Institutions, uncertain of their footing and often terrified of reputational risk, treated that signal as authoritative.

Musk’s takeover disrupted this dynamic. First, it changed the composition of the network itself. Right-leaning users, anti-cancellation voices, and previously suppressed participants became more visible. There was no longer a relatively uniform moral culture, and outrage could no longer pass as consensus.

Second, the migration of many activists, journalists, and academics to Bluesky effectively narrowed the reach of the cluster most committed to cancellation logic. On Bluesky, users still try to cancel people, and they are able to generate a powerful sense of unanimity among themselves. But the agreement is within a smaller, more socially insulated network that no longer includes employers, publishers, universities, nonprofits, and media organizations. By moving to Bluesky, these users lost the institutional pathways that once turned online outrage into real-world punishment.

The result is that the cancellation mechanism weakened at both ends. That does not mean cancellation disappeared, or that outrage no longer matters. Rather, it showed that the old machine relied on very specific conditions. It needed a platform where a relatively small but highly connected elite could agree with itself, amplify itself, and then be seen by institutions as speaking for everyone else.

Bluesky’s real lesson is not that one app failed to become the new town square. It is that the modern internet is remarkably good at manufacturing false majorities. A small, hyperactive, tightly networked class can make its preferences feel universal, its outrage feel democratic, and its obsessions feel unavoidable. But volume is not consensus, visibility is not legitimacy, and a feed is not the public. The sooner we all embrace that distinction, the less power we will give to niche online worlds to define social reality for everyone else.

***

Will Rinehart is a senior fellow at the American Enterprise Institute, where he focuses on the political economy of technology and innovation.

Source link

Related Posts

Load More Posts Loading...No More Posts.