It’s widely known that social media can quickly turn into a toxic cesspool of hate speech and ragebait, particularly during times of political turmoil.
Across social media platforms, amplified by the algorithm, hate often breeds hate. But what exactly makes toxicity so contagious? It turns out, the problem may be coming from within.
A study published this month in the Journal of Computer-Mediated Communication, co-authored by Alon Zoizner and Avraham Levy, looked at how social media users react when they’re exposed to toxic posts from people on their own political side, defined as the “ingroup,” compared with those from the opposing side, the “outgroup.”
Highlighting the motivations behind toxic posts, the researchers suggested that exposure to toxicity from your own side tends to encourage similar behavior, as a way to show loyalty and signal belonging. On the other hand, seeing toxic posts from the opposing side can trigger defensive reactions, prompting users to hit back.
Analyzing over 7 million tweets from 700,000 X accounts in Israel during 2023, a period of intense political division and conflict, the researchers found that toxicity mainly spreads online through ingroups.
In other words, the more people see toxic behavior from those on their own side, the more likely they are to mirror it. Reactions to toxicity from the opposing side were fewer. People are motivated less by outrage at the “other side,” and more by a desire to fit in with their own.
The design of social media makes this even worse. By highlighting political identities, it encourages social media users to see themselves as representatives of their group rather than as individuals—and to act accordingly.
While much has been said about the polarizing effect of social media echo chambers, the researchers found that, contrary to their expectations, homogeneous networks are less affected by both ingroup and outgroup toxicity compared with those in more heterogeneous networks.
They suggest two possible reasons for this. People in more mixed social networks may see a wider range of opinions, which can spark more conflict and toxic exchanges. Those in very like-minded networks, however, might already hold strong views, making them less influenced by others and less likely to feel the need to prove their loyalty through toxic behavior.
Either way, toxicity is an unwelcome fact of the internet. Social media companies have engineered our feeds to keep us hooked on outrage and keep us online. Likes and shares then reinforce this cycle, rewarding those who perform their group identity most loudly, even when that means being the most toxic.
The early-rate deadline for Fast Company’s World Changing Ideas Awards is Friday, November 14, at 11:59 p.m. PT. Apply today.


