manoel horta ribeiro

What does research really tell us about deplatforming?

Merriam-Webster has defined Deplatforming as an ``attempt to boycott a group or individual through removing the platforms (such as speaking venues or websites) used to share information or ideas’’ [Merr18]. Prominent cases of deplatforming in social media include the banning of: 1) Toxic subreddits (standalone communities hosted within the Reddit social network) like r/FatPeopleHate, r/ChapoTrapHouse, and r/The_Donald. 2) Accounts associated with far-right and conspiratorial communities like QAnon and the Alt-right. 3) Influencers spousing far-right or conspiratorial beliefs like Alex Jones, Laura Loomer, and Milo Yiannopoulos.

Previous research has explored the effects of deplatforming, trying to characterize the effects of deplatforming broadly. For instance, [CPSG17] studied how Reddit evolved circa 2015 after banning r/FatPeopleHate and r/CoonTown, two large toxic communities on the platform. They found that, within Reddit, a large percentage of users previously active in these communities stopped posting on the website and that those who continued drastically reduced their hate speech usage (as measured through a dictionary). Also considering within-platform effects, [JBYB21] studied the banning of three high-profile influences on Twitter. Considering users that previously engaged with these accounts, they again found a decrease in the activity and toxicity (as measured through Google’s Perspective API).

These within-platform studies are limited as they analyze the specific social networks where deplatforming occurred. Still, deplatformed influencers and communities usually migrate to alternative social networks or standalone websites. For instance, banned influencers like Alex Jones continued to be active on Gab [Ally21] (an alt-tech social media similar to Twitter) and used Telegram to broadcast content [Roge20]; Communities like Fat People Hate migrated to Voat [MePa22] (a now-defunct Reddit clone); and r/The_Donald even went as far as creating their own standalone social media, [HJZB21] (formerly

Other papers address this limitation by studying pairs of social media platforms. [UrKa22] found that far-right Telegram groups experienced explosive growth coinciding with the banning of far-right actors on mainstream social media platforms. [ASAB21] developed a method for identifying the same user across different pairs of social networks – Twitter and Gab, and Reddit and Gab. They find that the activity and toxicity of these users grow on Gab when they get banned on Twitter or Reddit (although their reach declines). Subsequent work found largely the same effects [Mitt21]. [HJZB21] studied the banning of two prominent Reddit communities that went on to create standalone websites: r/The_Donald and r/Incels. They found that activity decreased for both communities following the platform migration but that toxicity (as measured by Perspective API) increased only for The Donald (perhaps due to the cryptic language used in Incel communities). [RaKa21] studied the migration of deplatformed YouTube channels to Bitchute, finding that they receive drastically fewer views on the latter.

These across-platform studies cast doubt on the effectiveness of deplatforming. Overall activity and reach decrease in newer platforms [HJZB21] [ASAB21] and in the platforms that carried out deplatformed, as pointed out by within-platform studies. However, there seems to be substantial migration to alternative platforms where hate speech runs rampant [FrMaKr20], and users that migrate become more toxic [HJZB21] [ASAB21] — this is done considering matched users, so self-selection does not explain this effect. These results suggest a trade-off: deplatforming may lead to more toxic communities that are also smaller and have less reach.

Nonetheless, this work must also be considered with a grain of salt. First, most these studies only consider active engagement traces — likes, comments, posts, etc. It may very well be that the number of lurkers (i.e., users who consume content but do not actively engage in the discussions) increases on fringe platforms, given that they often allow full access without registration. In that context, previous work may be underestimating the reach and (passive) user activity in alt-tech social media platforms. Further, these studies consider unilateral migratory movements; they track how users banned in a specific mainstream platform behave on another specific fringe platform. However, there is no good reason to believe that users would not spread the time they previously spent on Reddit or Twitter across a variety of other platforms – including less public-facing ones like Telegram or Discord [Roge20] [UrKa22] [CoScMa19].

Last and more subtly, the work discussed here papers are also limited in that they do not consider or model network effects. It is well known that the utility of social media depends on the number of users [Luca15] — thus, it is naïve to conclude that the effects of banning controversial opinions would be constant. As Gab, Parler, and others receive more migrants, their utility will likely increase. Thus, it is tricky to generalize the effect of the bans on specific influencers and communities.


[Ally21] Allyn, Bobby: Social Media Site Gab Is Surging, Even As Critics Blame It For Capitol Violence. NPR.

[ASAB21] Ali, Shiza ; Saeed, Mohammad Hammas ; Aldreabi, Esraa ; Blackburn, Jeremy ; De Cristofaro, Emiliano ; Zannettou, Savvas ; Stringhini, Gianluca: Understanding the effect of deplatforming on social networks. In: Proceedings of the ACM Web Science Conference, 2021

[CoScMa19] Conway, Maura ; Scrivens, Ryan ; Macnair, Logan: Right-Wing Extremists’ Persistent Online Presence: History and Contemporary Trends : International Centre for Counter-Terrorism, 2019

[CPSG17] Chandrasekharan, Eshwar ; Pavalanathan, Umashanthi ; Srinivasan, Anirudh ; Glynn, Adam ; Eisenstein, Jacob ; Gilbert, Eric: You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech. In: Proceedings of the ACM on Human-Computer Interaction (2017)

[FrMaKr20] Freelon, Deen ; Marwick, Alice ; Kreiss, Daniel: False equivalencies: Online activism from left to right. In: Science (2020), Nr. 6508, S. 1197–1201

[HJZB21] Horta Ribeiro, Manoel ; Jhaver, Shagun ; Zannettou, Savvas ; Blackburn, Jeremy ; Stringhini, Gianluca ; De Cristofaro, Emiliano ; West, Robert: Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels. In: Proceedings of the ACM on Human-Computer Interaction (2021)

[JBYB21] Jhaver, Shagun ; Boylston, Christian ; Yang, Diyi ; Bruckman, Amy: Evaluating the effectiveness of deplatforming as a moderation strategy on Twitter. In: Proceedings of the ACM on Human-Computer Interaction (2021)

[Luca15] Luca, Michael: User-generated content and social media. In: Handbook of media Economics. Bd. 1 : Elsevier, 2015, S. 563–592

[MePa22] Mekacher, Amin ; Papasavva, Antonis: “I Can’t Keep It Up” A Dataset from the Defunct News Aggregator. In: Proceedings of the International AAAI Conference on Web and Social Media, 2022

[Merr18] Merriam-Webster: The Good, The Bad, & The Semantically Imprecise.

[Mitt21] Mitts, Tamar: Banned: How Deplatforming Extremists Mobilizes Hate in the Dark Corners of the Internet (2021)

[RaKa21] Rauchfleisch, Adrian ; Kaiser, Jonas: Deplatforming the far-right: An analysis of YouTube and BitChute. In: SSRN (2021)

[Roge20] Rogers, Richard: Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media. In: European Journal of Communication Bd. 35 (2020), Nr. 3, S. 213–229

[UrKa22] Urman, Aleksandra ; Katz, Stefan: What they do in the shadows: examining the far-right networks on Telegram. In: Information, communication & society (2022), Nr. 7, S. 904–923

Written on June 20, 2022