Deconstructing Online Hate

The internet has become a prime means of communication worldwide and this unprecedented global reach – combined with the difficulty in tracking communications – makes it an ideal tool for extremists to repackage old hatred, raise funds and recruit members. As the internet has grown and changed, hate groups and movements have adapted, creating websites, forums and social network profiles, becoming active in spaces such as online games and even creating parallel versions of services such as Twitter, YouTube and GoFundMe.[1]

Cultures of hatred

Youth don’t have to encounter overt hate speech to be exposed to hate online. Because of the networked nature of the internet, it’s possible for them to encounter it entirely by accident. Many online spaces contain “cultures of hatred,” communities in which racism, misogyny and other prejudices are normalized, while sexist, racist and homophobic slurs are common in online games.[2] MediaSmarts’ research has found that youth witness prejudiced speech on all of the platforms they use routinely.[3] Although young people may not participate in this prejudiced speech, they often cannot help but find their own views and opinions influenced by the values of the communities that they’re a part of. As well, because of the connected nature of digital media, all of these communities are connected, which means that small, mostly closed communities (in which radicalization occurs more rapidly) can have a significant influence on larger, more open communities.[4]

Ideologies of hate

While there are many different hate movements active online, there are shared ideologies of hate that most have in common. Learning to recognize these is essential to knowing hate when we see it.

The most fundamental element is the perception that all members of the hated group have a negative, immutable “essence” that makes them a threat and justifies any action taken against them. This is communicated through six different ideas: othering and dehumanization, appeals to the in-group’s glorious past, claims to victimhood by the in-group, fear of secret enemies, the idea of inevitable conflict with an outgroup and justifying the use of violence against them.

The term ideology, though, shouldn’t be taken to mean that these groups or movements have any kind of consistent philosophy of world view. In fact, researchers have identified a trend towards what’s called “salad bar” extremism, in which individuals pick and choose elements that appeal to them from different movements without worrying about consistency.[5]

Othering and dehumanization

The most fundamental ideology of hate is othering and dehumanization.  The “Other” is not, however, the actual group as it exists in reality, but a fiction created to solidify the identity of the hate group and justify its existence and actions.[6] To achieve this, the Other must be portrayed as being both inferior, to establish the hate group’s superiority, and threatening, to establish the need to take action against them. When people transfer their fears and hatred to the Other, the targeted group becomes, in their eyes, less than human. Denying the humanity of victims makes it much easier to justify acts of violence and degradation: for instance, there is a widely-held belief among White supremacists that Black people actually feel less pain than others.[7]  Even more powerful, however, is metadehumanization – the sense that someone else sees your group as inhuman.[8]

Dehumanizing propaganda has been shown to have little direct effect on people’s views (though that may be different for people who are repeatedly or heavily exposed) but it does change people’s view of what other people think. Precisely because it is such an extreme and abhorrent view, convincing people that it is widely held can be a powerful tool for mainstreaming and radicalization.[9]

Victimhood

Despite portraying the Other as being inherently inferior, claiming to be victims themselves – and rejecting the idea of the Other as a victim – is central to ideologies of hate.[10] In the case of hate groups, these are typically false or disproportionate claims of victimhood – such as claims that men, White people or Christians are more discriminated against than historically marginalized groups. People who are used to being at the top are more likely to see other groups’ gains as coming at their expense.[11] People will identify more closely with a group if that group appears to be isolated or under external threat, so group members who are imprisoned, lose their jobs or are killed in conflict with police are treated as martyrs.

The glorious past

Another essential element of hate ideology is the notion that the group has fallen from its once-glorious past.[12] Generally, this fall is portrayed as being the fault of either the Other or of members of the group who were fooled or subverted by the Other. As a result, it is only by defeating and destroying the Other that this glorious past can be regained. Sometimes this and victimhood will be joined together to form a “chosen trauma” – a past tragedy, either real or imaginary, used to show the need for vigilance, explain an inter-group conflict or justify a dominant group’s claim to victimhood.[13]

Secret enemies

One of the clearest signs that an argument is based on hate is showing one or more groups as secretly plotting against the in-group. Sometimes the group being targeted is shown as being manipulated by another, even more evil group (to explain how they can be both dangerous and inferior.). In some cases, the secret enemy may be a group that doesn’t even exist, such as a conspiracy, secret society or even aliens.[14]

Inevitable conflict

Another sign of an argument coming from hate is the idea that groups cannot co-exist: conflict is inevitable.[15] This may be the most harmful message because if you believe you're in an existential struggle - that your existence is at stake - then any tactic, any action can be justified;[16] “when the other side isn’t fighting fair, and their victory entails not just a temporary setback but cataclysm, the ends can justify nearly any means.”[17] That threat doesn’t have to be physical, though. In fact, research has found that fear of symbolic threats, which endanger your identity, are a better predictor of hate than physical ones.[18]

Justifying violence

The final ideology of hate, therefore, is the justification of violence.[19] Sometimes it will openly state that violence is needed and justified; more often, it will imply that by using one of the other images such as victimhood or inevitable conflict.[20] 

Forms of online hate

The places where young people encounter hate content online can be divided into four types: mainstream, fringe and niche platforms, along with cloaked hate content.[21]

Just as teenagers often have a public Instagram account and a “finsta” that’s just for their friends, hate groups and movements often post different content in these different spaces. This allows them to target people at different levels of the radicalization pyramid.

Mainstream platforms are the ones that most of us know and use most often: Instagram, YouTube, Wikipedia and so on. There is little hate content on these platforms overall and they are generally active in removing it. However, the huge amount of user content that is posted to them means that not all hate material can be removed quickly. As well, though hate makes up a tiny fraction of the content, it can still reach more people since even the smallest of them is many times bigger than fringe platforms.

Most of the hate content found on mainstream platforms isn’t explicitly hateful, both to avoid violating the terms of service and because its aim is to broaden the base of sympathizers and begin radicalizing sympathizers into members. This allows hate groups to “whitewash” their content because, to the casual viewer, when shared on Facebook or Twitter, a link to a hate site looks the same as a link to a legitimate source of news or information. For the same reason, mainstream hate material often adopts the aesthetic of mainstream sites like Instagram[22] or TikTok.[23]

Fringe platforms are not just smaller than mainstream ones. Some were specifically created to welcome people or content that aren’t welcome on mainstream sites; others started out trying to compete with mainstream sites and found a market catering to extreme views and content. While even the largest of them are a fraction the size of mainstream sites, some have been significant successes: an analysis done for the Southern Poverty Law Center found that five of the 12 fringe platforms it monitors ranked among the top 10 percent of internet domains in the US, with one ranked in the top five percent.[24]

These sites generally have very few limits on what content is allowed. As a result, hate material makes up a much larger proportion of the total content than it does on mainstream platforms, but its potential audience is much smaller. Nearly everyone who comes to one of these platforms is already sympathetic to hate messages, so the content tends to be more extreme than what is found on mainstream sites, aimed either at turning sympathizers into members or members into activists.

Finally, niche platforms are those that were created specifically by hate groups as a means of sharing resources, making connections and planning operations. This is where the most extreme content appears: the mechanism of radicalization in like-minded groups means that the activists who frequent these platforms feel a constant need to out-do one another. They can also serve as an incubator for content that is then spread to fringe and mainstream sites.  

Some of these sites have evolved to mimic popular commercial websites in a variety of ways, with many offering audiovisual material and discussion forums, and some featuring professional-looking layouts, with graphics and writing that mimic popular sites such as Buzzfeed and reach audiences as large as four million page views per month.[25] Some niche sites, though, are deliberately lo-fi. While these have relatively small audiences, their lack of polish is a plus for those who already identify with hate movements. It is here that most of the hateful memes and conspiracy theories that spread across the internet originate, created and constantly remixed by users.[26]

Many hate groups also maintain disguised, or cloaked, hate sites.[27] Overt hate sites actively promote hatred towards other groups while showcasing racist propaganda or offering hate-based communities online. In contrast, cloaked hate websites intentionally perpetuate hatred through more implicit and deceptive messages, masquerading as legitimate sources of opinion or information with neutral-sounding names that often include words like “centre” or “institute.”[28] The purpose of these sites is to reach young people before they have the knowledge or critical awareness to recognize them for what they are – one study found that over 60 percent of middle and high school students have drawn on them as research sources for school assignments[29] — before they have the ability to critically evaluate them. 

Many attempt to conceal a racist agenda behind a more moderate message of “education” or “debate.” This is true not only of sites which masquerade as mainstream sources of information, but of more overt hate sites as well, which position themselves as providing visitors with “the real story.” Overt hate sites also attack the mainstream educational system and call on supporters to educate their friends, families and communities about the “real” truth. A related phenomenon is what has been called ‘reasonable racism.’[30] These are hate sites which present their content as political provocation or debate, relying on pseudo-science and twisted logic rather than outright expressions of hate.

The networked nature of digital technologies, however, means that all of these different platforms are not really separate: rather, they are all parts of what some researchers have described as a “hate multiverse,” with mainstream and niche sites separated by no more than a single link or click. As one analysis describes it, “this multiverse acts like a global funnel that can suck individuals from a mainstream cluster on a platform that invests significant resources in moderation, into less moderated platforms like 4Chan or Telegram, simply by following the links offered to them.”[31]  This also strengthens connections between different parts of the “multiverse,” providing multiple entry points that may seem less extreme but lead to a tight-knit cluster of extremist sources, and builds links between different forms of hate such as anti-2SLGBTQ+ hatred and antisemitism.[32]

Connections in the multiverse can also lead to reciprocal radicalization between different communities – including ones that weren’t necessarily originally hate groups. Over the last decade, for example, anti-vaccination beliefs have become much more prevalent in far-right and White supremacist groups,[33] while at the same time anti-vaccination communities – even those that don’t see themselves as political or ideological[34] – have adopted more and more far-right attitudes. Innocuous topics such as health or fitness may lead to these communities which, along with being dangerous in themselves, are now vectors for hate content.

Videos

As well as being the most popular online platform for young Canadians, YouTube is also where young people are most likely to turn when they want to learn something.[35] Perhaps not surprisingly, it’s also one of the sites where youth are likely to encounter hate content: in one study, one in six self-identified fascists said their radicalization began as a result of videos they’d seen on YouTube.[36]

As with cloaked hate sites, many of these videos take an educational approach; flattering viewers by promising to share facts that have been “suppressed” by the authorities. However, rather than serving as sources for school assignments, these videos are “quick to watch, easy to share and spread.”[37]

In addition to spreading their message, hate groups use YouTube to raise funds, using features such as “Super Chat” that encourage viewers to donate money to support their favourite online personalities.[38]

Social networks

Online social networks encourage group interaction and strengthen connections between group members. While this has made it possible for a wider range of voices to be heard, sites such as Facebook and Twitter are used in order to join and recruit group members through fan pages, group feeds or hateful groups. As youth move to newer social networks, such as Instagram, Snapchat and TikTok, both prejudiced speech[39] and outright hate content[40] follow them there, as well. Additionally, social networks allow hate groups to “whitewash” their content because, to the casual viewer, when shared on Facebook or Twitter, a link to a hate site looks the same as a link to a legitimate source of news or information.

While youth most often report encountering prejudice online on large public social networks such as Facebook and Twitter,[41] niche sites such as 4chan play an important role in exposing youth to more severe hate content and have been credited by many hate group members as the site of their own radicalization.[42] As well, hate groups have responded to more aggressive efforts by mainstream social networks to remove them by establishing their own services. Though none of these has been especially successful, they provide a space where the most extreme phases of radicalization often take place.[43]

Social media’s ability to help youth interested in hate find friends and mentors is the key to developing the sense of group identity that’s so important in the radicalization process. For hate groups, the greatest advantage of social media isn’t that it allows hate groups to reach youth, but that it allows youth to disseminate hate material themselves through sharing and liking content, recommending videos to friends and expressing prejudiced views in their online communities.

Memes

Internet memes – a meme can be defined as “an activity, concept, catchphrase, or piece of media that spreads, often ... for humorous purposes, from person to person via the Internet”[44] – appear in nearly all online spaces and are particularly popular in social networks that are based on sharing images, such as Snapchat and Instagram.

Memes have been eagerly adopted by hate movements for a number of reasons. Gianluca Stringhini, a professor at Boston University who studies memes, describes them as “tools for information warfare”:[45] because they are a form of “inside joke,” they can be used to send different messages to viewers at different stages of the radicalization process.

As well as allowing hate groups to cloak their message in humour and irony, memes have been used to revive and spread stereotyped images that serve both to confirm viewers’ prejudices and to give them (and the person sharing the meme) the thrill of being “edgy.”[46]

Finally, memes by their nature spread rapidly. While most are fairly innocent, far-right groups have had considerable success in bringing them to mainstream platforms.[47] In fact, studies of memes have found that nearly all originate in two online forums dominated by hate content.[48]

Search and recommendation algorithms

Youth may also encounter hate material as a result of the search and recommendation algorithms that underpin most of the internet. Hate groups take advantage of “data voids,” situations where little or no content exists online about a particular search term. For example, with a major news event, they will rush out conspiracy videos about the news story before legitimate outlets can cover it, or by coining terms or phrases that, when searched, will lead to their content.[49] This effect can be amplified when legitimate news organizations, covering hate incidents, publicize terms that will lead “any disillusioned young man [to] start googling his way down a rabbit hole just by using that search term.”[50]

While hate groups work hard to manipulate these algorithms, there is also evidence that the algorithms themselves favour more and more extreme content “in a feedback loop that, link by link, guides new audiences to toxic ideas.”[51] Recommendation algorithms, such as those used by social networks and video sites to suggest (or in some cases decide) what content you should view, are designed to promote content that will keep users engaged; intentionally or unintentionally, these algorithms have often concluded that “outrage equals attention.”[52]

These algorithms are hugely influential – according to YouTube, seventy percent of views come from its recommendations,[53] while nearly ninety-five percent of views on TikTok come from the algorithmically curated “ForYou” page[54]– and they can play an important role in exposing youth to progressively more radical content. One former extremist said that “I think YouTube certainly played a role in my shift to the right because through the recommendations I got…It led me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”[55]

Dogwhistling

Because of the networked nature of digital media, hate groups can also simultaneously put out extreme content that is designed to energize those who have already been radicalized and share content designed to seem reasonable to more general audiences. This is called dogwhistling because it’s like blowing a whistle that only dogs can hear. Someone who’s already higher up the pyramid feels like they have special knowledge others don’t, like they’re part of an “inside joke,” while those at lower levels aren’t turned off by overt hate messages.

Dogwhistles aren’t used just to conceal hate content, however. They are themselves radicalization tools, as searching for, consuming or engaging with content that uses them will – through the design of hate groups and through the algorithmic logic of the internet, which prioritizes content that grabs your attention and makes you feel strongly about it  – draw you to higher levels of the pyramid.

For the same reason, arguments are often made in terms of pseudo-science, misinterpretations of fields such as neurology, genetics and psychology that can make prejudice seem reasonable. For example, one factor often mentioned by those who’ve moved into racist movements is so-called “race realism,” which draws on outdated and distorted scientific views to provide a rationale for racism,[56] while others describe being led from the online atheism community, which has a strong anti-feminist component, to radical misogynist groups.[57]

Ironically, hate groups appeal to the mainstream by casting their arguments as “skepticism,” with the suggestion that they are giving the audience the “real” truth that has been suppressed by the mainstream media.[58] Similarly, some groups present themselves as neutral proponents of “debate” on issues that are beyond debating, such as whether the Holocaust happened.

Irony

The roots of modern hate in “troll” culture are a big part of why modern hate groups mask their message as humour or irony.[59] This doesn’t diminish the impact of the message – indeed, prejudiced jokes have been found to have a significant effect on what attitudes are considered to be acceptable in a community[60] – but it does make it easier to move into the mainstream.

As well, as Alice Marwick writes, “irony has a strategic function. It allows people to disclaim a real commitment to far-right ideas while still espousing them.”[61] The style guide for one niche hate site states this clearly: “The unindoctrinated should not be able to tell if we are joking or not.”[62]

Irony can also allow youth in the early stages of radicalization to deny to themselves their commitment to this community. Many of those radicalized on 4chan describe initially being drawn to its more extreme forums by the thrill of “ironically” consuming racist memes, especially once those began to proliferate after the 2012 re-election of Barack Obama.[63]

Finally, irony serves as one more way of matching the message to the target’s readiness to receive it: “Masking ideology in humor also allows plausible deniability for when the recruiter pushes the target too hard too fast, it is easy to lighten up and back off as ‘just a joke’ and then later once the target’s guard lowers to resume the indoctrination.”[64] Or, as one site’s style guide itself states, “The goal is to continually repeat the same points, over and over and over and over again. The reader is at first drawn in by curiosity or the naughty humor, and is slowly awakened to reality by repeatedly reading the same points.”[65]

Keegan Hankes, an intelligence analyst at the Southern Poverty Law Center, identifies irony and troll culture as a key factor in the success of more recent hate movements: “The thing that the alt-right does better than white nationalism has done, in the years that I’ve been tracking it, is it gets young people involved.”[66] As one user of a White supremacist forum put it, “If you can get them to laugh, you can get them on our side.”[67]


 

[1] Gais, H., & Squire, M. (2023) Alt-tech sites continue to provide safe haven for hard right extremists. Southern Poverty Law Center.

[2] Kuznekoff, J. H., & Rose, L. M. (2012). Communication in multiplayer gaming: Examining player responses to gender cues. New Media & Society, 15(4), 541-556. doi:10.1177/1461444812458271

[3] Brisson-Boivin, K (2019). “Pushing Back Against Hate Online.” MediaSmarts. Ottawa.

[4] Johnson, N. F., Leahy, R., Restrepo, N. J., Velasquez, N., Zheng, M., Manrique, P., ... & Wuchty, S. (2019). Hidden resilience and adaptive dynamics of the global online hate ecology. Nature, 573(7773), 261-265.

[5] St-Amant, M., Jones, D., King, M., & McCoy J. (2022) Hate, Extremism and Terrorism In Alberta, Canada, and Beyond: The Shift from 2019 to 2022. Organization for the Prevention of Violence

[6] McCauley, Clark and Sophia Moskalenko. “Mechanisms of Political Radicalization: Pathways Toward Terrorism”, Terrorism and Political Violence, 20:3 (2008), 415-433.

[7] Kteily, N. S., & Landry, A. P. (2022). Dehumanization: trends, insights, and challenges. Trends in cognitive sciences.

[8] Landry, A. P., Ihm, E., & Schooler, J. W. (2022). Hated but still human: Metadehumanization leads to greater hostility than metaprejudice. Group Processes & Intergroup Relations, 25(2), 315-334.

[9] Luft, A. (2019). Dehumanization and the normalization of violence: It’s not what you think. Social Science Research Council: Items, 21.

[10] Ma, C. (2021) YouTube Celebrities and the Rhetoric of White Supremacy. Items. https://items.ssrc.org/extremism-online/mainstreaming-resentment-youtube-celebrities-and-the-rhetoric-of-white-supremacy/

[11] Payne, K. (2019) The Truth About Anti-White Discrimination. Scientific American.

[12] Futrelle, D. (2019) Hate lies and video games: Six ways Gamergate poisoned gaming and ruined the online world for the rest of us. We Hunted the Mammoth. https://www.wehuntedthemammoth.com/2019/08/24/hate-lies-and-video-games-six-ways-gamergate-poisoned-gaming-and-ruined-the-online-world-for-the-rest-of-us/

[13] Volkan, V. D. (2008). Massive Traumas at the Hand of ‘Others’: Large Group Identity Issues, Transgenerational Transmissions,‘Chosen Traumas’ and their Consequences. Retrieved on May, 5, 2013.

[14] Brack, C. E. (2015). The Conspiracist Style in American Extremism: An Education-Based Approach for Combating Conspiracism (Doctoral dissertation, Master’s Thesis. National Defense University. https://doi. org/10.5281/zenodo. 1211408).

[15] Volkan, V. D. (2005). Large-group identity, large-group regression and massive violence. International Newsletter of the Group-Analytic Society, 30, 8-32.

[16] Rushkoff, D. (2021) Tolerating the intolerant. Medium. https://medium.com/team-human/tolerating-the-intolerant-1721f40b7dbc

[17] Jigsaw. (2022) The Conspiracy Theories Issue. The Current. https://jigsaw.google.com/the-current/conspiracy-theories/

[18] Martínez, C. A., van Prooijen, J. W., & Van Lange, P. A. (2022). A threat-based hate model: How symbolic and realistic threats underlie hate and aggression. Journal of Experimental Social Psychology, 103, 104393.

[19] Brack, C. E. (2015). The Conspiracist Style in American Extremism: An Education-Based Approach for Combating Conspiracism (Doctoral dissertation, Master’s Thesis. National Defense University. https://doi. org/10.5281/zenodo. 1211408).

[20] Scrivens, R. (2022). Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists. Terrorism and Political Violence, 1-21.

[21] Williams, et al. (2021) The Online Extremist Ecosystem. Retrieved from https://apps.dtic.mil/sti/pdfs/AD1154178.pdf

[22] Tiffany, K. (2020) The Women Making Conspiracy Theories Beautiful. The Atlantic.

[23] Woodrow, H. (2022) “He’s Talking About The People In Power”: Ye West And Other Pop Culture Conspiracy Theories. Antihate.ca.

[24] Gais, H., & Squire, M. (2023) Alt-tech sites continue to provide safe haven for hard-right extremists. Southern Poverty Law Center.

[25] Beirich, H. (2019, February 20). The Year in Hate: Rage Against Change. Retrieved April 25, 2019, from https://www.splcenter.org/fighting-hate/intelligence-report/2019/year-hate-rage-against-change

Slide: Quote from McNamee, L., Pena, J., & Peterson, B. “A Call to Educate, Participate, Invoke and Indict: Understanding the Communication of Online Hate Groups,” Communication Monographs, 77:2 (2010), 257-280.

[26] Zannettou, S., Caulfield, T., Blackburn, J., De Cristofaro, E., Sirivianos, M., Stringhini, G., & Suarez-Tangil, G. (2018, October). On the origins of memes by means of fringe web communities. In Proceedings of the internet measurement conference 2018 (pp. 188-202).

[27] Daniels, J. (2008). Race, Civil Rights, and Hate Speech in the Digital Era. Learning Race and Ethnicity: Youth and Digital Media (pp. 129-154). Cambridge, MA: MIT Press.; McNamee, Pena & Peterson (2010)

[28] The Groups. (2010, January 29). Retrieved from https://www.splcenter.org/fighting-hate/intelligence-report/2015/groups

[29] Rich, J. (2019, January 23). Schools must equip students to navigate alt-right websites that push fake news. Retrieved from https://theconversation.com/schools-must-equip-students-to-navigate-alt-right-websites-that-push-fake-news-97166

[30] Meddaugh, P. M. (2009). Hate Speech or “Reasonable Racism?” The Other in Stormfront. Journal of Mass Media Ethics, 24(4), 251-268.

[31] Velásquez, N., Leahy, R., Restrepo, N. J., Lupu, Y., Sear, R., Gabriel, N., ... & Johnson, N. F. (2020). Hate multiverse spreads malicious COVID-19 content online beyond individual platform control. arXiv preprint arXiv:2004.00673.

[32] Logan, N. (2023) How antisemitic tropes are being used to target the LGBTQ community. CBC News.

[33] Walter, D., Ophir, Y., Lokmanoglu, A. D., & Pruden, M. L. (2022). Vaccine discourse in white nationalist online communication: A mixed-methods computational approach. Social Science & Medicine, 298, 114859.

[34] Smith, R et al. (2020). Under the Surface: Covid-19 vaccine narratives, misinformation and data deficits on social media. First Draft.

[35] Beyond Millennials: The Next Generation of Learners (Rep.). (2018). Pearson.

[36] From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled". (2018, October 11). Retrieved April 25, 2019, from https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/

[37] Deo. (2017, November 23). How White Nationalism Courts Internet Nerd Culture. Retrieved from https://medium.com/@DeoTasDevil/how-white-nationalism-courts-internet-nerd-culture-b4ebad07863d

[38] Warzel, C. (2019, March 19). We're Asking the Wrong Questions of YouTube and Facebook After New Zealand. The New York Times. Retrieved April 25, 2019, from https://www.nytimes.com/2019/03/19/opinion/facebook-youtube-new-zealand.html

[39] Brisson-Boivin, K (2019). “Pushing Back Against Hate Online.” MediaSmarts. Ottawa.

[40] Cox, J. (2018, December 18). TikTok Has a Nazi Problem. Retrieved April 25, 2019, from https://motherboard.vice.com/en_us/article/yw74gy/tiktok-neo-nazis-white-supremacy

[41] Brisson-Boivin, K (2019). “Pushing Back Against Hate Online.” MediaSmarts. Ottawa.

[42] From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled". (2018, October 11). Retrieved April 25, 2019, from https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/

[43] Beirich, H. (2019, February 20). The Year in Hate: Rage Against Change. Retrieved April 25, 2019, from https://www.splcenter.org/fighting-hate/intelligence-report/2019/year-hate-rage-against-change

[44] Internet meme. (2019, May 13). Retrieved from https://en.wikipedia.org/wiki/Internet_meme

[45] Mencimer, S. (2019, April 2). “The Left Can’t Meme”: How Right-Wing Groups Are Training the Next Generation of Social Media Warriors. Mother Jones.

[46] Marsi, F. (2019, March 26). How the far right is weaponising irony to spread anti-Muslim hatred. Retrieved April 25, 2019, from https://www.thenational.ae/world/europe/how-the-far-right-is-weaponising-irony-to-spread-anti-muslim-hatred-1.841430

[47] Mencimer, S. (2019, April 2). “The Left Can’t Meme”: How Right-Wing Groups Are Training the Next Generation of Social Media Warriors. Mother Jones.

[48] The Arxiv (2018, June 11). This is where internet memes come from. Retrieved from https://www.technologyreview.com/s/611332/this-is-where-internet-memes-come-from/

[49] Golebiewski, M., & Boyd, D. (2018). Data Voids: Where Missing Data Can Easily Be Exploited (Rep.). Data & Society.

[50] Breland, A. (2019, March 15). How the Suspected New Zealand Gunman Weaponized the Internet. Mother Jones.

[51] Editorial. (2018, November 24). The New Radicalization of the Internet. The New York Times. Retrieved April 25, 2019.

[52] Bergen, M. (2019, April 2). YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant. Retrieved April 25, 2019, from https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant

[53] Solsman, J. E. (2018, January 10). YouTube's AI is the puppet master over most of what you watch. Retrieved April 25, 2019, from https://www.cnet.com/news/youtube-ces-2018-neal-mohan/

[54] Nishijima, I. (2021). TikTok’s secret money maker is actually its algorithms. Medium. Retrieved from: https://medium.com/headlineasia/tiktoks-biggest-money-maker-is-actually-an-algorithm-879c5518db53

[55] Weill, K. (2018, December 17). How YouTube Built a Radicalization Machine for the Far-Right. Retrieved from https://www.thedailybeast.com/how-youtube-pulled-these-men-down-a-vortex-of-far-right-hate?ref=home

[56] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right

[57] Weill, K. (2018, December 17). How YouTube Built a Radicalization Machine for the Far-Right. Retrieved from https://www.thedailybeast.com/how-youtube-pulled-these-men-down-a-vortex-of-far-right-hate?ref=home

[58] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right

[59] Marwick, A., & Lewis, R. (2017) Media Manipulation and Disinformation Online. Data & Society.

[60] Ford, T. E., & Ferguson, M. A. (2004). Social consequences of disparagement humor: A prejudiced norm theory. Personality and Social Psychology Review, 8, 79-94.

[61] Wilson, J. (2017, March 23). Hiding in plain sight: How the 'alt-right' is weaponizing irony to spread fascism. The Guardian. Retrieved from https://www.theguardian.com/technology/2017/may/23/alt-right-online-humor-as-a-weapon-facism

54 Feinberg, A., & Feinberg, A. (2017, December 14). This Is The Daily Stormer's Playbook. Retrieved from https://www.huffingtonpost.ca/entry/daily-stormer-nazi-style-guide_n_5a2ece19e4b0ce3b344492f2

[63] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right

[64] Deo. (2017, November 23). How White Nationalism Courts Internet Nerd Culture. Retrieved from https://medium.com/@DeoTasDevil/how-white-nationalism-courts-internet-nerd-culture-b4ebad07863d

[65] Pink, A. (2017, December 13). Leaked 'Playbook' Shows How The Daily Stormer Makes Anti-Semitism Cool. Retrieved from https://forward.com/fast-forward/390000/leaked-playbook-shows-how-the-daily-stormer-makes-anti-semitism-cool/

[66] Collins, C. “What is the ‘Alt-Right’?” Teaching Tolerance 1.57 (2017)

[67] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right