Deconstructing Online Hate

The internet has become a prime means of communication worldwide and this unprecedented global reach – combined with the difficulty in tracking communications – makes it an ideal tool for extremists to repackage old hatred, raise funds, and recruit members. As the internet has grown and changed, hate groups and movements have adapted, creating websites, forums and social network profiles, becoming active in spaces such as online games, and even creating parallel versions of services such as Twitter and Wikipedia.

Cultures Of Hatred

Youth don’t have to encounter overt hate speech to be exposed to hate online. Because of the networked nature of the internet, it’s possible for them to encounter it entirely by accident: many online spaces contain “cultures of hatred,” communities in which racism, misogyny and other prejudices are normalized. Sites such as 4chan and some parts of Reddit make being offensive a virtue, while sexist, racist and homophobic slurs are common in online games.[1] MediaSmarts’ research has found that youth witness prejudiced speech on all of the platforms they use routinely.[2] Although young people may not participate in this prejudiced speech, they often cannot help but find their own views and opinions influenced by the values of the communities that they are a part of. As well, because of the connected nature of digital media, all of these communities are connected, which means that small, mostly closed communities such as 4chan (in which radicalization occurs more rapidly) can have a significant influence on larger, more open communities either by actively “trolling” them[3] or simply by participating in the community.

Forms of Online Hate

Websites

The most public face of online hate consists of websites maintained by hate groups: in fact, hate groups were some of the earliest adopters of the internet, moving their existing materials wholesale first onto discussion groups and then to Web pages.[4] While Web publishing is much cheaper and easier than print publishing, the costs and technical demands of creating a website meant that in the early days of the internet only the larger hate groups had an online presence. Today, with the easy availability of blogging platforms and simple Web publishing software, hate sites have proliferated. These sites have evolved to mimic popular commercial websites in a variety of ways, with many offering audiovisual material and discussion forums, and some featuring professional-looking layouts, with graphics and writing that mimic popular sites such as Buzzfeed,[5] and reach audiences as large as four million page views per month.[6]

Many hate groups also maintain disguised, or cloaked, hate sites.[7] Overt hate sites actively promote hatred towards other groups while showcasing racist propaganda or offering hate-based communities online. In contrast, cloaked hate websites intentionally perpetuate hatred through more implicit and deceptive messages, masquerading as legitimate sources of opinion or information with neutral-sounding names such as “National Policy Institute” (a supposed “think tank” identified as a White supremacist organization by the Southern Poverty Law Center.)[8]  The purpose of these sites is to reach young people before they have the knowledge or critical awareness to recognize them for what they are – one study found that over 60 percent of middle and high school students have drawn on them as research sources for school assignments[9] — and before they have the ability to critically evaluate cloaked sites.[10]

At first glance, cloaked websites appear to be authoritative and professional – based upon core values of ‘truth and free speech.’ Many attempt to conceal a racist agenda behind a more moderate message of “education” or “debate”. This is true not only of sites which masquerade as mainstream sources of information, but of more overt hate sites as well, which position themselves as providing visitors with “the real story.” Many overt hate sites also attack the mainstream educational system, and call on supporters to educate their friends, families and communities about the “real” truth.

A related phenomenon is what has been called ‘reasonable racism’.[11] These are hate sites which present their content as political provocation or debate, relying on pseudo-science and twisted logic rather than outright expressions of hate. Studies suggest that while stereotypical images and messages on hate sites produce a more immediate influence upon viewers, the effects of these images are not lasting, with the exception of those individuals who are already receptive to extremist or hate messages. However, hate messages that are more implicit, such as those on these types of websites, have more long-term persuasive power.[12]

Music

Many teenagers turn to musical genres and subcultures to help define their identity, and hate music producers take advantage of this. While hate rock is no longer the cash cow it once was for the larger hate groups,[13] and iTunes and Spotify have removed most of it from their services, it remains widespread not just on hate sites, but on file-sharing sites and mainstream services such as YouTube. [14]

Videos

As well as being the most popular online platform for young Canadians, YouTube is also where young people are most likely to turn when they want to learn something.[15] Perhaps not surprisingly, it’s also one of the sites where youth are likely to encounter hate content: in one study, one in six self-identified fascists said that their radicalization began as a result of videos they’d seen on YouTube.[16]

As with cloaked hate sites, many of these videos take an educational approach; flattering viewers by promising to share facts that have been “suppressed” by the authorities. However, rather than serving as sources for school assignments, these videos are “quick to watch, easy to share and spread.”[17]

Hate groups and individuals not only use YouTube to spread their message, but also to raise funds, using features such as “Super Chat” that encourage viewers to donate money to support their favourite online personalities.[18]

Games

Hate groups also make an effort to appeal to youth through video games, though the games tend to be more rudimentary and less effective than music: it’s easier to create garage-band quality punk or heavy metal music than to make a quality video game. Most are simple games with little replay value; it’s hard to imagine youth spending much time playing any of these, but their appeal may be their sheer outrageousness.

Gaming culture, however, has proven to be fertile ground for hate movements. Through websites where video games are discussed such as some Reddit sub forums, as well as video sites such as YouTube and Twitch, where gamers play for a worldwide audience, hate movements foster and amplify the racism, homophobia, and misogyny that are already a part of the experience in many gaming communities.[19]

Social Networks

Online social networks encourage group interaction and strengthen connections between group members. While this has made it possible for a wider range of voices to be heard, sites such as Facebook and Twitter are used in order to join and recruit group members through fan pages, group feeds, or hateful groups. As youth move to newer social networks, such as Instagram, Snapchat, and TikTok, both prejudiced speech[20] and outright hate content[21] follow them there as well. Additionally, social networks allow hate groups to “whitewash” their content because, to the casual viewer, when shared on Facebook or Twitter, a link to a hate site looks the same as a link to a legitimate source of news or information.

While large public social networks such as Facebook and Twitter are where youth most often report encountering prejudice online,[22] niche sites such as 4chan play an important role in exposing youth to more severe hate content, and have been credited by many hate group members as the site of their own radicalization.[23] As well, hate groups have responded to more aggressive efforts by mainstream social networks to remove them by establishing their own services; though none of these has been especially successful, they provide a space where the most extreme phases of radicalization often take place.[24]

The ability of social media to help youth interested in hate find friends and mentors is the key to developing the sense of group identity that’s so important in the radicalization process. For hate groups, the greatest advantage of social media is not that it allows hate groups to reach youth, but that it allows youth to disseminate hate material themselves through sharing and liking content, recommending videos to friends, and expressing prejudiced views in their online communities.

Memes

Internet memes – a meme can be defined as “an activity, concept, catchphrase, or piece of media that spreads, often ... for humorous purposes, from person to person via the Internet”[25] – appear in nearly all online spaces, and are particularly popular in social networks that are based on sharing images, such as Snapchat and Instagram.

Memes have been eagerly adopted by hate movements for a number of reasons. Gianluca Stringhini, a professor at Boston University who studies memes, describes them as “tools for information warfare”:[26] because they are a form of “inside joke,” they can be used to send different messages to viewers at different stages of the radicalization process.

As well as allowing hate groups to cloak their message in humour and irony, memes have been used to revive and spread stereotyped images that serve both to confirm viewers’ prejudices and to give them (and the person sharing the meme) the thrill of being “edgy.”[27]

Finally, memes by their nature spread rapidly. While most are fairly innocent, far-right groups have had considerable success in bringing them to mainstream platforms.[28] In fact, studies of memes have found that nearly all originate in two online forums dominated by hate content.[29]

Search and Recommendation Algorithms

Youth may also encounter hate material as a result of the search and recommendation algorithms that underpin most of the internet. Hate groups take great pains to manipulate these algorithms – for example, for many years an anti-Semitic site was the top result when the word “Jew” was searched on Google because the word was heavily used both on the site and on other sites that linked to that page. Hate groups also take advantage of “data voids,” situations where little or no content exists online about a particular search term; for example with a major news event, they will  rush out conspiracy videos about the news story before legitimate outlets can cover it, or by coining terms or phrases that, when searched, will lead to their content.[30] This effect can be amplified when legitimate news organizations, covering hate incidents, publicize terms that will lead “any disillusioned young man [to] start googling his way down a rabbit hole just by using that search term.”[31]

While hate groups work hard to manipulate these algorithms, there is also evidence that the algorithms themselves favour more and more extreme content “in a feedback loop that, link by link, guides new audiences to toxic ideas.”[32] Recommendation algorithms, such as those used by social networks and video sites to suggest (or in some cases decide) what content you should view, are designed to promote content that will keep users engaged; intentionally or unintentionally, these algorithms have often concluded that “outrage equals attention.”[33]

These algorithms are hugely influential – according to YouTube, seventy percent of views come from its recommendations[34] – and they can play an important role in exposing youth to progressively more radical content. One former extremist said that “I think YouTube certainly played a role in my shift to the right because through the recommendations I got…It led me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”[35]

Emerging Platforms

Hate movements are constantly moving to new platforms, both because existing platforms may take action to limit their presence on the platform, and because they follow youth as they adopt new apps and social networks. For instance, Islamic extremist groups, which have been aggressively removed from major platforms such as Facebook and YouTube, have moved to chat apps such as RocketChat and Discord[36], a gaming-focused messaging app also popular with far-right groups.[37] These groups have also adopted cross-platform tools such as Disqus, which provides the architecture for the comments sections of more than half a million websites.[38] The reach of this commenting network, which ranges from tiny blogs to partisan sites such as Breitbart, to large mainstream sources like The Atlantic, makes it easy for hate movements to organize in small communities and then spread their message in larger ones. As E.J. Gibney, a researcher who tracks hate content on Disqus, has put it, “When the worst parts of the internet want to try and be a little more mainstream, they go to Disqus.”[39]

Moving from Fringe to Mainstream

Taking their attitudes and ideas from the fringe to the mainstream is one of the main goals of any hate movement. The networked nature of digital technology makes this much easier, first because anyone can now publish content and distribute it potentially worldwide, and because this network of connections makes it much easier for ideas, attitudes and content to move between platforms and communities. (For example, one meme developed on a Reddit forum was repeated on Fox News four days later.)[40]

Mainstream media, which still reach the largest audiences (and, in particular, the largest number of voters in most countries) are an essential part of this process as well, particularly when they take the fact that something is being discussed on social media as evidence that it’s newsworthy. As technology columnist Farhad Manjoo put it, “extreme points of views ... that couldn't have been introduced into national discussion in the past are being introduced now by this sort of entry mechanism... people put it on blogs, and then it gets picked up by cable news, and then it becomes a national discussion.”[41]

Similarly, traditional media can play a role in exposing audiences to terms that, when searched, will lead viewers to extremist content. Sometimes this is with the best of intentions: for example, CNN anchor Anderson Cooper asked David Hogg, a survivor of the Parkland school shooting who had become a gun control advocate, about rumours that he was a “crisis actor” paid to take part in a fake event. Not surprisingly, most searches for the term “crisis actor” led to content supporting the conspiracy theory.[42] Other times, though, hate messages spread between media outlets that overlap one another on the ideological spectrum, such as when ideas move from White supremacist forums, to alt-right Twitter and YouTube personalities, to conservative TV commentators, and finally to mainstream TV news.[43]

Because of the networked nature of digital media, hate groups can also simultaneously put out extreme content that is designed to energize those who have already been radicalized, and share content designed to seem reasonable to more general audiences. Even the Daily Stormer website, whose audience is already expected to be open to racist and sexist humour, takes pains to put on the trappings of legitimacy:

In explaining why a writer should heavily block-quote mainstream news articles, the guide notes that it allows writers to borrow some of mainstream media’s air of scrupulousness and good hygiene. 

They also take pains to try and not turn off any readers who might be just starting to flirt with white supremacy. Again, this manifests itself in a concerted effort to mix the vicious and the mundane.[44]

For the same reason, arguments are often made in terms of pseudo-science, misinterpretations of fields such as neurology, genetics and psychology that can make prejudice seem reasonable; for example, one factor often mentioned by those who’ve moved into racist movements is so-called “race realism,” which draws on outdated and distorted scientific views to provide a rationale for racism[45], while others describe being led from the online atheism community, which has a strong anti-feminist component, to radical misogynist groups.[46]

Ironically, hate groups’ appeal to the mainstream by casting their arguments as “skepticism,” with the suggestion that they are giving the audience the “real” truth that has been suppressed by the mainstream media.[47] Similarly, some groups present themselves as neutral proponents of “debate” on issues that are beyond debating, such as whether the Holocaust happened.

For this reason, some experts have argued for a public health approach to hate content, making it more difficult to spread it between networks and targeting “superspreaders,” because “a small number of people have a massive impact.”[48] However, hate groups are often skilful at avoiding filters, replacing well-known slurs with acronyms or in-joke equivalents – using, for example, words such as “Google” and “Skype” that platforms will be unwilling to block as stand-ins for racist and anti-Semitic slurs.[49] This approach – or, similarly, campaigns to appropriate things like drinking milk or making the OK sign as hate symbols – not only allows groups to get around algorithms designed to block them, it also provides youth with the thrill of “trolling” others (provoking people in order to enjoy seeing them react).[50]

Irony

The roots of modern hate in “troll” culture are a big part of why modern hate groups mask their message as humour or irony.[51] This does not diminish the impact of the message – indeed, prejudiced jokes have been found to have a significant effect on what attitudes are considered to be acceptable in a community[52] – but it does make it easier to move it into the mainstream.

As well, as Alice Marwick writes, “irony has a strategic function. It allows people to disclaim a real commitment to far-right ideas while still espousing them.”[53] The style guide for the Daily Stormer states this clearly: “The unindoctrinated should not be able to tell if we are joking or not.”[54]

Irony can also allow youth in the early stages of radicalization to deny to themselves their commitment to this community: many of those radicalized on 4chan describe initially being drawn to its more extreme forums by the thrill of “ironically” consuming racist memes, especially once those began to proliferate after the 2012 re-election of Barack Obama.[55]

Finally, it serves as one more way of matching the message to the target’s readiness to receive it: “Masking ideology in humor also allows plausible deniability for when the recruiter pushes the target too hard too fast, it is easy to lighten up and back off as ‘just a joke’ and then later once the target’s guard lowers to resume the indoctrination.”[56] Or, as the Daily Stormer’s style guide itself states, “The goal is to continually repeat the same points, over and over and over and over again. The reader is at first drawn in by curiosity or the naughty humor, and is slowly awakened to reality by repeatedly reading the same points.”[57]

Keegan Hankes, an intelligence analyst at the Southern Poverty Law Center, identifies irony and troll culture as a key factor in the success of more recent hate movements: “The thing that the alt-right does better than white nationalism has done, in the years that I’ve been tracking it, is it gets young people involved.”[58] As one user of a White supremacist forum put it, “If you can get them to laugh, you can get them on our side.”[59]

 

[1] Kuznekoff, J. H., & Rose, L. M. (2012). Communication in multiplayer gaming: Examining player responses to gender cues. New Media & Society, 15(4), 541-556. doi:10.1177/1461444812458271
[2] Brisson-Boivin, K (2019). “Pushing Back Against Hate Online.” MediaSmarts. Ottawa.
[3] Tynes et al. Online Racial Discrimination and Psychological Adjustment Among Adolescents. Journal of Adolescent Health, 2008; 43 (6): 565
[4] Farber, B. M. (1997). From marches to modems: A report on organized hate in Metropolitan Toronto. Toronto: METRO, Access and Equity Centre.
[5] Hankes, K. (2017, February 9). Eye of the Stormer. Retrieved April 25, 2019, from https://www.splcenter.org/fighting-hate/intelligence-report/2017/eye-stormer
[6] Beirich, H. (2019, February 20). The Year in Hate: Rage Against Change. Retrieved April 25, 2019, from https://www.splcenter.org/fighting-hate/intelligence-report/2019/year-hate-rage-against-change
[7] Daniels, J. (2008). Race, Civil Rights, and Hate Speech in the Digital Era. Learning Race and Ethnicity: Youth and Digital Media (pp. 129-154). Cambridge, MA: MIT Press.; McNamee, Pena & Peterson (2010)
[8] The Groups. (2010, January 29). Retrieved from https://www.splcenter.org/fighting-hate/intelligence-report/2015/groups
[9] Lewis, B. (2018). Alternative Influence: Broadcasting the Reactionary Right on YouTube (Rep.). Data & Society.
[10] Daniels, J. (2008). Race, Civil Rights, and Hate Speech in the Digital Era. Learning Race and Ethnicity: Youth and Digital Media (pp. 129-154). Cambridge, MA: MIT Press.; McNamee, Pena & Peterson (2010)
[11] Meddaugh, P. M. (2009). Hate Speech or “Reasonable Racism?” The Other in Stormfront. Journal of Mass Media Ethics, 24(4), 251-268.
[12] Amon, K. (2010). Grooming for Terror: the Internet and Young People. Psychiatry, Psychology & Law, 17(3), 424-437.; Chhabra, C. (2010). Mining You-Tube to Discover Extremist Videos, Users and Hidden Communities. Information Retrieval Technology, 6458, 13-24 .; Lee, E., & Leets, L. (2002). Persuasive Storytelling by Hate Groups Online - Examining Its Effects on Adolescents. American Behavioural Scientist, 45(6), 927-957.
[13] Kim, T.K. (2006). White Noise. Intelligence Report, 121.
[14] Hogan, M., & Hogan, M. (2017, August 17). Is White Power Music Finally Getting Booted from the Internet? Retrieved from https://pitchfork.com/thepitch/is-white-power-music-finally-getting-booted-from-the-internet/
[15] Beyond Millennials: The Next Generation of Learners (Rep.). (2018). Pearson.
[16] From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled". (2018, October 11). Retrieved April 25, 2019, from https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/
[17] Deo. (2017, November 23). How White Nationalism Courts Internet Nerd Culture. Retrieved from https://medium.com/@DeoTasDevil/how-white-nationalism-courts-internet-nerd-culture-b4ebad07863d
[18] Warzel, C. (2019, March 19). We're Asking the Wrong Questions of YouTube and Facebook After New Zealand. The New York Times. Retrieved April 25, 2019, from https://www.nytimes.com/2019/03/19/opinion/facebook-youtube-new-zealand.html
[19] Weill, K. (2018, December 17). How YouTube Built a Radicalization Machine for the Far-Right. Retrieved from https://www.thedailybeast.com/how-youtube-pulled-these-men-down-a-vortex-of-far-right-hate?ref=home
[20] Brisson-Boivin, K (2019). “Pushing Back Against Hate Online.” MediaSmarts. Ottawa.
[21] Cox, J. (2018, December 18). TikTok Has a Nazi Problem. Retrieved April 25, 2019, from https://motherboard.vice.com/en_us/article/yw74gy/tiktok-neo-nazis-white-supremacy
[22] Brisson-Boivin, K (2019). “Pushing Back Against Hate Online.” MediaSmarts. Ottawa.
[23] From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled". (2018, October 11). Retrieved April 25, 2019, from https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/
[24] Beirich, H. (2019, February 20). The Year in Hate: Rage Against Change. Retrieved April 25, 2019, from https://www.splcenter.org/fighting-hate/intelligence-report/2019/year-hate-rage-against-change
[25] Internet meme. (2019, May 13). Retrieved from https://en.wikipedia.org/wiki/Internet_meme
[26] Mencimer, S. (2019, April 2). “The Left Can’t Meme”: How Right-Wing Groups Are Training the Next Generation of Social Media Warriors. Mother Jones.
[27] Marsi, F. (2019, March 26). How the far right is weaponising irony to spread anti-Muslim hatred. Retrieved April 25, 2019, from https://www.thenational.ae/world/europe/how-the-far-right-is-weaponising-irony-to-spread-anti-muslim-hatred-1.841430
[28] Mencimer, S. (2019, April 2). “The Left Can’t Meme”: How Right-Wing Groups Are Training the Next Generation of Social Media Warriors. Mother Jones.
[29] The Arxiv (2018, June 11). This is where internet memes come from. Retrieved from https://www.technologyreview.com/s/611332/this-is-where-internet-memes-come-from/
[30] Golebiewski, M., & Boyd, D. (2018). Data Voids: Where Missing Data Can Easily Be Exploited (Rep.). Data & Society.
[31] Breland, A. (2019, March 15). How the Suspected New Zealand Gunman Weaponized the Internet. Mother Jones.
[32] Editorial. (2018, November 24). The New Radicalization of the Internet. The New York Times. Retrieved April 25, 2019.
[33] Bergen, M. (2019, April 2). YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant. Retrieved April 25, 2019, from https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant
[34] Solsman, J. E. (2018, January 10). YouTube's AI is the puppet master over most of what you watch. Retrieved April 25, 2019, from https://www.cnet.com/news/youtube-ces-2018-neal-mohan/
[35] Weill, K. (2018, December 17). How YouTube Built a Radicalization Machine for the Far-Right. Retrieved from https://www.thedailybeast.com/how-youtube-pulled-these-men-down-a-vortex-of-far-right-hate?ref=home
[36] Katz, R. (2019, February 06). A Growing Frontier for Terrorist Groups: Unsuspecting Chat Apps. Retrieved April 26, 2019, from https://www.wired.com/story/terrorist-groups-prey-on-unsuspecting-chat-apps/
[37] Garfield, B. (2019, March 22). The Secret Chats of the Alt-Right | On the Media. Retrieved April 26, 2019, from https://www.wnycstudios.org/story/secret-chats-alt-right-unicorn-riot
[38] Disqus Market Share and Competitor Report. (2019, April 26). Retrieved from https://www.datanyze.com/market-share/comment-systems/disqus
[39] Warzel, C. (2018, June 05). How The Alt-Right Manipulates The Internet's Biggest Commenting Platform. Retrieved from https://www.buzzfeednews.com/article/charliewarzel/how-the-alt-right-manipulates-disqus-comment-threads
[40] Collins, K., & Roose, K. (2018, November 4). Tracing a Meme From the Internet’s Fringe to a Republican Slogan. The New York Times. Retrieved April 26, 2019, from https://www.nytimes.com/interactive/2018/11/04/technology/jobs-not-mobs.html
[41] Does the Internet Help or Hurt Democracy [Television series episode]. (2010, June 1). In News Hour. PBS.
[42] Boyd, D. (2019, April 24). The Fragmentation of Truth. Retrieved April 26, 2019, from https://points.datasociety.net/the-fragmentation-of-truth-3c766ebb74cf
[43] Daniels, J. (2019, April 9). UN Keynote Speech: Racism in Modern Information and Communication Technologies. Presented to the 10thsession of the Ad Hoc Committee on the Elaboration of Complementary Standards to the International Convention on the Elimination of All Forms of Racial Discrimination. Geneva.
[44] Feinberg, A., & Feinberg, A. (2017, December 14). This Is The Daily Stormer's Playbook. Retrieved from https://www.huffingtonpost.ca/entry/daily-stormer-nazi-style-guide_n_5a2ece19e4b0ce3b344492f2
[45] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right
[46] Weill, K. (2018, December 17). How YouTube Built a Radicalization Machine for the Far-Right. Retrieved from https://www.thedailybeast.com/how-youtube-pulled-these-men-down-a-vortex-of-far-right-hate?ref=home
[47] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right
[48] Amend, A. (2018, February 10). Silicon Valley's Year in Hate. Retrieved April 26, 2019, from https://www.splcenter.org/fighting-hate/intelligence-report/2018/silicon-valleys-year-hate
[49] Meaker, M. (2019, February 28). 'Hate Is Way More Interesting Than That': Why Algorithms Can't Stop Toxic Speech Online. Retrieved from https://psmag.com/ideas/neo-nazi-hate-speech-foiling-algorithms
[50] New Hate and Old: The Changing Face of American White Supremacy (Rep.). (2018). Anti-Defamation League Center on Extremism. doi:https://www.adl.org/new-hate-and-old
[51] Marwick, A., & Lewis, R. (2017) Media Manipulation and Disinformation Online. Data & Society.
[52] Ford, T. E., & Ferguson, M. A. (2004). Social consequences of disparagement humor: A prejudiced norm theory. Personality and Social Psychology Review, 8, 79-94.
[53] Wilson, J. (2017, March 23). Hiding in plain sight: How the 'alt-right' is weaponizing irony to spread fascism. The Guardian. Retrieved from https://www.theguardian.com/technology/2017/may/23/alt-right-online-humor-as-a-weapon-facism[
[54] Feinberg, A., & Feinberg, A. (2017, December 14). This Is The Daily Stormer's Playbook. Retrieved from https://www.huffingtonpost.ca/entry/daily-stormer-nazi-style-guide_n_5a2ece19e4b0ce3b344492f2
[55] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right
[56] Deo. (2017, November 23). How White Nationalism Courts Internet Nerd Culture. Retrieved from https://medium.com/@DeoTasDevil/how-white-nationalism-courts-internet-nerd-culture-b4ebad07863d
[57] Pink, A. (2017, December 13). Leaked 'Playbook' Shows How The Daily Stormer Makes Anti-Semitism Cool. Retrieved from https://forward.com/fast-forward/390000/leaked-playbook-shows-how-the-daily-stormer-makes-anti-semitism-cool/
[58] Collins, C. “What is the ‘Alt-Right’?” Teaching Tolerance 1.57 (2017)
[59] McInnes, Molyneux, and 4chan: Investigating pathways to the alt-right. (2018, April 19). Retrieved April 26, 2019, from https://www.splcenter.org/20180419/mcinnes-molyneux-and-4chan-investigating-pathways-alt-right