Impact of Online Hate

Online hate can have an impact in three interconnected ways:

  • the harm done to its targets, either from personal harassment or from online spaces being experienced as hostile;
  • the risk that those who encounter it may be radicalized by it, becoming more sympathetic and possibly even active; and
  • the effect that it has on the values and culture of the online spaces in which it happens.

Harm

The first of these is the most clear, though – as with cyberbullying – the harm done may not be visible to perpetrators. Indeed, a significant amount of cyberbullying is motivated by hate: for example, lesbian, gay, bi-sexual and transgender (LGBT) youth are almost twice as likely to report having been bullied online as those who are straight,[1] while young women are twice as likely to have been sexually harassed online as young men.[2] Young people who experience online hate are more likely to experience anxiety and depression,[3] and targets of online hate may suffer harassment and violence offline as well. [4] A frequent form of targeted hate is “doxxing,” the act of publishing a target’s home address or other personal information as a way of encouraging others to harass them.[5] As a result, members of vulnerable groups may be more reluctant to speak freely online[6] or withdraw from online spaces entirely[7], which has an impact not just on them, but also on the online communities they’re a part of.

Radicalization

The second possible impact of online hate is radicalization. This term refers to the process by which people come to believe that violence against others and even oneself is justified in defense of their own group. Not everyone who is involved in a group is necessarily radicalized to the same degree; in fact, even within a hate group, only a small number of people may be radicalized to the point where they are ready to advocate and commit violent acts.

One way of looking at the process is to think of any group or movement as a pyramid.[8] (While there has been some recent criticism of this model of radicalization, leading the authors of the original paper to propose a two-pyramid model that separates radicalization of opinion from radicalization of action,[9] it remains a valuable way to model radicalized groups.)

The base of the pyramid is made up of Sympathizers who support the group and share its ideals but who are not actively involved in what it’s doing. They are typically the largest part of the group but also the least committed.

The next level we might call the Members. These are people who identify themselves strongly with the group and participate in its everyday activities.

At the final level are Activists. These are the members who identify most strongly with the group and are likely to push it towards more radical positions and more extreme actions. The most extreme of these are those who commit violent and other criminal acts. While it’s not always clear how each person becomes radicalized to violence, in Canada alone there have been at least three hate-motivated mass murders whose perpetrators were at least partially radicalized online.[10]

The process of radicalization has traditionally been seen as the way in which people move up the pyramid to identify more deeply with their group and become more willing to support or engage in extreme acts. The networked nature of digital media, however, allows hate groups and movements to simultaneously target all of the different levels of the pyramid, making it “remarkably easy for viewers to be exposed to incrementally more extremist content”;[11] one scholar has described the internet as a “conveyor belt” to radicalization.[12] On extremist forums, “how to red-pill [i.e. radicalize] others is a constant topic of conversation,” with a great deal of thought put into matching the message with the target’s readiness.[13]

How Radicalization Happens

In their article Mechanisms of Political Radicalization,[14] Clark McCauley and Sophia Moskalenko identify 12 ways in which a person or group may become more radicalized. “Most of the mechanisms identified are associated with strong emotional experiences, including anger, shame, guilt, humiliation, fear, love and hate,”[15] and in most cases of radicalization, more than one mechanism is at work.

Of those, five are of particular relevance to studying online hate:

  1. The Slippery Slope – It’s rare for anyone to become radicalized by a single act or event; it is more often the culmination of many small steps. Research has shown that people have a tremendous ability to justify their actions, even actions they would normally consider to be wrong. (For instance, a person who forgets to leave a tip at a restaurant may retroactively find flaws with the service to justify not leaving a tip.) This can have the effect of shifting our morality: once we have established that something we previously considered wrong was actually right, more extreme actions may become permissible. This effect is particularly powerful online, where consequences are less apparent: the slope that leads from reading hateful content to creating it can be very slippery.
      
  2. The Power of Love – The social and emotional effects of being in a group can be just as powerful as whatever cause or ideology the group is committed to. Research has shown that members of hate groups such as skinheads will often act as mentors or ‘big brothers’ to vulnerable youth, providing a sympathetic ear, an explanation for their problems, and a way of taking action. Online spaces and communities used by hate groups have many of the same social features – forums, “likes” and “upvotes,” and opportunities to accumulate special privileges and social capital – as mainstream social spaces,[16] but hate movements can fulfil deeper emotional needs for members who “feel shunned in their lives, in their personal lives or in wider society”[17]; as the moderator of one of the largest misogynist forums on Reddit put it, “The manosphere fundamentally became a surrogate father for the life lessons I never got.”[18]

To create this support network, many of the techniques used by hate groups are intended to build group solidarity. Calls to protect the group, and in particular the most vulnerable within the group, are useful both for building support and for radicalizing supporters. Another way of meeting emotional needs is to bolster members’ self-esteem by building up the most extreme members of the group (frequently, those who have committed violence or even murder in the name of the cause)[19] as heroes, giving an opportunity to less-committed members to imagine themselves as heroes in defense of their group.

  1. Radicalization in Like-Minded Groups – All groups are subject to a phenomenon in which the average group member’s opinion will become more extreme over time. This may be because the more different your opinion is from that of the majority, the more pressure you feel to conform – so those who disagree with the majority are likely to change their opinion, while those who agree either maintain the same opinion or become more extreme in their views. An example of this is the Weather Underground, an American anti-war group, which in the 1970s moved from political protest to terrorism as a result of competition within the group over who was the ‘most radical’. This effect is further amplified by what’s been called the “majority illusion” in online spaces:

Socially connected individuals tend to be similar… giving rise to the “selective exposure” effect that leads individuals to overestimate the prevalence of their features in a population… creating an illusion that the attribute is far more common than it actually is. In a social network, this illusion may cause people to reach wrong conclusions about how common a behavior is, leading them to accept as a norm a behavior that is globally rare.[20]

One common method of closing ranks is through symbolism. Groups may encourage members to distinguish themselves through using traditional hate symbols (either in earnest on closed forums, or “ironically” in more public spaces), using “dog whistles” (words or phrases whose hateful meaning is clear to members but not outsiders) or by co-opting mainstream symbols such as the Celtic cross, pagan runes and even the “OK” hand sign, re-signifying them as emblems of white supremacy. The Anti-Defamation League argues that hate symbols are more than mere signs: “These symbols are meant to inspire a sense of fear and insecurity. [They] give haters a sense of power and belonging, and a quick way of identifying with others who share their ideology”.[21]

  1. Radicalization Under Isolation or Threat – People will identify more closely with a group if the group appears to be isolated or under external threat. As one filmmaker who interviewed dozens of extremists of different stripes put it, “these movements are deeply rooted in a sense of victimhood, real or imagined.”[22] For example, while White males are certainly the most advantaged group in society,

the far right plays on a much broader dislike of “political correctness” among many young men who feel alienated from mainstream culture… They may see what economic and social capital they do have slipping away. These disillusioned men are perfect targets for radicalization, and it’s a surprisingly short leap from rejecting political correctness to blaming women, immigrants, or Muslims for their problems.[23]

This worldview, in which group members are pitted against an enemy with no possibility of compromise, “is especially enticing to people used to the black and white moral world of many video games, clear evil enemies on one side with the player on the side of good,”[24] but it also serves to justify hatred and violence as being a form of self-defence. As Adam Klein, a professor at Pace University who has studied extremism in logs and social networks says, “rather than overt bigotry, most online hate looks a lot like fear.” These appeals to fear “create an illusion of imminent threat that radicals thrive on, and to which the violence-inclined among them have responded.”[25]

  1. Othering and Dehumanization 

Perhaps the most potent technique for fostering radicalization is to portray opposing groups as being inhuman. This explicitly draws the line between the in- and out-groups and makes it easier to justify any action against them. For example, in World War II the Japanese were portrayed in a heavily caricatured style in American propaganda – always stereotyped, often threatening, and sometimes monstrous – with the result that roughly half of American soldiers were in favour of exterminating the Japanese nation after the war was over. In fact, servicemen who had not seen combat were actually more likely to advocate extermination – suggesting that it was exposure to propaganda, and not actual contact with the enemy, that had produced this attitude.[26] As psychologist Nick Haslam puts it, though, “dehumanization doesn’t only occur in wartime. It’s happening right here, right now. And every day, good people who don’t see themselves as being prejudiced bigots are nevertheless falling prey to it.”[27]

The “other” is not, however, the actual group as it exists in reality – some of the groups cited as enemies, such as “SJWs” and “antifa,” do not really exist at all as concrete groups – but are a fiction created to solidify the identity of the hate group and justify its existence and its actions. The editor of one White supremacist website stated that “there should be a conscious agenda to dehumanize the enemy, to the point where people are ready to laugh at their deaths.” [28]

To achieve this, the Other must be portrayed as being both inferior, to establish the hate group’s superiority, and threatening, to establish the need to take action against them.[29] For this reason hate groups portray the Other in ways that emphasize difference – making them seem strange, even inhuman. This is often done through caricature or stereotype, name-calling, or ideology: in some cases hate groups will claim that others are literally not human. Dehumanization is one of the basic mechanisms of radicalization and is a necessary one for hate groups to successfully promote their ultimate message: that annihilation of a particular group is justified.

Radicalization and Youth

Young people are especially vulnerable to the mechanisms described above because many are looking for groups or causes that will give them a sense of identity. Identity seeking is a natural part of adolescence but, taken to its extreme, this can provide a toe-hold for hate mongers. “Anomie” is the term that describes the state of mind in which family, social, or cultural values appear worthless. Youth suffering from anomie will seek a group or cause that gives them values, an identity and a surrogate family.[30]  A common cause of anomie is when changing social conditions make it seem as though one’s identity is under attack. An example of this is the Gamergate phenomenon, in which young male video game players described their identity as gamers as being under threat by the increasing number of women playing games (who are characterized as “not real gamers”), the diversification of game genres, and the critiques of sexism both within the games industry and the games themselves; as Raph Koster, a longtime game developer puts it, “That sense of being marginalized by the rest of society, and that sense of triumph when you’re recognized – gamers have had that for quite a while.”[31]

This also explains why economic problems by themselves do not necessarily make a young person more prone to radicalization: one study found that “it was not poor socioeconomic status itself that pointed toward susceptibility, but rather a sense of relative deprivation, coupled with feelings of political and/or social exclusion.”[32]

Hostile Environments

Beyond radicalizing individuals, the connected, networked nature of online communities also enables hate movements to broaden the base of the pyramid to make hate speech – both in jest and in earnest – seem more acceptable. The purpose is not only to create a greater pool of potentially radicalized recruits, but to also create an online environment that is progressively more hostile to anyone targeted by these movements. While the top two tiers are the most visible and draw the most concern,

the bottom strata is just as responsible for the rancor, negativity, and mis-, dis- and mal- information that clog online spaces, causing a great deal of cumulative harm. This bottom strata includes posting snarky jokes about an unfolding news story, tragedy, or controversy; retweeting hoaxes and other misleading narratives ironically, to condemn them, make fun of the people involved, or otherwise assert superiority over those who take the narratives seriously; making ambivalent inside jokes because your friends will know what you mean (and for white people in particular, that your friends will know you’re not a real racist); @mentioning the butts of jokes, critiques, or collective mocking, thus looping the target of the conversation into the discussion; and easiest of all, jumping into conversations mid-thread without knowing what the issues are.

Just as it does in nature, this omnipresent lower stratum in turn supports all the strata above, including the largest, most dangerous animals at the top of the food chain. Directly and indirectly, insects feed the lions.[33]

 


[1] Hinduja, S. and Patchin, J.W. (2011). Cyberbullying Research Summary: Bullying, Cyberbullying and Sexual Orientation. Cyberbullying Research Centre.
[2] Duggan, M. (2017). Online Harassment 2017 (Rep.). Pew Research Center.
[3] Tynes, B. M., Giang, M. T., Williams, D. R., & Thompson, G. N. (2008). Online Racial Discrimination and Psychological Adjustment Among Adolescents. Journal of Adolescent Health, 43(6), 565-569. doi:10.1016/j.jadohealth.2008.08.021
[4] Carson, E. (2017, November 27). This lawsuit could shut down neo-Nazi site The Daily Stormer. Retrieved from https://www.cnet.com/news/taking-trolls-to-court-lawsuit-targets-daily-stormer-internet-nazis/
[5] Online Harassment: Extremists Ramp Up Trolling, Doxxing Efforts. (n.d.). Retrieved from https://www.adl.org/blog/online-harassment-extremists-ramp-up-trolling-doxxing-efforts
[6] Lenhart, A., Ybarra, M., Zickhur, K., & Price-Feeney, M. (2016). Online Harassment, Digital Abuse, and Cyberstalking in America (Rep.). New York, NY: Data & Society. doi:https://www.datasociety.net/pubs/oh/Online_Harassment_2016.pdf
[7] Resnick, B. (2017, March 07). The dark psychology of dehumanization, explained. Retrieved from https://www.vox.com/science-and-health/2017/3/7/14456154/dehumanization-psychology-explained
[8] Mccauley, C., & Moskalenko, S. (2008). Mechanisms of Political Radicalization: Pathways Toward Terrorism. Terrorism and Political Violence, 20(3), 415-433. doi:10.1080/09546550802073367
[9] Mccauley, C., & Moskalenko, S. (2017). Understanding political radicalization: The two-pyramids model. American Psychologist, 72(3), 205-216. doi:10.1037/amp0000062
[10] Carranco, S., & Milton, J. (2019, April 27). Canada’s new far right: A trove of private chat room messages reveals an extremist subculture. The Globe and Mail. Retrieved from https://www.theglobeandmail.com/canada/article-canadas-new-far-right-a-trove-of-private-chat-room-messages-reveals/
[11] Lewis, B. (2018). Alternative Influence: Broadcasting the Reactionary Right on YouTube (Rep.). Data & Society.
[12] Bergin, A.. Countering Online Radicalisation in Australia. Australian Strategic Policy Institute Forum, 2009.
[13] From Memes to Infowars: How 75 Fascist Activists Were “Red-Pilled”. (2018, October 11). Retrieved April 25, 2019, from https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/
[14] Mccauley, C., & Moskalenko, S. (2008). Mechanisms of Political Radicalization: Pathways Toward Terrorism. Terrorism and Political Violence, 20(3), 415-433. doi:10.1080/09546550802073367
[15] Mccauley, C., & Moskalenko, S. (2017). Understanding political radicalization: The two-pyramids model. American Psychologist, 72(3), 205-216. doi:10.1037/amp0000062
[16] Glaser, A. (2017, August 30). Nazis and White Supremacists Are No Longer Welcome on the Internet. So They’re Building Their Own. Retrieved from https://slate.com/technology/2017/08/the-alt-right-wants-to-build-its-own-internet.html
[17] Illing, S. (2019, March 17). This filmmaker spent months interviewing neo-Nazis and jihadists. Here’s what she learned. Retrieved from https://www.vox.com/world/2019/1/14/18151799/extremism-white-supremacy-jihadism-deeyah-khan
[18] Marche, S. (2016, April 14). Swallowing the Red Pill: A journey to the heart of modern misogyny. The Guardian. Retrieved from https://www.theguardian.com/technology/2016/apr/14/the-red-pill-reddit-modern-misogyny-manosphere-men
[19] After New Zealand Shooting, Far-right, Racists Claim Victimhood, Hail Killer as Hero. (2019, March 15). Retrieved April 25, 2019, from https://www.splcenter.org/hatewatch/2019/03/15/after-new-zealand-shooting-far-right-racists-claim-victimhood-hail-killer-hero
[20] Lerman, K., Yan, X., & Wu, X. (2016). The “Majority Illusion” in Social Networks. Plos One, 11(2). doi:10.1371/journal.pone.0147617
[21] Anti-Defamation League. (2001). Poisoning the Web - Internet as a Hate Tool. ADL: Fighting Anti-Semitism, Bigotry and Extremism. Retrieved July 20, 2011, from http://www.adl.org/poisoning_web/net_hate_tool.asp
[22] Illing, S. (2019, March 17). This filmmaker spent months interviewing neo-Nazis and jihadists. Here’s what she learned. Retrieved from https://www.vox.com/world/2019/1/14/18151799/extremism-white-supremacy-jihadism-deeyah-khan
[23] Marwick, A., & Lewis, B. (2017, May 18). The Online Radicalization We’re Not Talking About. Retrieved from http://nymag.com/intelligencer/2017/05/the-online-radicalization-were-not-talking-about.html
[24] Deo. (2017, November 23). How White Nationalism Courts Internet Nerd Culture. Retrieved from https://medium.com/@DeoTasDevil/how-white-nationalism-courts-internet-nerd-culture-b4ebad07863d
[25] Klein, A. G. (2018, November 20). Fear, more than hate, feeds online bigotry and real-world violence. Retrieved April 25, 2019, from https://theconversation.com/fear-more-than-hate-feeds-online-bigotry-and-real-world-violence-106988
[26] Clark McCauley, Clark& Moskalenko, Sophia. (2008) Mechanisms of Political Radicalization: Pathways Toward Terrorism, Terrorism and Political Violence, 20(3), 415-433.
[27] Resnick, B. (2017, March 07). The dark psychology of dehumanization, explained. Retrieved from https://www.vox.com/science-and-health/2017/3/7/14456154/dehumanization-psychology-explained
[28] Feinberg, A. (2017, December 14). This Is The Daily Stormer’s Playbook. Retrieved from https://www.huffingtonpost.ca/entry/daily-stormer-nazi-style-guide_n_5a2ece19e4b0ce3b344492f2
[29] Meddaugh, P.M. (2009). Hate Speech or “Reasonable Racism?” The Other in Stormfront. Journal of Mass Media Ethics, 24(4), 251-268.
[30] Amon, K. (2010). Grooming forTerror: the Internet and Young People. Psychiatry, Psychology& Law, 17(3), 424-437.
[31] Wingfield, Nick. “Feminist critics of video games facing threats in ‘Gamer Gate’ Campaign.” The New York Times, October 15 2014.
[32] Norman, J. M., & Mikhael, D. (2017, August 28). Youth radicalization is on the rise. Here’s what we know about why. Washington Post. Retrieved September 1, 2017, from https://www.washingtonpost.com/news/monkey-cage/wp/2017/08/25/youth-radicalization-is-on-the-rise-heres-what-we-knowabout-why/?utm_term=.39a485789d43
[33] Milner, R. M., & Phillips, W. (2018, November 20). The Internet Doesn’t Need Civility, It Needs Ethics. Retrieved from https://motherboard.vice.com/en_us/article/pa5gxn/the-internet-doesnt-need-civility-it-needs-ethics

Diversity in Media Toolbox

The Diversity and Media Toolbox is a comprehensive suite of resources that explores issues relating to stereotyping, bias and hate in mainstream media and on the Internet. The program includes professional development tutorials, lesson plans, interactive student modules and background articles.

Learn More