Pyramid Power: How Hate Spreads Online

Matthew JohnsonThe last few weeks have shed an unprecedented light on the use of digital media to spread and inspire hatred. Michael Zehaf-Bibeau, the perpetrator in the attacks on Canada’s National War Memorial and Parliament buildings, appears to have been motivated in part by exposure to online postings by a self-described member of the Islamic state[1], and the Federal government has already stated that it intends to create tools to remove online content that promotes the “proliferation of terrorism.”[2]

At almost exactly the same time, the Pew Institute released a study that revealed that young women (ages 18-24) were the group most at risk of being stalked, sexually harassed, or targeted for sustained harassment online and that online games were the digital environment most hostile towards women. None of these findings were likely surprising to anyone who had been following the so-called “gamergate” phenomenon, a campaign which began as a Twitter hashtag used by a game developer’s ex-boyfriend to make false allegations against her and expanded to harassment of women in and connected to the video game industry under the guise of promoting “ethics in game journalism.”[3] At the same time, the conflict has served “to reveal what many in the gaming industry say is a bias against women in gaming, evidenced not only by death and other malicious threats made against female game developers and female game players, but also by the male-heavy themes in many of today’s commercial games.”[4]

Real-world violence may seem to be in a completely different realm from behaviour that occurs only online. It’s important to understand, though, that radicalization can take place at several different levels, and that hate groups and movements are often simultaneously working on multiple levels of a pyramid of radicalization: first, drawing a large number of sympathizers from the general public with material that is carefully crafted to seem reasonable and not alienate those who aren’t already sympathetic to the cause; then, using a variety of mechanisms of radicalization to promote a group identity among those sympathizers to the point where they identify themselves as members of the movement, even if there is no organized group; and, finally, inspiring those members to become activists who are willing to take more extreme action – both online and off. Because materials aimed at the top of the pyramid are the most obvious and extreme, the attention that’s paid to them may make us less able to see what’s reaching the lower levels, especially since much of that has been created to fly under the radar. Since much of the “gamergate” discussion has taken place on or through Twitter, though, it’s shed light on the more subtle radicalization tools used by hate movements – in this case, those promoting misogyny online.

In the years since MediaSmarts released its Facing Online Hate suite of resources, online misogyny has become one of the highest-profile forms of online hate[5] – and has moved disturbingly far into the mainstream of online discourse. While most online misogyny is not connected to traditional hate groups, it relies on the same “ideologies of hate: such as dehumanizing the target group and casting the hate group as a victim,[6] and appeals in a similar way to youth – particularly boys and young men – who feel alienated from society.[7] Radical misogynists are also taking their cues from more established hate movements, with appeals not just to the top of the pyramid but to the general public at the bottom.

One example is the use of “cloaked” hate sites, which masquerade as sources of legitimate information or debate.[8] At first glance, cloaked websites appear to be authoritative and professional and attempt to conceal a racist agenda behind a more moderate message. Young people are particularly vulnerable to this type of content as research shows many lack the ability to critically evaluate cloaked hate sites:[9] even undergraduates who were shown the “” site – actually run by the White supremacist group Stormfront – were unable to recognize it as a biased source.[10] The misogynist equivalent of this site is which, like the Martin Luther King site, uses a .org address to provide credibility – in this case the genuine White Ribbon campaign and its partners in the UK, Australia and New Zealand. Unlike the real White Ribbon campaign, though, which aims to help men and boys escape the toxic forms of masculinity that lead to violence against women (here’s an example of the great work White Ribbon is doing in Canada), the cloaked site promotes the idea that men and women suffer equally from domestic violence. is relatively crude – though it’s slicker than the primitive Martin Luther King site, it’s a far cry from professional-looking cloaked sites such as the National Policy Institute and the Committee for Open Debate on the Holocaust – and it takes only a little investigation to uncover its lack of credibility: a TinEye image search, for instance, reveals that its photos of battered men are actually stock photos, and the “Contribute Today” section – which asks for donations not to support any programs but for a legal fund – identifies it as being owned by A Voice for Men, a well-known “men’s rights” misogynist site.

But how is a student who comes across the site going to know that they need to question its credibility? A student who finds the site while doing research for school will probably at least try to do this – almost nine out of ten students say they make an effort to verify information they’re seeking for school[11] – but they are substantially less likely to do so if they are seeking information on relationships out of their own personal interest, as one in four Grade 11 students do.[12] If they are led to the site by a link sent via social networks such as Twitter or Facebook, they are even less likely to try to verify the source: just over half of students try to authenticate information received this way.[13] What students need is not just the tools to question a site like this, or the misogynist arguments they may encounter on social networks, but the ability to recognize that they need to be questioned. MediaSmarts’ lesson Hate or Debate and our Responding to Online Hate guide help youth to recognize the signs that an argument is coming from a position of hate, such as “Othering,” where a group is presented as being completely different from the author’s group, sometimes even portrayed as inhuman;[14]  appeals to victimhood, in which hate groups portray themselves, and the group they claim to represent, as victims of the Other (for example, White supremacist websites will claim that Whites lose jobs or school placements, and even their own heritage, as a result of “reverse racism”[15]); and the idea that the group has fallen from its once-glorious past. This fall is also shown as being the fault of the Other, and it is only by defeating and destroying the Other that this glorious past can be regained. Additionally, youth can be taught to recognize radicalization tactics such as scare tactics that make it seem as though their identity is under threat.

Looking at the fake White Ribbon site, for instance, we see frequent assertions that men are the real victims not just of domestic violence but of a legal system supposedly biased against them (the site claims the U.S. Violence Against Women Act deprives men of “their civil rights, due process rights, and their right to cross-examine those who falsely accuse them”). Women are othered by being divided into two categories: feminists  and “good” women who support the movement. This othering is also frequently found in the Gamergate discourse and other forms of online misogyny, where all feminists are radicals (“radfems”) and women are described in gamespeak as “NPCs” – “non-player characters.” Participants describe their identity as gamers as being under threat by the increasing number of women playing games (who are characterized as “not real gamers”), the diversification of game genres, and the critiques of sexism both within the games industry and the games themselves. As Raph Koster, a long time game developer puts it, “That sense of being marginalized by the rest of society, and that sense of triumph when you’re recognized – gamers have had that for quite a while.”[16] This feeling of being marginalized may make people who identify themselves as gamers particularly vulnerable to radicalization efforts by hate movements, who have traditionally targeted vulnerable young people seeking groups or causes that will give them a sense of identity,[17] and then tailoring their messaging to appeal to, or actively recruit, them.[18] Because of the networked nature of the Internet, there’s no longer any need to have radicalized youth become members of a hate group, or for hate movements at all: instead, an increasing number are working to create “lone wolves”[19] like Michael Zehaf-Bibeau who can’t be directly tied to the people who radicalized them. “Doxxing,” the public release of a person’s home address and other contact information is a radicalization tool aimed at the very top of the pyramid that’s intended to encourage those “lone wolves” who are already committed to the movement – or looking for a cause that will give them a purpose and direction for their anger – to take action offline against people who have been identified as enemies. By posting a person’s offline contact information publicly, doxxing serves as a call to subject targets to threats, harassment, “swatting” (calling police to the address with a false report) and even violence. Doxxing has been used by “gamergate” supporters not only to target specific women but to silence others for fear of the consequences of speaking out: the actress Felicia Day was doxxed less than an hour after posting a blog in which she said she had not previously spoken out about gamergate due to her fear of being doxxed.

MediaSmarts’ own research has shown that girls already see the Internet as a hostile place: girls are substantially less likely to see the Internet as a safe space than boys (though they are just as likely to feel they can keep themselves safe,)[20] and one reason for this is surely the frequent and often public attacks on women online. Over a third of Canadian students in grades 7 to 11 encounter sexist or racist content online at least once a week, and girls are much more likely than boys to feel hurt when a racist or sexist joke is made at their expense (57% of girls compared to 34% of boys) and to want to learn about dealing with online racism and sexism in school. [21] Many online spaces – particularly multiplayer games – are characterized by highly aggressive and frequently racist, misogynist and homophobic discourse:[22] one study found that playing Halo 3 with a female voice and a female-identifying name led to three times more negative comments than playing with a male voice and male-identifying name or no voice and a gender-neutral name.[23] Boys, meanwhile, are much more likely than girls to say they and their friends “don’t mean anything by it” when they say racist or sexist things online and to not speak up against such content because “most of the time, people are just joking around.”[24]

If we want to ensure that the Internet is a safe space for all youth, each of us has a role to play. Parents and teachers need to communicate values of tolerance and respect – MediaSmarts’ research found that students with rules in the home about treating people with respect were substantially less likely to be mean or cruel to others online[25] – and so do the operators of the sites and platforms that young people use: online spaces that have taken steps to create cultures where racism and sexism aren’t tolerated have been successful in making harassment less common.[26] But for this to happen, their users – especially the young people who make up the most valuable demographic – have to demand it. It’s important that we prepare young people to recognize and confront hate material online and to help them understand that as members of a community – online or offline – they have a responsibility to make that community a safe place for everyone. They also need to know that they can make a difference by speaking out against hateful speech – even (especially) by their friends – by debunking hate content when it’s spread, by reporting hate content to ISPs and social network platforms and, in some cases, notifying police (our Responding to Online Hate guide has more information on how to respond to hate content online, including a list of police services in Canada that have dedicated anti-hate units.) Research has shown that members of a group are much less likely to conform to the group’s attitudes if even one person expresses a different opinion,[27] and as sociologist Randy Blazak puts it, even in the case of people radicalized to the top of the pyramid, “the illogic of racism and bigotry can be illuminated.”[28]  Digital technology can give people an unprecedented power to do harm: a single poster’s “doxxing” can expose them to harassment or violence by “lone wolves”. But it also gives all of us, including young people, an unprecedented power to do good.

[1] Quan, Douglas. “RCMP says it has video evidence Ottawa shooter had political, ideological motives.” Postmedia News, October 26 2014.
[2] Chase, Steven and Josh Wingrove. “Terror fight turns to Internet, sparking new free-speech debate.” The Globe and Mail, October 30 2014.
[3] Wingfield, Nick. “Feminist critics of video games facing threats in ‘Gamer Gate’ Campaign.” The New York Times, October 15 2014.
[4] Stansbury, Meris. “#Gamergate – and what it means for gaming in education.” eSchoolNews, November 20 2014.
[5] “Misogyny: The Sites.” Southern Poverty Law Center Intelligence Report Spring 2012.
[6] Rowland, Robert C. “The Symbolic DNA of Terrorism,” Communication Monographs, 75:1 (2008), 52 – 85
[7] Blazak, Randy. “From White Boys to Terrorist Men: Target Recruitment of Nazi Skinheads,” American Behavioral Scientist 44:6 (2001), 982-1000.
[8] Daniels, J. (2008). Race, Civil Rights, and Hate Speech in the Digital Era. Learning Race and Ethnicity: Youth and Digital Media (pp. 129-154). Cambridge, MA: MIT Press.; McNamee, Pena & Peterson (2010)
[9] Daniels, J. (2008)
[10] Gerstenfeld, Phyllis B., Diana R. Grant and Chau-Pu Ching. “Hate Online: A Content Analysis of Extremist Internet Sites,” Analyses of Social Issues and Public Policy, 3:1 (2003), 29-44.
[11] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Experts or Amateurs? Gauging Young Canadians’ Digital Literacy Skills. MediaSmarts, 2014.
[12] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Life Online. MediaSmarts, 2014.
[13] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Experts or Amateurs? Gauging Young Canadians’ Digital Literacy Skills. MediaSmarts, 2014.
[14] Meddaugh, Priscilla Marie and Kay, Jack “’Hate Speech or ‘Reasonable Racism?’ The Other in Stormfront.” Journal of Mass Media Ethics 24: 4 (2009), 251-268.
[15] Daniels, J. (2008)
[16] Wingfield, Nick. “Feminist critics of video games facing threats in ‘Gamer Gate’ Campaign.” The New York Times, October 15 2014.
[17] Amon, K. “Grooming for Terror: the Internet and Young People”. Psychiatry, Psychology & Law 17:3 (2010), 424-437.
[18] Blazak, Randy. “From White Boys to Terrorist Men: Target Recruitment of Nazi Skinheads.” American Behavioral Scientist 44:6 (2001), 982-1000.
[19] Simon Wiesenthal Center. Digital Terrorism and Hate 2011.
[20] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Cyberbullying: Dealing With Online Meanness, Cruelty and Threats. : MediaSmarts, 2014.
[21] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Encountering Racist and Sexist Content Online. MediaSmarts, 2014.
[22] Gray, K. L. Deviant Bodies, Stigmatized Identities, and Racist Acts: Examining the Experiences of African-American Gamers in Xbox Live. New Review of Hypermedia and Multimedia, 2012.
[23] Rose, L. M. “Communication in multiplayer gaming: Examining player responses to gender cues.” New Media & Society: 541-556.
[24] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Encountering Racist and Sexist Content Online. MediaSmarts, 2014.
[25] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Cyberbullying: Dealing With Online Meanness, Cruelty and Threats. : MediaSmarts, 2014.
[26] Hudson, Laura. “Curbing Online Abuse Isn’t Impossible. Here’s Where We Start.” Wired, May 15 2014.
[27] Dean, Jeremy. Conformity: Ten Timeless Influencers. PsyBlog, February 25 2010.
[28] Blazak, Randy. “From White Boys to Terrorist Men: Target Recruitment of Nazi Skinheads,” American Behavioral Scientist 44:6 (2001), 982-1000.

Add new comment