Bigotry, in its various forms, has been with us for a long time – at least since the Greeks coined the word “barbarian” to mean “anyone who isn’t us,” and likely longer – so it’s not surprising that racism, sexism and other prejudices have found a home on the Internet. MediaSmarts’ new report Young Canadians in a Wired World: Encountering Racist and Sexist Content Online looks at how often Canadian youth are exposed to prejudice, how it makes them feel and how they respond to it.
One of the biggest changes between this study and our last one, in 2005, is that social networks and other forms of peer-created content make up many of the most popular sites among young people (YouTube, the #1 site overall, features a mix of professional and user-generated content.)[1] For that reason, rather than asking about whether students had visited sites containing hate content, as we did in 2005, we asked whether (and how often) they had encountered sexist or racist content online. (This question was only asked of students in grades 7-11.) The result can be seen with equal parts optimism and concern: one fifth of students say they have never encountered racist or sexist content online, but four in ten say they see it once a month or once a week – and one in six encounter it daily. Older students are more likely to have seen racist or sexist content online: just one in five Grade 7 students see it once a week or more, compared to more than half of Grade 11 students. [2]
Boys and girls encounter racist and sexist content at roughly the same rates (boys are slightly more likely to see it once a day or once a week, girls somewhat more likely to see it once a month or once a year, and both are equally likely never to have seen it.) However, while boys seem to encounter this content at higher frequencies, girls seem to be more affected by it: substantially more girls (57% compared to 34% of boys) say that racist or sexist jokes hurt their feelings, and many fewer girls (36% compared to 52% of boys) report that they and their friends say racist or sexist things to each other for fun. Boys, in keeping with their attitudes towards cyberbullying[3], are much more likely to say they and their friends “don’t mean anything by it” when they say racist or sexist things online and to not speak up against such content because “most of the time, people are just joking around.” Boys are also much more likely than girls to engage in online bullying based on race, religion, ethnicity or sexual orientation, or to harass someone sexually online.[4]
These differences may partly explain our earlier finding that girls are less likely to see the Internet as a safe space than boys,[5] and may also be related to the frequent and often public attacks on women online. Some cases of this may be high profile, such as the attacks on critic Anita Sarkeesian after she launched an online campaign to fund a series of videos looking at sexism in video games,[6] but women who aren’t public figures attract online hostility as well.
Many of the online spaces frequented by youth – particularly multiplayer games – are characterized by highly aggressive and frequently racist, misogynist and homophobic discourse:[7] one study found that playing Halo 3 with a female voice and a female-identifying name led to three times more negative comments than playing with a male voice and a male-identifying name or no voice and a gender-neutral name.[8] There has also been a rise of online hate material specifically targeting women,[9] and, like other forms of hate, its rhetoric can influence the culture of more mainstream spaces.[10] While most online misogyny is not connected to traditional hate groups, it relies on the same “ideologies” of hate such as othering and dehumanizing the target group and casting the hate group as a victim,[11] and appeals in a similar way to youth – particularly boys and young men – who feel alienated from society.[12]
When youth do encounter content generated by hate groups, it’s often in the form of “cloaked sites” which disguise themselves as sources of legitimate information or debate. To achieve this, cloaked sites put on as many of the trappings of legitimacy as possible – using a dot-org Web address, for example, or having an official-sounding name. A 2003 study reports that when students in a first-year University class were asked to critically evaluate the site martinlutherking.org, a cloaked site created by the hate group Stormwatch, almost none were able to recognize that it was biased or identify the point of view of its author;[13] one of the teachers in our study reported a similar experience with Canadian high school students.[14]
Students seem to hold a number of contradictory beliefs about how harmful racist and sexist content is. The largest number agreed that “people say racist and sexist things to pick on other people,” (69%) but a significant number (52%) feel that “people say racist and sexist things because they are insensitive, but not because they mean to hurt anyone.” (It is, of course, possible for both of these to be true in different cases.) When asked how they felt people should react to racism or sexism online, 78 percent agree that “it is important to say something so people know it’s wrong” but 45 percent feel that “it is wrong, but it’s not my place to say anything.” The seeming contradiction can be partly explained by the fact that younger students are somewhat more likely to agree with the first statement (82% of Grade 7 students compared to 77% of Grade 11 students) and much less likely to agree with the second statement (37% in Grade 7 compared to 50% in Grade 11). This may seem like a contradiction, since one would expect teenagers to feel more entitled to speak out than younger students; one possible explanation for this is that while younger students spend their time in online environments primarily populated by their peers, older students spend more time in environments which they share with older youth and adults, such as multiplayer games. There was very little gender difference in whether or not students agreed with those statements, but there are significant differences relating to the statement “I don’t say anything because, most of the time, people are just joking around” (62% of boys compared to 52% of girls) and “When you see something racist or sexist online, it’s important to tell an adult” (42% of boys compared to 52% of girls).[15]
A significant majority (76%) of students have learned about how to deal with hateful, racist or sexist content online from some source. Parents and teachers are tied as sources (39% each) with friends behind at 21 percent and 17 percent of students learning from offline sources. As with our other findings about digital literacy skills, however, there are noteworthy gender differences. Girls are significantly more likely to have learned about it from both parents (43% compared to 34% of boys) and teachers (44% compared to 34% of boys), while boys are more likely to have learned about it from online sources (22%, compared to 13% of girls). Despite these differences, however, there is no significant difference in whether they have learned about this at all.[16] Girls are more likely than boys to have a rule in the home about sites they are not supposed to visit (51% compared to 44%)[17]. This rule has some association with the highest frequency of encountering racist or sexist content online (11% of students with this rule encounter it once a day or more, compared to 19% without the rule) but little or no relationship with the risk of encountering it at lower frequencies. This suggests that there may be sites or services that are popular among at least some youth which contain racist or sexist content and are considered inappropriate by parents (one likely candidate might be M-rated games).[18]
Like our findings on sexting, the high frequency with which youth encounter sexist content online suggests that more must be done to fight gender stereotyping and promote healthy views of gender and sexuality. Young people need to be educated about the harms of all kinds of hate content, whether based on race, religion, gender, sexual orientation, or any other grounds; they also must be equipped with the necessary digital and media literacy skills to recognize hate content when they encounter it – such as an understanding of the markers of an argument based on hate – and to recognize and decode the various persuasive techniques hate groups use to build group solidarity and recruit new believers, such as misinformation,[19] denialism and revisionism,[20] and pseudo-science.[21] Finally, youth need to be empowered to speak out against hate in all its forms, especially when they encounter it in mainstream spaces such as online games or social networks.
Click here to read the full report: mediasmarts.ca/ycww
Young Canadians in a Wired World – Phase III: Encountering Racist and Sexist Content Online was made possible by financial contributions from Canadian Internet Registration Authority, Office of the Privacy Commissioner of Canada and The Alberta Teachers’ Association.
[1] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Life Online. : MediaSmarts, 2014.
[2] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Encountering Racist and Sexist Content Online: MediaSmarts, 2014.
[3] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Cyberbullying: Dealing With Online Meanness, Cruelty and Threats. : MediaSmarts, 2014.
[4] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Encountering Racist and Sexist Content Online: MediaSmarts, 2014.
[5] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Life Online. : MediaSmarts, 2014.
[6] Zerbisias, Antonia. "Internet trolls an online nightmare for young women." The Toronto Star, January 18, 2013, sec. Insight.
[7] Gray, K. L. Deviant Bodies, Stigmatized Identities, and Racist Acts: Examining the Experiences of African-American Gamers in Xbox Live. New Review of Hypermedia and Multimedia, 2012.
[8] Rose, L. M.. "Communication in multiplayer gaming: Examining player responses to gender cues." New Media & Society: 541-556.
[9] "Misogyny: The Sites." Southern Poverty Law Center Intelligence Report Spring 2012. http://www.splcenter.org/get-informed/intelligence-report/browse-all-issues/2012/spring/misogyny-the-sites
[10] Chiang, Chau-Pu. "Hate Online: A Content Analysis Of Extremist Internet Sites." Analyses of Social Issues and Public Policy: 29-44.
[11] Rowland, Robert C. “The Symbolic DNA of Terrorism,” Communication Monographs, 75:1 (2008), 52 – 85
[12] Blazak, Randy. “From White Boys to Terrorist Men: Target Recruitment of Nazi Skinheads,” American Behavioral Scientist 44:6 (2001), 982-1000.
[13] Gerstenfeld, Phyllis B., Diana R. Grant and Chau-Pu Ching. “Hate Online: A Content Analysis of Extremist Internet Sites,” Analyses of Social Issues and Public Policy, 3:1 (2003), 29-44.
[14] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Teachers’ Perspectives. MediaSmarts, 2012.
[15] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Encountering Racist and Sexist Content Online: MediaSmarts, 2014.
[16] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Encountering Racist and Sexist Content Online: MediaSmarts, 2014.
[17] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Life Online MediaSmarts, 2014.
[18] Steeves, Valerie. Young Canadians in a Wired World, Phase III: Encountering Racist and Sexist Content Online: MediaSmarts, 2014.
[19] Meddaugh, P. M. “Hate Speech or "Reasonable Racism?" The Other in Stormfront,” Journal of Mass Media Ethics, 24:4 (2009), 251-268.
[20] McNamee, L., Pena, J., & Peterson, B. “A Call to Educate, Participate, Invoke and Indict: Understanding the Communication of Online Hate Groups,” Communication Monographs, 77:2 (2010), 257-280.
[21] Meddaugh, P. M. “Hate Speech or "Reasonable Racism?" The Other in Stormfront,” Journal of Mass Media Ethics, 24:4 (2009), 251-268.