In its early days, the internet was often spoken of as a free marketplace of ideas, where everyone’s views and thoughts could be shared and compete on an equal footing. Today it’s an essential tool for accessing information and services, but its value as a vehicle of civic engagement and debate has in many ways declined.
In today’s digital media, fully networked users aren’t just consumers of information but are in the middle of an endless web of connections, allowing content to be shared with any number of people on a variety of platforms. Most of that sharing takes place on a handful of large platforms: social networking sites such as Facebook or Twitter, video-sharing sites such as YouTube, or multiplayer online games such as Minecraft or Fortnite.
In any community, online or offline, the social norms – what’s seen as acceptable and unacceptable – are largely set by the most committed 10 percent of community members. This loudest 10 percent is often those who have the strongest opinions and beliefs, which may include prejudice to outright hate. As a result, if community members do not actively push back against hatred and prejudice, these beliefs can come to be seen as the norm for that community, resulting in a vicious cycle that makes it harder and harder to speak out against prejudice and hatred.
Many online environments—especially those popular with adolescent boys— are already known to have fairly high levels of racism, sexism and homophobia. These spaces can become unfriendly and sometimes unsafe environments for members of targeted groups. Hatemongers will often ‘troll’ mainstream sites, making hateful comments to provoke a reaction from some people or elicit sympathetic comments from others.
Exposure to online prejudice and hate can have much more serious effects than simply making people uncomfortable or unwelcome: studies have shown that experiencing discrimination online can cause stress, anxiety and depression.
MediaSmarts’ Young Canadians in a Wired World (YCWW) research shows that while more than three-quarters of Canadian youth feel it is important to speak up when they encounter hateful content online, nearly half choose not to because they feel “it’s not my place to say anything.”
MediaSmarts’ latest report, Young Canadians Pushing Back Against Hate Online, builds on our past research by asking more than 1,000 12 to 16 year olds where they encounter hate, how they decide to respond to it, and what might make them more likely to respond.
In this research we chose to focus on what we’ve termed casual prejudice, which we’ve defined as when people say or do things that are negative towards a particular group, but are not aimed at a specific person (for instance, the joking use of a homophobic slur when someone makes a mistake in a video game, or sharing a meme that makes fun of a person with a disability to make a joke). While it may seem to be less overtly harmful than targeted hate, the frequency of casual prejudice sets a powerful norm, both allowing more extreme actions and making it harder to speak out against them.
This study confirmed that most young people have witnessed casual prejudice, and for most of them it hurts their feelings – even if they’re not a part of the group the prejudice is aimed at. Eight in ten youth feel it’s important to do or say something about casual prejudice, but many of them face barriers in doing so: half don’t feel they know what to do or are worried about making things worse, and just under half say they are reluctant to do anything because they’ve seen people they know join in or encourage it.
Though most young people agree that casual prejudice is harmful, the fact that the harm doesn’t always seem personal can make them less likely to take action: almost three-quarters said that they would be more likely to do something if someone they knew said that what had happened hurt their feelings.
This research has clear implications for how online platforms themselves can address hate and prejudice: two-thirds of young people said that they would be more likely to do something if the website or app they were using had clear rules about what wasn’t acceptable. This shows that the vicious cycle described above, where normalizing hate and prejudice make it harder for youth to push back against them, can be turned into a virtuous one: two-thirds of young people, for example, said they would be more likely to do something if they knew the website or app had punished users for unacceptable behaviour. Making it easier to report hate online sends a powerful message: seven in ten youth said they’d be more likely to do something if the technical tools for reporting were easier to use. (See our List of Recommendations for Platforms for more detail on how online platforms can empower youth to push back against hate.)
While platforms have an important role to play in setting social norms, young people themselves – and the adults in their lives – can have just as much impact. Two-thirds of youth said they would be more likely to do something about casual prejudice if they thought most of their friends agreed with them; surprisingly, though almost as many said they would be influenced by whether they thought most of the platform’s users, not just their friends, felt the same way they did. What this shows is that it makes a difference every time anybody pushes back against hate and prejudice online, by changing other users’ views of what the community as a whole believes.
As this research shows, whether or not we choose to do something about hate and prejudice online depends a lot on what we think are the community’s views of it. Not surprisingly, then, our response to it has to be a community solution.
Parents need to be ready to talk to their kids about this issue (our tip sheet Talking to Kids About Hate Online can help you know how to give support when they need it); online platforms such as games and social networks can be more clear about their guidelines and provide users with easier ways to take action when they witness hate; teachers can use resources such as MediaSmarts’ Facing Online Hate materials to prepare young people to recognize and respond to hate content online; and youth themselves can feel reassured that even if it may not always seem like it, the majority of their peers agree with them when they stand up to hate.
Young Canadians Pushing Back Against Hate Online was funded by Public Safety Canada’s Community Resilience Fund.
 Xie, J., Sreenivasan, S., Korniss, G., Zhang, W., Lim, C., & Szymanski, B. K. (2011). “Social consensus through the influence of committed minorities.” Physical Review E Phys. Rev. E, 84(1).
 Lerman, K., Yan, X., & Wu, X. (2016). The “Majority Illusion” in Social Networks. Plos One, 11(2). doi:10.1371/journal.pone.0147617
 Dickter, C.L. and Newton, V.A. (2013). “To Confront or not to confront: Non-Targets’ evaluations of and responses to racist comments.” Journal of Applied Social Psychology. (43): 262-275.
 Tynes, B., Giang,M., Williams, D., and Thompson, G. (2008). “Online Racial Discrimination and Psychological Adjustment Among Adolescents.” Journal of Adolescent Health. (43) (6): 565
 Steeves, V. (2014). “Young Canadians in a Wired World: Encountering Racist and Sexist Content Online.” MediaSmarts. Ottawa. 1-25.