As was noted in the introduction, the Simon Wiesenthal Center's Digital Hate and Terrorism project has identified over 14,000 problematic websites, forums, blogs, and social media postings. 
Social networking platforms are particularly problematic. Some hate groups have developed their own versions of these sites, such as New Saxon, "a Social Networking site for people of European descent" produced by the American Neo-Nazi group National Socialist Movement. Extremist groups represented on Facebook include Stormfront, National Socialist Life, Libertarian National Social Movement, Aryan Guard, FARC, Al Shabab Mujahideen, Hamas, Hezbollah, Faloja Forum, Support Taliban and scores of anti-Israel sites. The Simon Wiesenthal Center actively works with Facebook to remove hate pages, but overall – and despite best efforts – the spread of online hate has so far outpaced efforts to remove them.
Cultures Of Hatred
Youth don't have to encounter overt hate speech to be exposed to hate online. Much more common are what could be called "cultures of hatred": communities in which racism, misogyny and other prejudices are normalized. Through the mechanism of radicalization in like-minded groups, those who join these communities are likely to find their own views and opinions influenced by the values of the community.
A good example of this is the culture of online gaming. In 2010 GAMBIT, the MIT game lab, conducted an experiment in which players adopted screen names that identified them as being members of minority groups, such as "Proud_2B_Muslim" and "GayPride90," while playing the game Halo:Reach on Xbox Live. In each case, although the players behaved no differently than they otherwise would they received significant amounts of abusive language and, in some cases, were even killed within the game by their own team members.
Similarly, many online environments – especially those popular with teenage boys – have fairly high "baseline" levels of racism, sexism and homophobia. Sites which host user-created content, such as eBaum's World, Newgrounds and 4chan, make a virtue out of being offensive. This free-floating hate can sometimes coalesce into more targeted and organized forms. For example, in the multiplayer online game World of Warcraft, anger among players about "gold farmers" – professional players, largely based in Asia, who play the game only to be able to sell the characters or items they acquire to other players – developed into a widespread animosity towards Asians in general and Chinese in particular. The result was the production of films within the game that mocked Chinese players by drawing on classic Asian stereotypes such as the "China doll" character and the idea that Chinese kidnap pet cats and dogs to eat (though there are no pets within the game.) as well as organized attacks within the game on characters perceived to be "Chinese."
This demonstrates one of the ways in which online hate can do harm: by turning the Internet into an unfriendly environment for members of targeted groups. Hatemongers will often "troll" mainstream sites, making racist or otherwise hateful comments to get a rise from some people and/or elicit sympathetic comments from others.  In fact, a 2004 study found that one in five participants in moderated chat rooms were exposed to negative remarks about a racial or ethnic group, while nearly two-thirds of those who participated in unmoderated chat rooms were.  Today, of course, nearly all online communication is unmoderated.
Exposure to online bigotry and hate can have much more serious effects than simply making people feel uncomfortable or unwelcome. Studies have shown that experiencing discrimination online can cause stress, anxiety and depression. 
Forms of Online Hate
Websites: Like all Web content, online hate comes in a variety of forms. The most public face consists of websites and blogs maintained by hate groups: in fact, hate groups were some of the earliest adopters of the Internet, moving their existing materials wholesale first onto discussion groups and then to Web pages. While Web publishing is much cheaper and easier than print publishing, the costs and technical demands of creating a website meant that in the early days of the Internet only the larger hate groups had an online presence. Today, with the easy availability of blogging platforms and simple Web publishing software, hate sites have proliferated. While most of these sites are simple screeds, the more sophisticated ones mimic popular commercial websites in a variety of ways, with many offering audiovisual material and discussion forums and some featuring professional-looking layout and graphics. A small number of websites are designed specifically to appeal to youth and children, such as the Al-Fateh ("The Conqueror") site sponsored by Hamas with content that promotes suicide terrorism.
Music and Videos: Many teenagers turn to musical genres and subcultures to help define their identity, and hate music producers take advantage of this. While hate rock is no longer the cash cow it once was for the larger hate groups, which have fallen victim to the same woes as the mainstream music industry and their own internal troubles , it remains widespread not just on hate sites but on file-sharing sites and mainstream services such as YouTube and iTunes. Hate groups also use the Web's video-sharing services to connect youth seeking guidance, support and validation to hate leaders such as David Duke and Al Qaeda's Anwar Al-Awlaki.
Games: Hate groups make a similar effort to appeal to youth through video games, though the games tend to be more rudimentary and less effective than music: it's easier to create garage-band quality punk or heavy metal music than to make a quality video game. Most are simple Flash games with little replay value, such as the anti-Muslim game Minaret Attack, though a few, like the Hezbollah-made Special Force, aspire to the production values of commercial games. It's hard to imagine youth spending much time playing any of these games, but their sheer outrageousness may provide a guilty pleasure.
Perhaps more significant is the way in which hate groups have drawn on games to encourage users to participate in their sites and forums. This, too, is reflective of a broader trend: 'gamification' has become one of the most common business buzzwords online. Like players of World of Warcraft or other online games, participants on sites such as Stormfront or Salafi Media gain increasing "levels" as they participate: posting, commenting and taking on responsibilities such as moderating others' comments can earn users higher status and special titles or privileges.
Social Networking: Online social networks encourage group interaction and strengthen connections between group members. Sites such as Facebook and Twitter are used in order to join and recruit group members through fan pages, group feeds, or hateful groups. Some hate groups have created their own social networking sites, but these are almost irrelevant: it's much more effective to reach youth through the commercial sites they're already using. Although Facebook has improved how it handles complaints about material that advocates violence, less obvious hate groups continue to use it as a platform.
The greatest advantage of social media is not that it allows hate groups to reach youth, but that it allows youth to disseminate hate material themselves. The ability of social media to help youth interested in hate find friends and mentors is the key to developing the sense of group identity that's so important in the radicalization process.
Accidental Contact: Because of the networked nature of the Internet, it's also possible for youth to encounter hate material entirely by accident. In fact, in MediaSmarts' 2001 study Young Canadians in a Wired World seven per cent of students surveyed reported having encountered hate material by accident, compared to five percent who said they had found it on purpose.
Accidental contact may happen through search engine results – cloaked sites use a variety of means to make sure they turn up in searches on innocuous topics – or through hate content being housed in general-interest sites. The game Border Patrol, where players shoot Mexicans attempting to cross the U.S. border and which appears to have originated on the White Aryan Resistance website, can also be found on eBaum'sWorld, a site that hosts Flash games and videos and is one of the most popular sites among teenage boys. Some groups have also been successful in hacking other sites to make them spread their message – as when an anonymous group hacked the Jerusalem Online site to display anti-Semitic comments.
An assessment of online messages put out by hate groups suggests that a variety of techniques work together to reinforce hate group identity, reduce external threats and recruit new members.  It's important to remember that not all of these techniques are meant to recruit people, but rather to move them along the different stages of radicalization: some are intended to build general support, some to turn casual supporters into active members and some to push members towards taking concrete action.
Education – One technique many hate sites use to build general support is to present their website content as being educational. This is true not only of sites which masquerade as mainstream sources of information – but of more overt hate sites as well, which are positioned as providing visitors with "the real story." Many overt hate sites also attack the mainstream educational system and call on supporters to educate their friends, families and communities about the "real" truth.
Mainstreaming – Despite their contempt for the education system and mainstream media, hate sites adopt many of their markers of credibility -- quoting from old editions of the Encyclopedia Britannica, for example, or selectively citing articles from reputable sources such as the Wall Street Journal. In one study one-third of hate sites denied being racist or a hate group – though often at the same time as they use more overt hate language. For example, the home page for Radio Islam, for instance, says "No hate. No violence. Races? Only one human race" at the top left of the page, while simultaneously saying "Know Your Enemy! No time to waste. Act now!" at the top right. 
Denialism – The best-known tactic for misinformation is denialism, most often in the form of claims that the Holocaust either did not happen or has been exaggerated. These sites typically rely on elaborate technical arguments about gas absorption or railcar fumigation: if readers lack the technical background to challenge the writer's points, they are less likely to challenge the broader conclusions drawn from them.
Pseudo-Science – Using a flood of supposed "facts" and statistics is the hallmark of another hate group technique: misinformation through pseudo-science. Many sites draw on outdated scientific theories such as eugenics and phrenology to substantiate their arguments.
Building Group Solidarity – Many of the techniques used by hate groups are intended to build group solidarity. Calls to protect the group, and in particular the most vulnerable within the group, are useful both for building support and for radicalizing supporters.
Hero Narrative – As well as building group solidarity, hate groups try to bolster members' self-esteem by giving them opportunities to think of themselves as heroes in defense of their group. As well as encouraging supporters to become active members, this hero-villain narrative provides a positive identity and a structure that group members can use to give their lives meaning.
Nationalism – Similarly, hate groups take advantage of existing group loyalties by portraying themselves as defenders of their nation, and by positioning groups they consider to be Other as the enemy. Following the September 11 attacks, for instance, many far-right groups not only turned their attention to Muslims but also claimed that Israel had somehow been involved.
Religion – An appeal to religion can be an even stronger tool for building solidarity. Nearly all hate groups do this in some way, whether to provide authority to back up their claims, or to deny legitimacy of the Other.
Scare Tactics – Creating a sense of urgency around a threat is essential for hate groups to radicalize members. They are also very effective at using current or controversial issues to transform fear and worry into hate: for example, hate groups in the United States have used the election of Barack Obama – and the false "controversy" over his place of birth – to build support, while anxieties about immigration in the UK have led to the creation of an entirely new hate group, the English Defence League.
Hate Symbols – The hate movement continues to use well-known symbols such as the Nazi swastika and the KKK's burning cross to 'brand' its message. However, hate groups also co-opt mainstream symbols such as the Celtic cross and pagan runes, re-signifying them as emblems of white supremacy. The Anti-Defamation League argues that hate symbols are more than mere signs: "These symbols are meant to inspire a sense of fear and insecurity. [They] give haters a sense of power and belonging, and a quick way of identifying with others who share their ideology". 
Othering – Finally, hate groups portray the Other in ways that emphasize difference -- making them seem strange, even inhuman. This is often done through caricature or stereotype, name-calling, or ideology: in some cases hate groups will claim that others are literally not human, such as the use of the term "mud people" to describe Blacks and the claim that Jews are descendants of Satan. Dehumanization is one of the basic mechanisms of radicalization and is a necessary one for hate groups to successfully promote their ultimate message: that annihilation of a particular group is justified.
Cloaked Websites and Implicit Messages
Various writers distinguish between overt hate websites and disguised, or cloaked, hate sites.  Overt hate sites actively promote hatred towards other groups while showcasing racist propaganda or offering hate-based communities online. In contrast, cloaked hate websites intentionally perpetuate hatred through more implicit and deceptive messages, masquerading as legitimate sources of opinion or information. Young people are vulnerable to this type of content as research shows many lack the ability to critically evaluate cloaked hate sites. 
At first glance, cloaked websites, appear to be authoritative and professional – based upon core values of 'truth and free speech.' Many attempt to conceal a racist agenda behind a more moderate message. For example, Melissa Guille of the Canadian Heritage Alliance (CHA) denies CHA is a hate site, arguing instead that the site is concerned about "keeping Canada for Canadians" and "removing the anti-white sentiment in society". 
A related phenomenon is what has been called 'reasonable racism'.  These are hate sites which present their content as political provocation or debate, relying on pseudo-science and twisted logic rather than outright expressions of hate. The classic example of this is the Stormfront website which, in comparison to other white supremacist hate sites, uses discourse that appears less injurious and blatantly hateful in order to appear more innocuous to naïve readers.  Studies suggest that while stereotypical images and messages on hate sites – for example, burning swastikas – produce a more immediate influence upon viewers, the effects of these images are not lasting, with the exception of those who are already receptive to extremist or hate messages. Hate messages that are more implicit, however, have more long-term persuasive power. 
The phenomena of cloaked hate sites and 'reasonable racism' mean that it may often be difficult for youth to recognize hate sites when they encounter them. What distinguishes genuine hate groups from sites engaged in legitimate political debate is the presence of a particular worldview and beliefs that might be termed an ideology of hate: a founding myth or story that the group perpetuates to define its identity, establish a dehumanized Other who threatens that identity, and to justify and encourage violence in defense of the identity.
Ideologies of hate share a number of characteristics:
 Digital Hate and Terrorism Project. (2011) Simon Wiesenthal Center.
 Tynes et al. Online Racial Discrimination and Psychological Adjustment Among Adolescents. Journal of Adolescent Health, 2008; 43 (6): 565
 Tynes et al. Adolescence, race, and ethnicity on the Internet: A comparison of discourse in monitored vs. unmonitored chat rooms. Applied Developmental Psychology 25 (2004) 667–684.
 Tynes (2008)
 Kim, T.K. (2006). White Noise. Intelligence Report, 121.
 McNamee, L., Pena, J., & Peterson, B. (2010). A Call to Educate, Participate, Invoke and Indict: Understanding the Communication of Online Hate Groups. Communication Monographs, 77(2), 257-280.
 Chiang, C., Gerstenfeld, P., & Grant, D. (2003). Hate Online: A Content Analysis of Extremist Internet Sites. Analyses of Social Issues and Public Policy, 3(1), 29-44.
 Anti-Defamation League. (2001). Poisoning the Web - Internet as a Hate Tool. ADL: Fighting Anti-Semitism, Bigotry and Extremism. Retrieved July 20, 2011, from http://www.adl.org/poisoning_web/net_hate_tool.asp
 Daniels, J. (2008). Race, Civil Rights, and Hate Speech in the Digital Era. Learning Race and Ethnicity: Youth and Digital Media (pp. 129-154). Cambridge, MA: MIT Press.; McNamee, Pena & Peterson (2010)
 Daniels, J. (2008)
 Guille, M. (2006). Canadian Heritage Alliance Video. Online Video Guide. Retrieved July 15, 2011
 Meddaugh, P. M. (2009). Hate Speech or "Reasonable Racism?" The Other in Stormfront. Journal of Mass Media Ethics, 24(4), 251-268.
 Amon, K. (2010). Grooming for Terror: the Internet and Young People. Psychiatry, Psychology & Law, 17(3), 424-437.; Chhabra, C. (2010). Mining You-Tube to Discover Extremist Videos, Users and Hidden Communities. Information Retrieval Technology, 6458, 13-24 .; Lee, E., & Leets, L. (2002). Persuasive Storytelling by Hate Groups Online - Examining Its Effects on Adolescents. American Behavioural Scientist, 45(6), 927-957.
 Meddaugh (2009)
 Glaser, J. (2002). Studying Hate Crime With the Internet: What Makes Racists Advocate Racial Violence? Journal of Social Issues, 58 (1), 177-193.