online hate essay

  • Expert Advisory Panel
  • Our partners
  • Become a partner
  • Advice for parents and carers
  • Advice for professionals
  • Connecting Safely Online
  • Fostering Digital Skills
  • UKCIS Vulnerable Users Working Group
  • Online hate
  • Online grooming
  • Fake news and misinformation
  • Screen time
  • Inappropriate content
  • Cyberbullying
  • Online reputation
  • Online Pornography
  • Radicalisation
  • Privacy and identity theft
  • Report issue
  • Pre-school (0-5)
  • Young Children (6-10)
  • Pre-teen (11-13)
  • Teens ( 14+)
  • Social media privacy guides
  • Gaming platforms and devices
  • Smartphones and other devices
  • Broadband & mobile networks
  • Entertainment & search engines
  • Get kids tech set up safe
  • My Family’s Digital Toolkit
  • Online gaming advice hub
  • Social media advice hub
  • Press Start for PlayStation Safety
  • Guide to apps
  • Digital resilience toolkit
  • Online money management guide
  • The dangers of digital piracy
  • Guide to buying tech
  • UKCIS Digital Passport
  • Set up safe device checklist
  • Online safety leaflets & resources
  • Digital wellbeing research programme
  • Parent Stories
  • Expert opinion
  • Press releases
  • Our expert panel
  • Free digital literacy lessons
  • Back to school guides
  • Early years
  • Primary school
  • Secondary school
  • Connect school to home
  • Professional guidance
  • News & Opinion
  • Expert Opinion

What is the real-world impact of online hate speech on young people?

As hate speech and trolling become more common online, our experts provide advice on how parents can play a role in supporting young people on this issue.

online hate essay

Young people are connected more than ever before and while this can be a huge benefit in linking them with friends, communities, loved ones and knowledge, it can, of course, be problematic in that they are exposed to an almost constant stream of information which they may not have the critical skills to filter and navigate.

Many young people have a clear digital identity which very often reflects the core of who they are. For example, they may not be ‘out’ as LGBTQ+ offline but are in their online life. If this is attacked, it hits very hard at a unique part of themselves that they should rightfully be proud of. To be exposed to any form of hate speech that attacks their community or identity is painful and can sadly lead to some not wanting to ‘reveal’ that part of themselves. If they witness a wider group of people they relate to being attacked, is it any wonder they can begin to feel negatively towards that characteristic?

This directly affects their self-esteem and self-worth and has very real impacts on their mental wellbeing with many reporting high rates of anxiety and depression. At Ditch the Label , we work hard to empower young people to celebrate who they are and to understand that the issue lies solely with the person directing hate at them – and they should never have to feel that they need to change who they are. If they are being targeted in an offline environment in addition to online, there can often be no escape from the abuse and this amplifies the impacts.

How should parents approach a conversation about online hate with young people?

We always recommend having open and honest conversations with young people about their online lives and experiences just as they do their lives at school or college. Ask them what platforms they are using and if they know how to report if anything happens. Have these conversations regularly rather than wait for a problem to occur and ensure they know they can come to you with any problems and you will be there to support them.

These initial conversations don’t necessarily have to be serious and heavy – they can just as easily be had in regular conversations at the dinner table or while watching TV. In fact, this approach very often removes any pressure to share with you any concerns they might have.

If they do tell you that they are worried about something, give them the time to explain and really listen. Try not to be angry if you feel they shouldn’t have gone onto a particular platform or website or they shouldn’t have shared or posted something. They are then far more likely to open up to you and share what is going on. Make yourself aware of the platforms they are using and if you are able to, spend time with them using a particular game or platform.

Any hate directed at them should be taken very seriously just as it would if it were occurring offline. Remind them that they are not at fault. It may need to be reported, not just to the platform it occurred on, but to the police in some cases as a hate crime.

What expectations should parents and young people have when it comes to reporting online hate speech they’ve been exposed to?

Most of the social media platforms have reporting processes and each has their own guidelines around hate speech and incitement to violence or abuse whether this is in the form of written words or imagery on public platforms. This procedure also considers what may be considered freedom of speech. If in doubt, always report and their moderators will investigate and additionally this helps to refine the reporting systems to encompass emerging trends and adapt accordingly.

There may be times when it may be difficult to navigate the reporting procedures or you may not be happy with the outcome. If you are unable to get harassing, abusive or hate content removed then you can contact us here. Ditch the Label are Trusted Flaggers across all the major platforms which means that we can often get content removed quickly, even if it has already been reported to no avail.

Our trained mentors can provide support on the issues that affect young people here. They can support not only in the removal of content, but in providing support in order to move forward.

Sajda Mughal OBE

What should children do if they come across online hate speech.

With the growth of social media, many children will, unfortunately, come across hate speech online, and we must ensure that young people are aware of what they should do if they come across online hate speech. If children see hate speech online, it is paramount that they tell someone they trust be that a parent or a teacher, for example. This is important for a number of reasons; firstly, the hate speech must be reported to the appropriate body. Secondly, it is important to tell an adult to talk through what the child has seen – this is the case even if the hate speech does not directly affect the child it is still important for the child to understand that what has been said is hateful and wrong, going against the majority of society’s values. If the hate speech has directly affected the child it is important they talk through how it has made them feel and what can be done to help them deal with these feelings.

Dr Elizabeth Milovidov, JD

What should parents do to support their child if they are a victim of online hate speech?

In today’s internet society, online hate speech (hateful, racist or sexist comments) is something that our children and young people are increasingly viewing and sometimes experiencing.  Governments around the world are increasing legislation to combat online hate speech and social media companies are increasing protections on their platforms.

While these are commendable actions, parents may be called upon today to support their children until online hate speech is eradicated. As with all challenges related to parenting a child in the digital age, parents can use this topic as a conversation starter.  Parents can then use the discussion as a way to express their family and cultural values, provide children with strategies to combat online hate speech as well as emphasise the importance of empathy and kindness.

Suggested conversation starters:

  • What is online hate speech?
  • Have you ever encountered online hate speech?  How did you feel?
  • What should you do if you see online hate speech?
  • Do you know how to report/block online hate speech on different platforms?
  • How do you think the person felt who wrote the online comment? How do you think people feel when they read the online comment?
  • How can we spread kindness and compassion online?

Parents should also check out the Hacking Hate online toolkit  produced by European Schoolnet, UK Safer Internet Centre, The Diana Award and other partners.  The toolkit looks at how young people can combat online hate and how to effect change in our communities.

Other resources

  • Internet Matters Online hate and Trolling advice guide for parents

More to explore

See more articles and resources to help children stay safe online.

  • Advice for 11-13 years
  • Advice for 14+ year olds
  • Advice for 6-10 years
  • Cyberbullying resources
  • Social media safety
  • Support wellbeing with tech

Support on site

  • Online hate and trolling – a parent’s story
  • Tackling online hate and trolling
  • Things to talk to them about – conversation starters for vulnerable children
  • Internet Matters response to the Online Harms White Paper

Related web links

Common Sense Media, “Where Kids Find Hate Online.”

How to deal with online hate speech

EU Code of Conduct and Online Hate Speech

Write the comment Cancel reply

Download workbook.

  • To receive personalised online safety guidance in the future, we’d like to ask for your name and email. Simply fill your details below. You can choose to skip, if you prefer.
  • First name *
  • Last name *
  • Email Address *
  • I am a * Parent/Carer Teacher Professional
  • Organisation name
  • Skip and download
  • Phone This field is for validation purposes and should be left unchanged.
  • Privacy Overview
  • Necessary Cookies
  • Third-party cookies
  • Analytical/performance cookies
  • Cookie Policy

This website uses cookies to make your experience better. These cookies do things like remembering you when you come back and figuring out what parts of the website you like. Some cookies are needed for basic functionality, and others from third parties help make your experience even better. You can disable these, but some may affect your experience of the site.

These need to be cookies should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

We may also use third-party cookies to enhance your experience and provide specific services. These cookies are subject to the privacy policies of the third parties placing them.

Please enable Strictly Necessary Cookies first so that we can save your preferences!

These cookies help us understand how visitors interact with our website, allowing us to improve its performance.

Here is a list of the cookies we use within this category:

Google Analytics 4 Meta Pixel Mailchimp Hotjar

See our Cookie Policy to learn more about the types of cookies we use.

Home

  • General Information
  • Media Issues
  • Digital Issues
  • Educational Games
  • Media Literacy Week
  • e-Tutorials
  • Our Approach
  • Research Reports
  • Young Canadians in a Wireless World
  • For Parents
  • Find Lessons & Resources
  • Digital Media Literacy Outcomes by Province & Territory
  • Digital Media Literacy Framework
  • Media Literacy 101
  • Digital Literacy 101
  • Class Tutorials - Licensed
  • My Licensed Resources
  • Become a donor
  • Become a volunteer
  • Teen Fact-Checking Network

Impact of Online Hate

Online hate can have an impact in three interconnected ways:

  • the harm done to its targets, either from personal harassment or from online spaces being experienced as hostile;
  • the risk that those who encounter it may be radicalized by it, becoming more sympathetic and possibly even active; and
  • the effect that it has on the values and culture of the online spaces in which it happens.

The first of these is the most clear, though – as with cyberbullying – the harm done may not be visible to perpetrators. Indeed, a significant amount of cyberbullying is motivated by hate: for example, lesbian, gay, bi-sexual and transgender (LGBT) youth are almost twice as likely to report having been bullied online as those who are straight, [1] while young women are twice as likely to have been sexually harassed online as young men. [2] Young people who experience online hate are more likely to experience anxiety and depression, [3] and targets of online hate may suffer harassment and violence offline as well. [4] A frequent form of targeted hate is “doxxing,” the act of publishing a target’s home address or other personal information as a way of encouraging others to harass them. [5] As a result, members of vulnerable groups may be more reluctant to speak freely online [6] or withdraw from online spaces entirely [7] , which has an impact not just on them, but also on the online communities they’re a part of.

Radicalization

The second possible impact of online hate is radicalization. This term refers to the process by which people come to believe that violence against others and even oneself is justified in defense of their own group. Not everyone who is involved in a group is necessarily radicalized to the same degree; in fact, even within a hate group, only a small number of people may be radicalized to the point where they are ready to advocate and commit violent acts.

One way of looking at the process is to think of any group or movement as a pyramid. [8] (While there has been some recent criticism of this model of radicalization, leading the authors of the original paper to propose a two-pyramid model that separates radicalization of opinion from radicalization of action, [9] it remains a valuable way to model radicalized groups.)

online hate essay

The base of the pyramid is made up of Sympathizers who support the group and share its ideals but who are not actively involved in what it’s doing. They are typically the largest part of the group but also the least committed.

The next level we might call the Members. These are people who identify themselves strongly with the group and participate in its everyday activities.

At the final level are Activists. These are the members who identify most strongly with the group and are likely to push it towards more radical positions and more extreme actions. The most extreme of these are those who commit violent and other criminal acts. While it’s not always clear how each person becomes radicalized to violence, in Canada alone there have been at least three hate-motivated mass murders whose perpetrators were at least partially radicalized online. [10]

The process of radicalization has traditionally been seen as the way in which people move up the pyramid to identify more deeply with their group and become more willing to support or engage in extreme acts. The networked nature of digital media, however, allows hate groups and movements to simultaneously target all of the different levels of the pyramid, making it “remarkably easy for viewers to be exposed to incrementally more extremist content”; [11] one scholar has described the internet as a “conveyor belt” to radicalization. [12] On extremist forums, “how to red-pill [i.e. radicalize] others is a constant topic of conversation,” with a great deal of thought put into matching the message with the target’s readiness. [13]

How Radicalization Happens

In their article Mechanisms of Political Radicalization , [14] Clark McCauley and Sophia Moskalenko identify 12 ways in which a person or group may become more radicalized. “Most of the mechanisms identified are associated with strong emotional experiences, including anger, shame, guilt, humiliation, fear, love and hate,” [15] and in most cases of radicalization, more than one mechanism is at work.

Of those, five are of particular relevance to studying online hate:

  • The Slippery Slope – It’s rare for anyone to become radicalized by a single act or event; it is more often the culmination of many small steps. Research has shown that people have a tremendous ability to justify their actions, even actions they would normally consider to be wrong. (For instance, a person who forgets to leave a tip at a restaurant may retroactively find flaws with the service to justify not leaving a tip.) This can have the effect of shifting our morality: once we have established that something we previously considered wrong was actually right, more extreme actions may become permissible. This effect is particularly powerful online, where consequences are less apparent: the slope that leads from reading hateful content to creating it can be very slippery.  
  • The Power of Love – The social and emotional effects of being in a group can be just as powerful as whatever cause or ideology the group is committed to. Research has shown that members of hate groups such as skinheads will often act as mentors or ‘big brothers’ to vulnerable youth, providing a sympathetic ear, an explanation for their problems, and a way of taking action. Online spaces and communities used by hate groups have many of the same social features – forums, “likes” and “upvotes,” and opportunities to accumulate special privileges and social capital – as mainstream social spaces, [16] but hate movements can fulfil deeper emotional needs for members who “feel shunned in their lives, in their personal lives or in wider society” [17] ; as the moderator of one of the largest misogynist forums on Reddit put it, “The manosphere fundamentally became a surrogate father for the life lessons I never got.” [18]

To create this support network, many of the techniques used by hate groups are intended to build group solidarity. Calls to protect the group, and in particular the most vulnerable within the group, are useful both for building support and for radicalizing supporters. Another way of meeting emotional needs is to bolster members’ self-esteem by building up the most extreme members of the group (frequently, those who have committed violence or even murder in the name of the cause) [19] as heroes, giving an opportunity to less-committed members to imagine themselves as heroes in defense of their group.

  • Radicalization in Like-Minded Groups – All groups are subject to a phenomenon in which the average group member’s opinion will become more extreme over time. This may be because the more different your opinion is from that of the majority, the more pressure you feel to conform – so those who disagree with the majority are likely to change their opinion, while those who agree either maintain the same opinion or become more extreme in their views. An example of this is the Weather Underground, an American anti-war group, which in the 1970s moved from political protest to terrorism as a result of competition within the group over who was the ‘most radical’. This effect is further amplified by what’s been called the “majority illusion” in online spaces:

Socially connected individuals tend to be similar... giving rise to the “selective exposure” effect that leads individuals to overestimate the prevalence of their features in a population... creating an illusion that the attribute is far more common than it actually is. In a social network, this illusion may cause people to reach wrong conclusions about how common a behavior is, leading them to accept as a norm a behavior that is globally rare. [20]

One common method of closing ranks is through symbolism. Groups may encourage members to distinguish themselves through using traditional hate symbols (either in earnest on closed forums, or “ironically” in more public spaces), using “dog whistles” (words or phrases whose hateful meaning is clear to members but not outsiders) or by co-opting mainstream symbols such as the Celtic cross, pagan runes and even the “OK” hand sign, re-signifying them as emblems of white supremacy. The Anti-Defamation League argues that hate symbols are more than mere signs: “These symbols are meant to inspire a sense of fear and insecurity. [They] give haters a sense of power and belonging, and a quick way of identifying with others who share their ideology”. [21]

  • Radicalization Under Isolation or Threat – People will identify more closely with a group if the group appears to be isolated or under external threat. As one filmmaker who interviewed dozens of extremists of different stripes put it, “these movements are deeply rooted in a sense of victimhood, real or imagined.” [22] For example, while White males are certainly the most advantaged group in society,

the far right plays on a much broader dislike of “political correctness” among many young men who feel alienated from mainstream culture... They may see what economic and social capital they do have slipping away. These disillusioned men are perfect targets for radicalization, and it’s a surprisingly short leap from rejecting political correctness to blaming women, immigrants, or Muslims for their problems. [23]

This worldview, in which group members are pitted against an enemy with no possibility of compromise, “is especially enticing to people used to the black and white moral world of many video games, clear evil enemies on one side with the player on the side of good,” [24] but it also serves to justify hatred and violence as being a form of self-defence. As Adam Klein, a professor at Pace University who has studied extremism in logs and social networks says, “rather than overt bigotry, most online hate looks a lot like fear.” These appeals to fear “create an illusion of imminent threat that radicals thrive on, and to which the violence-inclined among them have responded.” [25]

  • Othering and Dehumanization  

Perhaps the most potent technique for fostering radicalization is to portray opposing groups as being inhuman. This explicitly draws the line between the in- and out-groups and makes it easier to justify any action against them. For example, in World War II the Japanese were portrayed in a heavily caricatured style in American propaganda – always stereotyped, often threatening, and sometimes monstrous – with the result that roughly half of American soldiers were in favour of exterminating the Japanese nation after the war was over. In fact, servicemen who had not seen combat were actually more likely to advocate extermination – suggesting that it was exposure to propaganda, and not actual contact with the enemy, that had produced this attitude. [26] As psychologist Nick Haslam puts it, though, “dehumanization doesn’t only occur in wartime. It’s happening right here, right now. And every day, good people who don’t see themselves as being prejudiced bigots are nevertheless falling prey to it.” [27]

The “other” is not, however, the actual group as it exists in reality – some of the groups cited as enemies, such as “SJWs” and “antifa,” do not really exist at all as concrete groups – but are a fiction created to solidify the identity of the hate group and justify its existence and its actions. The editor of one White supremacist website stated that “there should be a conscious agenda to dehumanize the enemy, to the point where people are ready to laugh at their deaths.” [28]

To achieve this, the Other must be portrayed as being both inferior, to establish the hate group’s superiority, and threatening, to establish the need to take action against them. [29] For this reason hate groups portray the Other in ways that emphasize difference – making them seem strange, even inhuman. This is often done through caricature or stereotype, name-calling, or ideology: in some cases hate groups will claim that others are literally not human. Dehumanization is one of the basic mechanisms of radicalization and is a necessary one for hate groups to successfully promote their ultimate message: that annihilation of a particular group is justified.

Radicalization and Youth

Young people are especially vulnerable to the mechanisms described above because many are looking for groups or causes that will give them a sense of identity. Identity seeking is a natural part of adolescence but, taken to its extreme, this can provide a toe-hold for hate mongers. “Anomie” is the term that describes the state of mind in which family, social, or cultural values appear worthless. Youth suffering from anomie will seek a group or cause that gives them values, an identity and a surrogate family. [30]   A common cause of anomie is when changing social conditions make it seem as though one’s identity is under attack. An example of this is the Gamergate phenomenon, in which young male video game players described their identity as gamers as being under threat by the increasing number of women playing games (who are characterized as “not real gamers”), the diversification of game genres, and the critiques of sexism both within the games industry and the games themselves; as Raph Koster, a longtime game developer puts it, “That sense of being marginalized by the rest of society, and that sense of triumph when you’re recognized – gamers have had that for quite a while.” [31]

This also explains why economic problems by themselves do not necessarily make a young person more prone to radicalization: one study found that “it was not poor socioeconomic status itself that pointed toward susceptibility, but rather a sense of relative deprivation, coupled with feelings of political and/or social exclusion.” [32]

Hostile Environments

Beyond radicalizing individuals, the connected, networked nature of online communities also enables hate movements to broaden the base of the pyramid to make hate speech – both in jest and in earnest – seem more acceptable. The purpose is not only to create a greater pool of potentially radicalized recruits, but to also create an online environment that is progressively more hostile to anyone targeted by these movements. While the top two tiers are the most visible and draw the most concern, "the bottom strata is just as responsible for the rancor, negativity, and mis-, dis- and mal- information that clog online spaces, causing a great deal of cumulative harm. This bottom strata includes posting snarky jokes about an unfolding news story, tragedy, or controversy; retweeting hoaxes and other misleading narratives ironically, to condemn them, make fun of the people involved, or otherwise assert superiority over those who take the narratives seriously; making ambivalent inside jokes because your friends will know what you mean (and for white people in particular, that your friends will know you’re not a real racist); @mentioning the butts of jokes, critiques, or collective mocking, thus looping the target of the conversation into the discussion; and easiest of all, jumping into conversations mid-thread without knowing what the issues are. Just as it does in nature, this omnipresent lower stratum in turn supports all the strata above, including the largest, most dangerous animals at the top of the food chain. Directly and indirectly, insects feed the lions." [33]

[1] Hinduja, S. and Patchin, J.W. (2011). Cyberbullying Research Summary: Bullying, Cyberbullying and Sexual Orientation . Cyberbullying Research Centre. [2] Duggan, M. (2017). Online Harassment 2017 (Rep.). Pew Research Center. [3] Tynes, B. M., Giang, M. T., Williams, D. R., & Thompson, G. N. (2008). Online Racial Discrimination and Psychological Adjustment Among Adolescents. Journal of Adolescent Health, 43 (6), 565-569. doi:10.1016/j.jadohealth.2008.08.021 [4] Carson, E. (2017, November 27). This lawsuit could shut down neo-Nazi site The Daily Stormer. Retrieved from https://www.cnet.com/news/taking-trolls-to-court-lawsuit-targets-daily-stormer-internet-nazis/ [5] Online Harassment: Extremists Ramp Up Trolling, Doxxing Efforts. (n.d.). Retrieved from https://www.adl.org/blog/online-harassment-extremists-ramp-up-trolling-doxxing-efforts [6] Lenhart, A., Ybarra, M., Zickhur, K., & Price-Feeney, M. (2016). Online Harassment, Digital Abuse, and Cyberstalking in America (Rep.). New York, NY: Data & Society. doi:https://www.datasociety.net/pubs/oh/Online_Harassment_2016.pdf [7] Resnick, B. (2017, March 07). The dark psychology of dehumanization, explained. Retrieved from https://www.vox.com/science-and-health/2017/3/7/14456154/dehumanization-psychology-explained [8] Mccauley, C., & Moskalenko, S. (2008). Mechanisms of Political Radicalization: Pathways Toward Terrorism. Terrorism and Political Violence, 20 (3), 415-433. doi:10.1080/09546550802073367 [9] Mccauley, C., & Moskalenko, S. (2017). Understanding political radicalization: The two-pyramids model. American Psychologist, 72 (3), 205-216. doi:10.1037/amp0000062 [10] Carranco, S., & Milton, J. (2019, April 27). Canada’s new far right: A trove of private chat room messages reveals an extremist subculture. The Globe and Mail. Retrieved from https://www.theglobeandmail.com/canada/article-canadas-new-far-right-a-trove-of-private-chat-room-messages-reveals/ [11] Lewis, B. (2018). Alternative Influence: Broadcasting the Reactionary Right on YouTube (Rep.). Data & Society. [12] Bergin, A.. Countering Online Radicalisation in Australia . Australian Strategic Policy Institute Forum, 2009. [13] From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled". (2018, October 11). Retrieved April 25, 2019, from https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/ [14] Mccauley, C., & Moskalenko, S. (2008). Mechanisms of Political Radicalization: Pathways Toward Terrorism. Terrorism and Political Violence, 20 (3), 415-433. doi:10.1080/09546550802073367 [15] Mccauley, C., & Moskalenko, S. (2017). Understanding political radicalization: The two-pyramids model. American Psychologist, 72 (3), 205-216. doi:10.1037/amp0000062 [16] Glaser, A. (2017, August 30). Nazis and White Supremacists Are No Longer Welcome on the Internet. So They're Building Their Own. Retrieved from https://slate.com/technology/2017/08/the-alt-right-wants-to-build-its-own-internet.html [17] Illing, S. (2019, March 17). This filmmaker spent months interviewing neo-Nazis and jihadists. Here's what she learned. Retrieved from https://www.vox.com/world/2019/1/14/18151799/extremism-white-supremacy-jihadism-deeyah-khan [18] Marche, S. (2016, April 14). Swallowing the Red Pill: A journey to the heart of modern misogyny. The Guardian . Retrieved from https://www.theguardian.com/technology/2016/apr/14/the-red-pill-reddit-modern-misogyny-manosphere-men [19] After New Zealand Shooting, Far-right, Racists Claim Victimhood, Hail Killer as Hero. (2019, March 15). Retrieved April 25, 2019, from https://www.splcenter.org/hatewatch/2019/03/15/after-new-zealand-shooting-far-right-racists-claim-victimhood-hail-killer-hero [20] Lerman, K., Yan, X., & Wu, X. (2016). The "Majority Illusion" in Social Networks. Plos One, 11 (2). doi:10.1371/journal.pone.0147617 [21] Anti-Defamation League. (2001). Poisoning the Web - Internet as a Hate Tool. ADL: Fighting Anti-Semitism, Bigotry and Extremism. Retrieved July 20, 2011, from http://www.adl.org/poisoning_web/net_hate_tool.asp [22] Illing, S. (2019, March 17). This filmmaker spent months interviewing neo-Nazis and jihadists. Here's what she learned. Retrieved from https://www.vox.com/world/2019/1/14/18151799/extremism-white-supremacy-jihadism-deeyah-khan [23] Marwick, A., & Lewis, B. (2017, May 18). The Online Radicalization We're Not Talking About. Retrieved from http://nymag.com/intelligencer/2017/05/the-online-radicalization-were-not-talking-about.html [24] Deo. (2017, November 23). How White Nationalism Courts Internet Nerd Culture. Retrieved from https://medium.com/@DeoTasDevil/how-white-nationalism-courts-internet-nerd-culture-b4ebad07863d [25] Klein, A. G. (2018, November 20). Fear, more than hate, feeds online bigotry and real-world violence. Retrieved April 25, 2019, from https://theconversation.com/fear-more-than-hate-feeds-online-bigotry-and-real-world-violence-106988 [26] Clark McCauley, Clark& Moskalenko, Sophia. (2008) Mechanisms of Political Radicalization: Pathways Toward Terrorism, Terrorism and Political Violence , 20(3), 415-433. [27] Resnick, B. (2017, March 07). The dark psychology of dehumanization, explained. Retrieved from https://www.vox.com/science-and-health/2017/3/7/14456154/dehumanization-psychology-explained [28] Feinberg, A. (2017, December 14). This Is The Daily Stormer's Playbook. Retrieved from https://www.huffingtonpost.ca/entry/daily-stormer-nazi-style-guide_n_5a2ece19e4b0ce3b344492f2 [29] Meddaugh, P.M. (2009). Hate Speech or “Reasonable Racism?” The Other in Stormfront. Journal of Mass Media Ethics , 24(4), 251-268. [30] Amon, K. (2010). Grooming forTerror: the Internet and Young People. Psychiatry, Psychology& Law , 17(3), 424-437. [31] Wingfield, Nick. “Feminist critics of video games facing threats in ‘Gamer Gate’ Campaign.” The New York Times, October 15 2014. [32] Norman, J. M., & Mikhael, D. (2017, August 28). Youth radicalization is on the rise. Here’s what we know about why. Washington Post. Retrieved September 1, 2017, from https://www.washingtonpost.com/news/monkey-cage/wp/2017/08/25/youth-radicalization-is-on-the-rise-heres-what-we-knowabout-why/?utm_term=.39a485789d43 [33] Milner, R. M., & Phillips, W. (2018, November 20). The Internet Doesn't Need Civility, It Needs Ethics. Retrieved from https://motherboard.vice.com/en_us/article/pa5gxn/the-internet-doesnt-need-civility-it-needs-ethics

IMAGES

  1. Final Research Essay

    online hate essay

  2. 3facebook

    online hate essay

  3. Online Essay Help

    online hate essay

  4. Speech

    online hate essay

  5. The Hate U Give

    online hate essay

  6. Countering Online Hate and Offline Consequences

    online hate essay

VIDEO

  1. I Hate UwuCuteSingle

  2. You're WRONG to hate this Danganronpa Character!

  3. Do You Hate Studies?

  4. There is at least a essay of hate in this comment go find it #therianthropy #therian #therianpride

  5. Avoid This Terrible College Essay Topic

  6. When a student didn't write their own essay