Why has it become ‘the norm' and what impact does it have?
There are currently an estimated one billion online gamers worldwide. The size of the gaming community continues to grow worldwide and shows no signs of slowing down.
The reach of gaming goes across all communities and has a diverse mix of players globally. Anyone who has played online games or is a frequent viewer of game streamers will know that “trash talk” can be often viewed as ‘the norm’ and racist, homophobic, transphobic, and sexist comments are often rationalised as part of video game culture.
But why has it become ‘the norm' and expected that this should be the case?
Such content and comments can have an effect on gamers, streamers, and ultimately the brand platform or gaming studio.
The impact can be seen far and wide, particularly in the form of “hate raids”. This is when human users and/or bots target a particular streamer to ‘raid’ their stream and overwhelm the chat with hateful comments and threats, most commonly targeted at minority streamers.
Such toxic content has an impact on the mental wellbeing of streamers and their community. In an interview titled ‘Confessions of a Twitch Streamer who received ‘Hate Raids’, the anonymised streamer tells of the mental health impact:
“When I’m streaming, it’s my point to make sure all my viewers are having a good time and enjoying the stream, and I can’t do that if I’m worried about some hate raids.”
Similarly in a recent interview with GoBubble, Emma ‘Emzii’ Rose told us about the levels of toxicity online, the role of in-chat moderators, and how toxicity can not only put you off streaming but content creation as a whole.
Research by the Anti-Defamation League shows that marginalised groups suffer the most harassment in online gaming, with 37% responding that they will try and hide their identity as a result.
Viewers in the community can also be affected by streams and online games filled with hate and toxic content. In a recent study by Preply, it was found that 90% of gamers surveyed have experienced or witnessed emotional abuse or bullying while playing video games, and nearly 7/10 have considered quitting because of what they’ve witnessed.
The levels of hate and the widespread impact it has on the gaming community needs to be addressed. There is an organisational shift within the industry, with platforms and major companies starting to do more to protect their communities.
The importance of content moderation is paramount for creating safer, healthier and kinder digital communities. Usernames, profile images and in-game chat can be moderated to improve the experience and help support players’ and community members’ mental wellbeing.
At GoBubble we focus on the area of user-generated behaviour and sentiment as a key analysis tool for our cutting-edge Emotion AI, to provide an in-depth understanding of both the content posted, and the behaviour of the author.
We can support you to silence online abuse and help you build a safer, kinder and more supportive online community, while protecting your brand reputation.