The Power of Customisable Content Moderation

How can customisable content moderation empower digital communities globally?

hero image
Home | Blog | The Power of Customisable Content Moderation

The accuracy of automated moderation tools used by the major platforms is being brought into question by independent bodies including Meta’s Oversight Board, which recently urged the company to establish a more responsive system to reduce the margin for error and minimise the negative impact caused by incorrect decisions by individual human reviewers being ‘amplified.’ 


In a recent article, the Board expressed concern that Meta does not measure the accuracy of the automated moderation tools for specific content policies and does not publish details of error rates or the consequences of publishing harmful and dangerous content. 


It’s feared that a lack of transparency and clear guidance around content moderation measures can lead to aggressive take-downs, with content that holds ‘news value’ or raises awareness of an important issue, being mistakenly censored. On the other side, ambiguity around moderation policy, it’s suggested, could also lead to the publishing of harmful content from dangerous organisations. 


At GoBubble our Emotion AI is used across many sectors including social, business, healthcare, entertainment, gaming, dating, education and sport. For example, we work with football clubs worldwide from the Premier League through to League Two to reduce online hate that appears across their social channels.


Football has its own language and so much of the ‘banter’ we see on and off the pitch is part of the culture of the sport, but abuse and hate being directed at players and their families, as well as staff and fans, should never be tolerated as part of the game. Our Global Football AI product provides teams and players with a personalised dashboard that allows them to manage the category controls and to define the language and content that is deemed acceptable for their online community. 


Catherine Azam, Director and Head of Data Science, GoBubble said:


“AI and machine-led content moderation is not foolproof, and that’s universal, but at GoBubble the current level of accuracy for our Emotion AI tool is 98% which is higher than our competitors (market average is between 70 and 90%). 


"What’s unique about the product is that having created our own AI, we can empower our clients to customise their moderation filters and define the level of ‘banter’ or terminology they’re willing to allow, and align with the standards of their own digital community, using a personalised dashboard. GoBubble further allows users to select from pre-defined categories that scan specifically for hate-speech covering domains such as racism, misogyny and homophobia.


"We provide a responsive AI-led system as well as an optimized dashboard for human review, as a final stage of our quality assurance process to further reduce the margin of error for the AI."


Danielle Platten, CEO at GoBubble and award-winning female Tech Entrepreneur, added:


“Content moderation is a vital tool in the fight against digital harm, and it requires clear regulatory process and guidance for providers and platforms, to ensure that users are shielded from dangerous and offensive content whilst encouraging robust discussion and debate, and upholding our democratic right to freedom of speech online.


“Our Emotion AI enables us to scan content in multiple languages across all multimedia including audio, video, image, emoji and memes for harmful content, preventing issues in real-time. 


“This is not about scanning for keywords or adding more moderators to the review queue, it’s about using advanced technology to empower organisations and individuals to decide what content they want to appear on their own platforms and social feeds, which can affect their reputation and mental health and wellbeing, underpinned by established regulatory guidance and protocol.


“Together we’re helping to enhance the collective well-being of digital communities and the people who run them, through cutting-edge pure AI doing all the heavy lifting with incredible accuracy.”