Online video games have grappled with the persistent issue of toxic voice chat, with women and minority players often being subjected to harassment.
The video game series, “CALL of DUTY”, has recently integrated AI technology, developed by publisher Activision, to actively detect and respond to hate speech in real-time during online matches.
This moderation tool employs machine learning to identify and address discriminatory language and harassment.
Machine learning empowers AI to acquire knowledge and adjust autonomously, relying on algorithms and the data it has been exposed to in order to recognize patterns.
In Call of Duty, the newly introduced tool known as ToxMod, developed by Modulate, is aimed at enhancing the gaming experience. Activision’s Chief Technology Officer, Michael Vance, mentioned that it will contribute to creating an enjoyable, equitable, and inclusive environment for all players.
Online video games have grappled with the persistent issue of toxic voice chat, with women and minority players often being subjected to harassment.
Don’t miss out on: World Record Holder Tobi Amusan Cleared to Compete in World Athletics Championships, Vows to Uphold Integrity of the Sport
The prevalence of this issue becomes even more pronounced in popular multiplayer games, owing to the vast player base; with approximately 90 million individuals engaging in Call of Duty on a monthly basis.
Activision has taken steps to address this concern, leveraging its existing tools. These include player reporting functionality and automated monitoring of text-based conversations and offensive usernames. These efforts have already resulted in one million accounts facing communication restrictions.
Within Call of Duty’s code of conduct, explicit prohibitions exist against bullying and harassment, encompassing insults based on factors such as race, sexual orientation, gender identity, age, culture, faith, and country of origin.
ToxMod, as described by Vance, enables a significant scalability of the company’s moderation endeavors. It categorizes toxic behavior based on its severity before human intervention is required to determine appropriate action.
Players will not have the option to opt out of AI-based monitoring unless they completely disable in-game voice chat.
Initially, ToxMod has been implemented in Call of Duty’s Modern Warfare II and Warzone games, but it is currently available only in the United States. A full-scale deployment is scheduled to coincide with the release of the next installment, Modern Warfare III, set for November 10th. This expansion aims to enhance the gaming experience for a wider audience.