Toxic players are one of the biggest problems in call of duty. Although there are mechanisms to report or silence them, the measures do not always have an effect. That is why Activision began testing a tool that uses artificial intelligence combat abusive behavior.
In an entry published on the blog of call of dutyActivision announced that it will implement a new large-scale real-time voice chat moderation system. The company will use ToxMod, a tool developed by the Modulate company that uses artificial intelligence to identify discriminatory language, hate speech, harassment and other toxic behavior.
ToxMod works in three stages: first, classifies and processes the data of a voice chat to determine if the conversation requires attention. If certain parameters are met, the system will proceed to analyze tone, context and perceived emotion to determine the type of behavior. This is accomplished using machine learning models, which Modulate says understand nuanced and emotional cues to differentiate between a prank and misbehavior.
“These classification models are not capable of understanding everything about a conversation at first glance, but they can look for telltale signs of anger, distress, aggression, and even sinister intentions more subtle,” Modulate mentions in one of its documents. Finally, the tool is capable of escalating the most toxic voice chats so that moderators can take appropriate action.
call of duty will fight toxicity in all languages
ToxMod understands 18 languages — including Spanish — and is able to understand the full context of a conversation between two or more of them. This is important, as it is common to hear people uttering racist slurs in a language other than English in call of duty.
Modulate uses language models, artificial intelligence, and human experts who speak each language natively and can identify certain harmful actions. “Tackling toxic behavior is more nuanced and requires fluency in a language and the culture of the language’s country of origin, as well as the subculture and psychology of gaming and online behavior in general,” the company says.
In a previous post, Modulate comments that ToxMod detects risk categories specific, like violent radicalization or child bullying. This is possible thanks to detection algorithms that identify repeated behavior patterns. For example, if a player uses extremist language on one occasion, it is classified as an offense; but if he repeats or escalates it over several weeks, it’s added to a risk category for moderators to assess.
Artificial intelligence is not infallible




It is important to mention that ToxMod is not the ultimate solution against abusive behavior in call of duty. The tool will be integrated into a game moderation system which, among other things, includes text-based filtering available in 14 languages, as well as a mechanism to report players who break the rules. The final decision to penalize or not the offender will be in the hands of the anti-toxicity team.
The first beta will start today with the integration in Call of Duty: Modern Warfare II and Call of Duty: Warzone in North America. Subsequently, will be extended to the entire public with the launch of Call of Duty: Modern Warfare III on November 10. Activision confirmed that moderation will begin in English, though Spanish and other languages will be added at a later date.