Call of Duty Now Features an AI-Powered ‘Anti-Toxicity’ Moderator
Beware all those who like to talk trash during a Call of Duty game – there’s an all-new AI-powered moderator called ‘ToxMod’ that has been integrated with Modern Warfare II and Warzone that’s targeting players abusing other players via voice chat features. It has been confirmed that it’ll also be making its way to Modern Warfare III when the game launches on November 10th.
This new feature has arrived courtesy of a partnership between Activision and Modulate, the creator of the brand-new ‘voice chat moderation technology.’ In real-time, ToxMod will detect players speaking in an abusive fashion in-game and act accordingly.
If this was introduced in MW2’s Search and Destroy lobbies back in 2009, millions of players would have been banned overnight.
Be Careful What You Say
It’s a fairly comprehensive technology, admittedly.
It has been explained that ToxMod is on the lookout for hate speech, discriminatory language, harassment, and all other forms of abuse. It’s an effort to strengthen Call of Duty’s anti-toxicity mechanics, which now also include text-based filtering that monitors the in-game chat and attempts to spot malicious behaviour across 14 different languages.
It was reported in a Call of Duty blog post that, since the launch of Modern Warfare II, more than one million accounts have been detected violating the Code of Conduct and have had action taken against them.
In recent weeks, Activision has also introduced a fresh clause that’s designed to prevent players from making false reports against players that haven’t done anything wrong, dubbed ‘Malicious Reporting’.
On August 30th, the ToxMod technology was rolled out across North America, and it’s scheduled for a worldwide release when Modern Warfare III drops in November.