Call of Duty’s anti-toxicity focus has made surviving lobbies easier than ever—and that’s good

CoD's voice chat moderation, which is AI-powered, has also paid dividends in a reduction of toxic behavior, according to Activision.

Call of Duty BO6 Zombies exfil feature
Image via Activision

Call of Duty’s efforts to combat in-game toxicity both over voice and text messaging has yielded some eye-popping results, according to a new report published by Activision.

Recommended Videos

Starting in November 2023 with Modern Warfare 3, Activision said it began collaborating with Community Sift for text-based moderation, and it has since resulted in over 45 million text-based messages “in violation of the Code of Conduct” being blocked since. Sounds like it’s working well, and that’s a good thing.

Black Ops 6 lime green beta skin and blueprint
You won’t need a hazmat suit to play online, Activision says. Image via Activision

Activision says that the same systems will be in place for Black Ops 6 on day one when it launches later this month, as the company further attempts to continue “fostering a positive and welcoming community for all players.” Isn’t that what everybody should strive for? In BO6, it will be available in 20 languages.

“The Disruptive Behavior team knows that hype and passion is part of Call of Duty’s DNA,” Activision said. “Voice and text-based moderation tools in Call of Duty don’t target our competitive spirit – it enforces against behavior identified in the Call of Duty franchise Code of Conduct, targeting harassment and derogatory language. Similar to MW3, the CoD Code of Conduct will be visible during the initial in-game flow when players first launch core multiplayer modes in Black Ops 6, asking players to acknowledge the Code of Conduct pillars.”

The controversial AI-powered voice moderation, too, will be back in BO6. And since improving the system back in June after originally being deployed last summer, Activision says it’s seen a 67 percent reduction in repeat offenders of voice chat offenses, so the strict enforcement is proving to help toxic players change their ways—or at least switch over to Discord. But it’s helped lead to a 43 percent overall drop in “exposure to disruptive voice chat” since January, according to Activision.

In the ongoing efforts to battle toxicity, Activision says CoD has partnered with the California Institute of Technology and the University of Chicago Booth School of Business to research and better how the franchise’s efforts have been doing. And it feels like the research is helping a lot.

“The knowledge gained from these research initiatives are valuable input to be fed back into the game as part of Call of Duty’s overall strategy to combat disruptive behavior,” Activision said.

In the end, CoD is just a video game, and no one should feel like they are under attack for their identity or anything else other than their K/D ratio after a close match with some friendly banter.

After all, times have changed, so those who claim that CoD has “gone soft” or “woke” should utilize any temporary bans as an opportunity to take a look in the mirror and ask where they stand on ongoing persistent issues like racism, sexism, homophobia, and more in 2024.

Author
Image of Scott Duwe
Scott Duwe
Senior Staff Writer & Call of Duty lead. Professional writer for over 10 years. Lover of all things Marvel, Destiny 2, Metal Gear, Final Fantasy, Resident Evil, and more. Previous bylines include PC Gamer, Red Bull Esports, Fanbyte, and Esports Nation. DogDad to corgis Yogi and Mickey, sports fan (NY Yankees, NY Jets, NY Rangers, NY Knicks), Paramore fanatic, cardio enthusiast.