Skilled Human Oversight:
Activision Blizzard

Investing in safety with trained moderators.

Trust and Safety Solutions

Featured Tools:

Activision Blizzard

All Activision Blizzard teams utilize a combination of human moderation and tools to address reports of misconduct in a timely manner or prevent them all together.

Given the tremendous scale of Activision’s Call of Duty franchise, its anti-toxicity team employs a variety of tools to expand the scope of their reach. Among them: text-based filtering across 14 languages for in-game text (chat and usernames); a robust in-game player reporting system; a global real-time voice chat moderation system powered by Modulate called ToxMod, an AI-powered voice chat moderation technology, to address toxic speech. They also leverage an AI-powered content moderation and filtering platform from that shield players from online harms, including cyberbullying, abuse, hate speech, threats and child exploitation.