Transparency Report Hero Image

Xbox Releases Second Transparency Report Demonstrating the Integral Role of Proactive Content Moderation

With a growing community of more than 3 billion players around the world, continuing to invest in trust and safety is critical to fostering a safe and inclusive online environment. Shielding players from harm is an integral role of the Xbox Safety team and the work that we do. Players don’t often see, or know about, all of the content moderation measures working in the background that help make their experience safer and more welcoming. Today, we are releasing our second Xbox Transparency Report, which details the continued efforts to better protect our players and illustrates our safety measures in action.

Our multifaceted safety suite includes our proactive and reactive moderation efforts, Community Standards, parenting and family controls such as the Xbox Family Settings App, and our continued work with industry partners and regulators. Our critical investments in content moderation combine AI and human-powered technologies to catch and filter out content before it reaches and impacts players. We use a range of measures that give us the scale, speed, and breadth to keep up with the growing interactions and activities of our players. As noted in the Transparency Report, 80% (8.08M) of total enforcements this period were through our proactive moderation efforts. The data articulates the impact of this approach.

As the needs of players continue to evolve, so do our tools. The safety of our players is a top priority – and to advance safe online experiences, we will continue to invest in innovation, work in close collaboration with industry partners and regulators, and collect feedback from the community. We look forward to sharing more.

Transparency Report Infographic

Among the key takeaways in the report:

  • Proactive measures are a key driver for safer player experiences. In this period, 80% of our total enforcements issued were the result of our proactive moderation efforts. Our proactive moderation approach includes both automated and human measures that filter out content before it reaches players. Automated tools such as Community Sift, work across text, video and images catching offensive content within milliseconds. In the last year alone, Community Sift assessed 20 billion human interactions on Xbox.Proactive measures also detected and enforced against 100% of account tampering, piracy, phishing, and cheating/inauthentic accounts.
  • Increased focus on inappropriate content. We understand that the needs of our players are constantly evolving, and we continue to listen to player feedback about what is or is not acceptable on the platform, in line with our Community Standards. During this last period, we increased our definition of vulgar content to include offensive gestures, sexualized content, and crude humor. This type of content is generally viewed as distasteful and inappropriate, detracting from the core gaming experience for many of our players. This policy change, in conjunction with improvements to our image classifiers, has resulted in a 450% increase in enforcements in vulgar content, with 90.2% being proactively moderated. These enforcements often result in just removing the inappropriate content, which is reflected in the 390% increase in “content-only” enforcements in this time period.
  • Continued emphasis on inauthentic accounts. Our proactive moderation, up 16.5x from the same period last year, allows us to catch negative content and conduct before it reaches players. The Xbox Safety team issued more than 7.51M proactive enforcements against inauthentic accounts, representing 74% of the total enforcements in the reporting period (up from 57% last reporting period). Inauthentic accounts are typically automated or bot-created accounts that create an unlevel playing field and can detract from positive player experiences. We continue to invest in and improve our tech so players can have safe, positive, and inviting experiences.

Around the world, our team continues to work closely with key industry partners to collaborate on our safety approach, including increased education and improving our safety measures to exceed standards:

  • We enhanced our console family settings in close cooperation with the German age-rating board, USK, to achieve an official recognition of our parental controls. Now acknowledged, Xbox One and Xbox Series X|S are the first consoles with an internet browser to meet the high standards of youth protection programs according to §11 JMStV, Germany’s Interstate Treaty on the protection of minors. While the recognition is specific to Germany, the improvements support families and players everywhere, further demonstrating our commitment to safe gaming experiences for all players around the world and our ongoing partnership with regulatory bodies.

Together, we are creating a community where everyone can have fun. Every person, whether a first-time player or a seasoned pro, plays a role in building a more positive and inviting community for all. Player feedback and reporting helps us improve our safety features. If you see something inappropriate, please report it – we could not do this without you!  

Some additional resources:    


Editor’s Note (May 25): We’ve included additional information about our console family settings accreditation in collaboration with German age-rating board, USK.