Xbox

Xbox Voice Chat Filter and its Implementation

DISCLAIMER: TO BE CLEAR, THIS ARTICLE IS WRITTEN PURELY FROM AN XBOX TECHNOLOGICAL STANDPOINT AND IS NOT MEANT TO GENERATE DISCUSSION ON WHAT IS RIGHT AND WRONG! THIS IS PURELY ABOUT HOW A FILTER SYSTEM WOULD WORK AND ALLOW USERS TO PICK AND CHOOSE WHAT THEY SEE AND HEAR WITHOUT MESSING UP THE ENTIRE SERVICE!

Xbox Chat Filter

Having chat filters in video games, particularly when it comes to explicit/mature language, is nothing new. MMORPGs in particular (World of Warcraft, Star Wars: The Old Republic, etc.) have a setting that will automatically replace words that are considered “foul” with asterisks, unless manually deactivated by the player.

https://www.youtube.com/watch?v=dsgRrHqNH8U

However, mature language filters traditionally only affects text chats and have no impact on voice communication. The standard approach to “filtering” voice chats is to either not communicate with gamers who are too explicit for someone’s comfort level or report them in the hopes of getting them suspended.

Well, Microsoft is seeking to advance the chat filter system, starting with implementing a filter in Xbox Live’s text chat services. That is to say players will have the option to filter mature language when viewing direct messages on Xbox Live. Moreover, Xbox’s chat filter will evidently have thresholds (child, teen, adult, etc.), with gamers able to choose which level of filtration they prefer to use (the option to allow everything will also be available).

On top of text filter system, Microsoft will eventually seek a way to filter voice communications, somehow implementing a system that manages to adjust what one player hears from another based on their language settings. Exactly how the voice filter system would work is something even Microsoft itself doesn’t know, as this type of technology has never been used in video games in the past.

Xbox – The Idea

So far, Microsoft has only touched on a couple ideas regarding how they think the system could work. One such idea revolves around tone and emotion, the filter system picking up on certain “behaviors” that indicate anger or aggression and filtering that kind of animation out of what another gamer hears (again, based on that player’s settings).

Something like this would require the filter system to detect certain vocal tones on a level so subtle that it’s beyond the average person’s powers of perception, especially when those utterances are made without much, if any, animus.

Implementation Issues

Implementing a filter system for Xbox Live’s text chat won’t be much of a challenge seeing as this function is already used frequently. However, when it comes to filtering voice messages or live voice chats, there’s a bit more of a challenge. It’s not so much about making it so that someone can’t say whatever they want, but rather making it so that someone else simply won’t hear what is being said based on their filter settings.

Many bugs can occur when trying to utilize a system like voice chat filters, such as a massive, accidental muting of everyone due to coding errors or other flaws in the system. In addition, the system mistaking some tones or emotions for others simply because someone is loud could also affect the outcome of such an update.

In addition, what happens when the filtration algorithm mistakes someone’s accent for using words or conveying emotions they weren’t actually using or conveying? Humans have a hard enough time understanding each other, especially when communicating with someone who isn’t fluent in a given tongue. If people have difficulty deciphering each other, how can a machine hope to do it properly?

Moreover, what about individuals with speech impediments or other concerns that may get in the way of their ability to communicate? For example, someone with a lisp, stutter, or who experiences autism may have a more difficult time communicating with others. Will Xbox’s voice chat be able to detect such things and not mistake them for filterable behavior?

Inaccurate Preferences

Some questions Microsoft will have to answer when updating chat filter systems, especially when it comes to voice, involve the filter preferences not functioning correctly. For example, what happens when someone sets their preferences to filter out “mature” messages, but it turns out they’re also missing out on teen communication due to a faulty algorithm?

Moreover, how does Microsoft define the different thresholds for what messages are inappropriate for different filter levels? Does mature content only encompass explicit language or high levels of perceived anger? These things are generally easier to adjust in text messages, but voice chats are an entirely different beast, especially when a party chat consists almost entirely of a group of friends who share similar personalities.

Time Frame

Keep in mind that this kind of technology takes time to develop and implement. Amazon’s Alexa took years to develop before its initial release in November 2014, and the languages with which she could be communicated were severely limited. The available languages have increased since the device’s inception, but it took a few years to get there.

With a voice chat filter system for Xbox, the time frame could be just as long or longer to develop and implement. The bugs would be astronomical, especially as Microsoft tries to deal with some of the aforementioned hurdles.

If Microsoft wants what they consider an effective system, hopefully they don’t rush the project and miss out on addressing some potential, irritating shortcomings. Current communication filters still miss many things they’re meant to catch, which is why there’s a failsafe system in which players report the issues the automated systems don’t catch.

The Future

Whether or not this voice chat filter system comes to fruition for the current generation of Xbox or if it makes its debut on Project Scarlett, Microsoft has quite the task in front of them for creating an automated voice chat monitor that works effectively. If the goal is to implement the system based on user settings and only affect what people hear and not what they say, the system won’t be much different than modern text filters.

However, with humans being fallible and coding being a frequently malfunctional practice, creating a highly buggy system is a likely result. With any luck, Microsoft will either take their time with researching and developing the project and won’t release it until well after it has been tested with enough user feedback, if they release it at all.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments