Valorant devs to keep a closer eye on what goes on in the chat.
Ask Valorant is truly the best insight into the minds of Valorant's developers, and the November 5 edition delivered with some hard-hitting questions, followed up by well thought out answers.
The Valorant dev team knows it has a lot of work to do when it comes to patch stability and harboring a safe and friendly in-game environment. Unfortunately, online gaming in 2020 can be extremely toxic.
Valorant players have often found themselves reporting the most toxic of players, but to no effect. Now, Valorant seems to be cracking down much more on those that abuse in-game comms and text chat.
What are Valorant planning to do to count this issue?
I know you now have Forced Name Change to punish players who violate the Code of Conduct with an offensive name, but what about dealing with the awful stuff in the chat?
We knew that chat would be a problem before launch and our new systems weren't going to be ready, so while we worked on finishing that, we put an interim system in place that has been running since launch.
Today we issued chat and voice comms restrictions to the worst of the worst, focusing on language and behavior that’s so obviously bad that it’s easy to detect. We’ve recently deployed a new chat evaluation system that is running in English, which means you might run into more players who’ve received text or voice restrictions. Over the upcoming months, we’re expanding our supported languages and improving our detection so it does more than catch only the most egregious offenders.
It’s important to remember that your words have a meaningful impact on your teammates, and what you say in the confines of a match can have lasting effects on the people you’ve said it to. Hate speech, slurs, and any type of threat or bullying is not welcome in our games. Players that receive chat restrictions are reminded to use in-game comms constructively—we win and lose together!
For the rest of you: Keep reporting players who aren’t following the Code of Conduct. Each report is sent into our behavior systems to review, and having the reports helps us continue to improve our machine learning models and get better & better at dealing with disruptive behavior.
It can be hard for any game, especially one as big as Valorant, to monitor voice comms. It is just a sad truth that players can typically get away with slurs, hate speech, and just saying overall nasty things to their players. Text chat is much easier to keep an eye on, if there is a way to log it. Without some players feeling their privacy is being invaded, however, logging voice comms is almost impossible.
Thankfully, Riot Games is taking the reporting system in Valorant seriously. The point of being able to speak in-game is to communicate with teammates in order to coordinate a victory. It can though, devolve into a shouting match full of negativity.
It seems the Valorant dev team knew this would happen before they even launched the game. It's a good sign that they are continuing their work on making Valorant a safe game for all to enjoy, and have started removing the obvious offenders.Published 06 Nov 2020, 02:13 IST