06.06.2024

Managing toxicity in online environments currently and looking ahead

At Reboot Develop Blue 2024, Unity's Micaela Hays took to the stage to discuss the growing issue of online toxicity, emphasizing its ethical and business implications.

She highlighted that between 2021 and 2023, the percentage of individuals encountering toxic behavior rose from 68% to 76%. Additionally, 49% of gamers have steered clear of certain titles due to toxicity. During a subsequent interview, she shared insights into why toxic behavior has been escalating.

"The rise in toxicity can be attributed to increased gaming during the COVID-19 pandemic," Hays remarked. "As people resumed their normal routines, the gaming community's growth leveled off, but the impact remained."

The lockdowns significantly affected mental health, altering people's empathy levels. Hays noted that players' lack of face-to-face interaction, especially young people who missed crucial development years, contributed to more self-centered behavior.

"Isolation during the pandemic stripped away some of the humanity in interactions," she explained. "This anonymity behind a screen has emboldened people."

"Moderation sees one of the highest turnovers in the industry because they face obscene language and terrible threats on a daily basis. The human psyche can only take so much before asking if it's worth it"

Research also indicates that toxic behavior isn't just a minority issue. "Toxicity is widespread," Hays said.

Hays suggested that the roots of the problem might extend back to the early 2000s when anti-toxicity efforts were minimal.

"When I began gaming, toxicity was just accepted as part of the experience," she recalled. "This entrenched behavior perpetuates the cycle because new players accept it as normal."

Changing these entrenched behaviors is challenging. "For a long time, people believed toxicity was just part of gaming," Hays said. "Fortunately, more companies are taking responsibility for creating healthier environments."

"More companies understand the onus is on them to do something about the environments they create for players"

Hays offered a glimmer of hope: data shows that 96% of gamers want to combat toxicity, with many willing to pay more for a positive gaming environment.

Addressing this issue involves compromises. Hays discussed various moderation methods, such as reporting systems and speech-to-text transcriptions, which have their own challenges. Additionally, the context of behavior is crucial; what's acceptable among friends in one game might be inappropriate in another.

Her background as a high school math teacher influenced her approach. She moved from community management to business development at Hi-Rez Studios, eventually specializing in community safety at Unity, focusing on in-game communication through the Vivox service.

Educating rather than just punishing players is key. Some games reward non-toxic behavior by gating rewards for those who avoid bans.

One challenge is the high turnover in moderation roles, which see daily exposure to offensive language. Hays explained that these roles are often underpaid and under-appreciated, leading to burnout.

The solution presented in her talk was Unity's Safe Voice, a tool that combines machine learning with transcription and tonal analysis to manage toxicity. It tracks player behavior, such as muting patterns and reactions to specific actions.

The moderation needs of a child-centric title like Roblox will differ from those of a game with a more mature audience, such as Call of Duty

Safe Voice started its closed beta phase last July and has now progressed to an open beta stage. This system is crafted to provide detailed context that is often missing, presenting it in an easily understandable format.

"This is machine learning at its finest," Hays remarked. "It serves to protect not only the players but also the moderators."

When asked about the potential for misinterpreting context, nuances, or jokes through automation, Hays emphasized that Safe Voice excels in this domain. The system is highly adaptable, allowing studios to configure it according to the specific behaviors deemed acceptable within their communities.

"For example, games come with ratings," Hays explained. "For Mature or 17+ games, certain obscenities are more commonplace and don't necessarily equate to toxicity." In these contexts, she added, "words like 'damn' or 'shit' might be used neutrally as part of everyday language."

"[Fully automated moderation] could happen in future, but some studios want to some sort of human check and I'm all for that as well"

Hays does not believe that permitting specific language in certain communities will deter potential new players, as these players "are adults and should grasp the contextual use of such language, especially in contrast to games with young user bases like Roblox, where most players are aged 7 to 14."

Merely focusing on language and tone is insufficient; the system must also interpret subtle nuances and the varying meanings of the same word across different regions. When queried about this, Hays agreed, noting that tonal analysis is critical. Languages like Portuguese, Spanish, and English have distinct variations that make certain words inappropriate in one country but acceptable in another.

"It’s not just about what you say and how you say it, although those are crucial elements. It’s also important to consider the listener’s perception. Understanding whether what I said offended you based on your tone is taken into account."

This aspect sets Safe Voice apart from basic transcription services, which might miss the context that makes a statement offensive or inoffensive. Conversely, it works both ways.

Even though AI handles much of the preliminary analysis, human oversight is required to review selected data, assess it, and take action. Could full automation of this process be achieved in the future?

"It's a possibility down the line," Hays stated. "Achieving full automation would be fantastic, but I believe studios still want some level of human verification, and I support that."

gamesindustry.biz
Comments
Write a comment...
Related news