Advertisement
Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Even cozy games can get toxic

It's not just big-name shooters that are turning to GGWP for AI-based moderation.

GGWP

Former professional esports player Dennis Fong founded GGWP in 2022, more than a year before companies like Microsoft and Google debuted their natural-language search engines and the AI revolution officially gripped the globe. GGWP is an AI-powered moderation system that identifies and takes action against in-game harassment and hate speech, and after two years on the scene, it’s now integrated into titles at more than 25 studios.

Fong may be a veteran of the Doom and Quake esports scenes, but he’s interested in protecting players from abuse in every genre, especially as social features become easier to implement for studios of all sizes. GGWP is live in thatgamecompany’s social adventure title Sky: Children of the Light, the meditation app TRIPP VR, the kids-focused MMO Toontown Rewritten, the first-person MOBA Predecessor, Fatshark’s action shooter Warhammer 40,000: Darktide, the metaverse platform The Sandbox, and it powers Unity’s anti-abuse toolset.

These aren’t all gritty military sims or hardcore competitive franchises like Counter-Strike or League of Legends, where you might expect emotional outbursts and increased toxicity. One-third of the games that utilize GGWP are co-op and PvE experiences, rather than competitive PvP settings, according to Fong. Turns out, cozy games need moderation too.

“Cozy games tend to see a lot more chat activity when compared to competitive games, so naturally there tend to be far more incidents that are chat-related as compared to gameplay,” Fong said. “That said, users are clever and are always discovering new ways to turn something intended to be positive, like a ‘thank you’ emote, into something negative by using it after a player makes a mistake. We help companies understand what’s happening and then implement tools to help curb that behavior.”

GGWP’s Unity partnership is particularly notable, if only because of its potential scale. GGWP powers Unity’s Safe Text and Safe Voice products, including its Vivox voice chat system, and it’s integrated into the uDash dashboard. Unity developers can activate GGWP in their games with a click and have billing handled through their existing Unity partnerships.

Outside of Unity, it takes just a few lines of code to activate GGWP in a game. There’s a free tier that allows studios to try out the system, and a self-service portal for the truly independent developer. Custom contracts for larger titles aside, GGWP charges based on the volume of API calls a game generates.

"There are companies that do a subset of what we do, but we’re the only comprehensive platform for positive play," Fong said.

In-game moderation is a massive problem for any game with a social feature, and the bigger the audience, the more harassment there is to sift through. One studio executive told Fong in 2022 that their game received more than 200 million player-submitted reports in just one year, and this volume was common among popular online titles. During his research phase, Fong found that most AAA studios addressed just 0.1 percent of all reports they received annually, and some had anti-toxicity teams of fewer than 10 people.

GGWP exists because most game companies, even the largest ones, are awful at moderating their spaces. Clicking the “report” button in many games feels like sending a strongly worded letter to a trash incinerator inside a black hole. Here’s how Fong described it to Engadget in 2022:

“I'm not gonna name names, but some of the biggest games in the world were like, you know, honestly it does go nowhere. It goes to an inbox that no one looks at. You feel that as a gamer, right? You feel despondent because you’re like, I’ve reported the same guy 15 times and nothing’s happened.”

GGWP has successfully blocked hundreds of millions of abusive messages and it’s being used to protect billions of user interactions monthly. Games that use the system have seen a 65 percent reduction in toxic behavior and a 15 percent improvement in player retention — meaning, GGWP is preventing harassment from happening in the first place, and this helps players feel comfortable enough to keep coming back.