How Spirit AI uses artificial intelligence to level up game communities | Tech News

Spirit AI is using artificial intelligence to combat toxic behavior in game communities. The London company has created its Ally social intelligence tool to decipher online conversations and monitor whether cyberbullying is taking place.

It is the brainchild of researchers at New York University, according to Mitu Khandaker, creative partnerships director at Spirit AI and an assistant arts professor at the NYU game center. The company uses AI, natural language understanding, and machine learning to help data science and customer service teams to understand the general tenor of an online community. It also helps predict problems before they escalate.

Ally considers context, nuance and the relationships between users versus seeking and blocking keywords. The software uses natural language understanding and AI to identify the intent of a message, and then analyzes the behavior and reactions to determine its impact. It can do things like identify cyberbullying.

Spirit AI recently introduced a host of new features, some of which were unveiled at this year’s Game Developers Conference, and as part of the new Fair Play Alliance Summit. Product improvements include new Smart Filters, an updated and customizable web-based front end and intuitive node-based interface, and GDPR (the new European privacy law) compliance. Another product, Character Engine, is a natural language AI framework for building autonomous characters with agency, history, and knowledge.

“We are at a moment where these things are happening and AI can help,” Khandaker said.

Khandaker recently spoke about Ally at the Casual Connect Europe event in London. I spoke with her afterward. Here’s an edited transcript of our interview.

Here’s an edited transcript of our interview.

Above: Mitu Khandaker at Casual Connect Europe in London.

Image Credit: Dean Takahashi

GamesBeat: Tell us about what your company does with Ally.

Mitu Khandaker: Ally is really a tool for analyzing the social landscape of communities, really understanding them in terms of language and context. All of the things about conversation that I mentioned also apply when you’re looking at conversations between players and non-player characters (NPCs), right? But this time looking at those things between players. Conversation is very nuanced, very contextually specific.

As a company, the idea for both products came about at the same time. Initially we were talking about, “What is possible when you really understand language, when you really understand behavior in context?” Obviously there’s the dream of being able to naturally talk to characters. We’re helping realize that. But also, for me personally–we started Spirit almost three years ago. This was a year into Gamergate, when myself and a lot of my colleagues were targets. I thought that if we really understand nuanced conversation, online harassment is a key area to try to tackle.

You can’t just ascertain harassment from keywords. Language changes. Bad language might be fine between friends, but then something that is completely innocuous-sounding that a stranger suddenly starts—someone might invent a new curse word, right? If your system doesn’t know about it, or it’s deliberately misspelled, all of these things. People try to circumvent these systems as much as possible if they’re trying to be malicious.

The system is good at understanding language in context. That’s done several ways. One of them might be just semantically trying to understand what the word means, from how it’s being used. Another might be—this is a key one, which is trying to understand whether the language is consensual. Going back to the example, let’s say my best friend decides to call me a name. I’m fine with that. She can call me whatever she wants to. That’s a consensual relationship there. I know she doesn’t mean it unkindly. I’m not going to say, “Don’t do that.”

But if that same word gets used by a stranger — some random dude online starts hurling the same insult – I might either go silent, not respond, which shows that maybe I’m not having a two-way conversation with this person, or I might say, “Go away, leave me alone,” expressing some kind of lack of consent verbally. Or I might just log off.

It’s about trying to make communities safer for people. Understanding what it is that specifically has hurt them. What is it that they’re not consenting to? Basically, by using these tools you can curtail those types of interactions and just keep people feeling safe and coming back. That’s how it makes life easier for a player.

On the community management side, the community moderator side, one of our big aspirations is, how do we use things like AI to reduce emotional labor? This is getting into some of my bigger philosophies around AI. There’s a lot of talk about AI and whether automation will replace jobs and things like that. That’s the big topic of conversation now. Wherever you land on that topic, one thing that we can use AI to do is reduce the emotional work that people have to do.

When you look at really tough jobs, like community moderation—moderators, as you know, have to look at so many terrible comments. Over time that stuff gets to you. There have been reports of people in those situations getting treated for PTSD because of the content they’re being exposed constantly, day in and day out. One thing AI can help us with is reducing that emotional labor, automating away some of the shitty stuff they have to look at.

GamesBeat: People always think of AI doing boring work, like truck driving, but–

Khandaker: Right, but what about AI doing the emotional work? I think that’s really interesting, because what that leads us with is a healthier work force.

GamesBeat: The part about “the cake is a lie” in your talk made me curious. Was it from an actual Eliza conversation?

Khandaker: Oh, that was an actual conversation in Eliza, yeah. One of the things Eliza does, on the particular computer it’s on it will restore things the user has previously said and repeat it back. Like I said, it’s that model of therapy where the therapist says, “That’s interesting, tell me more about the cake.” That’s how it works.

Above: Spirit AI’s Ally

Image Credit: Spirit AI

GamesBeat: When I was an English major, we dealt with different theories like structuralism. You have a story arc. Is that an example of something AI could replicate? Could it create its own narratives, the emotional arc of a story?

Khandaker: Absolutely. There are several ways you can look at it. If you look at how the story unfolds between you and a character you’re talking to, there can be an arc that you architect. You might say, after a certain amount of time, or this many conversational moves, you want to ramp up the drama. You want to make the character suddenly angry. What are they angry about? There’s a particular thing that they want to bring up with you, that they’ve been meaning to talk to you about.

That’s one way of doing it, actually—when you’re making the story unfold through conversation, that’s one way you can control that arc. You can say, “This is the particular story I’ve written. Here are the ways it can unfold.” The character will say certain things when it’s the right time, because you’ve finally built their trust up or whatever. That’s another thing we can do through our system.

You might also like

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. AcceptRead More