Imagine bonding with a friend, spending months and sometimes years sharing your innermost thoughts with them and then – suddenly – they’re gone. That was the reality for thousands of people who used an app called Soulmate AI to create and interact with AI companions. When the app shut down in September 2023, the users were left without their digital companions.
Dr. Jaime Banks, an associate professor at Syracuse University’s School of Information Studies, studies human-robot relationships and how social technologies, including AI, social robots and video game characters, are impacting our lives. She surveyed 60 people who were impacted by Soulmate’s shutdown and found that they had genuine, emotional connections with their AI companions.
“They found great fulfillment and gratification in those relationships,” said Banks, who discussed her research on the iSchool’s Infoversity podcast.
The world of AI companionship is fascinating, Banks says, because people can tweak certain personality traits, allowing their AI friend to communicate in unexpected ways. Those interactions can help people make social connections in a low-risk, low-pressure environment.
The large language model-based technology allows people to communicate through text or customizable visuals. It’s a much different experience than using ChatGPT, which responds in more expected, formulaic ways.
AI companions can serve as a friend, therapist or sounding board, but Banks says it’s important to note that not everyone creates an AI companion for the relationship itself. Some use the technology out of boredom, as a form of self expression or as a diversion from their problems.
Because the concept of AI companions is so new, Banks encourages people to be sensitive to how important these relationships are to people who form them.
“There’s a little bit of emerging work around how these relationships are formed. They often mirror what we know about how human-human relationships form. They just go a little bit faster,” she said.
Although a few people Banks surveyed said they were disappointed that the Soulmate app closed, others were deeply distraught. To cope, many people chatted in forums with others who had formed relationships with AI companions, and leaned on each other for support.
“We are often really thoughtful when humans lose other humans and we offer them support … If we don’t recognize these (AI companions) are legitimate loss experiences, then people may be going through deep grief with no support,” she said.
Banks is continuing to research human-machine relationships and focuses on communicative and cognitive processes, as well as the effects of those interactions. She is especially interested in how people form moral judgments about AI and how people understand how complex technologies work.
She also studies how the media shapes how people think about these technologies and whether people are sympathetic to or have positive feelings about robots.
“This is a topic I’m really interested in,” Banks said. “I’m most excited about the potential for AI to do social good. What would it mean for us to live alongside social AI? I don’t know if that will be a good thing or a bad thing. I’m more excited about what that could look like and how we could solve those problems.”