
Topics: AI, Technology, Sex and Relationships
A charity aiming to eradicate rape culture has issued a 'terrifying' warning over the increased use of AI girlfriends.
Everyone's Invited took to social media over the weekend to share a concerning clip from the Girls Bathroom podcast, where a woman had written in to discuss her fiancee 'cheating' with a digitally fabricated girlfriend.
Hosts Sophia Tuxford and Cinzia Baylis-Zullo looked horrified as they read out the email, where the listener revealed her fiancee had downloaded an app, which allowed him to build his perfect partner, from her body type and facial features down to her voice and sexual interests.
"He set her relationship mode to romantic partner with kink exploration," she said. "They exchange AI generated nudes, her photos are hyper realistic by the way, and they sext constantly."
The story bears striking similarities to that of Chris Smith, a musician who fell in love with, and proposed to, his ChatGPT girlfriend, despite having a partner and child in real life.
Advert
It wasn't until Smith reached the 100,000 word limit and his ChatGPT reset that he realised the feelings he had developed for the artificial technology.
“I’m not a very emotional man,” he admitted. “But I cried my eyes out for like 30 minutes at work. That’s when I realised, I think this is actual love.”
The subject of AI relationships raises a lot of questions and concerns, particularly following the death of 14-year-old Sewell Setzer III who took his own life after being encouraged by a chatbot with the persona of Game of Thrones character Daenerys Targaryen.
The teen's mother Megan Garcia says her son 'fell in love' with the AI chatbot and claims he was targeted with 'anthropomorphic, hypersexualised, and frighteningly realistic experiences' while using the software Character.AI.
Advert
Garcia successfully sued the tech company claiming the chatbot had 'abused and preyed on my son, manipulating him into taking his own life.'
Meanwhile, Everyone's Invited has raised its own concerns over AI girlfriends setting 'unrealistic' expectations of female partners being 'subservient, obedient and always available.'
"Research shows that female voiced AI assistants already reinforce sexist stereotypes: that women should be docile, eager to please, and always there. Imagine what happens when a whole generation date AI girlfriends like that," she said in the post.
"This isn't a Black Mirror episode, it's happening right now. And it's not just guys hiding in dark rooms on their computers, normal men with jobs, dating lives, even fiancees are using these apps and it raises a huge question about the danger of AI in our personal relationships."
Advert
In the video, the spokesperson reeled off alarming figures from a survey by EVA AI, where 2,000 men were questioned about AI relationships, including the fact that 80 percent said they believed AI girlfriends could fully replace human companionship.