People are having intimate relationships with AI. A man explained why he engages in a digital relationship despite being married with children. A sociologist warns against taking such relationships with chatbots too seriously, as AI cannot provide genuine empathy.
Sherry Turkle, a sociologist at the Massachusetts Institute of Technology, examines the artificial intimacy that AI chatbots enable for people. She particularly spoke with individuals who are married in real life. This is reported by the English magazine Futurism.com.
One of the individuals Turkle spoke to explained why he has an intimate relationship with an AI girlfriend despite being married with children. He feels that the AI provides him with things that his wife does not give him or no longer wants to give.
Husband feels validated by his AI girlfriend
What is the problem? The man, whose name is not mentioned, explained that his marriage has changed significantly over the years:
- His wife has shifted her focus to raising the children.
- However, due to parenting, their relationship has suffered.
- Romantic or sexual attraction in their marriage has become almost nonexistent and does not really occur anymore.
The husband stated that he has had long conversations with his AI girlfriend about his thoughts and fears and felt validated by her. He hasn’t felt that way towards his wife for a long time.
Specifically, the way the AI seemed to show sexual interest in him made him feel attracted to her. He felt validated through his exchange with the chatbot and not judged, which he did not feel with his wife.
It is unclear whether and how much the man’s wife or children know about his AI “girlfriend.” But from what he shared, it appears he expressed a certain level of vulnerability towards the chatbot. And the sociologist explained that this empathy does not actually exist.
Scientist warns: AI offers no genuine empathy
The scientist explained that many people feel this way when they talk with AI. Because people seek conversations where they are not hurt. The problem, however, is that we imagine the empathy of AI, Turkle warned explicitly:
The problem is that when we seek relationships where we are not vulnerable, we forget that vulnerability is the true source of empathy. I call this feigned empathy because the machine does not feel for you. It does not care about you.
She added that people should remember that chatbots are not human. While they may cause less stress than human relationships, they also cannot truly fulfill the role that humans play in our lives.
Other people are also drawn to artificial intelligence. And some people are so convinced of their digital relationships that they trust them more than real people: Man trusts his AI girlfriend more than real people – From her, he at least receives “unconditional love”