What happens when we start turning to machines for the comfort we once found in people?
A growing body of research suggests that the rise of AI chatbots may be quietly reshaping how we connect—and not always for the better.
Programs like ChatGPT are powered by artificial intelligence to engage in conversation with users.
As the technology has advanced, they have become increasingly human-like and capable of more natural and realistic conversations, even engaging emotionally.
“For average levels of daily use, conversing with a chatbot with highly empathetic, emotional, and socially considerate responses was also associated with higher loneliness and lower socialisation,” the report said.
“Those who spend more time with chatbots tend to be even lonelier.”
The study found that people with “social vulnerabilities,” including those with attachment tendencies and experience distress from emotional avoidance, were more likely to feel loneliness after engaging daily with a chatbot.

Even Non-Personal Interaction Can Result in Dependency
Meanwhile, even non-personal conversations could be susceptible with those asking chatbots for advice with brainstorming becoming emotionally dependent.“When users engage in non-personal conversations, the [chatbot] model also responds more practically and informatively than emotionally, such as by facilitating the development of the user’s skills,” the report said.
“At high usage, chatbots with a greater degree of professional distance, even to the degree of frequently neglecting to offer encouragement or positive reinforcement when appropriate, tend to be more strongly associated with emotional dependence and problematic use.”
A Convenient Reprieve From Loneliness: University Dean
Paul Darwen, associate dean of IT at James Cook University’s Brisbane campus, said that while people were more connected than ever, they were “less connected with other people.”“And that’s a question. That’s not a question for computer science. That’s a question for social science,” he told The Epoch Times.
Darwen further stated that while AI might be a “band-aid solution” to loneliness, it might also create other problems.
“And what [will] happen in the future? People are talking about [AI] sexbots. I am not sure what will happen then,” he said.
The associate dean also pointed out that people were also beginning to substitute real interaction with chatbots, and this could motivate AI companies to focus on this niche market for profit.
“There was an episode of [the animated sitcom] South Park where, in the dystopian future, Alexa was like the robot companion of everyone who was lonely,” Darwen said.
“We’re very close to that being a possibility,” he said, noting that development in this field was

Chatbots and Suicides
In recent years, this issue has become a reality with dire consequences.According to the lawsuit, the boy used a chatbot program marketed through Character Technologies’ AI platform and developed an emotional dependence on it.
The mother alleged that the chatbot’s ability to simulate realistic human interactions later caused her son to undergo severe emotional distress, which ultimately led to his suicide.
In a separate case, a Belgian man committed suicide after being persuaded by a chatbot in 2023.
The man developed an obsession with climate change and engaged heavily with an AI chatbot app called Chai to alleviate his concerns.
Following a several-weeks-long discussion, the chatbot advised the man to sacrifice his life to save the planet, which he eventually did.
The man’s death sparked calls for new laws in the EU to regulate chatbots and impose responsibility on AI companies.
Too Many Unanswered Questions: AI Safety Group
Greg Sadler, CEO of Good Ancestors Policy, a charity focused on AI, said studies had shown that chatbots can be as persuasive as humans.“There are unanswered questions, like whether chatbots should have access to dangerous information, whether AI developers can reliably control their models, and who is liable when chatbots cause harm,” he told The Epoch Times.
“This isn’t just a challenge for chatbots intended for social engagement. Businesses proposing to use customer-facing chatbots face real risks and legal uncertainty until these legal and technical challenges are resolved.”
To tackle these issues, Sadler said the government could introduce legislation that helps establish minimum safety standards and impose responsibility when things go wrong.
“Government should also support technical research into ensuring AI is aligned with our values and can be controlled,” he said.
The company forecasts the market to grow at a compound annual rate of 23.3 percent between 2025 and 2030, with the market value hitting US$27.3 billion by 2030.