The Epoch Times
The Epoch Times
AD
The Epoch Times
Support Us
SHARE
World NewsAustralia News

How Chatbots Are Replacing Human Connection—And Leaving Us Lonelier

‘Those who spend more time with chatbots tend to be even lonelier,’ said MIT Media Lab research.
Copy
Facebook
X
Truth
Gettr
LinkedIn
Telegram
Email
Save
How Chatbots Are Replacing Human Connection—And Leaving Us Lonelier
A virtual friend is seen on the screen of an iPhone in Arlington, Virginia, on April 30, 2020. Olivier Douliery/AFP via Getty Images
Alfred Bui
By Alfred Bui
5/9/2025Updated: 5/9/2025
0:00

What happens when we start turning to machines for the comfort we once found in people?

A growing body of research suggests that the rise of AI chatbots may be quietly reshaping how we connect—and not always for the better.

Programs like ChatGPT are powered by artificial intelligence to engage in conversation with users.

As the technology has advanced, they have become increasingly human-like and capable of more natural and realistic conversations, even engaging emotionally.

MIT Media Lab released a study (pdf) in March exploring the interaction between people and machine, finding overall that users would initially experience a drop in loneliness.

“For average levels of daily use, conversing with a chatbot with highly empathetic, emotional, and socially considerate responses was also associated with higher loneliness and lower socialisation,” the report said.

Related Stories
AI Tools Tempt Learners to Skip the Basics, University Dean Warns
4/24/2025
AI Tools Tempt Learners to Skip the Basics, University Dean Warns
AI Safety Warning: Model Caught Lying to Researchers, Hiding True Capability
2/14/2025
AI Safety Warning: Model Caught Lying to Researchers, Hiding True Capability

“Those who spend more time with chatbots tend to be even lonelier.”

The study found that people with “social vulnerabilities,” including those with attachment tendencies and experience distress from emotional avoidance, were more likely to feel loneliness after engaging daily with a chatbot.

A man looks at his smartphone in Newcastle, Australia on Dec. 1, 2024. (Roni Bintang/Getty Images)
A man looks at his smartphone in Newcastle, Australia on Dec. 1, 2024. Roni Bintang/Getty Images

Even Non-Personal Interaction Can Result in Dependency

Meanwhile, even non-personal conversations could be susceptible with those asking chatbots for advice with brainstorming becoming emotionally dependent.

“When users engage in non-personal conversations, the [chatbot] model also responds more practically and informatively than emotionally, such as by facilitating the development of the user’s skills,” the report said.

“At high usage, chatbots with a greater degree of professional distance, even to the degree of frequently neglecting to offer encouragement or positive reinforcement when appropriate, tend to be more strongly associated with emotional dependence and problematic use.”

Yet researchers could not explain why this happened.

A Convenient Reprieve From Loneliness: University Dean

Paul Darwen, associate dean of IT at James Cook University’s Brisbane campus, said that while people were more connected than ever, they were “less connected with other people.”

“And that’s a question. That’s not a question for computer science. That’s a question for social science,” he told The Epoch Times.

Darwen further stated that while AI might be a “band-aid solution” to loneliness, it might also create other problems.

“And what [will] happen in the future? People are talking about [AI] sexbots. I am not sure what will happen then,” he said.

The associate dean also pointed out that people were also beginning to substitute real interaction with chatbots, and this could motivate AI companies to focus on this niche market for profit.

“There was an episode of [the animated sitcom] South Park where, in the dystopian future, Alexa was like the robot companion of everyone who was lonely,” Darwen said.

“We’re very close to that being a possibility,” he said, noting that development in this field was

A person has a conversation with a humanoid robot in Las Vegas, Nevada, on Jan. 10, 2024. (Frederic J. Brown/AFP via Getty Images)
A person has a conversation with a humanoid robot in Las Vegas, Nevada, on Jan. 10, 2024. Frederic J. Brown/AFP via Getty Images

Chatbots and Suicides

In recent years, this issue has become a reality with dire consequences.
In October 2024, a Florida mother filed a lawsuit against AI startup Character Technologies, Inc., and its co-founders, alleging that they were responsible for the death of her 14-year-old son.

According to the lawsuit, the boy used a chatbot program marketed through Character Technologies’ AI platform and developed an emotional dependence on it.

The mother alleged that the chatbot’s ability to simulate realistic human interactions later caused her son to undergo severe emotional distress, which ultimately led to his suicide.

In a separate case, a Belgian man committed suicide after being persuaded by a chatbot in 2023.

The man developed an obsession with climate change and engaged heavily with an AI chatbot app called Chai to alleviate his concerns.

Following a several-weeks-long discussion, the chatbot advised the man to sacrifice his life to save the planet, which he eventually did.

The man’s death sparked calls for new laws in the EU to regulate chatbots and impose responsibility on AI companies.

In the same year, an eating disorder association in the United States shut down its AI chatbot service after it was reported the program was giving harmful advice to users.
According to one user, the chatbot advised her to try to lose weight and measure herself on a weekly basis despite being told she had an eating disorder.

Too Many Unanswered Questions: AI Safety Group

Greg Sadler, CEO of Good Ancestors Policy, a charity focused on AI, said studies had shown that chatbots can be as persuasive as humans.

“There are unanswered questions, like whether chatbots should have access to dangerous information, whether AI developers can reliably control their models, and who is liable when chatbots cause harm,” he told The Epoch Times.

“This isn’t just a challenge for chatbots intended for social engagement. Businesses proposing to use customer-facing chatbots face real risks and legal uncertainty until these legal and technical challenges are resolved.”

To tackle these issues, Sadler said the government could introduce legislation that helps establish minimum safety standards and impose responsibility when things go wrong.

“Government should also support technical research into ensuring AI is aligned with our values and can be controlled,” he said.

According to data from the U.S.-based market research company Grand View Research, the value of the global AI chatbot market was around US$7.76 billion (A$12.1 billion) in 2024.

The company forecasts the market to grow at a compound annual rate of 23.3 percent between 2025 and 2030, with the market value hitting US$27.3 billion by 2030.

Alfred Bui
Alfred Bui
Author
Alfred Bui is an Australian reporter based in Melbourne and focuses on local and business news. He is a former small business owner and has two master’s degrees in business and business law. Contact him at [email protected].
Author’s Selected Articles

NSW Youth Convictions Plunge After Court Rules Kids Can’t Grasp Right From Wrong

May 09, 2025
NSW Youth Convictions Plunge After Court Rules Kids Can’t Grasp Right From Wrong

Nine Strikes $2.7 Billion Deal to Sell Domain to US Property Giant

May 09, 2025
Nine Strikes $2.7 Billion Deal to Sell Domain to US Property Giant

Greens Candidate Concedes Defeat After Strong Challenge for Labor’s Seat of Wills

May 08, 2025
Greens Candidate Concedes Defeat After Strong Challenge for Labor’s Seat of Wills

Wong Leverages Community Backlash After Liberal Party’s ‘Chinese Spies’ Comment

May 06, 2025
Wong Leverages Community Backlash After Liberal Party’s ‘Chinese Spies’ Comment
Save
The Epoch Times
Copyright © 2000 - 2025 The Epoch Times Association Inc. All Rights Reserved.