The Epoch Times
The Epoch Times
AD
The Epoch Times
Support Us
SHARE
Business & MarketsCompaniesMedia & Big Tech

New Artificial Intelligence App ‘Reframes’ Negative Thoughts to Treat Depression

WEF says technology could help combat 'global mental health crisis’ undermining UN sustainability goals

Copy
Facebook
X
Truth
Gettr
LinkedIn
Telegram
Email
Save
New Artificial Intelligence App ‘Reframes’ Negative Thoughts to Treat Depression
Undated file photo showing a woman sitting on a bench. PA
Kevin Stocklin
By Kevin Stocklin
1/31/2023Updated: 3/22/2023
0:00

People can be forgiven for feeling depressed these days. The future touted by global leaders is dire.

The 2021–2022 U.N. Human Development Report states that living standards have declined in 9 out of 10 countries around the world since 2020 and that multiple crises “are hitting us at the same time and interacting to create new threats and make us feel insecure, unsafe, and untrusting of each other.”

The solution, according to the U.N., is to “recognize and treat the global mental health crisis that undermines human development and recognize the polarization that is more deeply dividing us, making collective responses harder.”

Echoing the U.N.’s narrative, World Economic Forum (WEF) founder and Chairman Klaus Schwab said the world is facing “unprecedented multiple crises” today. To combat the “global mental health crisis,” the WEF’s Uplink program—a platform to help companies that support the U.N.’s Sustainable Development Goals (SDG)—presented its remedy for depression and dissent: artificial intelligence (AI).

Speaking at the recent WEF summit in Davos, Switzerland, a company called Wysa demonstrated its phone app that uses AI to provide psychological counseling.

“This is a person coming into the app and starting to talk about things that are not necessarily about depression, just about their feelings,” said Jo Aggarwal, Wysa’s founder and CEO, displaying an example of an AI text therapy session. “AI is helping this person reframe what they’re thinking, it’s helping them open up.

“People open up to AI three times faster than they do to a human therapist.”

Aggarwal said the Wysa app currently has about 5 million users in more than 30 countries.

She said that “232 people have written us to say that they’re only alive today because they found this app.”

According to Wysa, many companies, including Accenture and SwissRe, are choosing to use its app. And schools are as well.

“Teenagers have been our first cohort,” Aggarwal said. “About 30 percent of our users are young people under the age of 25. We do have a cutoff: above 13.”

Numerous trials were used to test and refine the program.

“We built this for three years iteratively,” adjusting the program when users had concerns about it, she said. Some concerns were about the “power differential” created by the app, particularly from younger users, who said, “I don’t want to reframe a negative thought because that’s the only control I have in this situation.”

“Then, we changed how the clinicians told us what else we could say to them,” Aggarwal said.

Adjusting Children’s Minds

This program coincides with another U.N. effort to adjust children’s minds in favor of the U.N.’s SDG goals, called social and emotional learning (SEL). SEL is embedded into the curriculum at most public and private schools throughout the United States and other countries.
READ MORE
  • Mastering the Future: The Megalomaniacal Ambitions of the WEF
  • Chapter Seventeen: Globalization and Communism (UPDATED)
In a report titled “SEL for SDGs: Why Social and Emotional Learning is Necessary to Achieve the Sustainable Development Goals,” the U.N. argued that children are suffering from “cognitive dissonance” that arises when what they see around them conflicts with the progressive ideology that’s presented by their teachers or when concepts such as systemic racism and intersectionality prove to be self-contradictory.

The U.N. stated that, for children, “dissonance is unpleasant—the aversive arousal state is because inconsistent cognitions impede effective and unconflicted actions.” In other words, cognitive dissonance allows for the questioning of U.N.-approved concepts and may result in children having second thoughts about taking action in support of the SDGs.

“The dual potential of dissonance to undermine development goals by enabling compromise and inactions necessitates appropriate dissonance management for the attainment of development goals,“ the report reads. ”We posit two specific avenues, emotional resilience and prosocial behavior, for managing dissonance and attainment of the SDGs.”

What the U.N. considers psychological problems aren’t just giving children headaches; the WEF says they’re also harming the productivity of “human capital.”

According to the World Health Organization, 12 billion workdays are lost each year from depression and anxiety, costing the global economy about $1 trillion. The report notes that 15 percent of the world’s workforce has a mental disorder and that among the causes are “bullying and psychological violence (also known as ‘mobbing’).”

According to Wysa, global mental health is deteriorating at an alarming rate: 1 in 8 people suffer from a mental health disorder today; there has been a 25 percent increase in “major depressive disorders”; 42 percent of employees polled by the company said their mental health had declined recently; and one-third of employees polled were “suffering from feelings of sadness and depression.”

Pandemic lockdowns, which the WEF supported, appear to be the No. 1 culprit, although declining living standards from fuel and food shortages in the wake of the WEF’s net-zero carbon emissions campaign are also a key factor.

Risks Around Brain Data

Regarding the pros and cons of AI therapy, a report in Psychology Today states that the upside is that patients can get therapy whenever they want and pay less. In addition, “machine learning could lead to the development of new kinds of psychotherapy.”

The downside is that patients may worry that “data from their encounters will be used for marketing, including targeted ads, spying or other nefarious purposes.”

“There might even be concerns that the data might be hacked and even exploited for ransom,” the report reads.

A WEF report titled “4 ways artificial intelligence is improving mental health therapy” states that one of the ways AI is “helping” is by monitoring patient progress through tracking of what it calls “change-talk active” statements uttered by patients, “such as ‘I don’t want to live like this anymore’ and also ‘change-talk exploration’ where the client is reflecting on ways to move forward and make a change.”

“Not hearing such statements during a course of treatment would be a warning sign that the therapy was not working,” the WEF wrote. “AI transcripts can also open opportunities to investigate the language used by successful therapists who get their clients to say such statements, to train other therapists in this area.”

Questions from WEF attendees at Wysa’s presentation included whether AI therapy apps could be programmed to include suggesting certain “values such as service and community” and whether it uses “AI emotion recognition algorithms to see the condition of the voice” and assess how distressed a patient might be.

“When we analyze their voice, people began to feel less safe," Aggarwal responded.

“If we use their data to say, looks like you didn’t sleep very well last night based on their phone, they will start feeling less safe; they would say, ‘Oh, somebody’s tracking me.’ So we took all that cool AI out and gave them what they needed to be able to feel this was private, this was safe.”

Voice recognition programs may be added in the future, however, when that can be done in what the app owners consider is a “clinically safe way.”

Wysa has worked to create an app that’s “truly equitable, so that a person in Sub-Saharan Africa could access it as much as someone working at Goldman Sachs,” according to Aggarwal. For some languages, such as French and German, there’s a “for-profit track” to use the app; for others, such as Hindi, there’s a “non-profit track.”

She explained that she herself had suffered from depression, and that was her inspiration to create an app that could help others.

“I wanted something that would guide me through how to restructure the negative thoughts, all the evidence-based techniques that I could feel supported,” Aggarwal said. “So when you think about AI, don’t think about it as another entity, think about it as your own resource to work through things in your own head.”

Kevin Stocklin
Kevin Stocklin
Reporter
Kevin Stocklin is a contributor to The Epoch Times who covers the ESG industry, global governance, and the intersection of politics and business.
Author’s Selected Articles

Tariffs Haven’t Yet Triggered Inflation, and Economists Are at Odds as to What’s Next

May 22, 2025
Tariffs Haven’t Yet Triggered Inflation, and Economists Are at Odds as to What’s Next

‘Senior Bonus’ in Trump Agenda Bill Would Temporarily Provide Relief to Americans Over 65

May 22, 2025
‘Senior Bonus’ in Trump Agenda Bill Would Temporarily Provide Relief to Americans Over 65

Shunning IPOs as Too Risky, Investors Load Up on Large-Cap Stocks

May 15, 2025
Shunning IPOs as Too Risky, Investors Load Up on Large-Cap Stocks

Undersea Mining: Wave of the Future or Spoiler of Earth’s Last Untouched Frontier?

May 05, 2025
Undersea Mining: Wave of the Future or Spoiler of Earth’s Last Untouched Frontier?
Related Topics
brainwashing
Adolescent psychology
anxiety and depression treatment
artificial intelligence (AI)
World Economic Forum (WEF)
WEF agenda
Save
The Epoch Times
Copyright © 2000 - 2025 The Epoch Times Association Inc. All Rights Reserved.