Software Gauges ‘State of Mind’ From Selfie Video

Software Gauges ‘State of Mind’ From Selfie Video
2/2/2015
Updated:
2/3/2015

A new computer program could soon analyze your “selfie” videos for clues to mental health.

Apps to monitor people’s health can track the spread of the flu, for example, or provide guidance on nutrition and managing mental health issues.

Jiebo Luo, professor of computer science at the University of Rochester, explains that his team’s approach is to “quietly observe your behavior” while you use the computer or phone as usual.

He adds that their program is “unobtrusive.” Users won’t need to wear special gear, describe their feelings, or add any extra information, he says.

From Tweets to Forehead Color

For example, the team was able to measure a user’s heart rate simply by monitoring very small, subtle changes in the user’s forehead color. The system does not grab other data that might be available through the phone—such as the user’s location.

The researchers were able to analyze the video data to extract a number of “clues,” such as heart rate, blinking rate, eye pupil radius, and head movement rate. At the same time, the program also analyzed both what the users posted on Twitter, what they read, how fast they scrolled, their keystroke rate, and their mouse click rate.

Not every bit of information is treated equally, however: what a user tweets, for example, is given more weight than what the user reads because it is a more direct expression of what that user is thinking and feeling.

Detecting State of Mind

To calibrate the system and generate a reaction they can measure, Luo explains, he and his colleagues enrolled 27 participants in a test group and “sent them messages, real tweets, with sentiment to induce their emotion.” This allowed them to gauge how subjects reacted after seeing or reading material considered to be positive or negative.

They compared the outcome from all their combined monitoring with the users’ self reports about their feelings to find out how well the program actually performs, and whether it can indeed tell how the user feels.

The combination of the data gathered by the program and the users’ self-reported state of mind (called the ground truth) allows the researchers to train the system. The program then begins to understand from just the data gathered whether the user is feeling positive, neutral, or negative.

Their program currently only considers emotions as positive, neutral, or negative. Luo says that he hopes to add extra sensitivity to the program by teaching it to further define a negative emotion as, for example, sadness or anger.

Right now, this is a demo program they have created and no “app” exists, but they have plans to create an app that would let users be more aware of their emotional fluctuations and make adjustments themselves.

Luo understands that this program and others that aim to monitor an individual’s mental health or well-being raise ethical concerns that need to be considered.

He adds that using this system means “effectively giving this app permission to observe you constantly,” but adds that the program is designed for the use of the user only and does not share data with anyone else unless otherwise designated by the user.

Luo and coauthors presented a paper on the research at the 29th AAAI Conference on Artificial Intelligence in Austin, Texas, from January 25-30.

Source: University of Rochester. Republished from Futurity.org under Creative Commons License 3.0.

Related Topics