You can use the front camera on your phone to regularly record your surroundings and facial emotions with the MoodCapture app. After that, the pictures are looked at to see if there are any clinical signs of sadness. In a test with 177 people who were diagnosed with major depressive disorder, the app correctly identified early signs of sadness 75% of the time.
The corresponding author of the study, Andrew Campbell, said
For the first time, pictures taken “in the wild” have been used to identify depression.
- Psychological Hurdle of Sports Retirement.
- Empowering Women at Work: 5 Strategies to Address the ‘Exhaustion Gap’.
Face recognition technology, deep learning, and AI devices are all used in a similar way by MoodCapture. “When someone unlocks their phone, MoodCapture knows how their depression works and can tell them to get help.”
Every time a user unlocks their phone, MoodCapture is supposed to look at a set of pictures and rate them in real time. The AI model finds connections between facial expressions and things like eye contact, changes in facial expression, and a person’s surroundings that show these things are important for figuring out how depressed someone is.
In a statement, Jacobson, who runs the AIM HIGH Laboratory for AI and Mental Health, said,
“Our goal is to record the changes in symptoms that people with depression experience in their daily lives.”
In a perfect world, Jacobson says, an AI programme like MoodCapture should tell people not to get depressed. Instead, tell them to do something constructive, like calling a friend or going for a walk outside.
Phone app uses AI to read people’s faces to tell if they are depressed
They say they have made the first app for smartphones that can usually tell when someone is depressed before they even know anything is wrong. Software for processing face images and artificial intelligence are used in the app.
With the MoodCapture app, the front camera of a phone records what’s going on around the user and their facial expressions as they use the phone regularly. After the pictures are looked at, they are checked for signs of sadness. The app found early signs of sadness 75% of the time in a study of 177 people who had been diagnosed with major depressive disorder.
Computer scientists and doctors from the Geisel School of Medicine say that the technology might be ready for everyone in five years if they keep working on it.
On February 27, the team posted their results to the arXiv preprint database. They then shared them at the CHI 2024 meeting in May, which is put on by the Association for Computing Machinery. CHI studies are looked over by other researchers before they are accepted, and only the best ones are published in the conference proceedings.
Andrew Campbell, the Albert Bradley 1915 Third Century Professor of Computer Science and study’s lead author, says, “This is the first time that natural ‘in-the-wild’ images have been used to predict depression.” People who work in digital mental health are trying to make a tool that can guess a person’s mood when they have been diagnosed with major depression and avoid drawing attention to itself.
In the past week, Campbell has used software that recognizes his face more than 800 times to get into his phone. “People use facial recognition software to unlock their phones hundreds of times a day,” he adds.
“MoodCapture uses a similar technology pipeline of deep learning and AI hardware for facial recognition,” he adds. “There is a huge opportunity to scale up this technology without the user having to do anything extra.” “When someone unlocks their phone, MoodCapture knows how their depression works and can tell them to get help.”
The app took 125,000 shots of the people who took part in the study over the course of 90 days. As part of the study, everyone agreed to have their picture taken with their phone’s front camera. They just didn’t know when it would happen.
With the help of the first group of volunteers, MoodCapture was taught how to spot depression. As they answered “I have felt down, depressed, or hopeless,” their phone’s front-facing camera randomly took pictures of them. The Patient Health Questionnaire-8 (PHQ-8) has this question. Doctors use it to find sad people and keep an eye on them.
Researchers used AI to look at these photos so that MoodCapture’s predictive model could learn to link people’s reports of being depressed with certain facial expressions and environmental factors, such as the main colors, lighting, and number of people in the picture.
With MoodCapture, a user’s phone looks at a set of pictures in real time every time they open it. AI finds connections between how someone moves their face and details about their background that can help figure out how depressed they are. It learns over time which parts of a picture are unique to each person. The AI model might think that someone is beginning to feel sad if they sit in a dark room with a blank look on their face for a long time.
The same PHQ-8 question was asked of a different group of people while MoodCapture took pictures of them. This was done to see how well the prediction model worked. The software looked at these pictures for signs of sadness based on what it learned from the first group. It was 75% of the time that the MoodCapture AI was right about whether this second group was sad or not.
Campbell says that a sensor needs to be accurate at least 90% of the time in order to be useful. “This shows a way to a powerful tool for evaluating a person’s mood in a passive way and using the data as a basis for therapeutic intervention,” he says. “I think that this kind of technology could be available to everyone in five years.” This is possible, as we’ve shown.
MoodCapture helps people with major depression because their moods change all the time. Nicholas Jacobson, an assistant professor of biological data science and psychology in the Center for Technology and Behavioral Health and a co-author on the study, said this.
While many of the treatments for sadness are based on long-term plans, people with depression go through ups and downs in their health. He said, “Traditional tests miss most of what depression is.” Jacobson runs the AIM HIGH Laboratory for AI and Mental Health: Innovation in Technology Guided Healthcare.
He says, “Our goal is to record the changes in symptoms that people with depression go through in their daily lives.” “If we can use this to figure out how to predict and understand how depression symptoms change quickly, we can stop them before they get worse and treat them.” Sadness won’t hurt us as much if we can live in the moment.
Jacobson thinks that apps like MoodCapture could help close the big gap between when people who are sad need help and when they can get in touch with mental health services. He says that most people see a therapist like a doctor less than 1% of the time in their lives. He says, “The goal of these technologies is to give more real-time support without putting more stress on the care system.”
MoodCapture and other AI apps should not tell someone directly that they might be sad. Instead, Jacobson says, these apps should suggest things like going outside or checking in with a friend that will keep them from getting depressed.
He adds, “Telling someone bad things are happening with them could make things worse.” “We believe that MoodCapture opens the door to testing tools that could help find signs of depression before they get worse.” Besides these apps, there should also be ways to treat sadness that try to stop it before it gets worse. A little more than ten years ago, this kind of work would not have been possible.
A grant from the National Institutes of Mental Health gave money for the project. Jacobson is in charge of that grant. We will use deep learning and idle data collection to look for signs of sadness in real time as part of the grant. The new study builds on one that Campbell’s lab led in 2012, which looked at students’ mental health by using inactive and automatic data from their phones.
But since then, cameras on smartphones have gotten better, so Campbell says it was easy for the researchers to get the kind of shots people usually take with their phones. Campbell is in charge of the Center for Technology and Behavioral Health’s new tools and data analysis. He is in charge of the group there that makes mobile sensors that can use passive data to keep track of things like mood and job success.
Campbell says the new study proves that quiet pictures are very important for mobile therapy tools to work. Like user-generated pictures, they get a better sense of mood and happen more often. They also don’t turn people off by making them do something. “These neutral photos are a lot like seeing someone in real life when they’re not putting on a front,” says Campbell. This helped our model for predicting face expressions work better.
The study was written by Subigya Nepal, who is working on her PhD in Campbell’s research group at the Guarini School of Graduate and Advanced Studies, and Arvind Pillai, who is also working on his PhD at the Guarini School. Nepal says that the next steps for MoodCapture are to train the AI on a larger group of people, improve its ability to spot mood disorders, and make private rights stronger.
Nepal says that the experts want to make MoodCapture so that photos never leave the phone. On the user’s device, pictures would be used to find sad facial emotions and turn them into code that the AI model could use. “If the data ever got lost, there would be no way to put it back together into a picture that could be used to find the user,” he adds.
Also, Nepal says that the app could be more accurate for the user if the AI was set up to learn more from how the user moves their face.
“You wouldn’t have to start over.” Since the general model is 75% right, we could use information from a specific person to make it even better. It says, “Devices made in the next few years should be able to handle this with ease.” “We know that facial expressions show how someone is feeling.” Our study shows that these are one of the most important signs we can get from technology to check on mental health.
Comments are closed.