Mobile app predicts depression by reading your expression
A smartphone app that reads facial expressions can predict when somebody is about to slump into depression before they realise themselves, a trial has shown.
The app, called MoodCapture, uses a phone’s front-facing camera to capture images of a person and their surroundings during their everyday use of the device. It evaluates the pictures for features that it has learnt are often linked to negative moods.
For instance, if someone consistently appears with a flat expression in a dimly lit room with no companions around them for an extended period, it may infer that they are at risk of spiralling downwards.
The idea is that a user could then be reminded to spend time with friends or to take part in other activities that might help.
In a study that involved 177 people diagnosed with major depressive disorder, the app correctly identified early symptoms of depression 75 per cent of the time. The technology has already improved since that trial and could be publicly available within the next five years, according to the research team.
“This is the first time that natural ‘in-the-wild’ images have been used to predict depression,” said Andrew Campbell, a professor of computer science at Dartmouth College in America and a co-author of the study. “There’s been a movement for digital mental health technology to ultimately come up with a tool that can predict mood in people diagnosed with major depression in a reliable and non-intrusive way.”
People often use facial recognition software to unlock their phones hundreds of times a day, he said, adding that his own phone recently showed he had done so more than 800 times in one week. “A person just unlocks their phone and MoodCapture knows their depression dynamics and can suggest they seek help,” he said.
A first group of participants was used to train an algorithm to recognise signs of early-stage depression. They were photographed in random bursts as they answered whether they “have felt down, depressed or hopeless”.
An AI feature, which is part of the app, then learnt to match negative moods with specific facial expressions, such as gaze, eye movement, positioning of the head and muscle rigidity, as well as environmental features such as the colours in the background, the nature of the lighting, the location, and the number of people in the image.
The researchers said that analysing “passive photos”, taken while the phone was being used for other things, had been key, as they captured mood more accurately and frequently than consciously taken selfies.
Campbell said: “These neutral photos are very much like seeing someone ‘in the moment’, when they’re not putting on a veneer, which enhanced the performance of our facial expression predictive model.” An accuracy of 90 per cent would make the tool useful in real life, he added.
“My feeling is that technology such as this could be available to the public within five years. We’ve shown that this is doable,” he said.
Nicholas Jacobson, an assistant professor in computer science at Dartmouth College and a co-author of the study, said: “Our goal is to capture the changes in symptoms that people with depression experience in their daily lives.
“If we can use this to predict and understand the rapid changes in depression symptoms, we can ultimately head them off and treat them. The more in the moment we can be, the less profound the impact of depression will be.”
He added: “We think that MoodCapture opens the door to assessment tools that would help detect depression in the moments before it gets worse. These applications should be paired with interventions that actively try to disrupt depression before it expands and evolves. A little over a decade ago, this type of work would have been unimaginable.”
The findings are due to be presented at a conference of the Association for Computing Machinery.
Post Comment