Smartphone App with AI Detects Depression Onset from Facial Expressions

Published on: 

MoodCapture, an app with AI that can detect depression onset from facial cues, accurately diagnosed 75% of participants in a study with early symptoms of depression.

Dartmouth investigators developed the first smartphone application using artificial intelligence (AI) to detect the onset of depression—based solely on facial expressions.1

“If we can use this to predict and understand the rapid changes in depression symptoms, we can ultimately head them off and treat them,” said investigator Nicholas Jacobson, assistant professor of biomedical data science and psychiatry at Dartmouth's Center for Technology and Behavioral Health, in a press release.2 “The more in the moment we can be, the less profound the impact of depression will be.”

The promising results suggest the smartphone app could be released to the public in 5 years, according to investigator Andrew Campbell, Dartmouth's Albert Bradley 1915 Third Century Professor of Computer Science. When that happens, people could whip out their smartphone, and like how they unlock their phone with facial recognition, the same feature could detect the onset of depression.

The app, MoodCapture, accurately diagnosed 75% of participants with early symptoms of depression in the clinical trial.

“This is the first time that natural ‘in-the-wild’ images have been used to predict depression,” Campbell said in a press release. “There’s been a movement for digital mental-health technology to ultimately come up with a tool that can predict mood in people diagnosed with major depression in a reliable and non-intrusive way."

This year, a study published in the American Journal of Psychiatry found a brain scan programmed with an AI algorithm can predict within a week whether the antidepressant, sertraline, will work, as opposed to waiting 6 – 8 weeks.3 Thus, a lot of waves in AI have been made in the major depressive disorder space, in terms of detection and treatment.

The study, led by Subigya Nepal and Arvind Pillai, both PhD candidates from Dartmouth College, included 177 people already diagnosed with major depressive disorder, recruited from online advertisements on Google and Facebook.1 Participants were excluded if they had bipolar disorder, active suicidality, or psychosis. Most participants were female (86.4%) and White (83.6%).

Before the study, participants consented to having their photos taken via their phone’s front camera, but they did not know when it would happen. At random points, the app captured 125,000 pictures of participants over 90 days.

MoodCapture uses facial-image processing software with AI to detect the onset of AI before the user even knows something is wrong. The app uses the phone’s front camera to capture a person’s facial cues and surroundings in candid, “in-the-wild” shots—capturing an image when someone unlocks their phone.

“MoodCapture uses a similar technology pipeline of facial recognition technology with deep learning and AI hardware, so there is terrific potential to scale up this technology without any additional input or burden on the user,” Campbell said.2 "A person just unlocks their phone and MoodCapture knows their depression dynamics and can suggest they seek help.”

Investigators used the first group of participants to program MoodCapture to recognize depression in facial cues. The front-facing camera took random bursts of photos when they answered the question, “I have felt down, depressed, or helpless.” The question was pulled from the 8-point Patient Health Questionnaire (PHQ-8).

Afterward, the team used image-analysis AI on the photos so MoodCapture’s predictive model could learn to associate depression with specific facial expressions such as gaze, eye movement, positioning of the head, and muscle rigidity. Backgrounds were also analyzed in terms of colors, lighting, photo locations, and the number of people in a photo.

The AI program collects patterns of previous images, so if someone always has a flat expression in a dimly lit room for a long period, the predictive model might surmise the person is experiencing the onset of depression.

A second group of participants tested out the model, answering the same PHQ-8 question and having the app capture and analyze photos of them for depression-related facial cues based on data from the first group. The model correctly determined three-fourths of the participants had depression.

Despite the study’s positive results, investigators highlighted several limitations, including the small sample size with many white females, using self-reported data of depression scores, only including clinically depressed individuals and no healthy controls, and not exploring other factors related to depression such as social interactions, physical activity, and environmental. They also highlighted how MoodCapture could have ethical and privacy concerns, so it is necessary to ensure user consent, data security, and transparency of using personal data.

“We think that MoodCapture opens the door to assessment tools that would help detect depression in the moments before it gets worse,” Jacobson said.2 “These applications should be paired with interventions that actively try to disrupt depression before it expands and evolves. A little over a decade ago, this type of work would have been unimaginable.”


  1. Nepal, S, Pillai, A, Collins, A, et al. MoodCapture: Depression Detection Using In-the-Wild Smartphone Images. 2024.
  2. Smartphone App Uses AI to Detect Depression from Facial Cues. EurekAlert! February 27, 2024. Accessed February 27, 2024.
  3. Derman, C. Brain Scan with AI Can Predict Whether the Antidepressant Sertraline Will Work. HCPLive. February 15, 2024. Accessed February 27, 2025.