Sad upset woman with tears in eyes

(Credit: goffkein.pro/Shutterstock)

HOBOKEN, N.J. — Your eyes may be the windows to your mental health. In a world where nearly 300 million people grapple with depression, early detection could be a game-changer. Now, thanks to cutting-edge research from Stevens Institute of Technology, your smartphone might soon become a powerful tool for spotting the early warning signs of depression — just by looking at your face.

Professor Sang Won Bae and doctoral candidate Rahul Islam have developed two innovative AI-powered smartphone apps that could revolutionize how we detect mental health issues. The first app, called PupilSense, works by analyzing your eyes.

“Previous research over the past three decades has repeatedly demonstrated how pupillary reflexes and responses can be correlated to depressive episodes,” Bae explains in a media release.

PupilSense takes quick snapshots of your eyes when you’re using your phone, measuring the size of your pupils compared to your irises. It does this during short 10-second bursts when you’re opening your phone or using certain apps.

In an early test with 25 volunteers over four weeks, the app analyzed about 16,000 interactions. The results? The best version of PupilSense was 76% accurate in identifying times when people reported feeling depressed. That’s even better than the current leading smartphone-based depression detection system.

Bae and Islam aren’t stopping there. They’re also working on another system called FacePsy, which examines your facial expressions to gain insight into your mood.

“A growing body of psychological studies suggest that depression is characterized by nonverbal signals such as facial muscle movements and head gestures,” Bae points out.

Woman's eye close up
Researchers have developed two innovative AI-powered smartphone apps that detect mental health issues by looking at your eyes and face. (Photo by Gayatri Malhotra)

FacePsy works quietly in the background, taking quick snapshots of your face when you open your phone or use certain apps. Don’t worry about privacy, though – the system deletes the actual images almost immediately after analysis.

Some of the early findings were quite surprising. For instance, smiling more seemed to correlate with potential signs of depression.

“This could be a coping mechanism, for instance people putting on a ‘brave face’ for themselves and for others when they are actually feeling down,” Bae explains. “Or it could be an artifact of the study. More research is needed.”

Other potential signs of depression included fewer facial movements in the morning and certain specific eye and head movement patterns. Notably, yawing (side-to-side) movements of the head during the morning seemed to have a strong link to increased depressive symptoms.

You might be wondering, “Why use smartphones for this?” The answer is simple: accessibility. With most people using smartphones daily, these apps could provide a readily available tool for early depression detection.

“And since most people in the world today use smartphones daily, this could be a useful detection tool that’s already built and ready to be used,” Bae notes.

Unlike other AI systems for detecting depression, which often require wearing special devices, these smartphone-based apps could offer a more convenient and less intrusive option.

While these technologies show immense promise, they’re still in the early stages of development. The PupilSense system is now available open-source on GitHub, allowing other researchers and developers to build upon this groundbreaking work.

As for FacePsy, Bae sees the pilot study as “a great first step toward a compact, inexpensive, easy-to-use diagnostic tool.” The team presented their findings at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia.

Paper Summary

Methodology

The study used an innovative mobile sensing system, FacePsy, to track facial expressions and head gestures as potential indicators of depression. A total of 25 participants used their smartphones for four weeks, during which the app automatically captured data whenever they unlocked their phones or used specific apps. The system collected data on various facial features, such as Action Units (AUs), which are specific movements of facial muscles, eye movements, and head gestures. This data was processed on the device, ensuring privacy by discarding raw images after feature extraction. The app gathered facial behavior data during natural smartphone usage, creating a real-world setting for depression detection.

Key Results

The study found significant patterns in the facial behavior of individuals experiencing depression. Specific Action Units (like those related to smiling, eyebrow movements, and chin raising), head gestures, and eye openness were identified as key markers. The analysis revealed that individuals in depressive episodes exhibited reduced head movement and fewer expressions associated with happiness. The model used in the study achieved an accuracy of 69% in detecting depression, with a prediction error (Mean Absolute Error) of 3.08 for the severity of depressive symptoms, meaning the system’s predictions were typically close to the actual clinical assessments.

Study Limitations

One of the main limitations of the study was the relatively small sample size of 25 participants, which might limit the generalizability of the results. Additionally, the study only included data from participants with Android devices, which may not fully represent users of other operating systems. The facial behavior captured was limited to 10-second bursts triggered by phone use, so longer interactions or behaviors outside this window were not recorded. Finally, while the system showed promise in detecting depression, it may not capture more subtle or complex aspects of mental health that are beyond facial expressions.

Discussion & Takeaways

This study demonstrates the potential of using mobile facial behavior sensing as a tool for detecting depression in real-world settings. The findings suggest that facial features, such as eye openness, head gestures, and specific muscle movements, are useful indicators of depression. While traditional methods rely on clinical evaluations, this approach offers a non-invasive, continuous way to monitor mental health, which could lead to timely interventions. However, the study also underscores the need for larger and more diverse datasets to improve accuracy and generalizability. For healthcare practitioners and researchers, this technology opens new avenues for real-time, affective computing-based mental health monitoring.

Funding & Disclosures

The study was conducted by researchers Rahul Islam and Sang Won Bae at the Semer Center for Healthcare Innovation, Stevens Institute of Technology. There were no conflicts of interest disclosed. The researchers have made the source code for FacePsy available as open-source, allowing other developers and researchers to use and build upon their work for further studies on mental health and affective computing.

About StudyFinds Analysis

Called "brilliant," "fantastic," and "spot on" by scientists and researchers, our acclaimed StudyFinds Analysis articles are created using an exclusive AI-based model with complete human oversight by the StudyFinds Editorial Team. For these articles, we use an unparalleled LLM process across multiple systems to analyze entire journal papers, extract data, and create accurate, accessible content. Our writing and editing team proofreads and polishes each and every article before publishing. With recent studies showing that artificial intelligence can interpret scientific research as well as (or even better) than field experts and specialists, StudyFinds was among the earliest to adopt and test this technology before approving its widespread use on our site. We stand by our practice and continuously update our processes to ensure the very highest level of accuracy. Read our AI Policy (link below) for more information.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

John Anderer

Associate Editor

Leave a Reply