AI Uses Facial Expressions to Aid Mental Health Treatment
Facial expressions analysis serve as crucial indicators of an individual’s emotional state, pivotal in psychotherapy for gauging momentary feelings. Psychiatrist Paul Ekman’s 1970s system standardized the decoding of basic human emotions like happiness, sadness, and disgust through facial cues.
Martin Steppan, a psychologist in Switzerland at the University of Basel, notes the widespread adoption of the Ekman system in psychoactive research, acknowledging its prevalence.
However, decoding facial cues demand considerable time. Consequently, many specialists explore alternative methods—like skin electrical oscillations—to gauge emotional shifts in patients during psychotherapy sessions and studies.
The aim behind uploading over 30,000 facial expression photos is to recognize six fundamental emotions: fear, sadness, anger, disgust, amazement, and happiness. It is based on video analyses from twenty-three patients’ psychiatric sessions at the Center for Computer Science at the University of Basel.
Martin Steppan, speaking to the “Medical Express” website focusing on medical research, elaborates that the AI system has been trained using over 950 hours of psychiatric patient video recordings.
The outcomes of the study are remarkable. The AI’s analyses aligned with those of three trained psychiatrists, even detecting fleeting facial expressions—like a passing smile or a momentary expression of disgust—that clinicians might overlook.
The researcher believes the AI system’s credibility in capturing facial expressions as an emotional metric positions it as a crucial tool for both research and treatment.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.