UCLA is embarking on a three-year study to better understand the impact of “sleep, physical exercise, heart rate, and daily routine” on anxiety and depression symptoms. According to new information on a current study conducted by UCLA and Apple, the researchers are detecting emotions through the use of face recognition, speech patterns, and a variety of other passive behaviour tracking techniques.
The emotion study goes one step further—it makes an inference about your emotional state based on your health data. It is one amongst many increasing numbers of apps that claim to passively evaluate your emotions using what is known as emotion AI or affective computing. The field seeks to comprehend a person’s emotions through the use of numerous data points, including facial expressions and is frequently used for commercial purposes. Recent machine learning-based emotion recognition technologies on an android platform are listed below.