1 EmoGram
Created | |
---|---|
Tags |
Goal
The EmoGram helps the doctor to detect the emotions of the patient during the intake and other consultations. The doctor can act accordingly and help to improve the state of mind of the patient.
*orginally the goal was change the environment based on the detected emotions. E.g. playing calming sounds or changing the light to calming colours. We didn’t manage to make this (due to storm eunice)
Explanation of concept
- A camera will record the facial expressions of the patient.
- The Teachable Machine compares it to its database and find the matching emotion.
- Then a colour will be shown in the corner of the screen of the doctor. This colour tells the doctor which emotion is detected. The doctor can act accordingly.
Used technologies
Teachable Machine, p5.js
Link to prototype
https://editor.p5js.org/xhxin9/sketches/kCEQZ-MA6
Video
Findings from testing
- It’s really hard to recognize emotions. Only if you make really obvious facial expressions, it will notion the emotion. So at the moment, it’s not suitable to use during appointments, as patients will mostly express subtle emotions.
- To make the AI more accurate, we should train it with the tone of voice, hand gestures etc. We could also connect it to heartrate sensors for example.