This presentation focuses on the potential of gestures as tools in design. Hand gestures and facial expressions can reveal points of view, feelings, and opinions. Current technology advances allow the recognition of signs and gestures via computer supported devices. If gestures are gathered, it is possible to increase the information space and improve the user experience.
A gesture recognition model using novel technologies (i.e. Leap Motion, Intel RealSense) is proposed. Two scenarios will be discussed:
- Sign languages recognition: Sign languages are natural languages used mostly by deaf and hard of hearing people. The complexity of automatic sign language recognition is associated with its multiple dimensions: handshape, non-manual markers, movement, palm orientation and location. Within a dialogue, changes in any of the dimensions can completely alter the meaning of a phrase.
- Gesticulations recognition: in meetings, people exchange not only verbal messages but also nonverbal messages using multimodal channels such as gaze, body posture, and facial, head and hand gestures.
Moreover, challenges related to gestures in design will be discussed.
Luis is a Computer Science doctoral student from Costa Rica. He is visiting with Dr. Natalia Romero for a period of 16 weeks. The goal of his stay is to use the available equipment and technologies here at TU Delft and implement a software prototype that combines the knowledge of the TU Delft researchers and the gesture recognition framework proposed by Luis. The prototype addresses the use of gesture and facial expression recognition in group meetings. In Costa Rica, Luis researches automatic sign language recognition using natural user interfaces.
Wednesday Nov 15, 2017
16:00 – 17:00