Exhibition 1_Page Group 08
Team number | ??Group 08 |
---|---|
Students | Celine Katelijn van Kooten Chia-Pei Lai Fabienne Wijshoff |
Coach | Dieter Vandoren |
Brief | Ford |
Keywords | AIEmotionFordInteractionexperience |
Exhibition link | |
Status | Finished |
One liner! | |
Link to video |
- Introduction
- Concept 1: Making Your Life Easier (MYLE)
- Concept 2: Friendly Ford
- Concept 3: Smartly responding to your mood
- Our conclusions
Introduction
Problem statement
From Ford: HMW create trustworthy AI based experiences that are relevant to our passenger vehicle users and our commercial vehicle users?
Timeframe: 2030-2035
Our design direction
Our design direction
The experience of the user when driving a Ford car is largely determined by the emotions the user goes through when interacting with the car. Depending on a lot of factors, the user can be happy, satisfied, frustrated, overjoyed, or disappointed by the experience.
We believe the purpose of a Ford AI should be able to create designs that evoke emotions which result in positive user experiences. We see many opportunities in the fact that a car is a controlled microenvironment. We want to capitalize on this, to give all occupants a complete and pleasant experience.
Video
Concept 1: Making Your Life Easier (MYLE)
Explanation
Explanation
The vehicle can learn the preference of the drivers immediately when their phone connects with the car. The vehicle will access to the calendar, find the next appointment, and open the map to prepare for navigating users to the next destination. The music they are playing at that moment will continue as well. Drivers can have a more efficient life with Ford.
Your day is busy enough. Driving a car may be a relaxing moment between all your various accomplishments of the day. Getting your journey ready, and managing it while driving, can take up unnecessary stress and headspace. That's why we're introducing Making Your Life Easier.
Prototype and testing
Prototype and testing
We wanted to test how participants respond to a personalized message the moment they enter the car.
Two possible scenarios we anticipated were:
- the participant finds it pleasant and feels welcome
- the participant feels unsafe that the technology around them knows so much about them
Two hypotheses:
A participant testing the MYLE:
Feedback
The user test indicated that users would like to be welcomed by AI in their car. They mentioned that they would even like a more personalized AI.
Video
Concept 2: Friendly Ford
Explanation
Explanation
With the trend of people marrying less in the future, getting fewer children and people personalizing their cars, Friendly Ford accommodates to those trends. A friendly AI greets you when you enter the car and starts a conversation with you. It knows your daily schedule and communicates with you depending on your feedback. If you are in a bad mood, Friendly Ford will try to help you cheer you up with suitable music or give you the space you need, depending on the course of the conversation. You can also just tell Friendly Ford what to do and it will do it for you!
Feeling lonely? Having the need to talk to someone while you are going on a trip alone? Don?t worry, Friendly Ford has got your back!
Prototype
Prototype
For the prototype of this concept we used the application Voiceflow. This gave us the possibility to create an AI that reacts to different inputs from a person, with variable feedback.
Testing
Testing
In order to see how well this idea performs in context, we created a small test setup that was meant to verify our concept.
We also tested to see how the participant would react to different types of ?Friendly Fords? faces.
Feedback
(+) What went well? | (-) What went wrong? |
---|---|
+ Personalized greeting | - Not enough utterances ? Had to influence testing a bit |
+ Different choice paths were used | - Laptop would not pick up sound very well |
+ The testing environment was as accurate as possible | - No feedback on not well understood responses of tester |
Future recommendations
Future recommendations
If this concept were to be developed more, we would implement more utterances to make it more accurate, or insert replies of the AI itself which would ask the participant to rephrase what they were saying.
Short movie
Short movie
Concept 3: Smartly responding to your mood
Explanation
Explanation
How you feel affects your driving style and performance (1). When a user experiences very negative emotions, this can cause him to respond emotionally and drive more recklessly. To help the driver and passengers have a more pleasant and safer journey, the car smartly responds to these emotions using several micro feedbacks in the car.
By measuring atypical metrics (clenching the steering wheel, slamming doors, screaming in the back seat), the car can make an assessment about the moods of the driver and passengers and adjust its settings accordingly.
By measuring atypical metrics (clenching the steering wheel, slamming doors, screaming in the back seat), the car can make an assessment about the moods of the driver and passengers and adjust its settings accordingly.
Prototype + Testing
Prototype + Testing
To test this concept, we have created a smart steering wheel prototype. This steering wheel recognizes your emotions by the way you interact with it (e.g. squeezing hard, slamming, pulling, caressing) and reacts to these with feedback aimed at changing this emotion for the better.
We wanted to use Arduino to measure the data from the interaction between the user and the steering wheel, and create a digital response accordingly. Due to a broken Arduino, we then created a ?Wizard of Oz? prototype. Using scenarios, we simulated different emotions. Our goal was twofold: 1. For each emotion, we wanted to observe how a user acted upon that emotion. 2. We Wizard of Oz-ed a smart response from the car and observed their reaction.
Feedback
Feedback
Our main insights were that users appreciated the personalized responses from the car: they thought it was kind, but not always helpful in terms of changing their mood. Furthermore, we found that the interaction with the steering wheel can be an indicator for emotions but can definitely not make a judgement with certainty. We observed many other body movements and gestures that might be equal or better indicators (such as facial expression, shoulder movement/tension)
Users appreciated the personalized responses from the car: they thought it was kind, but not always helpful in terms of changing their mood. We found that only measuring through steering wheel provided limited data input for us.
Short movie
Our conclusions
What we learned
What we learned
During the past four weeks, we exploring AI/ Machine Learning, made our first prototypes and create a first concept direction that fits our vision of the future.
We discovered our interest lies in the emotional side of AI: How AI can recognize/measure human emotion, act upon it, or maybe even display emotion itself? We believe that an AI can create more value for users when it truly understands the user. In order to do so, an AI should not only look at the outside of the human (i.e. facial expressions, body language). These data can be easily misinterpreted, and overlooks the complexity of humans.
Our interest lies in the emotional side of AI: How AI can recognize/measure human emotion, act upon it, or maybe even display emotion itself? How can we create more embodied interactions (outside of touchscreens)?
What?s next?
What?s next?
We were inspired by the the Ford Evos concept car, which uses electronics in the seats to determine the driver?s pulse rate while driving and adapt the car to its physical condition. We believe Ford can take this biofeedback a step further in the future. Therefore, we propose a speculative design direction. To better understand users? emotions or moods, we want to look further than the outside of the human. By implementing non-invasive brainwave measuring technology (such as EEG) in the chair and combining it with big data and AI, the car could be able to understand the user precisely and can therefore offer a better response to make people in the car feel more satisfied and happy.
By measuring brainwaves, the car can learn what a user really needs. Something the user may not even realize themselves. The car can learn to smartly respond to these measurements and provide the ultimate service.
* The future of cars... really understands their users. * Looking beyond the outside of the user (movements/facial recognition) towards biofeedback and more futuristic ways of understanding users (EEG).