User Research for Next-Generation In-Vehicle Experiences
Guiding Product Vision through User Insights
What if autonomous vehicles could have their own virtual assistant with a visual presence? We discovered a technology that could animate characters based on audio input, displaying appropriate emotions in real-time. We saw potential for a novel virtual assistant solution that could transform how people interact with autonomous vehicles.
The concept raised some fascinating questions. Would passengers want a visual representation of their assistant in the car? Could this companion follow them from home to the vehicle? How would people interact with it, and what would they want to use it for? We wanted to understand the human side before moving forward with this idea.
From Concept to Research
I led a comprehensive three-part research initiative to answer these questions, coordinating across design, engineering and creative teams to bring this concept to life for user testing.
Creating a Realistic Virtual Assistant
Working closely with our design team, I guided the development of a face design that felt natural without falling into the uncanny valley. This meant numerous iterations and feedback rounds of reviewing concepts, providing direction, and ultimately facilitating a team vote to select the final design. I also coordinated with engineering to integrate the startup's animation software and ensure working video demonstrations were ready for user testing on schedule.
Developing an Abstract Alternative
To give users meaningful choices, we needed more than one design direction. We selected a creative agency to develop an alternative design approach. In a brainstorming session with the agency, we tackled a core challenge: how do you visualize sound? Together, we created storyboards and developed three design directions. The agency pitched all three concepts to the team, we voted on the favorite, and they produced full video demonstrations for both the user study and our CES demo.
Designing and Executing the UX Study
The UX study brought all the pieces together: multiple assistant models, user scenarios, and research questions. I worked with two interns to structure the study, defining the core questions and determining the different assistant variations to test. Working hands-on with engineering, I coordinated the animation of multiple assistant models and technical integration into our test mockup. I led the development of the script and interview questionnaire, recruited 10 participants, and ran internal test sessions to refine our approach and smooth out our process. I oversaw the interviews, led the analysis to extract key insights about user preferences and interaction patterns, and presented the findings to both the team and senior leadership.
Shaping Strategic Vision
The research delivered critical insights about design preferences, which use cases users found most valuable, and how they naturally wanted to interact with a virtual assistant in the car. These findings informed the development of our CES 2017 demo, which was presented to OEM executives in a private showroom.
The demo contributed to a broader narrative, positioning the company as an innovative partner in the emerging autonomous vehicle space. The project also established a methodical research approach that could be applied to evaluate other emerging technologies, using systematic user research to let data and real user feedback guide decisions and transform promising technologies into well-defined product concepts.