Drive by Light
Interactive projection-based Steering wheel
Time
May 2023
Client
MSc Thesis
Project type
Automotive UX Design
Role
Designer
Tools
Figma, Protopie, Arduino, Fusion360, Blender
Problem outline
Having pointed out from the research activity that, despite the inevitable progression and eventual switch to full-autonomous vehicles, human driving will most likely remain among the primary activities achievable in a car whether it will become something discretional or auxiliary to the vehicle intelligence, the aim of this design challenge was to develop the re-design of an interactive solution for the steering wheel and instrument cluster unit in the vehicle dashboard.
The solution
Drive by Light is a steering wheel proposal that features a light source placed above the driver’s head that projects information and controls’ labeling (contextualized for each feature accessible) on the whole surface of the steering wheel, mapped precisely in correspondence with hardware controls following dynamically the rotation angle of the steer.
Exploiting a standard choice of hardware controls already adopted in market-ready vehicles, the steering wheel design provides an eased and focused access to all features of the infotainment adapting the nature of contents shown and relative controlling inputs to the specific infotainment feature selected.
The steering wheel interface is divided into three pans. Information Display occupies the central portion of the steer, providing contextualized information related to selected features.
On the left are placed all the controls that allow navigating through the infotainment sections thanks to a rotary encoder selector and mechanical buttons for going back, managing the view in the homepage, and vocal command activation.
On the right pan, a 3x3 matrix of capacitive touch pads hosts detail management and control of feature-related interactions. On this side of the steering wheel, the system projects controls’ labeling following the feature selected.
The process
Discover
Field expert evaluations performed previously highlighted three main problems across all vehicles included in the analysis:
- Overcrowded control dashboards on the steering wheel’s spokes (at different levels in each vehicle) fail in providing fast lane patterns to drivers for solving situational tasks as they extend the number of glances required and the average time spent per each glance away from the driving horizon.
- An unclear hierarchy of controls in terms of labeling and placement of hardware controls on the spokes prevents users from building scalable patterns of interaction and habits when dealing with tasks of different natures.
- Unintuitive information displays intended as feedback and feedforward on the instrument cluster linked to steering wheel controls are responsible as well for extended time per each glance and extended learning time required.
All the deficiencies aforementioned belong to a deeper pattern of lack of contextualization of interfaces. In the development of interface design in recent years, the constantly growing number and specificity of in-vehicle infotainment features have not been addressed with a coherent adaptiveness of control clusters touchpoint.
Develop
The main working logic ruling the interface on the steering wheel is to provide exclusive ambient for each of the main features the driver might need when controlling the car manually. By doing so the intent is to prevent users from iterative exploration of settings and screens because of redundant shortcut controls occupying the interface with no purpose.
The logic behind this approach has been to (1) systematize control gestures for functional driving tasks, (2) migrate and optimize task-related information display to reduce cognitive load while driving, and (3) locate all task interaction functional to the driving experience on the steering wheel by moving it from the center tunnel.
Moving on detail level of interface functioning, each specific ambient of single features contains all the controls related to that feature. Besides that, once in the specific ambient of a feature, controls projected on the right pan are mapped with the same placement scheme of correspondent information displayed on the central pan. This allows users to quickly associate controls to respective settings operable.
Prototype and Test
the user validation testing aims to provide soundness to the concept and find insights on possible improvements both locally (around interface details such as affordances of digital and physical controls and clarity of visual interface) and globally (around the nature of the concept itself rather than the information architecture and interaction patterns and flows) through a task-based testing protocol with previous and afterward surveys.
For this reason I put up an interactive and fully functioning prototype able to perform all the interactions flows and patterns in the vehicle’s infotainment. Starting from a 3D-printed shape of the steering wheel I moved to embed an Arduino circuit and code connected to a Protopie (exploiting the Connect plugin) application of the infotainment previously created in Figma.
In order to simulate a driving-like situation as much as possible and given the technical complexities of adapting the prototype and the projector to an actual simulator, an additional monitor reproducing several driving video footage was placed in front of the tester behind the steering wheel. Each footage (different from tester to tester) was shot from different gameplay sessions of a highway driving simulator.
Testers were asked, in order to force their attention span onto the road rather than the steering wheel, to provide out loud insights into each event occur- ring in the video run in front of them (lane departures, turns, received and performed overtakes, sudden events happening on the road).
At irregular time intervals, testers were asked to perform tasks on the steering wheel interface. During the testing length, an external camera featuring custom eye-tracking software was used to record the amount and the medium length of glances (intended as the movement of eyes’ focus from the driving horizon towards the steering wheel interface) away from the road during task performance. Values from the eye-tracking recordings, crossed with the missed events happening in the road-driving videos were sufficient to understand the effectiveness of interaction patterns designed and spot weaknesses to improve afterward in the prototype.
Conclusions
Despite the limitation in terms of testing technology (all done custom and diy) and time constraints for validating the concept on trial road context, it was possible to perceive improvements in terms of distraction of users. By spotting and measuring the lenght of each distraction from the driving horizon (noted as glances), I could manage all the improvements needed to progressively reduce them (look at the data below, representing the average number and lenght of glances away from the road while operating tasks).
Besides that, managing to locate all the interaction flows of the infotainment without releasing hand contact on the steer has been the main challenge that I hope to have tried to solve.
Potential next steps for this project could be the adoption of a better and more accurate testing environment to test its potentialities on a market vehicle and it efficiency on the road.