Interior of a vehicle

Human-computer interaction in cars

Devising the best approach for driver alerts and controls in next-generation vehicles

Advances in automotive technology are changing the relationship between the car and the driver. In effect, the car and the driver have now become partners in controlling the vehicle, and observing and reacting to external stimuli. As with any activity done in pairs, communication is vital to understand who needs to do what, and when.

Cars need to be programmed to know the most effective ways to interact with a human driver, and equipped with the right tools to do this. The human driver needs safe and simple ways to communicate back to the car.

Determining all of this is a focus for Professor Stephen Brewster and other computing scientists within the Multimodal Interaction Group.

"What we're looking at is how people use technology, that's the human computer interaction. The multimodal part is how do we use all the senses and control capabilities we've got as humans," says Professor Brewster.

Designing feedback for semi-autonomous driving

The shift toward semi-autonomous driving creates one of the key challenges of multimodal interaction within a vehicle.

"At some point you’ve got to hand over control from the car to the driver. You need to avoid the scenario where the car thinks the driver is driving, the driver thinks the car is driving, and no one is driving" says Professor Brewster.

The group designs experiments to understand the range of human perception for particular sensory stimuli – haptic (touch-based feedback), visual or aural – or combination of these. From this, they determine how these stimuli can be applied to convey information most effectively for various scenarios. They'll then design and test prototypes for these scenarios to see if, or how, they work in practice.

In something like alerting the driver of a semi-autonomous vehicle that they need to take back control of the driving, this would be a fairly intense stimulation of as many senses as possible.

"For the sort of feedback related to semi-autonomous vehicles, we look at haptics on the seatbelt, on the wrist (similar to the alerts of a smart watch) and on the seat (vibration). Then we combine that with visual warnings; we combine that with audio warnings. The combination of all three of modes works the best," explains Professor Brewster. “Unsurprisingly the visual ones don't work very well if you're playing on your phone while the car is being controlled autonomously and you don't see the light on the dashboard flashing," he adds.

Developing new concepts for haptic feedback

The possibilities for in-car interaction only have their limits in the range of human sensory awareness. The group have funding from EPSRC and Jaguar Land Rover to explore a variety of haptic feedback innovations, often in combination with visual and audible alerts. As it is in constant contact with the hands, the steering wheel is a natural place to experiment with haptic feedback in a car.

With Jaguar Land Rover, the group are currently exploring how temperature might be used to convey information to a driver.

"If you think of a bare-hand interaction with anything, you always get temperature feedback. But most of the time this is ignored. We thought, well, if it's there and humans can sense it, we ought to study it,” explains Professor Brewster.

“We try to understand something about the basic perception of that feedback. Then we look at how we can apply that in the design of a user interface, in this case on a steering wheel. How much hot and how much cold do you need? What are the steps you can detect? But we do this research in more practical situations than maybe a psychologist would do."

Using Peltier elements that can both heat and cool the steering wheel, the researchers have looked into applying this feedback for navigation (warming or cooling different sides of the wheel to indicate upcoming manoeuvres), or alerting drivers to fatigue.

The group are also looking into other novel ways of alerting a driver using the steering wheel.

"We've looked into pins that come out of the steering wheel. So if you imagine little pins with rounded ends that push against you. We call this cutaneous push feedback and we can apply it to create different patterns,” says Professor Brewster. “With vibration, all you get is the sensation itself. With these pins you can feel patterns, so you can create more informative messages with those.”

Adapting to touchscreens in cars

Another opportunity for development of haptic technologies and multimodal alerts in cars is the trend in modern vehicles to reduce their dashboard controls down to an – often very large – touchscreen. Drivers are losing the familiar knobs and dials that they could adjust with minimal visual attention.

Add to this the increasing number of functions a car can perform and the information-intensive nature of some of these selection tasks – navigation, song selection – and there's a need to return some non-visual cues (or visual cues that require less attention) back to drivers so they can pay proper levels of attention to the road.

They are partners on a project called HAPPINESS, working with companies such as Bosch to introduce tactile sensations such as roughness, vibration and relief into a touchscreen type interface.

"Everybody wants to remove physical controls from the cars and just replace them with touchscreens, just like we did with phones. The work we're doing with tactile feedback on touchscreens in the car was motivated by work we did on tactile feedback for touchscreens on the phone. We thought, well the next problem is touchscreens are appearing in cars and they've got exactly the same issues as phones so how can we deal with that?" says Professor Brewster.

Freehand gestures and ultrahaptics

Another approach offering alternative controls in a vehicle is freehand gestures, where a driver can control certain functions by making mid-air gestures. These require less attention than using a touchscreen as they don't require the driver to focus on a specific button or target.

The issue here is that by making gestures in mid-air, you have no surface to receive feedback from. But this 'surface' can be now be created using ultrasonic haptics, a technology the group helped to develop in partnership with the University of Bristol.

This technology enables the creation of focused air pressure waves by using an array of ultrasound transducers. This focused air pressure generates, in effect, a surface in mid-air which the user can feel when they put their fingers in its vicinity. The group are also developing this technology on a project called Levitate, where this ultrasonic array is used to suspend particles in mid-air and create visible 3D objects.

Applied to mid-air gesturing in a car, ultrasonic haptics can be configured to generate familiar sensations that mimic the tactile response you might anticipate from actions like scrolling a page or turning a dial.

Multimodal cues are also important in this setting. One of the group's researchers has been examining the types of multimodal feedback that work well with freehand gestures. These have included ambient visual displays on the dashboard using an LED strip. When you use a gesture to change the temperature for example, a light might move from blue to red across the top of the dashboard. This provides some feedback for the driver but without demanding much of their visual attention.

Research and development for accessibility

For Professor Brewster and his colleagues, their work on cars has its origins in the group's initial and continuing focus on enabling technologies for people with disabilities. A key part of their research is focused on improving human-computer interaction for people with various types of physical and mental impairment.

"Part of our work has always been on accessibility. What we've found is that our work developing computer interfaces and suitable feedback responses for visually impaired people can also be applied to scenarios where visual attention is limited, like undertaking tasks when you're driving. It is the same with our research to develop interfaces to help those with motor impairments. Some of this work can also apply to situations using an interface inside a moving vehicle," he explains. 

"We're interested in how you can apply all of the things humans can do, and sense, to all these different settings. That's what I started doing on my PhD and then we've kept doing it ever since."

Virtual reality

The group are also conducting research into the virtual reality (VR) and the use of head-mounted displays. They are focusing on the perception of motion while using a head-mounted display (HMD) and how this can contribute to motion sickness.

They've also taken this research into the environment of a moving car by examining how using such a device while in a moving car can induce motion sickness; a result of the conflict between the perception of virtual motion in the headset and the real motion of the vehicle.

"We conducted the first on-road and in motion study to systematically investigate the effects of various visual presentations of the real-world motion of a car on the sickness and immersion of VR HMD wearing passengers. We established new baselines for VR in-car motion sickness and found that there is no one best presentation with respect to balancing sickness and immersion. Instead, user preferences suggest different solutions are required for differently susceptible users to provide usable VR in-car,” says Professor Brewster.