Scientists reveal new technology that will help us compute more safely on the move and that 'body talk' could control mobiles

Published: 12 April 2005

New research by University of Glasgow scientists that enables people to interact safely with mobile computers while walking, running or driving, could help to prevent users from putting themselves in danger.

New research by University of Glasgow scientists that enables people to interact safely with mobile computers while walking, running or driving, could help to prevent users from putting themselves in danger.

The research means that changing tracks on digital music players of the future while on the move could be done with the nod of the head.

Walking and texting is dangerous. While your eyes are glued to the tiny, hard-to-see display and your thumb is stabbing buttons, you might easily walk into an innocent bystander, a lamppost or under a bus. Those carrying mobile computers find it just as hard to operate a tiny keyboard or scribble with a stylus while walking.

The research, which has been funded by the Engineering and Physical Sciences Research Council (EPSRC), is being carried out at the University of Glasgow and is developing a solution to this problem using 3D sound and gestures.

"We hope to develop interfaces that are truly mobile, allowing users to concentrate on the real world while interacting with their mobile device as naturally as if they were talking to a friend while walking," explains Professor Stephen Brewster, from the University of Glasgow, who is leading the project.

If using our eyes is difficult and unsafe in a mobile environment, the next best thing would be using our ears as well as any other movements we might make that did not interfere with the business of walking, running or driving.

The research team found that most previous research into audio interfaces and gesture recognition was based on a static user rather than a moving one. This led Professor Brewster - along with colleagues Dr Rod Murray-Smith, John Williamson and Georgios Marentakis - to develop 'audioclouds', a new way of interacting with computers on the move.

The project is currently half way through its three-year research period, but already the team sees a number of different additional applications, including using simple gestures, like a nod of the head, to change music tracks on your MP3 player.

EPSRC spokeswoman, Lucy Brady, said: "The innovative aspect of this project is that engineering is being used to explore a new paradigm for interacting with mobile computers, based on 3D sound and gestures, to create interfaces that are powerful, usable and safer."

For more information, contact: Professor Stephen Brewster, Department of Computing Science, University of Glasgow, Tel: 0141 330 4966, E-mail: stephen@dcs.gla.ac.uk or Jenny Murray, Press Officer, University of Glasgow, Tel: 0141 330 8593

Media Relations Office (media@gla.ac.uk)


Three images are available from the EPSRC Press Office (?Audioclouds.jpg?, ?Overview.jpg? and ?Expt kit.jpg?). Contact Natasha Richardson, Tel: 01793 444404, E-mail: natasha.richardson@epsrc.ac.uk.

The 3-year project, ?Audioclouds: Three-dimensional auditory and gestural interfaces for mobile and wearable computers?, started in October 2002 and is receiving EPSRC funding of ?238,765. To find out more about audioclouds visit www.audioclouds.org.

First published: 12 April 2005

<< April