Eye-tracking research is a peek into the future of mobile device interaction

Published: 3 April 2023

A new study exploring how mobile devices can be controlled solely by the movements of users’ eyes could offers a peek into the future of gaze-based interactions with smartphones, researchers say.

A new study exploring how mobile devices can be controlled solely by the movements of users’ eyes could offers a peek into the future of gaze-based interactions with smartphones, researchers say.
 
Human-computer interaction specialists from universities in Scotland, Germany and Portugal have taken a closer look at how eyes can be used to control mobile devices and made a series of recommendations on how to integrate gaze-interaction into future generations of tech.
 
Their insights are the first time that researchers have investigated how three forms of gaze interaction work while users are walking or sitting, and which methods users prefer in both situations. The results are set to be presented as a paper at the ACM Conference on Human Factors in Computing Systems later this month.
 
The paper could help shape the user experience of future mobile devices, which are likely to embrace eye-tracking technology as front-facing cameras become ever-more sophisticated.
 
The paper is based on an evaluation of 24 study participants’ experiences with using different eye-based interaction methods while they were seated at a desk and then walking around a room. The participants used the methods to select specific targets from a grid of white, circular shapes on a mobile phone screen each time one of the targets turned from white to black.

https://youtu.be/YgEl5WbTDaQ
 
Over the course of the study, participants were asked to select different numbers of onscreen targets. The numbers of targets varied between two and 32, and were  counterbalanced between participants to minimise the influence of extraneous factors, such as practice or fatigue, on the experimental results.
 
The three methods the participants were tasked with using were Dwell Time, Pursuits and Gaze Gestures. Dwell Time lets users select items by fixating their gaze on a target for 800 milliseconds. In Pursuits, users follow a small object orbiting around the target to select it. Gaze Gestures uses a multi-stage process where users look off-screen either to the left or right to narrow down the number of targets until they reach the one they want to select.
 
The researchers found that, when seated, the participants preferred to use Pursuits. That method was also faster – on average, it took just 1.36 seconds for users to select a target, compared to 2.33 seconds with Dwell and 5.17 seconds using Gaze Gestures.
 
When on the move, users preferred Dwell Time. At an average of 2.76 seconds it was slightly slower than Pursuits at 2.14 seconds but faster than Gaze Gestures at 6.68 seconds. A photograph of a mobile phone displaying the eye-tracking targets
 
Dr Mohamed Khamis, of the University of Glasgow’s School of Computing Science, supervised the research and co-authored the paper. He said: “Eye tracking has been well-studied in recent years across a range of user environments, but most of those where either the user, the camera, or both are stationary.  
 
“As smartphone camera technology has advanced, it’s become much more practical for eye-tracking to be implemented in those devices despite the challenges of both the device and the user moving at the same time.
 
“It’s a very promising method for enabling quick interaction with devices, and it could make it easier for people with mobility issues to use smartphones, as well as expanding the number of situations where devices can be used. Anyone who has tried to take an important call while wearing gloves or carrying something heavy in one hand will know how difficult it can be to free up their hands to touch their device.
 
“What we set out to do is explore how users might prefer to use eye-tracking to control their devices, and set out guidelines for future developments.”
 
Omar Namnakani, a PhD student at the School of Computing Science, is the first author of the paper. He added: “In the paper, we suggest a few guidelines for deciding how gaze-bazed interaction should be used in different situations.
 
“Where users are sitting and there are less than nine targets on screen, Pursuits seems to be the best method to use based on our research. However, Pursuits can be tiring to use when there are more targets. In those cases, Dwell Time is the best option both when sitting and moving around.
 
“However, despite users’ preferences in our study and the slower speed of the input, Gaze Gestures was the most accurate method of selection when users were both seated and moving. When accuracy is more important than speed, that should be the preferred method of selection.”
 
The researchers plan to continue their collaboration to examine new methods of eye interaction which could improve on the three techniques explored in the paper. They are also keen to explore the implications of eye-tracking technology for users’ digital privacy.
 
Dr Jonathan Grizou of the University of Glasgow, Dr Augusto Esteves from the University of Lisbon, and Yasmeen Abdrabou of the University of the Bundeswehr and the University of Glasgow are co-authors on the paper.
 
The team’s paper, titled ‘Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile Devices’, will be published in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems.

The research was supported by funding from the Engineering and Physical Sciences Research Council (EPSRC), the Islamic university of Madinah, the Royal Society of Edinburgh, and the Fundação para a Ciência e a Tecnologia and LARSy.


First published: 3 April 2023

<< April