Dr Mohamed Khamis
- Lecturer (Computing Science)
Dr. Mohamed Khamis is a lecturer (assistant professor) in the School of Computing Science and a member of the Glasgow Interactive SysTems (GIST) research section. He received his PhD from Ludwig Maximilian University of Munich (LMU) in Germany. He was supervised by Florian Alt, and Andreas Bulling. He also worked as a research associate at the German Research Center for Artificial Intelligence (DFKI) in Germany, and at the German University in Cairo (GUC) in Egypt.
Mohamed's research interests are at the intersection of Human Computer Interaction, Ubiquitous Computing, and User Privacy and Security.
Mohamed has published 60+ peer reviewed papers, including 6 papers at ACM CHI, the top conference in Human-Computer Interaction, three of which received honourable mention awards (top 5% of ~3000 submissions). He also published at top conferences like UIST, UbiComp, ETRA, ICMI and MobileHCI, and top journals like PACM IMWUT.
Mohamed is a program committee member of ACM CHI. He was previously a PC member of CHI LBW, PerDis, MUM, and others. He is the General co-chair of ACM PerDis 2019. He also organized several workshops like PETMEI, ArabHCI and Human-Drone Interaction.
He reviews for top conferences and journals, like ACM ToCHI and ACM ToPS. He gave guest talks in Germany, Italy, Egypt and the UK.
Many of Mohamed's accomplishments resulted from collaborations with scholars in Germany, UK, France, USA, Japan, Finland, Austria, Italy, Slovenia, and Egypt.
He is a member of ACM and SIGCHI. He is a founding member of Cairo ACM SIGCHI chapter, and co-organized the first ArabHCI workshop at CHI 2018.
You can learn more about my research through:
I worked in a diversity of topics within HCI, privacy and ubiquitous computing. In particular, I am active in the fields of eye tracking, pervasive displays, usable security, and user privacy. My contributions are at the crossroads of user privacy and ubiquitous computing:
- Usable Security and Privacy: I contributed to both: (1) understanding threats to user privacy that are caused/facilitated by ubiquitous technologies, such as thermal attacks (CHI'17, Honorable Mention), and shoulder surfing (CHI'17, MUM'17, INTERACT '19, ETRA'20), and (2) inventing novel ubiquitous systems for protecting user privacy and security on mobile devices (CHI'20, CHI'19, ICMI'17, MTI), public displays (IMWUT, ETRA'19, PerDis'17), and in VR (IEEE VR'19, USEC'17).
- Designing Gaze-based Systems: I contributed to comprehensively understanding and addressing challenges of gaze interaction on ubiquitous devices, such as mobile devices (CHI'18, MobileHCI'18) and public displays (UIST'17, UbiComp'16, MUM'16), as well as proposing novel concepts that employ gaze to address problems on public displays (CHI'18, IMWUT, PerDis'17), and mobile devices (ICMI'17). I published survey papers that reflect on prior work and set the agenda for future research in eye tracking on mobile devices (MobileHCI'18) and eye tracking for security applications (CHI'20).
You can learn more about my research and bio on http://www.mkhamis.com/
Abdelrahman, Y., Khamis, M. , Schneegass, S. and Alt, F. (2017) Stay Cool! Understanding Thermal Attacks on Mobile-based User Authentication. In: CHI '17: CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6-11 May 2017, pp. 3751-3763. ISBN 9781450346559 (doi:10.1145/3025453.3025461)
Khamis, M. , Trotter, L., Mäkelä, V., von Zezschwitz, E., Le, J., Bulling, A. and Alt, F. (2018) CueAuth: comparing touch, mid-air gestures, and gaze for cue-based authentication on situated displays. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2(4), 174. (doi: 10.1145/3287052)
Eiband, M., Khamis, M. , von Zezschwitz, E., Hussmann, H. and Alt, F. (2017) Understanding Shoulder Surfing in the Wild: Stories from Users and Observers. In: CHI '17: CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6-11 May 2017, pp. 4254-4265. ISBN 9781450346559 (doi:10.1145/3025453.3025636)
Khamis, M. , Buschek, D., Thieron, T., Alt, F. and Bulling, A. (2017) EyePACT: eye-based parallax correction on touch-enabled interactive displays. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1(4), 146. (doi: 10.1145/3161168)
Khamis, M. , Becker, C., Bulling, A. and Alt, F. (2018) Which One is Me?: Identifying Oneself on Public Displays. In: 2018 CHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 21-26 Apr 2018, p. 287. ISBN 9781450356206 (doi:10.1145/3173574.3173861)
Khamis, M. , Baier, A., Henze, N., Alt, F. and Bulling, A. (2018) Understanding face and eye visibility in front-facing cameras of smartphones used in the wild. In: 2018 CHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 21-26 Apr 2018, p. 280. ISBN 9781450356206 (doi:10.1145/3173574.3173854)
Mäkelä, V., Khamis, M. , Mecke, L., James, J., Turunen, M. and Alt, F. (2018) Pocket Transfers: Interaction Techniques for Transferring Content from Situated Displays to Mobile Devices. In: 2018 CHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 21-26 Apr 2018, p. 135. ISBN 9781450356206 (doi:10.1145/3173574.3173709)
Royal Society of Edinburgh (RSE) Sabbatical Research Grant (£65,616), March 2020
Received the John Robertson Bequest research grant (£1,500), March 2019
Google Pilot Research Award (~£500) - March 2016
Open topic 1: Designing Eye Gaze interaction for Handheld Mobile Devices
Imagine controlling a mobile device with your eye movements. Front-facing cameras of handheld mobile devices are continuously advancing. In particular, recent smartphones feature depth cameras (e.g. iPhone X), which can significantly improve the quality of eye tracking on mobile devices compared to older versions. Eye gaze is fast, and interacting with gaze is intuitive, natural and brings in a lot of benefits for the user. In addition to being a hands-free interaction modality, gaze is subtle and thus suitable for sensitive interactions (e.g. entering passwords). There are also many ways gaze can improve other forms of interaction, like interaction by touch. Imaging looking at a button on the top of the screen, and instead of tapping on it with your finger, you just gaze at it and tap anywhere on the screen to activate it. This would significantly increase interaction speed, and thereby improve the overall user experience.
However interaction with mobile devices using eye gaze is also challenging. Mobile devices are used in dynamic contexts (e.g. while walking). So how can we ensure accurate interaction in these contexts? This would require some computer vision work and/or coming up with novel gaze interaction techniques that would work even if eye tracking data is not accurate (e.g. see work on Pursuits  and gaze gestures ). Another problem is that people do not always hold phones in a way that reveals their eyes to the front facing camera. How can we guide them to hold the phone in a suitable manner? These are some of the challenges that get in the way of enabling gaze interaction on mobile devices.
This PhD project aims to:
1) Identify opportunities of using eye gaze on mobile devices
2) Identify challenges that hinder the adoption of eye gaze interaction on mobile devices
3) Address some of the core challenges identified in step (2)
 Mohamed Khamis, Florian Alt, and Andreas Bulling. 2018. The past, present, and future of gaze-enabled handheld mobile devices: survey and lessons learned. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). ACM, New York, NY, USA, Article 38, 17 pages. DOI: http://www.mkhamis.com/data/papers/khamis2018mobilehci.pdf
 Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing (UbiComp '13). ACM, New York, NY, USA, 439-448. DOI: https://doi.org/10.1145/2493432.2493477
 Drewes H., Schmidt A. (2007) Interacting with the Computer Using Gaze Gestures. In: Baranauskas C., Palanque P., Abascal J., Barbosa S.D.J. (eds) Human-Computer Interaction – INTERACT 2007. INTERACT 2007. Lecture Notes in Computer Science, vol 4663. Springer, Berlin, Heidelberg https://link.springer.com/chapter/10.1007/978-3-540-74800-7_43
Current PhD Students
Professional activities & recognition
Prizes, awards & distinctions
- 2020: Top 5% of submissions to CHI 2020 (CHI Honorable Mention Award)
- 2018: Top 5% of submissions to CHI 2018 (CHI Honorable Mention Award)
- 2018: Top 5% of submissions to MobileHCI 2018 (MobileHCI Honorable Mention Award)
- 2017: Top 5% of submissions to CHI 2017 (CHI Honorable Mention Award)
- 2017: Top 5% of submissions to MUM 2017 (MUM Honorable Mention Award)
- 2020 - 2021: Royal Society of Edinburgh Sabbatical Research Grant
- 2019 - 2021: ACM CHI Program Committee
- 2018: ACM CHI LBW Program Committee
- 2017 - 2018: ACM PerDis
- 2016 - 2018: MUM
- 2019 - 2020: Special issue on Pervasive Displays in Springer journal on Personal and Ubiquitous Computing
- 2019: Special issue on HCI in the Arab World in the ACM Interactions Magazine
Professional & learned societies
- 2018: Professional Membership, Association of Computing Machinery (ACM)
- 2018: Professional Membership, Special Interest Group in Computer-Human Interaction (SIGCHI)
- 2018: Founding member, Cairo ACM SIGCHI Chapter
More about me on my website: http://www.mkhamis.com/