Intelligent robot 'dog' leads way for the blind - and a smarter future

Published: 15 April 2024

RoboGuide project uses the latest in 5G technology, real-time sensor data and large language models, to help guide and inform partially sighted users

Researchers from the James Watt School of Engineering’s Computer Sensing and Imaging (CSI) group have been working with The Forth Valley Sensory Centre (FVSC) Trust and the Royal National Institute of Blind People (RNIB) Scotland to design and test an innovative ‘robot dog’ helper for the visually impaired, gaining praise from end users – and much media attention – in the process. The 9-month project, funded by the European Physical Sciences Research Council (EPSRC), aims to get its ambitious new assistive technology ready for market.

Female user testing the CSI group's 'robot dog' at the Hunterian Museum

Some of the hardware and methods used by project investigators Dr Wasim Ahmed, Dr Olaoluwa Popoola, Prof Muhammad Imran, and Dr Paul Lynch and their team are perhaps familiar, being based on the modular agile quadruped unitree robot. However, our researchers’ RoboGuide (affectionately known as ‘Robbie’) has been modified and refined with a very specific type of animal mind – the beloved ‘guide dog for the blind’, whose special partnership with human beings dates back at least as far as ancient Rome.

In some settings, such robots are programmed for autonomous missions, such as archaeological expeditions, factory floor inspections, bomb disposal, or even radiation testing. This one is different. The human user (who is blind or partially sighted) remains vital. They do not separate from the dog but employ it as they would a real dog, to have it navigate and guide them through indoor environments that are sometimes complex spatially, full of potential hazards and noise – for example, museums, shopping centres, or schools. RoboGuide does this in ways which are more reliable than currently common methods (such as cameras or GPS), using sophisticated sensors and sensor data to determine optimal routes and maintain consistency. It is artificially intelligent.

Set up to be intuitive for users to learn and operate manually, RoboGuide also incorporates complicated large language models (LLMs) – essentially, software trained on huge datasets of natural language text that allows machines to understand and generate human language or speech. Robbie uses this to speak to users – to tell them about the objects and features surrounding them, and to answer simple questions. In this, it may offer more than a real dog.

Two participants at a recent event at the SEC showcasing the RoboDog project

The project team have demonstrated their project at several events recently, including the RNIB’s Global technology conference at the SEC, a public engagement event at the UofG Advanced Research Centre (ARC), and an evaluation event with volunteers from RNIB at the Glasgow’s Hunterian Museum, where Robbie was able to tell his ‘human’ about the cultural artefacts on display. Dr Wasim Ahmad speaks earnestly of his hope to “create a robust commercial product which can support the visually impaired wherever they might want extra help”.

So far, the team have been interviewed by Kaye Adams for her radio show (‘The Kaye Adams Programme’) on BBC Radio Scotland; reporters from the national ITN and BBC flagship news teams; technology reporters for popular BBC technology programme ‘Click’; the BBC World Service ‘Tech Show’ – and have featured in nearly every UK newspaper and magazine from the Telegraph to the Metro, and Scotland’s fortnightly current affairs publication ‘Holyrood’.

As Dr Popoola has been at pains to point out, these ‘robodogs’ are not in any way intended to replace our beloved guide dogs; rather, they are to supplement them, and potentially help make up for the scarcity of dogs available to those who need them, and people willing to train them. In fact, the RoboGuide project is a fantastic example of how research into robotics and AI can be extended and applied to help solve very human problems, to the benefit and betterment of individual and shared life experiences.

CSI group research team test Robbie the RoboDog at the SEC. The Social Robot also looking on.

PhD student Abdul Ghani Zahid, Dr Ola Popoola (PI) Dr Wasim Ahmad (Co-I) and students Shengning Zhang and Jingyan Wang at a demo event at SEC Alongside ‘Robbie’ and our Social Pepper Robot.


Press Coverage

The project team and Robbie the RoboGuide at the BBC Scotland studios

https://news.stv.tv/west-central/glasgow-university-experts-develop-ai-powered-robot-guide-dogs-for-visually-impaired

https://www.aol.co.uk/ai-powered-robot-guide-dogs-105158821.html?guccounter=1

https://www.digit.fyi/chatty-robot-guide-dog-in-development-at-glasgow-uni/

https://www.dailymail.co.uk/sciencetech/article-13060637/AI-powered-robot-guide-dogs-developed-visually-impaired.html

https://www.heraldscotland.com/news/24106604.chatty-robot-guide-dogs-development-glasgow-university/

https://www.theengineer.co.uk/content/news/roboguide-walks-and-talks-to-assist-blind-and-partially-sighted-people

https://www.holyrood.com/news/view,scottish-university-develops-lifechanging-aipowered-robot-for-the-visually-impaired

https://futurescot.com/glasgow-researchers-pioneer-ai-powered-robo-guide-dogs-to-help-blind-and-partially-sighted-people/

First published: 15 April 2024