Researchers from Scottish universities have developed an innovative way to breathe new life into outdated robot pets and toys using augmented reality technology.
 
They have prototyped a new software system which can overlay a wide range of new virtual behaviours on commercially-available robot pets and toys which are designed to look like animals and mimic their actions.
 
The system, called Augmenting Zoomorphic Robotics with Affect (AZRA), aims to address the shortcomings of the current generation of these ‘zoomorphic’ robots, which often have very limited options for interactivity.

https://youtu.be/clZsK4bW4uw
 
In the future, AZRA-based systems could enable older robot pets, and even previously non-interactive toys like plush dolls, to provide experiences which are much closer to those provided by real animal companions.
 
The richer experiences AZRA enables could help provide more pet-like experiences for people who are unable to keep real animals for reasons of health, cost or restrictions on rental properties.
 
When users of the AZRA system wear augmented reality devices like Meta’s Quest headset around their robot pets and toys, it projects a sophisticated overlay of virtual facial expressions, light, sound and thought bubbles onto the toy’s surfaces and surroundings.
 
AZRA is underpinned by a sophisticated simulation of emotions based on studies of real animal behaviour. It can make robots seem more convincingly ‘alive’ by imbuing them with moods which fluctuate unpredictably and can be affected by the touch or voice of their owner.
 
Eye contact detection and spatial awareness features means it knows when it is being looked at, and touch detection enables it to respond to strokes – even protesting when it is stroked against its preferred direction. It can request attention when ignored, or relax peacefully when sensing its owner is busy with other activities.
 
The system can also adjust the enhanced pet’s behaviour to better suit their owners’ personality and preferences. If users are high-energy and playful, the robot slowly adapts to become more excitable. In quieter households, it becomes more relaxed and contemplative.
 
The team say their research could also help cut down on electronic waste by reducing the likelihood of robot pets and toys being disposed of after their owners become tired of them.

A diagram showing the different ways the AZRA system can display interactive elements
 
The development of AZRA will be presented as a paper at the 34th IEEE International Conference on Robot and Human Interactive Communication in the Netherlands on 26th August.
 
Dr Shaun Macdonald, of the University of Glasgow’s School of Computing Science, is the paper’s lead author and led the development of AZRA. He was initially inspired to develop the system after receiving a less-than-inspiring gift.
 
He said: “I was given a little robot pet that had a very basic set of movements and potential interactions. It was fun for a few days, but I quickly ended up losing interest because I had seen everything it had to offer.
 
“I was a bit disappointed to realise that, despite all the major developments in technology over the last 25 years, zoomorphic robots haven’t developed much at all since I was a child. It’s all but impossible to build a relationship with a robot pet in the way you might with a real animal, because they have so few behaviours and they become over-familiar very quickly.
 
“As a researcher in human-computer interaction, I started to wonder whether I could build a system which could overlay much more complex behaviours and interactions on the toy using augmented reality. Being able to imbue older robots and pets with new life could also help reduce the carbon footprint of unwanted devices by keeping them from landfill for longer.”
 
Dr Macdonald used a simple off-the-shelf zoomorphic pet, the Petit Qoobo, as the basic real-world platform on which to overlay the augmented reality elements during the development of the system. 
 
Guided by previous research into the emotional needs of dogs, Dr Macdonald developed Zoomorphic Robot Affect and Agency Mind Architecture, or ZAMA. ZAMA provides the AZRA system with a kind of artificial emotional intelligence, giving it a series of simulated emotional states which can change in response to its environment.
 
Rather than simple stimulus-response patterns, the system provides the augmented reality pet with an ongoing temperament based around combinations of nine personality traits including 'gloomy', 'relaxed' or 'irritable'. It has daily moods that fluctuate naturally, and a long-term personality which develops over time through interactions with its owner.
 
It simulates desires for touch, rest, food, and socialisation which are subtly randomised each day. When its needs aren't met, the AR robot will actively seek interaction, displaying emojis and thought bubbles to communicate what it wants.
 
The researchers are already working to explore the future potential of the technology, including participatory studies where volunteers can interact with the robot and then adjust its emotional parameters in real-time to explore what feels natural versus artificial in robot behaviour.
 
Dr Macdonald added: "AZRA turns a robot from being a device that I almost entirely choose to interact with into a device which can engage me in interaction itself. It feels more like me and another entity attempting to interact and communicate, rather than me make-believing almost all of that interaction myself.
 
"One of the main advantages of this system is that we don't have a fixed 'this is how this should work' approach. What we have is a really great development test bed where we can try different ideas quickly and see what works. As AR glasses become more mainstream, this could become a way to breathe new life into existing robots without having to replace them entirely."
 
Dr Salma ElSayed of Abertay University and Dr Mark McGill of the University of Glasgow are co-authors of the paper. The team’s paper, titled ‘AZRA: Extending the Affective Capabilities of Zoomorphic Robots using Augmented Reality’, will be presented at the IEEE RO-MAN 2025 conference at the Eindhoven University of Technology in the Netherlands on Tuesday 26th August.  


First published: 20 August 2025