Research Project Showcase

Here you can find research projects from researchers within the IELabs community.

DeepSea Nexus: Connecting with your oceans through Extended Reality

TEAM: Dr Laurence De Clippele (PI), Dr Iain Findlay-Walsh (Co-I) & Dr Tiffany Vlaar (Co-I)
FUNDING: University of Glasgow, Crucible Seed Funding Award, £7,398
 
This interdisciplinary project brings together researchers from marine science, machine learning (ML), sonic and immersive arts to develop an interdisciplinary workflow allowing diverse audiences to experience a range of complex marine datasets simultaneously. This will help us understand when and why biodiversity is changing over time, in a novel immersive and intuitively derived way with the aim to inform decision-making processes surrounding data analysis and promote engagement with our oceans.
The workflow will be evidenced using a long-term image and soundscape deep-sea coral reef dataset, integrating marine-, machine learning- and arts-based knowledge and approaches. This holistic sensory approach to data science is expected to increase a sense of connection to and agency over the protection of our oceans. Further, it is expected to increase the rate at which scientific insights are made and guide data analytical priorities. The project aims to engage those interested in understanding and monitoring our oceans, such as researchers, coastal communities, and the wider public
Timeline: August 2024- July 2025
 
 

Listening to the Digital City: reappraising ambience in urban planning

TEAM: Dr Iain Findlay-Walsh (PI) & Dr Rebecca Noone (Co-I)
FUNDING: University of Glasgow, Crucible Seed Funding Award, £7,274
 
In city planning, the language of ambience refers both to the computational infrastructures of a city’s design (ambient computing) and a set of aspirational ideals that make a city “livable” (cf. Mattern, 2021 and Halegoua, 2019). Linked to digital or smart city developments, ambient computing and ambient design utilise data-driven intelligence to organise and systematise infrastructures and services such as garbage collection and traffic flows (Bwalya, 2019). Drawing from theories of ambience in sounds studies, ambience carries specific aesthetic and perceptual connotations of harmony, clarity, immersion and scale (Burdon, 2023). 
 
Drawing upon original field recordings, deep listening, and critical mapping of a city’s presumed ambience, from Glasgow and London, UK, the authors stage a theoretical and an experiential intervention into ambience and its political, aesthetic and social consequences within the design of contemporary cities. While ambience’s status may be a seemingly benevolent measure of spatial and social good, this paper discusses the process of listening to the other noises of ambient computing—from the city’s data centres to Wi-Fi towers to fiberoptic cables—that are sutured into urban space. Additionally, we practice deep listening and critical mapping to attend to the silences and exclusions ambience produces through its regulations of environments. Applying a critical technocultural discourse analysis (Brock, 2018) to arts-based methods (Loveless, 2019), this project advances a timely critical intervention into ambience as a marker of value in contemporary spaces. Therefore, the project examines ambience as more than a metaphor for design but a translation of values: connoting softness, abundance, unobtrusiveness, self-regulation and quietude. 

Array Infinitive

Team: Leslie Deere, Stuart Cupit, Chris Speed and Ross Flight
Institutions: Glasgow School of Art, Centre for Contemporary Art

Development Sponsorship with Solarflare Studio

 

This practice-based research examines audience experience in Virtual Reality. The research draws upon audio-visuals generated live and performed to a group in VR. This study pushes the envelope on ways in which experimental live performance in VR impacts an audience, and investigates the affect of live content whilst immersed in the virtual space. This research is being supervised by Dr Marianne Greated (Head of Painting, School of Fine Art, GSA) and Francis McKee (Director, Centre for Contemporary Art Glasgow) and Ronan Breslin (School of Simulation & Visualisation GSA). The goal of this project is to evoke a heightened altered state experience, using VR as a vehicle to deepen the immersion. This work aims to create a meditative encounter utilising sound, frequency, light and the colour spectrum. A key element of this project is the group aspect, which generates a shared experience.

Pilot tests (covid dependent) to take place at CCA July 13-18 2021. Exhibition version (covid dependent) to take place at CCA 22-25 July 2021. 

 

Edify

Team: Macpherson = PI, McDonnell = Co-I
Institutions & Funders: UofG, Sublime/Edify and Innovate UK

 

The Edify teaching platform is an output from the Innovate-UK funded Project Mobius.

Developed with and for academics, it allows ordinary users, with no 3D expertise, to teach using the powers of VR.

Those with access to sufficient VR headsets can use the software at scale, but those without can use a single headset to broadcast the VR experience across video-conferencing tools such as Zoom or Teams.

The University of Glasgow has partnered with edify to validate the platform, and act as an exemplar of at-scale VR deployment in Higher Education.

 

Sensorily Stressed: Using Virtual Reality Technology to Examine the Relationship between Sensory Sensitivities and Anxiety

Team: Elliot Millington, Dr David Simmons and Dr Neil McDonnell
Institutions: University of Glasgow, Edify/Sublime

 

ESRC

This project is using Virtual Reality to test how the senses and anxiety interact with each other. We will be seeing how participants perform on sensory tasks while being transported to different environments, each encouraging different levels of anxiety. We hope that this will help us better understand the perceptual experience of autistic people in their everyday lives.

Using VR to understand the inner perceptual world of autism

Team: Sarune Savickaite, Dr David Simmons and Dr Neil McDonnell
Institutions & funders: University of Glasgow and Edify.ac

 

ESRC industrial partnership with Edify.ac (Sublime Ltd)

Autism, a common neuro-developmental condition, affects at least 1% of the UK population. Autism is partly characterized by sensory difficulties, such as over- or under-responsiveness to certain types of lighting and everyday noises, and an almost obsessive desire for particular types of sensory stimulation, known as “sensory seeking” behaviour. To date, most research on sensory aspects of autism has used parent/caregiver-reports, combined with a smaller amount of self-report data from those able to speak for themselves and further data from lab-based experiments. So far, however, despite these data providing us with some fascinating insights, we have yet to fully appreciate precisely what is going on in the “inner perceptual world” of autism, although it is clear that it is qualitatively different from what typical individuals experience.In this project we propose the use of Virtual Reality (VR) technology to explore this inner perceptual world. VR technology has become much less bulky and much more affordable in recent years, and the availability of software has burgeoned. In our experiments we aim to explore perceptual worlds by asking people to illustrate their experiences using the powerful and compelling creative tools now available for use in VR environments, such as Tiltbrush (www.tiltbrush.com) . We will use a combination of quantitative analysis of participants’ responses to questionnaires with qualitative analysis of both their verbal descriptions (if available) and of their audio-visual creations to further understand the nature of their inner perceptual worlds. Furthermore, we will use our experience in objective behavioural experimentation to embed game-like tasks into the created environments to explore our participants’ perceptual limits more objectively. This collaborative project will further our understanding of the inner perceptual world of autism and result in the development of a suite of versatile VR software tools together with new techniques of creative expression for those with communication difficulties.