Descriptions should be written as one or more proper sentences, starting with a capital letter and ending with a full stop, exclamation mark, or question mark.

VARID uses dynamic real-time image processing techniques to mimic a variety of vision loss symptoms

By Mr George Wigmore (Senior Communications Officer), Published

Foster + Partners’ Applied Research + Development team has been working closely with City, University of London (City) and UCL’s PEARL Lab to develop VARID (Virtual and Augmented Reality for Inclusive Design). This is a design toolset that uses virtual and augmented reality (VR/AR) technologies to help architects, researchers and designers improve their understanding of how users with visual impairments experience the world around them.

Visual impairments such as glaucoma and age-related macular degeneration affect hundreds of millions of people worldwide, and cases are set to double in the next 30 years, as our societies age. It is imperative that architects design inclusive spaces where anyone can feel comfortable living, working, and visiting.

Virtual and Augmented Reality for Inclusive Design
Demonstration of VARID (Virtual and Augmented Reality for Inclusive Design)

VARID uses dynamic real-time image processing techniques to mimic a variety of vision loss symptoms, such as blurring, warping, or peripheral vision loss. Developed as a game-engine plugin, it is compatible with a range of virtual and augmented reality headsets, which are commercially available. VARID is data driven and capable of generating personalised simulations based on a particular set of clinical test results.

VARID also helps in all design stages to inform massing, choice of materials or selection of colours. Researchers at UCL and City have used VARID to test, extend and validate the toolset to improve their understanding of user behaviour and spatial experience and help designers create more inclusive environments.

The project was funded by an Epic Megagrant. VARID is freely available to download, use and extend on Github.

Dr Pete Jones, Lecturer in Optometry and Visual Science in the School of Health and Psychological Sciences at City, University of London, said:

VARID uses dynamic, real-time image processing techniques to mimic a variety of vision loss symptoms, such as blurring, warping and peripheral vision loss. It has been developed as a game engine plugin, and it’s compatible with a variety of commercially available VR and AR headsets.

VARID, a VR Toolset for Inclusive Design