Friday, 22 December 2017

SUITCEYES

No, it's not a typo: it's an acronym: Smart, User-friendly, Interactive, Cognition-Enhancer, Yielding Extended Sensosphere. Let me explain.

One of the big features of this year (and by extension, the next three years as well!) was a successful bid to the EU's Horizon 2020 Funding Scheme for a 3-year project to explore the use of smart textiles to provide haptic communication and navigation aids for deafblind people. The project is worth €2.4 Million, and has a consortium of five academic partners and two industrial partners from seven countries. You can find more detail on the partners than I could possibly fit into this post at the project's official website.

This builds upon - among other things - the work I undertook with Brian Henson and Bryan Matthews on haptic navigation aids for the visually impaired in the Department for Transport -funded WHISPER project (remember WHISPER?).  Whereas there we looked at barriers to navigation, and the potential for haptic aids to navigation, this project takes the whole approach to a much deeper level, bringing in expertise on Smart Textiles (The University of Borås, Sweden, who are co-ordinating the project), machine learning and object recognition (Centre for Research & Technology Hellas, Greece), psychophysics (Vrije Universiteit Amsterdam, Netherlands) and gamification (Offenburg University of Applied Sciences, Germany), a major producer of adaptive technology (Harpo, Poland) and a producer of tactile books (the delightfully named Les Doigts Qui Rêvent). This focuses on the broader issue of communication, beyond just the requirement for navigation, and emphasises the needs of deafblind individuals, rather than just the visually-impaired.

The work at Leeds focuses on two of the work packages making up the project: engaging with deafblind people to explore their needs and ensure that the project remains focused on addressing these; and exploring the use of haptic signals to aid navigation. The latter goes beyond the simple use of distance sensors and vibration motors that we explored in WHISPER: we'll be looking at bringing in inertial and GPS measures to enrich the navigation information, and by bringing in work from the other partners, we'll be exploring more sophisticated haptic signals (and a more sophisticated interface than wristbands with vibration motors attached!), and the use of object recognition from a camera feed. 

We'll be kicking the project off with a symposium at Borås in January: From Touch to Cognition.

I can't wait to get started!

No comments:

Post a Comment