Thursday, 5 July 2018

Month in Review: June 2018

Well, the wheels are definitely off in terms of my two posts a month target! We'll see if I can catch up an do four posts in July!

It goes without saying that June was a busy month. It's not quite as brutal as May (particularly since I'd finished my marking), but it's exam board season. We're much more efficient than we used to be (what was about 14 hours of meetings has been trimmed to 7 or so), but the deadlines are absolute. Things have to be ready for the external examiners come hell or high water (did I say that in my last post? Well, it's true!). All marks need to be finalised, students with coursework extensions' work marked , marks uploaded and coursework  samples selected.

Of course,  it was made extra busy by the British Academy Summer Showcase preparations, and prep for New Designers and the upcoming SUITCEYES consortium meeting taking place in Leeds this month. Also our taught MSc conference, where we spend a day with every taught postgraduate presenting their dissertation. Busy and hard work, but a great way of finding out about the range of projects going on.

I have two Laidlaw Summer students, one returning to work on FATKAT (the Finger and Thumb Kinematic Assessment Tool), and another working on haptic feedback in VR and SUITCEYES. The Leeds SUITCEYES team is finally complete, with Zhenyang Ling (known as Yang), Research Fellow in Haptic Communication and Navigation starting.

There are some big changes afoot for CAP (the Cognition-Action-Planning lab I am part of), as we do our annual stocktake of where we are and where we're going. The most obvious change is our increasing focus on Immersive Cognition. It doesn't mean a lot of change in what we do, but reflects the fact that our work is increasingly oriented around Virtual, Augmented or Mixed Reality. The lab will be rebranded as ICon (for "Immersive Cognition" - ICogn seemed a bit opaque). It's exciting times with Leeds' new centre for Immersive Technologies coming online.

July promises to be another busy month! We have the aforementioned SUITCEYES consortium meeting taking place in Leeds, we are now ready to start conducting our first interviews, and with Yang getting stuck into the technology side,  we're really hitting our stride. And, of course, teaching prep. I always aim to have handouts ready by the end of July, so I can print in good time and can't make last minute changes.

Busy times - still,  it keeps me out of trouble, eh?

British Academy Summer Showcase

I nearly titled this "Too busy to blog", since it's been a fiendishly heavy duty month. That's partly due to exam boards - this is always quite a busy time of year, since marks need to be in and confirmed for the external examiners' visit come hell or high water - but this year things have been busier than usual, thanks to the upcoming SUITCEYES consortium meeting here at Leeds next month, and in particular the British Academy's first summer showcase, where Stuart Murray, Sattaporn Barnes (of Eatfish Design) and I were showing off our "Engineering the Imagination" project, and the resulting artificial hands that we developed.

It was a great time - if very busy (we spent about thirteen hours each over three days on our exhibition stand). Lots of good conversations! But let me back up a little: what is Engineering the Imagination? After all, you might have spotted a certain similarity to the title of this blog...
Engineering the Imagination is a year long project funded by the APEX scheme, intended to bring together sciences and the humanities. This particular project focuses on the design of artificial hands, and in particular the consideration of non- functional hands: which is hard for me, as an engineer, to get my head around. I suspect that Stuart and I have very different takes on the project.  For Stuart, I think it's all about hands as metaphor, ideas of deficit and difference: what makes a hand 'disabled'? Why do we design artificial hands to be like 'normal' hands - and what makes a hand 'normal'? What do hands signify, and how does this change if the hand is artificial? Stuart would be better placed to explain his views.

For me,  it's about exploring ideas about what we can do with artificial hands. Why not have a sixth finger? Lights? If we can't replicate the human hand,  are there other ways an artificial hand could emote? Or function?

The designs we were showing off reflected this. There was the Empathy Hand: a powered hand that could adopt a range of poses; the three-fingered "Mudd Hand", based the hand of our collaborator Andy Mudd (who was also there to show the original that inspired it!) and the six-digit "Lunate Hand" which had a second thumb, inspired by the work of Clifford Tabin, and his comments about extra thumbs.

You can see images of all three, and the stand (for context!) below! Also, though we didn't have it ready in time for the Showcase,  the Empathy Hand now has a light-up palm which, when pressed, causes the hand to light up and close in response. It was a great three days, but I'm aware that I'm already five days late with this update, so I think I'll call it a day there, and let you enjoy the pics!



The Stand as a Whole!
A three-fingered artificial hand, shown with fingers closed.
The Mudd Hand: A three-fingered hand designed to mimic that of our collaborator, Andy Mudd

A six-digit hand: it has the normal five digits, plus an additional thumb extending from the palm to oppose the middle finger.
The Lunate Hand: A six-digit hand adding an extra thumb from the palm. Named Lunate because we reckon that the thumb is attached roughly where the lunate bone is in the interest, and it sounded swish.


An artificial hand shown in an open pose, with fingers splayed.
The Empathy Hand: An artificial hand that can open and close in response to trigger signals. It is designed to be modular so that parts can be interchanged. Adding a light-up palm for example! At the moment it just has a range of poses triggered by button presses.


An artificial hand shown in a closed pose, grasping another hand from the exhibition.
The Empathy Hand getting to grips with the competition!


The Mudd and Lunate Hands in Situ


Friday, 25 May 2018

Month in Review: May 2018

It's not the end of the month, yet, but as it's half term next week, I'm off work, so this seemed like a good time to update. Rather than risk drifting into June.

It has (as always) been a busy old month. In many ways, May is exam month: vivas have been the main feature. I've marked portfolios, read and examined dissertations, and examined not one but two product design exhibitions! Everything else gets rather squeezed out. Still, I've managed to fit in a presentation at the Pint of Science Festival, which was good, and we've made some significant progress on the Apex project, so I'm awash with bits of 3D printed hands at the moment! I also managed a trip to Peterborough to visit Deafblind UK for the SUITCEYES project, which was very informative. And last - but far from least - we welcomed aboard a new member of the SUITCEYES team: Adriana Atkinson, as Research Fellow in User Needs Analysis. She'll be looking after the interviews in the UK that will inform the SUITCEYES project. In fact, after four months of largely admin, recruitment and planning, with me doing a bit of technical development on Work Package 5 (Driver Control Units - the bits I took to Amsterdam last month), things have abruptly sprung into life. This is particularly true on Work Package 2 where we suddenly have a draft protocol (thanks in large part to Sarah Woodin), an application for ethical review for the protocol (thanks in large part to Bryan Matthews) and a good chunk of literature under review (thanks to Sarah, Bryan and Adriana). I mention who's doing these things since, for the most part, I've ordered computers, booked rooms, organised meetings and run vivas - it feels almost unnerving to have so much happening without me being the one doing it! But it is also a huge relief to feel all the early work starting to pay off, and feel like we're actually getting into research and not just lots of planning and project management.

Next month is shaping up to be an even more exciting one: Jamie Mawhinney will be resuming his Laidlaw Scholarship on developing FATKAT; we have a second Laidlaw Scholar (one Erik Millar)  starting who will be looking at tactile feedback and VR; we have another SUITCEYES Research Fellow starting - looking after the sensing and technical developments and, of course, I will be down at the British Academy Summer Showcase with Stuart Murray and Eat Fish Design showing off our work on Engineering the Imagination. Also, there will be exam boards, so my teaching duties are not done yet.

Still, first, I'm off to see the Falkirk Wheel and the Kelpies at the back end of this month: I couldn't be more excited!

Talking through Touch: Pint of Science Festival

I was invited to participate in the Pint of Science festival this year - specifically at the "Harder, Better, Faster, Stronger" event on the 16th of May. As is my want, I like to think out loud in writing a presentation, and the blog is a perfect place to do that, so here are my jottings - published retrospectively, in this case, largely because I've been so busy with examining duties that the blog as been a low, low priority!

This presentation is on "Talking through Touch", and it really relates to the work I'm doing on the Horizon 2020-funded SUITCEYES project. As always, I need to be careful because I am an Engineer - not a neuroscientist, or a psychophysicist, or even a philosopher of the senses. I know how to make things, but I can't give chapter and verse on - say - sensory illusions or the practicalities of multisensory integration or the merits of different haptic sign languages. I can parrot what I've read elsewhere and heard from others, I can give you a bit of an overview on these areas, but I'll never be anywhere near as good at them as those who specialise in them. But I can make stuff so, y'know - swings and roundabouts.

Anyway, it does imply the need for my customary "Caveat Lector" warning: you're about to read the thoughts of an engineer, and they need to be read in that context!

The Sense of Touch
Perhaps a logical place to start is with the sense of touch. And where to better start than by pointing you to people who are far more well-versed in these things than I am? A good place to start would be the recent Sadler Seminar Series "Touch: Sensing, Feeling, Knowing" convened here at the University of Leeds by Amelia De Falco, Helen Steward and Donna Lloyd. Sadly, the slides from the series aren't available - I might need to chase up on those to see if they or recordings will be made available, because they were very good talks. Particularly notable for my purposes - because they deal with crossing senses - were those from Charles Spence from the University of Oxford (noted for his work on multisensory integration - using sound and tactile stimuli to augment the sense of taste, for example) and Mark Paterson from the University of Pittsburgh who deals with sensorimotor substitution and the problems thereof (which we will come back to later on).

A lot of my research is about prehension and grip - but hands are also used to explore the world around us (sensing hot and cold, rough and smooth, hard and soft, and so forth) and to communicate - through gestures or direct touch (India Morrison's presentation on Affective Touch at the aforementioned Sadler Seminar series was particularly illuminating in the latter regard). And of course, it is worth noting that touch is not a sense restricted to the hands, but present across the skin - albeit with different degrees of fidelity. Hence the classic "Cortical Homunculus" representations that you see:

Sensory homunculus illustrating the proportion of the somatosensory cortex linked to different parts of the body.
Cropped from image by Dr Joe Kiff taken from Wikipedia under creative commons licence CC BY-SA 3.0
This is the limit of my knowledge on neurology of the somatic senses, so I'm going to leave it there. The key point for my talk is that we're interested in touch as a mode of communication, rather than, for example, as a way of exploring properties of the world around us. Of course, there is a link here: in order to communicate through touch, we need to be able to perceive the signals that are being sent! So let's have a think about what those signals might be.

Communicating Through Touch
Tactile communication takes many forms. The one we're probably most familiar with is the eccentric-rotating mass motor, that provides vibrotactile feedback on our phones - the buzzing when you set it to "vibrate". But there are lots of other examples. Braille is well known, and likewise you can often get tactile images (see this link for a nice paper on this from LDQR, who are partners in the SUITCEYES project), such that information can be presented in a tactile form. Tactile sign languages exist, and these take a variety of forms, from fingerspelling alphabets (often the hand) to more complex social haptic signals or tactile sign languages such as Protactile. This does highlight an interesting distinction - between signals (one-off, discrete messages) and language (assembling signals into complex messages - at least, to my engineering mind, language assembles signals: linguistics may take a different view!). You can see the fundamental difference between a simple buzz, and - as an example - Protactile. Haptic sign languages have shape, movement, involve proprioception. They aren't just morse code that can be converted easily into vibrations.

Luckily, Haptic Feedback isn't restricted to vibrotactile feedback through eccentric rotating mass motors. One of the examples that I find really interesting is the Haptic Taco, which changes its shape as you get nearer or further from a target point. And there are lots of examples of different modalities of haptic feedback - electrostatic, thermal, pressure, shape changing, etc, etc, etc - you can check out conferences such as Eurohaptics for the cutting edge in haptic feedback.

Sensory Substitution vs Haptic Signals vs Haptic Language
This brings us neatly onto the issue of what it is that we want to signal. After all, in the case of SUITCEYES, the aim is to "extend the sensosphere" by detecting information from the environment, and then presenting this to the user in a tactile form. This can take two forms that I can see: direct sensory substitution (transferring information from one modality to another - measuring distance with a distance sensor and then giving a related signal, as we did back in the WHISPER project) or by signalling - that is, interpreting the sensor data and sending a corresponding signal to the user.

A simple example, based on comments from the WHISPER project, might help to illustrate this. One piece of feedback we received was that the device we developed would be helpful for identifying doors, since it could be used to locate a gap in a wall. This suggests two different approaches.

The first is the sensory substitution approach: you measure the distance to the wall, and feed this back to the user through vibrations that tell them the distance to the item the distance sensor is pointing at. Close items, for example, might give a more intense vibration. The system doesn't know what these items are - just how far the signal can travel before being returned. In this scenario, the user sweeps the distance sensor along the wall, until they find a sudden decrease in vibrations that tells them that they have found hole. It would then be up to them to infer whether the hole was a door. Of course, this wouldn't work terribly well if the door was closed. An alternative would be to use computer vision, for example, to recognise a doorway.

A second approach would be to use, for example, computer vision to interpret a camera feed and recognise doorways. Now, instead of sending a signal that is related to distance, the system would need to provide some sort of signal that indicated "Door". This might be in the form of an alert (if the system is just a door detector, it need only buzz when it sees a door!), or of a more nuanced signal (it might spell out D-O-O-R in fingerspelling, morse code or braille, or it might use a bespoke haptic signal using an array of vibrotactile motors).

There is a third approach, which would be that of a haptic language - that is, combining multiple signals into a coherent message. "The door is on the left", for example, or "The door is on the left, but it's closed", or "The door is 3m to the left".

There is one further issue to consider (kindly highlighted to me by Graham Nolan from Deafblind UK), which is that of nuance: when we speak, we don't just state a series of words. We modify them with tone, gesticulation and body language, something that often gets lost in written text alone - see Poe's Law, or any of the many misunderstandings on the internet and email arising from failure to recognise sarcasm, or a joke - it is, after all, one of the reasons that emojis have caught on. I imagine. The same problem applies in haptic communication: less so with our door example, which is largely functional, but let's take a different example.

If signal distance, then you would know when something was in front of you. You might, using (let's say) our hypothetical computer vision system give that thing a label. Is it a wall, a door, a person? Or your best friend? And what if it is your best friend giving a greeting, or your best friend waving warning? Do they look happy or worried? Can we have empathetic communication and build relationships if our communication is purely functional?

I'm not the right person to answer that, but from a technical perspective, it does highlight the challenge. Do we need a stimulus that directly conveys a property (such as distance)? A signal that can be used to label something?

So, there are a few things we can look at here: modulation of some one property to represent another, a set of signals to label different things, combining multiple signals to create messages, and finally the challenge of modulating those signals, or messages, to capture nuance. But what properties do we have to play with?

Things to Consider
There are several modalities of tactile stimuli:

Contact - a tap or press, bringing something into contact with the skin.
Vibration - the classic vibration setting on mobile phones
Temperature - not one that is well used, as far as I'm aware, since it's tricky to get things to heat up and cool down quickly.

Another interesting example is raised by the Haptic Taco: a device that changes shape/size to indicate proximity to a target. So, we can add shape and size to our list. There are others, too (electrostatic displays being the most obvious).

Then, we can modulate each of these in three ways - duration, location and intensity - and play around with ordering.

So, we have a toolkit of modalities and modulations that we can apply to create signals or more complex communication. Of course, we then have questions of discrimination - the ability to differentiate these elements - in time, location and space.

There is finally, the question of efficiency: how quickly and reliably a message can be interpreted. After all, morse code can be delivered readily through vibrotactile feedback, but compared to direct speech, it is relatively slow.

And... that's pretty much where my presentation runs out. No clear conclusions... no findings, because we're still very much at the beginning of the project. This is more about laying out my thoughts on haptic communication. Let's hope that doesn't bother the audience too much.

Monday, 30 April 2018

Month in Review: April 2018

I'm going to have to admit defeat on producing a none Month-in-Review post this month: life and work have been too busy for blogging! Of course, paradoxically, that gives me a lot to talk about, but such is life.

Of course, this month has seen the Easter holidays, so I've had Bank Holidays, and a substantial chunk of annual leave taken up looking after children (including a trip to Manchester to visit the Robots exhibition at the Museum of Science and Industry, and a family holiday), which means that I've only actually been working for half the month.

Still, it's been an exciting half month, particularly on the SUITCEYES front. We've been interviewing for User Needs Research Fellows, and I've also had a trip to visit Astrid Kappers' lab in Amsterdam to test out the first prototypes with Nils-Krister Persson and Adriana Stöhr from Borås. And Astrid. It was a fascinating, and very informative time. Lots to report on - in due course!

There was also a trip to Dundee with Stuart Murray to meet with Graham Pullin (the author of the excellent Design Meets Disability, and leader of the also-excellent Hands of X project) to talk Hands, which was also great. On top of that, we are at the end of teaching, have had a very productve workshop on Immersive Technologies (today!), and we've just managed to get revisions on one of our grip modelling papers in.

Lots of exciting stuff to report - but no time in which to report them, sadly! Such is the way!

I will be doing the Pint of Science festival in May on the subject of Talking Through Touch, so hopefully I'll get a chance to put up a post on my talk before I give it... here's hoping!

Sunday, 1 April 2018

Month in Review: March 2018

Ooops. I managed to miss my "two posts a month" target in March, albeit only by a day. Anyway, it's been a busy month with not a huge amount of specifics to report. As noted in my last post, I was down to Westminster for the All-Party Parliamentary Group on Assistive Technology: other than that there's been the usual end of term rush, planning for the Summer Showcase, advertising for Research Fellows, sorting out employment paperwork, and one PhD student (Awais Hafeez) has passed his transfer viva, while another has passed his final viva (Haikal Sitepu) - congratulations to both! The main thing that has kept me busy though is implementing the controller for SUITCEYES, which I'm pleased to say is coming along nicely and will be tested in Amsterdam next month. Sorry, *this* month, since it's now April. Of course, half of this month is School holidays, so I've a fair bit of annual leave, plus the two days in Amsterdam, plus a day's round-trip to Dundee. I'll keep you posted!

Friday, 30 March 2018

APPGAT: Assistive Technology and the Industrial Strategy

This week, I attended the All-Party Parliamentary Group on Assistive Technology's symposium on Assistive Technology and the Industrial Strategy. This was a new experience for me: policy and parliament are both rather outside my sphere of experience, but ever since Claire Brockett organised a Parliamentary outreach session on Science and Academia in UK Policy, I've been thinking about how I might engage more with Westminster, and this seemed like a good opportunity to get involved and keep my finger on the pulse. 

I went with two hats on (not literally) - representing both the Centre for Disability Studies, and the Institute of Design, Robotics and Optimisation - though I was there for both in very much a listening capacity. Just attending was an interesting experience - the format was very different from anything one experiences in academia. Each presenter got five minutes, the keynote got ten, and the timekeeping was absolutely dead on. The floor was opened to questions and comments, the questions were (more or less) answered by the panel, and that was the end of the session. By academic standards - where presentations are usually fifteen to twenty minutes and frequently overrun - this was lightning fast. Of course, the aim wasn't to describe a detailed piece of research, but to give high level comments, and make way for discussion.

The session was chaired by Lord Chris Holmes (Conservative Peer and noted Paralympian Swimmer), and had contributions from Hazel Harper  of Innovate UK, Bill Esterson MP (Shadow Minster for International Trade, and Shadow Minister for Small Business), Prof Nigel Harris (Director of the outstanding Designability), David Frank (Microsoft's UK Public Affairs Manager), 
Dr. Catherine Holloway (Academic Director of the Global Disability Innovation Hub), Alex Burghart MP (Member of the Work and Pensions Select Committee), with the keynote coming from Sarah Newton MP, Minister for Disabled People. This was followed by questions from the floor - I won't go through a blow-by-blow account of what was said: rather, let me pull out the key themes.

Of course, there were two themes that were in some tension here - as always in assistive technology - the needs of disabled people to remove barriers and find solutions that enable them to do what they wish to do; and the needs of the designers and manufacturers of assistive technology to keep making new devices and thereby keep making money. This technology exists within academia, as well, of course - the REF requires me to produce new and cutting edge engineering (AI! Exoskeletons! Self-Driving Cars!), which isn't necessarily the same research that will most benefit disabled people. Which isn't to say that the two are mutually exclusive, of course, but it is a source of tension.

This tension exists in the Industrial Strategy itself: this strategy is all about "building a more productive economy". So, in terms of AT does that mean improving the productivity of the AT sector? Or does it mean AT to improve the productivity of disabled people? This was never really addressed - there was a lot of reference to helping disabled people "fulfill their potential", which basically seemed to mean working. But there were also references to the size of the AT sector in the UK economy, how well we perform there, selling to the rest of the world. The two need not be mutually exclusive - indeed, they can be mutually reinforcing, as highlighted by Nigel Harris' discussion of Designability's co-design approach.

Inevitably, the poster children of cutting edge technology (AI! Exoskeletons! Self-Driving Cars!) cropped up. Which I'm not against by any means - with my iDRO hat on, these are exciting new technologies that are going to help us to do all sorts and have huge potential for enabling things - with my CDS hat on , though, I'm more skeptical. And this is where the underlying tension rears its head again. If we want the UK to be world leaders in tech, we need to be doing R&D where the tech is "sexy" and the world at large will want to invest. But that's not necessarily the same areas that will most benefit the lives of disabled people. 

This ties in with the wider issue that the size of the market for any given piece of AT is relatively small. Nigel Harris highlighted the need for products that have wider appeal, so that they can be sold to the mainstream as well as the specialist sectors. This also raises the larger question of accessibility - that is, whether we need to develop specialist AT and to what extent we need to ensure that technology is accessible so that it everyone can enjoy the benefits - the selective enabling issue that I was musing on a year and a bit ago.

Particularly noticeable was the lack of any representation of Disabled People Organisation's on the panel (noted by Catherine Holloway) - we had academics and industrialists, but nothing about the end user. Which communicates to me that the focus of the symposium was on the AT sector as a business, rather than on the needs of the recipients of AT. Perhaps that's unfair: or perhaps it's just an indication that one way of resolving the tension between them is to treat the two aspects separately, and this symposium was really about the manufacturers. After all, it's unreasonable to judge the activities of APPGAT on the basis of a single symposium. Nevertheless, the symposium promised to look at "how the AT sector can further contribute to our economy and society". There was a lot of the former, out rather the less of the latter, other than the need for AT to help people "fulfill their potential".

How this will work out in the Industrial Strategy remains to be seen. Maybe there is a need to address "AT as a business" and "AT as a service" separately? Going after the cutting edge is always chasing rainbows - things that were exciting and novel and exploratory become useful and hence commonplace and mundane, so the research and the attention moves on. It's good to have that cutting edge - but attention needs to be paid to the other part as well: how we get from that cutting edge to useful products and devices that actually benefit people's lives once the immediate research attention moves on. That's something that I'd like to see addressed, though I've no idea how you'd do it. 

Still, in reflecting on all this, a particular question keeps popping up in my mind: how do we enable disabled people to get involved in the AT industry? Not just as users and testers, but as designers, makers, direction-setters? How do we enable people to make their own AT, and customise their own devices, rather than just selling them specialist kit?

Anyway, those are my thoughts - you can follow APPGAT on Twitter at @AT_APPG and follow the discussion of the symposium on #ATIndustrialStrategy  .