Thursday 28 September 2017

Month in Review: September 2017

September is always slightly white-knuckle, as the start of term hurtles into view. Lectures need to be ready, Web pages up-to-date, handouts printed, exams prepared - or you're in for a really hard time once teaching starts.

This year, the problem has been compounded by a big research grant getting approved in principle and needing detailed negotiation to ensure that moves to approval in practice; a small research grant in the same sort of state; the launch of the N8 Robotics and Autonomous Systems Student Network down in Sheffield; and chairing a session for LUDI at the AAATE conference (also in Sheffield). All great things (and a nice break from teaching prep!), but all needing to be slotted into a busy time.

Still, it's done: lectures, handouts, exams, ready; modules launched; new and returning tutees welcomed; teaching underway. Which isn't to say that it's an easy ride from here, but the start of term is always a nice point - when you can draw breath, mop your brow and get down to actually teaching instead of just thinking about it. And the benefit of prepping over the summer is that research marches on, rather than coming to an abrupt halt.

Anyway, I promised highlights of the summer, so here they are:

1) Getting MagOne working. Or rather, undergraduate summer interns Jamie Mawhinney and Kieran Burley getting it working for grip and posture applications respectively. Application specific calibration and housings need to be developed, but we've achieved proof of concept for both grip and posture, and the hardware and software for running off an Arduino Nano are in place. Low cost there-axis force sensing, here we come!

2) The fully housed PSATs getting up and running... and getting prepared for prehension studies. Low-Cost market tracking, here we come!

I think that'll do for now. In the meantime... back to teaching!

Wednesday 20 September 2017

Present in Absence: Projecting the Self

Last week saw me in Sheffield for the launch of the N8 Robotics and Autonomous Systems Student Network on behalf of Robotics@Leeds (it was great, thanks) and the AAATE conference on behalf of LUDI (also great, thanks!). There are probably blog posts in both, but I mixed in some of my Augmenting the Body/Self duties by catching up with Sheffield Robotics' Tony Prescott and Michael Szollosy, by picking up a MiRo for Stuart Murray to show off on behalf of the project at the Northern Network for Medical Humanities Research Congress, and using my extended commute to spend some time going through our thoughts from the summer workshop. I'll keep those under my hat, since they'll be going into a grant, but between all these things and presentations at AAATE about NAO and ZORA, and with articles appearing about VR at Leeds, I've been thinking a lot about the limits of the body and the self.

So, I thought I would work through them here, at least to get them straight. Bear in mind this is me thinking off the top of my head without a proper literature review, or any kind of philosophical expertise on self  or consciousness. It's a hot take from an engineer responding to the ideas swirling around me: feel free to correct me.

The limits of the body and Deleuzian revisitings of assemblages and rhizomes have been covered elsewhere on this blog. The skin, as I've noted before, is a pretty handy boundary for delimiting self and other, and the one that we probably instinctively default to. Questions of whether my clothes or equipment should be regarded as part of "me" might seem trivial. The answer is obviously no, since I can put these on and off in a way that I couldn't with any other part of me: even hair or nails, while easily shorn, are not rapidly replaced except by extensions or false nails. Yet, if we take ease of removal and replacement as indicative of the boundary of self, then (as Margirit Schildrick pointed out at our Augmenting the Body finale) what about the human microbiome? The  countless bacteria that we lug around inside us? They aren't easily removed or replaced, though they can be - I can swallow antibiotics and probiotics, I guess. Yet, if I have an abscess, it's not easily removed, so is that part of "me"? And by extension, what about a tattoo, or a pacemaker, artificial hip, or insulin pump? I have none of these, so the question is perhaps facetious. But I do have a dental implant: an artificial tooth screwed in to replace the one knocked out on a school playground in the 80s. Is that "me"? Or does it have to be plugged in to my nervous system to be "me"? In which case, my surviving teeth would count, but not the false tooth - what about hair and nails? Do they count? They don't have nerves (do they? I'm getting outside my field here), even if they transmit mechanical signals back to the skin they're attached to. My tooth isn't like my clothing - I can't take it off and put it back in any more easily than my other teeth: it's screwed into my jaw. I can't swap it out for a "party tooth", I don't need to take it out at night. I treat it exactly as I do my real teeth and 99.99% of the time I'm not even conscious of it, despite it offering a block which is absent sensation when I drink something hot or cold.  Which feels really weird when I do stop to think about it, but after nearly three decades, that very rarely happens.

Of course, the answer is probably: "does it matter?" and/or "it depends". Never has the question of whether that tooth is or isn't "me" arisen, and whether our definition of "self" should extend to walking aids, hearing aids, or glasses will almost certainly depend on context. And for my purposes, I'm sure the extent of body and self matters in engineering for at least one reason: telepresence.
Telepresence crops up in three contexts that I can think of: teleoperation (for example, operating a robotic manipulator to clear up a nuclear reactor or robotic surgery such as Da Vinci) - the ability to project skilled movement elsewhere; telepresence robots (to literally be present in absence - sending a tekeoperated robot to attend a meeting on your behalf); and virtual reality (the sense of being somewhere remote - often not physically real).

So this got me thinking about different levels of proximity to the self that technology can exist at. Here's what I thought:

1) Integral: anything physically under the skin or attached to the skeleton; where some form of surgery is required to remove it. Pacemakers, cochlear implants, orthoses, dental implants, insulin pumps. I originally moored "internal", but felt I needed something to differentiate this from devices - camera capsules for example - that are swallowed but only remain inside temporarily.

2) Contact: Anything attached to the exterior of the body: clothes, an apple watch, a fit bit. Also prosthetics. I wonder if puppets fall under this category? Glove puppets at least.

3) Reachable: Anything unattached that I can interact with only if I can get it into contact with me. A mobile phone, maybe - though voice control such as Siri would affect that. Remote control devices likewise - though I would argue that my TV remote requires me to physically touch it. It extends the device, not "me". 

4) Proximity: Siri and Alexa are interesting. Do they extend the device? Or me? I mean, if I speak to another person, I don't regard them as me - but I project my voice to reach them. I extend an auditory and visual (and olefactory!) presence around me. So a camera for gesture control, or a microphone for voice recognition can be activated beyond arm's reach.

5) Remote: At this point, we're talking about things that are no longer in the immediate physical environment: out of sight and hearing. Perhaps another room, perhaps another city, perhaps half way round the world - perhaps in a virtual environment. This is your fundamental telepresence or virtual reality.

In a handy image-based form:

It's not very well thought out: a first iteration rather than a well-founded theory, but I find it helpful in puzzling through a few things. One of these is as a simple way of noting the physical proximity of a piece of technology required for me to form an assemblage with it. An insulin pump or pace maker must be integral; glasses and clothing must be in contact; a smartphone must be reachable; a screen or voice control must be proximate. I can't form an assemblage with anything remote except through an intermediary - for example, I can't talk to someone on the other side of the world, except by getting a phone within reach. 

The other is that we can locate the range of projecting different senses (seeing, feeling, hearing something outside the range of our usual senses) and projecting action (pushing or gripping something we otherwise couldn't; projecting our voice further than usual), and also the relative location of the device in question. For example, a "feature phone" (as in a mobile phone that isn't a smartphone), requires touch - it needs to be in physical contact with me - for me to operate it - but it allows me to hear and speak to someone miles away. Provided, of course, that they also have a phone of some sort.

To what extent am I projecting myself in using such a phone? Probably not much: I don't really regard myself as being in proximity with someone I call. What if I make a video call using Skype or 
Adobe Connection or Face Time? Am I projecting myself then? What if I'm a surgeon using Da Vinci? Am I projecting myself into the patient? There's a whole bunch of questions about what constitutes "self" there, but I feel like the distinction of different types of distance is useful. 

Anyway, I just thought I'd put it up there. It's a work in progress, and I'll have a stab at mapping out some examples to see if it's useful. As always, thoughts are welcome!