On October 11, Shahram Jalaliniya will defend his thesis “A Body-and-mind Centric Approach to Wearable Personal Assistants” at the IT University of Copenhagen.
Title: A Body-and-mind Centric Approach to Wearable Personal Assistants
Time: Tuesday 11th of October 13.00-17.00 (max)
Place: Room 4A22 at IT University of Copenhagen. From Malmö C one takes the Öresundståg to Ørestad and then the metro to DR Byen station (on the same train ticket).
Defendant: Shahram Jalaliniya (post-doc at IOTAP)
Supervisor: Thomas Pederson (senior lecturer at Malmö University)
Assessment committee members:
Professor Kasper Hornbæk, Department of Computer Science, University of Copenhagen, Denmark.
Professor Jeff Pelz, Carlson Center for Imaging Science, Rochester Institute of Technology, USA.
Associate Professor Erik Grönvall, IT University of Copenhagen, Denmark (Chairman).
ABSTRACT | Tight integration between humans and computers has long been a vision in wearable computing (“man-machine symbiosis”, “cyborg”), motivated by the potential augmented capabilities in thinking, perceiving, and acting such integration could potentially bring. However, even recent wearable computers (e.g. Google Glass) are far away from such a tight integration with their users. Apart from the purely technological challenges, progress is also hampered by the common attempt by system designers to deploy existing interaction paradigms from desktop and mobile computing (e.g. visual output, touch-based input; explicit human-computer dialogue) to what is in fact a completely new context of use in which computer users interact with the device(s) on the move and in parallel with real-world tasks. This gives rise to several physical, perceptual, and cognitive challenges due to the limitations of human attentional resources. In fact, while wearable computers in recent years have become smaller and closer to our bodies physically, I argue in this thesis that to achieve a tighter man-computer symbiosis (e.g. such that some everyday decision-making can be offoaded from human to system as in context-aware computing) we also need to tie the computer system closer to the conscious and unconscious parts of our minds. In this thesis, I propose a conceptual model for integrating wearable systems into the human perception-cognition-action loop. I empirically investigate the utility of the proposed model for design and evaluation of a Wearable Personal Assistant (WPA) for clinicians on the Google Glass platform. The results of my field study in a Copenhagen hospital simulation facility revealed several challenges for WPA users such as unwanted interruptions, social and perceptual problems of parallel interaction with the WPA, and the need for more touch-less input modalities. My further exploration on touch-less input modalities such as body gestures and gaze, showed the great potential of using eye movements as an implicit input to WPAs. Since the involuntary eye movements (e.g. optokinetic nystagmus) are unconscious reflections of the brain to external stimuli, analyzing such involuntary eye movements by WPAs opens new opportunities for unconscious interaction (and man-machine symbiosis). Our EyeGrip prototype successfully demonstrated user’s complete unconscious control over visual information flows (e.g. Facebook feeds), magically halting the scrolling when an entity of interest shows up. In this thesis I lay a conceptual framework and demonstrate the potential of some new avenues for achieving tighter integration between wearable computers and human agents.