The Handbook of Multimodal-Multisensor Interfaces, Volume 1. Sharon Oviatt

Чтение книги онлайн.

Читать онлайн книгу The Handbook of Multimodal-Multisensor Interfaces, Volume 1 - Sharon Oviatt страница 41

Автор:
Жанр:
Серия:
Издательство:
The Handbook of Multimodal-Multisensor Interfaces, Volume 1 - Sharon Oviatt ACM Books

Скачать книгу

using examples that researchers have already studied.

       Guidance Targets and Constraints

      We spoke earlier of guiding as an interactive goal that can benefit from multimodal coordination. Even for this specific goal, the haptic channel can take many forms. We’ll give examples that vary on the recurrence/continuity parameter, spanning both kinesthetic (via force feedback) and tactile varieties.

      Haptics can provide virtual constraints and fields. In virtual space (3D virtual environment, driving game or 2D graphical user interface), with a force feedback device it is possible to render force fields that can assist the user in traversing the space or accomplishing a task. These “virtual fixtures” were first described as perceptual overlays: concrete physical abstractions (walls, bead-on-wire) superposed on a rendered environment [Rosenberg 1993], which can be understood as a metaphor for a real-world fixture such as using a ruler to assist in drawing a straight line. This concept is a fertile means of constructing haptic assistance [Bowyer et al. 2014], which has been used repeatedly in areas such as teleoperated surgical assistance [Lin and Taylor 2004], and efficient implementations devised, e.g., for hand-steadying [Abbott et al. 2003].

      Haptics can predict user goals. To provide guidance without getting in the way, the designer must know something of what the user will want to do; but if the user’s goal was fully known, the motion could be automated and guidance not needed. In dynamic environments like driving, a fixture can be exploited as a means of sharing control between driver and automation system. The road ahead is a potential fixture basis, and a constraint system can draw the vehicle toward the road while leaving actual control up to the driver [Forsyth and MacLean 2006].

      Haptics can layer guidance onto graphical user interfaces (GUIs), or alternatively be built from scratch into visuo-haptic interfaces. Researchers have often sought to add guiding haptic feedback to GUIs, essentially layering a haptic abstraction on top of one designed for visual use. This has been tricky to get right. Some argue the need to start from scratch. Smyth and Kirkpatrick [2006] developed a bimanual system whereby one hand uses a force feedback device to set parameters in a complex drawing program while the mouse hand independently draws—an example of complementary roles of the two modalities. Some guidelines emerged: design for rehearsal; use vision for controlling novel tasks and haptics for routine tasks; and haptic constraints to compensate for the inaccuracies of proprioception.

      Haptics can provide discrete cues. That most familiar of haptic mediums, vibrotactile buzzes, has been well studied for guidance cueing: of spatial directional [Gray et al. 2013], walking speed [Karuei and MacLean 2014], timing awareness [Tam et al. 2013]), and posture [Tan et al. 2003, Zheng et al. 2013]. In Section 3.3.3, we discuss vocabulary development for more informative discrete communicative elements.

      Haptics can provide spatial marking. Highly relevant to guiding interactions, the addition of spatially informative sensations to touched surfaces screens is becoming possible through several emerging technologies, whether the surface is co-located with a graphic display (touchscreen) or mapped to it (as with a trackpad accessed through fingertip or stylus, or a haptically enabled mouse). Most basically, a vibrotactile actuator can jolt an entire touched surface when a finger crosses a boundary; our brain attributes the “bump” to the touched point rather than the entire screen. Variable friction can render textures that mark regions of a surface [Levesque et al. 2011], but because the whole surface has the same coefficient of friction at a given instant, state changes are salient but not felt as edges under the finger.

      Marking traceable edges requires the capacity to independently display different haptic states to skin that touches a surface at different points, through multiple fingers, different parts of the hand, or adjacent points on one finger. Present efforts have not yet simultaneously achieved high resolution, high refresh rate, and optical transparency, nor low cost. Recent advances in shape display use technologies ranging from shape memory polymers (http://www.blindpad.eu) or mechanical structures [Jang et al. 2016]) are promising.

       Improving Specific Performance and General Quality

      Quantifiable performance improvements are always easier to value than more qualitative ones, whether they benefit safety, efficiency or some other monetizable parameter. As for many interface innovations, however, performance improvement often manifests as a fluidity or reduction in effort that lessens fatigue over a period of time where the user is doing many different things, and can be difficult to isolate in causality or to measure precisely.

      Exceptions may be when haptic feedback is applied to error suppression in situations where users are known to be particularly error-prone. For example, drivers often have difficulty with verbal left/right direction commands, whereas spatially delivered haptic cues are likely to improve performance without diverting visual or auditory attention from a driving task. Haptic feedback can also increase dexterity in surgical simulations, teloperated environments, and facilitate simple pointing tasks on a GUI or touchscreens [Poupyrev and Maruyama 2003, Levesque et al. 2011]. These are all changes that can be measured, at least in controlled laboratory settings, with some transfer to real environments inferred.

      More broadly, haptic feedback is often found to contribute to the user’s sense of immersion through addition of a sensory modality, for gaming environments, virtual reality, and teleoperated or minimally invasive surgery. Immersion is generally accepted as beneficial, enabling secondary performance improvements by dint of focus and clarity, or greater engagement and enjoyment and thus product success.

       Affect or Emotion Display

      Haptic elements, both input and output, can be used for affective coloring of an interactive experience, as an overt user expression (overtly, as in “conviction widgets” [Chu et al. 2009]), or deliberate conveyance of emotion to another person [Smith and MacLean 2007]. Incoming to the user, attention to affective haptic design can influence how signals are interpreted [Swindells et al. 2007], make them more understandable and memorable [Klatzky and Peck 2012, Seifi and MacLean 2013], and contribute to a sense of delight in the interaction [Levesque et al. 2011].

      Sometimes the primary purpose of a person-to-person communication is affective in nature. Haptics can contribute to such enrichment. Therapeutically, touch-centric mediums such as haptic social robots can act both socially and physiologically on a human to change emotional state [Inoue et al. 2012, Sefidgar et al. 2015].

      Designers of effective haptic sensations within a multimodal interaction must understand what properties of haptic signals are manipulable, how they are perceived, and schemas for encoding meaning to them.

       3.3.1 The Sensation

      Delivered through a heterogeneous set of technologies, haptic sensations target different human mechanorceptors, and further vary in energetic state and expressive properties.

      A sensation can be kinesthetic or tactile. The most common type of the proprioceptively targeted haptic display is force feedback, in which the device exerts a force on the user’s body (often a hand) while the user moves the device through space (e.g., handshaking with a robot or teleoperated surgery). Vibrotactile actuators, alone or in arrays, produce the most well known of tactile sensations. Others include programmable friction [Winfield et al. 2007, Levesque et al. 2011], ultrasonic sensations [Carter et al. 2013], and thermal feedback [Ho and Jones 2007].

      A sensation’s salience can vary, from intrusive to ambient. Haptic sensations can be designed to instantly capture the user’s attention (e.g., vibrotactile (VT) notifications) or be present

Скачать книгу