DE | EN
 
 
 
 
 
 
 
   
 

Haptic shape cues, invariants, priors and interface design

 Vincent Hayward 


Introduction

Perception is often discussed by reference to cues as separate sources of information for the perceiver [1]. With vision and audition, the list of such known cues is quite extensive [2, 3]. For example, visual depth perception in humans is thought to rely on monocular, oculomotor and binocular cues. Monocular depth cues include motion parallax, color contrast, perspective, relative size, relative height, focus, occlusion, shading, texture gradient, shadows, interreflections, and others. Oculomotor cues include accommodation and convergence. Binocular cues include disparity-based stereopsis. Such collections have been also identified for other object qualities such as size or color. With audition, say for object localization, there are analogous notions, such as interaural time difference, interaural intensity differences, or spectral cues related to head-related transfer functions, in addition to monaural cues [4].
These cues are tied with the manner in which the sensory apparatus – physically and computationally – has evolved to account for the ambient physics. For example, sound localization obeys fundamental constraints related to the propagation of sound such as wavelength and speed of propagation. Nature has developed marvelous mechanisms to cope with these constraints and at the same time take advantage of them.
It is thus natural to propose that for touch, like for vision and audition, such physically and computationally specific cues must exist and can be identified. This chapter is about discussing some putative tactile cues that refer to shape as one of the object attributes that a perceiver could be interested in.
To this end, the notion of invariant will be used to identify a collection of possible tactile shape cues, and priors necessary to the processing of haptic shape are suggested from the analysis of experimental evidence. Examples of how these notions can be applied are described by looking at two specific haptic detection tasks and how stereotypical movements can be interpreted.
Displays may be thought to operate like ‘mirrors’ of the perceptual system. The colors channels of a LCD display ‘mirror’ the color channels of the visual system. The fast repetition of frames – a sampling process – ‘mirrors’ the computational spatiotemporal interpolation performed by the visual system – a reconstruction process. Examples such as those abound. For haptic interfaces one may adopt a similar view point and examples of how this approach can be applied are discussed later.
...