Implicit and explicit memory effects in haptic perception

Soledad Ballesteros


There is considerable evidence in the literature showing that memory is not a unitary entity. Old neuropsychological findings from different groups of patients and more recent laboratory studies are in agreement about the major memory systems in the human brain. A distinction is made between declarative or explicit memory and nondeclarative or implicit memory. Of special interest is the distinction in declarative memory between episodic and semantic memory [1]. Episodic memory includes personal experiences and conscious recollection of events in our past. This type of memory is defined by the capacity to retrieve voluntarily facts and events in the spatial-temporal context. On the other hand, semantic memory refers to our general knowledge, including the meaning of words and concepts.
Two decades ago, Graf and Schacter [2] used the terms implicit and explicit memory to refer to two different forms of memory as well as two ways of accessing previously encoded information. Explicit memory refers to conscious recollection of previous experience with stimuli (words, pictures, objects, etc.) while implicit memory is inferred when previous experiences with the stimuli do not require intentional or conscious recovery of previously perceived information. Most research on implicit and/or explicit memory has focused on verbal stimuli or pictures presented visually while the number of studies that have presented the stimuli tactually has been very limited.
Researchers on touch have called our attention to the historical lack of interest in the study of touch [3, 4]. For example, Heller [3] pointed out that psychologists had emphasised the study of visual shape perception. As he recognised, touch did not function as efficiently as vision in detecting outlines of shapes, because the touch modality is slower than vision and scans the stimulus sequentially. These different methods of operation made researchers think that touch is less important than vision. However, this situation has changed lately. During the last decade a number of laboratories around the world have dedicated a great deal of effort and research resources to studying how touch works [5]. It is true that human vision is an outstanding perceptual modality that allows sighted people to rapidly gather highly precise information from objects in space and their spatial relations. However, when human perceivers actively explore objects with their hands, a large number of sensory inputs and high quality of sensory information are extracted for further processing [6].