Skip to main content icon/video/no-internet

Multimodal Interactions: Tactile-Auditory

When we explore a texture with our hands or just rub our hands, the tactile percept is accompanied by simultaneous sounds. Although not always obvious in everyday life, this auditory feedback conveys useful information about surface properties. This entry describes examples of how auditory feedback influences tactile perception, and some of the proposed mechanisms. Experiments can unmask such an interaction between auditory and tactile perception, for example in what Jousmäki and Hari called the parchment-skin illusion, experimentally demonstrated as follows: subjects rubbed their palms together while the ensuing sound was played back to them in headphones. When the high frequencies of the sound (above 2,000 hertz, Hz) were selectively enhanced, subjects reported that the palmar skin feels dry and parchmentlike, compared with trials when the original sound was played back. This striking example of audiotactile interaction highlights an aspect of multisensory perception that has been studied much less frequently than audiovisual interaction.

In another example, sounds combined with vibrotactile stimuli applied to the fingers were reported as louder than the same sounds presented alone. This effect was demonstrated using vibrotactile stimuli applied via a tube that subjects grasped, resulting in stimulation of their fingers and palms with a perceived intensity of 24 to 28 decibels (dB) above threshold. Tones were 10 dB above threshold, measured within masking noise. Both stimuli were of the same frequency, 200 Hz. Subjects adjusted tones to match a reference tone and chose lower intensities in trials with simultaneous auditory and tactile stimulation, compared with auditory-only stimulation. To demonstrate audiotactile interaction, this experiment kept stimuli at low intensity levels, in line with the inverse effectiveness rule: multisensory interactions are most evident when at least in one sensory modality the stimulus is of low intensity, making information from that modality alone unreliable.

It is widely accepted that two further rules govern the size of multisensory interaction effects: Interaction is most likely to be evident when stimuli occur simultaneously (temporal rule) and when they are perceived as arising from the same spatial location (spatial rule)—conditions that would typically be met when stimuli represent a single audiotactile object. Whether or not the spatial rule applies to audiotactile interaction has been questioned in recent studies. Moreover, it is controversial whether the three rules of multisensory integration, mostly based on experiments at the single-neuron level, fully apply to behavioral studies.

Which of the two senses is dominant when information from both modalities, auditory and tactile, is available to the subject? The weighting depends on several factors, one of them being the subject's long-term experience of how useful information from a given modality is for a certain task. However, subjects can modify the weighting depending on task demands: It has been reported, for example, that subjects who use a probe to explore a texture weight tactile information stronger than during exploration with bare fingers. This adjustment reflects the louder sounds and the lower reliability of tactile feedback during exploration with a probe.

What may be the neural basis of audiotactile interaction that—as the previous examples illus-trate—works both ways, with auditory stimuli biasing tactile perception and vice versa? One might consider two interesting parallels between the two modalities, as pointed out by von Békésy nearly 50 years ago: First, they both require neural processing with analyses of rich temporal structures, for example, when sound or vibration frequencies are discriminated. Second, the brain processes auditory as well as vibrotactile stimuli as input from receptors that transduce movement into neural activation: from auditory hair cells that are excited when sound waves make the basilar membrane of the inner ear flex, and from the rapidly adapting Pacinian corpuscles in the skin that are most sensitive to vibration in the range of 100 to 300 Hz—frequencies that are well within the sensitivity range of the human auditory system. So could it be that the auditory cortex, highly specialized in the analysis of temporal patterns (as they occur in speech and environmental sounds), contributes to the processing of temporally structured vibrotactile stimuli?

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading