Vibrotactile sleeves and multimodal armbands show promise as devices that can transmit information to a user through the tactile sense. In this way, individuals have the potential to receive information haptically when typical auditory or visual channels are preoccupied or unavailable. To achieve this, individuals must successfully learn the mapping between haptic cues and informational icons through cross-modal associative learning. The success of this process is limited by perceptual capabilities of users, as well as lack of neural markers to quantify the success of haptic learning. In order to optimize future wearable displays and training methods necessary to maximize the potential of a haptic communication channel via the arm, we need to better understand how to improve perceptibility of tactile cues transmitted to users, as well as develop neural correlates to track haptic learning.
Evaluating the Effect of Stimulus Duration on Vibrotactile Cue Localizability With a Tactile Sleeve
Vibrotactile arrays are appealing as wearable haptic devices, since designers can vary parameters including cue location and duration to create distinct haptic icons to represent a wide range of information. Vibrotactile sleeves have typically used cues that vary in duration from 100 to 400 ms, but it is not well understood how cue duration might affect localizability of stimuli. Using an experimental protocol typically employed to understand how our visual system can localize stimuli, we examined localization of tactile cues for tactors spaced at fixed locations along the forearm, using a custom Vibro-Tactile Sleeve (VT-Sleeve), while we varied cue duration between 100 and 400 ms.
“Evaluating the Effect of Stimulus Duration on Vibrotactile Cue Localizability with a Tactile Sleeve”, IEEE Transactions on Haptics, vol. 14, pp. 328-334, 2021.,
Representational Similarity Analysis for Tracking Neural Correlates of Haptic Learning on a Multimodal Device
A goal of wearable haptic devices has been to enable haptic communication, where individuals learn to map information typically processed visually or aurally to haptic cues via a process known as cross-modal associative learning. Neural correlates have been used to evaluate haptic perception and may provide a more objective approach to assess association performance than more commonly used behavioral measures of performance. In this project, we examined Representational Similarity Analysis (RSA) of electroencephalography (EEG) as a framework to evaluate how the neural representation of multifeatured haptic cues change with association training.