Syntacts: Open Source Framework for Audio-Controlled Vibrotactile Haptics
Multi-sensory haptic cues have the potential to transmit a wider variety of information in the same amount of time as single-sensory haptic cues. However, these cues also interfere with each other, causing them to feel less salient to users. As it is critical that the multisensory cues transmitted to a user are conspicuous, we use the AIMS Testbed to investigate the perception of multisensory haptic cues and how this perception changes when cues are modified.
This research project focuses on delivering haptic guidance through cutaneous (skin stretch and squeeze) methods to help train people for new tasks. Haptic devices are tremendously useful for giving customized feedback during training. These devices can simulate forces associated with real-world tasks or provide guidance forces that help users to complete the task more effectively or accurately. It has been shown, however, that providing both task forces and guidance forces simultaneously through the same haptic interface can lead to confusion and worse performance.
This project investigates human perception of haptic, or touch, cues. In the field of haptics, there is a need for a standardized method to characterize haptic cues and assess human perception of these cues. Most haptic devices are characterized using methods that are unique to the experiment, making direct comparisons across studies challenging. To meet these needs, we have developed the AIMS (Adjustable Instrumented Multisensory Stimuli) Testbed, a modular and instrumented testbed that allows for flexible testing of and comparison between haptic cues.
Robotic devices are excellent candidates for delivering repetitive and intensive practice that can restore functional use of the upper limbs, even years after a stroke. Rehabilitation of the wrist and hand in particular are critical for recovery of function, since hands are the primary interface with the world. However, robotic devices that focus on hand rehabilitation are limited due to excessive cost, complexity, or limited functionality. A design and control strategy for such devices that bridges this gap is critical.
The objective of this research effort is to develop a rehabilitation robot and associated controllers to be used in both therapy and evaluation of subjects with incomplete spinal-cord injuries. We are working in collaboration with Dr. Gerard Francisco and Dr. Nuray Yozbatiran of TIRR-Memorial Hermann and UTHealth.
Vibrating muscle tendons at a range of frequencies is known to produce movement illusions in human subjects. Although there are examples in the literature on the use of vibrators to transmit simple cues such as direction information, movement illusions due to vibration have not been utilized as a method of providing illusory kinesthetic feedback. One possible main application is artificial proprioception for prosthetic devices.
The primary goal of this research effort is to improve the effectiveness of skill transfer, rehabilitation, and collaboration via haptic devices. We hypothesize that mediating robotic interfaces (either serving as the expert or placed between a human expert and the novice) can facilitate and improve the effectiveness of skill transfer and collaboration in expert-novice pairs as well as in therapist-patient rehabilitation interactions. Various shared control system architectures for skill transfer are being studied in two phases.