Skip to main content
Home
Mechatronics and Haptic
Interfaces Lab

Main navigation

  • Home
  • People
  • Research
  • Publications
  • Contact
User account menu
  • Log in

Importance of Wrist Movement Direction in Performing Activities of Daily Living Efficiently

Neural activity modulations and motor recovery following brain-exoskeleton interface mediated stroke rehabilitation

A Multi-sensory Approach to Present Phonemes as Language through a Wearable Haptic Device

Spatially Separated Cutaneous Haptic Guidance for Training of a Virtual Sensorimotor Task

Syntacts: Open-Source Software and Hardware for Audio-Controlled Haptics

Spatially Separated Cutaneous Haptic Guidance for Training of a Virtual Sensorimotor Task

Simply Grasping Simple Shapes: Commanding a Humanoid Hand with a Shape-Based Synergy

Explorations of Wrist Haptic Feedback for AR/VR Interactions with Tasbi

MISSIVE: Multisensory Interface of Stretch, Squeeze and Integrated Vibration Elements

MISSIVE - Multisensory Interface of Stretch, Squeeze and Integrated Vibration Elements

MISSIVE incorporates skin stretch, squeeze and vibration cues presented simultaneously to the user in distinct patterns. The use of multisensory cues allows us to design large discrete cue sets while maintaining a small and wearable form factor. With MISSIVE, we demonstrated language transmission via haptic phonemes, or units of sound encoded as haptic cues consisting of vibration, radial squeeze, and lateral skin stretch components. 

Syntacts: Open Source Framework for Audio-Controlled Vibrotactile Haptics

Syntacts: Open Source Framework for Audio-Controlled Vibrotactile Haptics

Pagination

  • First page
  • Previous page
  • …
  • Page 2
  • Page 3
  • Page 4
  • Page 5
  • Current page 6
  • Page 7
  • Page 8
  • Page 9
  • Page 10
  • …
  • Next page
  • Last page
RSS feed
YouTube

Mechatronics and Haptic Interfaces Lab at Rice University

Mechanical Engineering Department, MS 656, 713-348-2300
Bioscience Research Collaborative 980, Houston, TX 77030