Humans use sensory information from the world to take action and complete tasks. While there are many strategies for augmenting human perception, touch-based systems provide a large bandwidth and high-fidelity approach for improving human sensory capacity. However, this feedback often does not perfectly match what people are seeing -- and we have a limited understanding of how people integrate information during such sensory mismatch (e.g., seeing a mug while feeling a cylinder with different mechanical properties).
Postdoctoral Fellows
Volkan Patoglu, Ph.D., Professor
Mechatronics Programme
Faculty of Engineering and Natural Sciences
Sabanci University, Istanbul, Turkey
Ali Israr, Ph.D.
Disney Research, Meta Reality Labs
Graduate Students
Chris Bartley, M.S. (Draper Fellow) Air Force
Ben Black (MME) Georgia Institute of Technology PhD - ME, National Instruments
Kevin Bowen (MSME) Exxon Mobil
Abhishek Gupta (PhD) Assistant Professor, India Institute of Technology (Bombay)
Research by Janelle Clark and Alix Macklin is featured.
Prototyping is in overdrive for our CDMRP project with UTHSC!