humancomputerintegration / touchfold

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality

This is the repository for PCB schematics and 3D printed file used in the wearable device in the "Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality" paper (ACM CHI2021).

hardware

  • DC motor (26:1 Sub-Micro Planetary Gearmotor 0.1 kg-cm, Pololu)
  • linear resonant actuator (LRA C10-100, Precision Micro Drives)
  • force sensor (FSR 400, Interlink Electronics)
  • photo interrupter (SG-105F, Kodenshi)

citing

When using or building upon this device in an academic publication, please consider citing as follows:

Shan-Yuan Teng, Pengyu Li, Romain Nith, Joshua Fonseca, Pedro Lopes. 2021. Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality. In Proceedings of CHI Conference on Human Factors in Computing Systems 2021 (CHI’2021). Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3411764.3445099

thanks / acknowledgments

We would like to thank our colleague Daniel Steinberg for assisting us in building a Project North Star headset. Secondly, we would like to thank Leap Motion for open-sourcing the aforementioned mixed reality headset and Microsoft for open-sourcing the Mixed Reality Toolkit. Moreover, we would like to thank the University of Chicago’s The Center for Data and Computing (CDAC) for their support with the HoloLens headset. Lastly, we sincerely thank our funding sources for making this work possible. We would like to thank the University of Chicago’s Center for Unstoppable Computing (CERES) for their on-going support. Lastly, this work was supported by the Sony Research Award program.

About

License:MIT License