Hand Tracking Visualization
jfResearchEng opened this issue Β· comments
π Feature
Hand tracking enables the use of hands as an input method for the Oculus Quest headsets. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.
We can use LabGraph to record the data captured from Quest headset and visualize the data (e.g. in Unity). Oculus Hand Tracking API can be found here.
One set of Quest 2 and Link Cable could be provided for a US-based user who has contributed to LabGraph (subject to review/approval).
This task is a follow-up task of #81 on visualize the data obtained.
Additional context
- Existing application can be found [here] (https://developer.oculus.com/documentation/unity/unity-handtracking/)
- The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/quest2/visualization
- Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
- Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
- Add proper license header.