kuff / medslr

✌️ Exploring Sign Language Recognition in Virtual Reality using the Unity Barracuda framework

Home Page:https://youtu.be/jGk2umf45GM

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

htlab

This project is inteded as a template for working with OVR hand tracking in Unity. The TrackingScene contains a minimal scene with everything needed to get started. In addition, the project comes with:

Where to put your code

To speed up compile times, the project splits up your code from third party code to avoid recompiling files which have not changed. This means that your code should live in the Assets/Project/Runtime directory. This also means that you must update the Project.Runtime.asmdef Assembly Definition file when importing third party libraries. In some IDEs such as Rider 2022 this can be done for you automatically.

How to launch the project

Due to a bug in the OVR hand tracking implementation of OpenXR, you must use different backends depending on how you launch the project. By default, the project uses the Legacy LibOVR+VRAPI backend, enabling you to run hand tracking in Unity PlayMode over Oculus Link. However, when you wish to build the project into an Android apk-file, you must change the backend to OpenXR. This is done through Oculus > Tools > OVR Ultilities PLugin > Set OVRPlugin to OpenXR. If you wish keep working in the editor afterwards you simply select Set OVRPlugin to Legacy LibOVR+VRAPI in the same menu. Remember to switch your platform to Android before building. You do not have to switch back afterwards.

About

✌️ Exploring Sign Language Recognition in Virtual Reality using the Unity Barracuda framework

https://youtu.be/jGk2umf45GM

License:Creative Commons Attribution Share Alike 4.0 International


Languages

Language:C# 67.9%Language:ShaderLab 27.3%Language:HLSL 4.7%