immersive-web / webxr-hand-input

A feature repo for working on hand input support in WebXR. Feature lead: Manish Goregaokar

Home Page:https://immersive-web.github.io/webxr-hand-input/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Expose hands and controllers at the same time

cabanier opened this issue · comments

The Quest platform is able to provide information on hands and controllers at the same time.
Should we expose this to WebXR? If so, should we create a new feature string?

/tpac expose hands and controllers at the same time

This would mean that the XRInputSource just has both a gamepad and hand object on it, yes? We can add language to that effect.

I don't think this needs a new feature, it's a part of hand input.

the positions of the hand and the controller would be different. It's not just that you get the controller inputs.

Ah, hmm. Interesting.

yes, this is why it only works with Quest Pro controllers. They are self tracked so they don't need IR emitters and HMD camera tracking.

Apparently the ML2 browser already exposes hands and the controller at the same time.
As the XRSession.inputSources is a list of active input sources, it can contain more than 2 sources.