vrm-c / vrm-specification

vrm specification

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Make animation includable in the specification

lalalune opened this issue · comments

Hello,

I am involved with several different projects using VRM for gaming, metaverse, digital collectibles, social and even some card game stuff. I am also involved in interoperability groups and creative commuinities that leverage VRM, and I am a Vtuber, hosting live coding sessions almost daily.

One feedback I get over and over and over is that it is extremely inconvenient that VRM cannot include animations. There is no reason this would not be possible in glTF, and in fact we are already hacking animation into VRM as a stopgap by modifying the Blender plugin for export.

I understand why the specification does not have animations, but in practice it is not how people use the spec, and it severely hamstrings the format for UGC interchange scenarios. The users have spoken, and they want animation in the VRM format.

commented

The spec just says that animations are "not used", so you can include animation data in a VRM.
However, there is no way to use the animation as a VRM because we don't specify ways to use the animation data.

https://github.com/vrm-c/vrm-specification/blob/master/specification/VRMC_vrm-1.0/README.md#unused-items

How do you want to use animation data if it's able to include in VRM?

in fact we are already hacking animation into VRM as a stopgap by modifying the Blender plugin for export.

If you are using a VRM Add-on for Blender, the latest version just released now includes animations in the VRM file. Please try it out.

How do you want to use animation data if it's able to include in VRM?

For single-character experiences, ideally we would pack the animations into the model. Animations can be retargeted in some but not all cases, and in the web especially we don't have the fancy retargeting tools like Mecanim in Unity.
For video game experiences, it'd be really handy to have the animations and rigs in the same format as our characters and wearables. We would probably store our animations all together onto a skeleton with no skinned mesh and then use it as a template to retarget or driving characters.
There are also some cases where we could push VRM for previews, a good example is OpenSea. However, the VRM ends up being T-Pose without animation support, so at least being able to pack a preview idle or default directly onto the model and load it is ideal. Currently OpenSea supports glTF.

By the way, we're working on an open source web-based character creator now, which is almost entirely VRM except for the animations: https://github.com/webaverse-studios/charactercreator

If you are using a VRM Add-on for Blender, the latest version just released now includes animations in the VRM file. Please try it out.

Didn't know this! Bravo, I'm using the latest version already and will have to try it out :)

Didn't know this! Bravo, I'm using the latest version already and will have to try it out :)

Thanks!
Exporting animations only works with VRM 1.0. Please check your settings.

commented

For single-character experiences, ideally we would pack the animations into the model. Animations can be retargeted in some but not all cases, and in the web especially we don't have the fancy retargeting tools like Mecanim in Unity.
For video game experiences, it'd be really handy to have the animations and rigs in the same format as our characters and wearables. We would probably store our animations all together onto a skeleton with no skinned mesh and then use it as a template to retarget or driving characters.
There are also some cases where we could push VRM for previews, a good example is OpenSea. However, the VRM ends up being T-Pose without animation support, so at least being able to pack a preview idle or default directly onto the model and load it is ideal. Currently OpenSea supports glTF.

I got the motivation.

I would want to close the issue by answering that that is possible at the spec level.
If you need further support for each implementation, you can ask in each repository.