c-frame / aframe-extras

Add-ons and helpers for A-Frame VR.

Home Page:https://c-frame.github.io/aframe-extras/examples/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[animation-mixer] multiple external animations

Utopiah opened this issue · comments

Please note that I'm not a professional animator so this might be a dumb suggestion.

In https://readyplayerme.github.io/visage/?path=/story/components-avatar--animated and https://henryegloff.com/how-to-load-gltf-animations-as-separate-files-in-three-js/ the rigged glTF model is loaded but also animations in separate files, e.g https://readyplayerme.github.io/visage/male-idle.glb in addition to https://readyplayerme.github.io//visage/male.glb or https://henryegloff.com/demos/threejs/_models/look-around.glb in addition to https://henryegloff.com/demos/threejs/_models/robot-idle.glb

I'm wondering if the animation-mixer should also support this.

See also #420 on playing the animations but not on loading from different URLs or blobs.

Loading animations shouldn't be the responsibility of the animation-mixer component but another component, I'm doing it in my player-info component and when the animations are set I set the animation-mixer component with the IDLE animation.
You can look at an example of loading mixamo fbx animations here
https://github.com/akbartus/AFrame-Runtime-GLTF-Animations
For animations in glb, it's the same thing, just replace FBXLoader by a GLTFLoader.

If you're using a fbx mixamo animation you just exported from the mixamo site (I'm using FBX Binary, Without Skin, 30 Frames per Second, Keyframe Reduction "uniform" when exporting) on a RPM avatar, you may need to convert the animation to remove the mixamorig prefix in the bones names, and multiply the position track value by a coefficient like 0.01 or 0.005 depending of the animation.

I took inspiration of
https://github.com/ApteroSAS/avatar-vrm-xr/blob/main/src/utils/loadMixamoAnimation.ts
and https://github.com/mrdoob/three.js/blob/dev/examples/jsm/utils/SkeletonUtils.js

I ended up with my own function (feel free to use it, it's MIT), here is a simplified version.
This code remove the position tracks excepted the Hips one (really needed for AvatarSDK MetaPerson 1.0 avatars, not required for RPM) and remove all scale tracks as well. I have a more complex function adding offsets to quaternion tracks for arms for AvatarSDK MetaPerson 1.0 avatars not included here.

function getBones(skeleton) {
  return Array.isArray(skeleton) ? skeleton : skeleton.bones;
}

function getBoneByName(name, skeleton) {
  for (let i = 0, bones = getBones(skeleton); i < bones.length; i++) {
    if (name === bones[i].name) return bones[i];
  }
}

function simpleRetargetClip(target, clip, options = {}) {
  const names = options.names ?? {};
  const positionMultiplier = options.positionMultiplier ?? 1.0;
  const tracks = [];

  clip.tracks.forEach((track) => {
    const trackSplitted = track.name.split(".");
    const mixamoRigName = trackSplitted[0];
    const boneName = names[mixamoRigName] || mixamoRigName;
    const propertyName = trackSplitted[1];
    const boneTo = getBoneByName(boneName, target.skeleton);
    if (!boneTo) return;

    if (track instanceof THREE.VectorKeyframeTrack && track.name.endsWith("position") && options.hip === boneName) {
      const threetrack = new THREE.VectorKeyframeTrack(
        `${boneName}.${propertyName}`,
        track.times,
        track.values.map((v) => v * positionMultiplier)
      );
      tracks.push(threetrack);
    } else if (track instanceof THREE.QuaternionKeyframeTrack) {
      const times = track.times;
      const values = track.values;
      const threetrack = new THREE.QuaternionKeyframeTrack(`${boneName}.${propertyName}`, times, values);
      tracks.push(threetrack);
    }
  });

  return new THREE.AnimationClip(clip.name, clip.duration, tracks);
}

Here how you can use it in an aframe component:

          events: {
            "model-loaded": function (evt) {
              (async () => {
                const model = evt.detail.model;
                const mesh = model.getObjectByName("Wolf3D_Avatar");
                const animations = [
                  ["Idle", "/avatars/animations/Breathing Idle.fbx"],
                  ["AnotherAnimation", "/avatars/animations/Another Animation.fbx", {positionMultiplier: 0.005}],
                ];
                const names = {};
                const bones = mesh.skeleton.bones;
                for (let i = 0; i < bones.length; ++i) {
                  const bone = bones[i];
                  names["mixamorig" + bone.name] = bone.name;
                }
                const fbxLoader = new FBXLoader();
                for (let [animationName, url, options] of animations) {
                  options = options ?? {};
                  const asset = await fbxLoader.loadAsync(url);
                  const clip = THREE.AnimationClip.findByName(asset.animations, "mixamo.com");
                  const newClip = simpleRetargetClip(mesh, clip, {
                    hip: "Hips",
                    names: names,
                    positionMultiplier: options.positionMultiplier ?? 0.01,
                  });
                  newClip.name = animationName;
                  model.animations.push(newClip);
                }
                this.el.setAttribute("animation-mixer", "clip:Idle;loop:repeat;crossFadeDuration:0.2");
              }
            })();
          }

Complete working example of using realistic animated avatars in networked-aframe:
https://github.com/networked-aframe/naf-valid-avatars