caizhongang / SMPLer-X

Official Code for "SMPLer-X: Scaling Up Expressive Human Pose and Shape Estimation"

Home Page:https://caizhongang.github.io/projects/SMPLer-X/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bone point data of the human body

ghx2757 opened this issue · comments

This is really an exciting job. Can I directly apply the bone point data of the human body in the model inference results to the self built human body model? I visualized the bone point data of the human body in the model inference results, but did not output a representation similar to the skeleton. @caizhongang

Hi @ghx2757 , we use SMPL-X, which is a parametric mesh model, as the human representation. Currently, we do not output joints but you may intercept the keypoints here.

Please let us know if this resolves your problem. :)

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors?
My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

非常感谢您的回复!按照您的提示,我模拟了Mesh的可视化方案,成功可视化了人体25个关键点!然而,这些点的顺序是否与orig_joints_name的顺序一致仍然是个问题,因为这里的root_pose 和 body_pose是 22 个旋转向量? 我的目标是使用这些原始旋转矢量来驱动我自己的数字人。再次感谢你。
您好,我最近也在研究该工作输出的关节点,请教一下输出的22个旋转向量和原始的smplx是一致的吗?

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late.
the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis.
so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

Thank you very much for your reply. You explained it very clearly. Thank you for your guidance. I will try to follow the ideas you provided in the future. Thank you again for your reply, which is extremely important to me. I wish you a happy life! @zacida

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender export_to_blender

Hi thanks for your solution. But I met a strange problem. when I try to import pk file, it says:
image

Python: Traceback (most recent call last):
File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon_init_.py", line 69, in execute
load_bvh(res_db, root_path, gender)
File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 198, in load_bvh
ob, obname, arm_ob = init_scene(scene, root_path, gender)
File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 68, in init_scene
cam_ob = bpy.data.objects['Camera']
KeyError: 'bpy_prop_collection[key]: key "Camera" not found'

why cause that?

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.
rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.
to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

    [
      
    
        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
    
      
        
          
        
        
          
          
        
      
      [
        
          
        
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
   [ ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif) ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
  
    [
      
    
        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
    
      
        
          
        
        
          
          
        
      
      [
        
          
        
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
   [ ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

Hi thanks for your solution. But I met a strange problem. when I try to import pk file, it says: image

Python: Traceback (most recent call last): File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon_init_.py", line 69, in execute load_bvh(res_db, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 198, in load_bvh ob, obname, arm_ob = init_scene(scene, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 68, in init_scene cam_ob = bpy.data.objects['Camera'] KeyError: 'bpy_prop_collection[key]: key "Camera" not found'

why cause that?

do not delete all things at first! keep at least one camera
image

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.
rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.
to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

    [
      
    
        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
    
      
        
          
        
        
          
          
        
      
      [
        
          
        
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    [
      
    
        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
    
      
        
          
        
        
          
          
        
      
      [
        
          
        
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
   [ ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif) ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
  
    [
      
    
        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
    
      
        
          
        
        
          
          
        
      
      [
        
          
        
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    
   [ ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

Hi thanks for your solution. But I met a strange problem. when I try to import pk file, it says: image
Python: Traceback (most recent call last): File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon_init_.py", line 69, in execute load_bvh(res_db, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 198, in load_bvh ob, obname, arm_ob = init_scene(scene, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 68, in init_scene cam_ob = bpy.data.objects['Camera'] KeyError: 'bpy_prop_collection[key]: key "Camera" not found'
why cause that?

do not delete all things at first! keep at least one camera image

Hi, sorry to bother you. But after that I met another weird error again:

Python: Traceback (most recent call last):
File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon_init_.py", line 69, in execute
load_bvh(res_db, root_path, gender)
File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 230, in load_bvh
apply_trans_pose_shape(
File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 160, in apply_trans_pose_shape
mrots, bsh = rodrigues2bshapes(pose)
File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 144, in rodrigues2bshapes
rod_rots = np.asarray(pose).reshape(24, 3)
ValueError: cannot reshape array of size 216 into shape (24,3)

As the log says, the pose size is 216 but I transform to shape(24,3), but the original codes are:
image
obviously, when the pose size is 216, it should go into the above codes and should not get an error

@AWangji , it is sad that I cannot reproduce your issue... maybe you could go ask hybrik's maintainer.

however, if you meet other error like "numpy64 cannot iterate", you can try to remove all "#apply shape blendshapes", that's for another issue.

@AWangji , it is sad that I cannot reproduce your issue... maybe you could go ask hybrik's maintainer.

however, if you meet other error like "numpy64 cannot iterate", you can try to remove all "#apply shape blendshapes", that's for another issue.

Hi, thanks for your reply. I have solved all the above problems.
But personally, I doubt why the resulted animation jitter is very serious. Why cause that? Can I smooth it?

Hi, @AWangji ,

I don't know, I'm just new to this area.. as I said before, it's just a rough and naive way. If you wanna get the best performance I think you have to write code yourself or wait for the authors of SMPLer-X to finish such feature.

Hi, @AWangji ,

I don't know, I'm just new to this area.. as I said before, it's just a rough and naive way. If you wanna get the best performance I think you have to write code yourself or wait for the authors of SMPLer-X to finish such feature.

Excuse me, I would like to know if your converted results will also have jitter? I just want to know something to do with my video range of motion

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender export_to_blender

Traceback (most recent call last):
File "inference.py", line 333, in <module>
main()
File "inference.py", line 160, in main
out = demoer.model(inputs, targets, meta_info, 'test')
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
return forward_call(*input, **kwargs)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward
return self.gather(outputs, self.output_device)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 181, in gather
return gather(outputs, output_device, dim=self.dim)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 78, in gather
res = gather_map(outputs)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map
return type(out)((k, gather_map([d[k] for d in outputs]))
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in <genexpr>
return type(out)((k, gather_map([d[k] for d in outputs]))
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 73, in gather_map
return type(out)(map(gather_map, zip(*outputs)))
TypeError: expected a sequence of integers or a single integer, got '<map object at 0x7a29e0579c10>'

Excuse me, I have such an error after running your comparison code. Do you know what caused it? Thank you very much!

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: [compare](https://github.com/zacida/SMPLer-X/compare/3e72f5d31fb875c525b97871072bbd3914fe2dce...52390a2c725ccbbe0b823fdf4dbf80e5af7803d7)

2. using [blender addon from HybrIK](https://github.com/Jeff-sjtu/HybrIK/releases) to import the pickle file to blender

3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender export_to_blender

load checkpoint from local path: ../pretrained_models/mmdet/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth
0%| | 0/275 [00:07<?, ?it/s]
Traceback (most recent call last):
File "inference.py", line 213, in
main()
File "inference.py", line 136, in main
out = demoer.model(inputs, targets, meta_info, 'test')
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
return forward_call(*input, **kwargs)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward
return self.gather(outputs, self.output_device)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 181, in gather
return gather(outputs, output_device, dim=self.dim)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 78, in gather
res = gather_map(outputs)
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map
return type(out)((k, gather_map([d[k] for d in outputs]))
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in
return type(out)((k, gather_map([d[k] for d in outputs]))
File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 73, in gather_map
return type(out)(map(gather_map, zip(*outputs)))
TypeError: expected a sequence of integers or a single integer, got '<map object at 0x73350d3719a0>'

Hello! First of all, thank you very much for your guidance. I encountered this problem when running your code. Do you know how to solve it? Thank you very much!

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.
rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.
to make use of pose data to drive digital human, I have a rough and naive way:

1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: [compare](https://github.com/zacida/SMPLer-X/compare/3e72f5d31fb875c525b97871072bbd3914fe2dce...52390a2c725ccbbe0b823fdf4dbf80e5af7803d7)

2. using [blender addon from HybrIK](https://github.com/Jeff-sjtu/HybrIK/releases) to import the pickle file to blender

3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

    [
      
        ![export_to_blender](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)
      
    ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)
    
    
      
        
          
        
        
          
          
        
      
      [
        
          
        
      ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)
    
   [ ![export_to_blender](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U) ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)
  
    [
      
        ![export_to_blender](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)
      
    ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)
    
    
      
        
          
        
        
          
          
        
      
      [
        
          
        
      ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)
    
   [ ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

load checkpoint from local path: ../pretrained_models/mmdet/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth 0%| | 0/275 [00:07<?, ?it/s] Traceback (most recent call last): File "inference.py", line 213, in main() File "inference.py", line 136, in main out = demoer.model(inputs, targets, meta_info, 'test') File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward return self.gather(outputs, self.output_device) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 181, in gather return gather(outputs, output_device, dim=self.dim) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 78, in gather res = gather_map(outputs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 73, in gather_map return type(out)(map(gather_map, zip(*outputs))) TypeError: expected a sequence of integers or a single integer, got '<map object at 0x73350d3719a0>'

Hello! First of all, thank you very much for your guidance. I encountered this problem when running your code. Do you know how to solve it? Thank you very much!

out['my_body_pose_mat'] = my_body_pose_mat =>out['my_body_pose_mat'] = torch.tensor(my_body_pose_mat).to("cuda:0")