robotology / icub-models-generator

Resources and programs to generated models (URDF, SDF) of the iCub robot

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Generate models with hands

traversaro opened this issue · comments

@Yeshasvitvs check the README in this repo and the discussion in robotology/community#137 .

I want separate the problem of adding hands to the model (for which we can keep this issue) and the problem of adding eyes to the model, for which we can create a new issue in #37 .

Brief recap: the iCub's URDF are currently generated using two possible workflows, as described in the README of this repository https://github.com/robotology-playground/icub-model-generator#icub-model-generator . Both the workflows do not support exporting a model of the hands, for the following reason:

  • The dh workflow extract the parameters from the iDyn model, that do not include models for the fingers.
  • The simplified CAD model prepared by the mechanical guys used in the simmechanics workflow include the hand and all the fingers as a fixed body.

However, some information about the hands is currently available in this form:

@claudiofantacci is also working on getting a reliable model of the hands.

My two cents: for the time being I think the most reasonable solution is to isolate the hands from the model in vislab model [2] , and the resulting model can be easily add to all the generated model, even automatically.
The tricky thing is to identify the l_hand_dh_frame and r_hand_dh_frame frame as defined in [1] in the VisLab model [2], to make sure that the transformation of the fingers with respect to the hand and the rest of the arm are consistent.

[1] : http://wiki.icub.org/wiki/ICub_Model_naming_conventions
[2] : https://github.com/vislab-tecnico-lisboa/icub-moveit/tree/master/icub_description

It is not 100% clear to me the differences between the two workflows and related stuff.
First things first: do we want to use/support URDF, SDF or both?
As of now, my understanding is to use SDF.

Here is what I know about the hand.
The model we have are simplified CAD model. Simplified in that they are not 1:1 corresponding w.r.t. the Creo CAD, but instead they have less vertexes, i.e. meshes, and they are similar to cylinders for the phalanxes and ball-shaped for the tips. Even though this simplification are applied, they have the correct frame poses coming from the DH parameters.

As of now, I'm quite sure that the DH parameters are:

  • wrong/very imprecise for the whole thumb (maybe excluding just the position of the very first frame)
  • imprecise for the index
  • missing/unimplemented for the ring and little fingers

In superimpose-hand repo, I use the simplified CAD version and they work reasonably well. Having said that, though, we are on the process of making anew the DH parameters and possibly to have the CAD files updated.

First things first: do we want to use/support URDF, SDF or both?
As of now, my understanding is to use SDF.

Some of our software including this generator and most ROS-based software only supports URDF, so for now we need to support both.

Having said that, though, we are on the process of making anew the DH parameters and possibly to have the CAD files updated.

Great! By "CAD files updated" you mean to have a shinkwrap for each link in the hand? There is any discussion in iCub Facility's internal Redmine on this?

Sorry @claudiofantacci , I accidentally deleted your message. : (
However we can talk about this when we are back from vacation!

🙈
😂
No worries! We will discuss back from vacation!

Hi @traversaro and @claudiofantacci

Any news about this issue?

Thanks in advance,
Pedro Vicente

Hi @vicentepedro, not at this very moment, but we have a student that will start working on this in the next months.
Keep in touch!

Hi @traversaro

Any news on generating the iCub eyes and hands automatically from the CAD?

We are interested in using these models on pybullet. But we didn't find the cameras reference frames on the model to create the virtual cameras on the simulator.

We are testing the following repos:
https://github.com/diegoferigo/icub-model-pybullet from @diegoferigo
and
https://github.com/robotology-playground/pybullet-robot-envs

If this is not the right place to put the question, feel free to move it ;)

Hi @vicentepedro, I would say the most official model with eyes + hands (even if not completely automatically generated from CAD) is iCubGazeboV2_5_visuomanip . It includes movable eyes and actuated hands, and you can see it in use in the icub-gazebo-grasping-sandbox. We had problem in the past in using this models in PyBullet, but workaround are relatively easy (see robotology/icub-models#12).

If you find a way to use that model in PyBullet, feel free to report your success, for example in robotology/community's "Show and Tell" category. If instead you have any issue, feel free to open new issue in https://github.com/robotology/icub-models, that is the public facing repo for iCub models (we should actually move all the issues from this repo to that one).

Thanks @traversaro

@Tiago-N will try that model instead, and we will report how it goes.

We are testing the following repos:
https://github.com/diegoferigo/icub-model-pybullet from @diegoferigo
and
https://github.com/robotology-playground/pybullet-robot-envs

The diegoferigo/icub-model-pybullet repository was an early experiment and it is currently no longer maintained. For the applications that we were interested, we switched to Ignition Gazebo instead of pybullet. I'm going to archive the repository.

Instead, for what concerns robotology-playground/pybullet-robot-envs, I'm not sure if it's still actively maintained. The main developer recently left IIT, and I don't know if internally there's a plan to keep its development.

Instead, for what concerns robotology-playground/pybullet-robot-envs, I'm not sure if it's still actively maintained. The main developer recently left IIT, and I don't know if internally there's a plan to keep its development.

Probably on this @xEnVrE may know something.

@Tiago-N will try that model instead, and we will report how it goes.

Ok, pay attention to the issue in robotology/icub-models#12 that is quite critical, however it should be easy to workaround.

Instead, for what concerns robotology-playground/pybullet-robot-envs, I'm not sure if it's still actively maintained. The main developer recently left IIT, and I don't know if internally there's a plan to keep its development.

Probably on this @xEnVrE may know something.

As far as I know, at the moment there are no planned activities for that repository.

Probably on this @xEnVrE may know something.

As far as I know, at the moment there are no planned activities for this repository.

And just to clarify, with "this repository", you mean https://github.com/robotology-playground/pybullet-robot-envs, right?

Probably on this @xEnVrE may know something.

As far as I know, at the moment there are no planned activities for this repository.

And just to clarify, with "this repository", you mean https://github.com/robotology-playground/pybullet-robot-envs, right?

Sorry for being not clear. Yes I meant that repository. I think we can also ask @fbottarel for information on that repository.

I was never involved in https://github.com/robotology-playground/pybullet-robot-envs, and as far as I know no one uses it nor maintains it right now.

Let me try to revive this issue. Lately we have been working on the teleoperation pipeline using iCub3. It would be useful to have an iCubGazeboV3 robot with hands.