HanzhiC / NGDF

Neural Grasp Distance Fields for Robot Manipulation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Neural Grasp Distance Fields for Robot Manipulation

Website  •  Paper

License: MIT   Code style: black        Meta-AI    rpad

Setup

  1. Clone the repository: git clone --recursive git@github.com:facebookresearch/NGDF.git

  2. Create a conda environment and install package dependencies

    cd NGDF
    conda env create -f ngdf_env.yml
    conda activate ngdf
    pip install -e .
    

    Install PyTorch separately, based on your CUDA driver version. The command below was tested on a 3080/3090 with CUDA 11.1:

    pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
    
  3. Setup submodules

    • ndf_robot
      cd ndf_robot && pip install -e .
      
      Source ndf_robot whenever the conda env is activated
      conda activate ngdf
      cd $CONDA_PREFIX
      mkdir -p ./etc/conda/activate.d
      touch ./etc/conda/activate.d/ndf_env.sh
      echo "cd /PATH/TO/ndf_robot && source ndf_env.sh && cd -" >> ./etc/conda/activate.d/ndf_env.sh
      
      Download pre-trained ndf_robot weights:
      cd NGDF/ndf_robot
      bash ndf_robot/scripts/download_demo_weights.sh
      
    • acronym
      cd acronym && pip install -e .
      

Folder structure

NGDF
├── acronym                     # Submodule with utilities for ACRONYM dataset
├── contact_graspnet            # Submodule with ContactGraspnet for baselines
├── data                        # Datasets, models, and evaluation output
├── differentiable-robot-model  # Submodule for differentiable FK
├── ndf_robot                   # Submodule for pre-trained shape embedding
├── ngdf                        # Code for training and evaluating NGDF networks
├── OMG-Planner                 # Submodule with pybullet env, reach and grasp evaluation
├── scripts                     # Scripts for running training and evaluation
└── theseus                     # Submodule for differentiable FK and SE(3) ops

Grasp Level Set Optimization Evaluation

  1. Download datasets acronym_perobj and acronym_multobj from this Google Drive link. Place the datasets in data/.

    The datasets are required to compute the closest grasp metric.

  2. Run evaluation

    • Download pre-trained models and configs into data/models from this link
    • Download object rotations into data from this link
    • Run grasp level set evaluations:
    bash scripts/eval/grasp_level_set/perobj.sh
    bash scripts/eval/grasp_level_set/multobj.sh
    

    Results are stored in eval/ in each model dir.

    To evaluate the grasps in pybullet, you'll need to install the code in the following section, then run the above commands with a -p flag: bash scripts/eval/grasp_level_set/perobj.sh -p

Reaching and Grasping Evaluation

  1. Set up dependencies

    • OMG-Planner, follow instructions in OMG-Planner README OMG-Planner/README.md

    • pytorch3d

      pip install "git+https://github.com/facebookresearch/pytorch3d.git@stable"
      
    • differentiable-robot-model

      cd differentiable-robot-model
      git remote add parent https://github.com/facebookresearch/differentiable-robot-model.git
      git fetch parent
      python setup.py develop
      
    • theseus-ai

      cd theseus
      pip install -e .
      
    • Contact-GraspNet

      cd contact_graspnet
      conda env update -f contact_graspnet_env_tf25.yml
      sh compile_pointnet_tfops.sh
      pip install -e .
      

      Download trained model scene_test_2048_bs3_hor_sigma_001 from here and copy it into the checkpoints/ folder.

  2. Run evaluation script

    bash scripts/eval/reach_and_grasp/perobj.sh
    

    The results are saved in data/pybullet_eval. Get summary results in jupyter notebook

    jupyter notebook --notebook-dir=scripts/eval/reach_and_grasp
    

NGDF Training

  1. Single object model training:
    bash scripts/train/perobj_Bottle.sh
    bash scripts/train/perobj_Bowl.sh
    bash scripts/train/perobj_Mug.sh
    
  2. Multi-object model training;
    bash scripts/train/multobj_Bottle.sh
    

Bibtex

@article{weng2022ngdf,
  title={Neural Grasp Distance Fields for Robot Manipulation,
  author={Weng, Thomas and Held, David and Meier, Franziska and Mukadam, Mustafa},
  journal={arXiv preprint arXiv:2211.02647},
  year={2022}
}

License

The majority of NGDF is licensed under MIT license, however a portion of the project is available under separate license terms: ContactGraspNet is licensed under a non-commericial NVidia License.

Contributing

We actively welcome your pull requests! Please see CONTRIBUTING.md and CODE_OF_CONDUCT.md for more info.

About

Neural Grasp Distance Fields for Robot Manipulation

License:MIT License


Languages

Language:Python 75.0%Language:Jupyter Notebook 13.4%Language:Shell 11.6%