There are 23 repositories under grasping topic.
Train robotic agents to learn to plan pushing and grasping actions for manipulation with deep reinforcement learning.
Summary of key papers and blogs about diffusion models to learn about the topic. Detailed list of all published diffusion robotics papers.
Baseline model for "GraspNet-1Billion: A Large-Scale Benchmark for General Object Grasping" (CVPR 2020)
Antipodal Robotic Grasping using GR-ConvNet. IROS 2020.
Deep Reinforcement Learning for Robotic Grasping from Octrees
Python module for GQ-CNN training and deployment with ROS integration.
MIT-Princeton Vision Toolbox for Robotic Pick-and-Place at the Amazon Robotics Challenge 2017 - Robotic Grasping and One-shot Recognition of Novel Objects with Deep Learning.
[ICRA 2022] CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation
Pytorch implementation of diffusion models on Lie Groups for 6D grasp pose generation https://sites.google.com/view/se3dif/home
Detecting robot grasping positions with deep neural networks. The model is trained on Cornell Grasping Dataset. This is an implementation mainly based on the paper 'Real-Time Grasp Detection Using Convolutional Neural Networks' from Redmon and Angelova.
An implementation of our RA-L work 'Real-world Multi-object, Multi-grasp Detection'
paper list of robotic grasping and some related works
Toolbox for our GraspNet-1Billion dataset.
ROS2.0 Foxy and Humble repositories which provide ready-to-use ROS2.0 Gazebo + MoveIt!2 simulation packages for different Industrial and Collaborative Robots.
Official PyTorch implementation of Synergies Between Affordance and Geometry: 6-DoF Grasp Detection via Implicit Representations
An Optimization-based Motion and Grasp Planner
"Good Robot! Now Watch This!": Repurposing Reinforcement Learning for Task-to-Task Transfer; and “Good Robot!”: Efficient Reinforcement Learning for Multi-Step Visual Tasks with Sim to Real Transfer
Deep learning for grasp detection within MoveIt.
Robotic grasp dataset for multi-object multi-grasp evaluation with RGB-D data. This dataset is annotated using the same protocal as Cornell Dataset, and can be used as multi-object extension of Cornell Dataset.
Collection of object models compatible with pybullet simulator https://github.com/bulletphysics/bullet3/tree/master/examples/pybullet
Grasping in GAZEBO robotics simulator.
[cvpr19] Code to generate images from the ObMan dataset, synthetic renderings of hands holding objects (or hands in isolation)
Simple and unified interface to zero-shot computer vision models curated for robotics use cases.
Robot Learning of Shifting Objects for Grasping in Cluttered Environments
TransCG: A Large-Scale Real-World Dataset for Transparent Object Depth Completion and A Grasping Baseline