Baseline model for "CMG-Net: An End-to-End Contact-based Multi-Finger Dexterous Grasping Network" (ICRA 2023).
In this repository, we propose an end-to-end deep neural network, CMG-Net, for multi-finger grasping.
- Ubuntu 18.04 LTS
- Python 3.7
- PyTorch 1.7
- Pybullet
This code has been tested with python3.7, pytorch 1.7, CUDA 10.1
Create the conda env
conda env create -f environment.yml
Compile and install pointnet2 operators (code adapted from VoteNet)
cd pointnet2
python setup.py install
Get Pyrender with OSMesa follow here
Download trained model from here and extract it into the checkpoints/
folder.
Download object models from here, extract them to the dataset/
folder and place them in the following structure:
dataset
|-- urdfs
| |-- barrett_object
| |-- setup
Download a mini-dataset from here and extract it to the folder:
dataset
|-- view_7
Download 50 test scenes from here and extract it to the folder:
dataset
|-- test
CMG-Net can output multi-finger hand configurations and grasp poses for an input single-shot viewpoint in a cluttered scene.
Using test scenes to test our CMG-Net and evaluate the Successful Rate(SR) in simulation, execute:
python3 test.py --use_normal --checkpoint_path checkpoints/36_vw7_1155_2048.pth
Start training with distributed data parallel:
python -m torch.distributed.launch --nproc_per_node=1 train.py --use_normal
Notes:
- Please refer to the code to make changes to the relevant parameters.
- The parameters for the camera and point cloud are in
config/config.yaml
Please cite our paper in your publications if it helps your research:
@inproceedings{wei2023cmgnet,
title={CMG-Net: An End-to-End Contact-Based Multi-Finger Dexterous Grasping Network},
author={Mingze Wei and Yaomin Huang and Zhiyuan Xu and Ning Liu and Zhengping Che and Xinyu Zhang and Chaomin Shen and Feifei Feng and Chun Shan and Jian Tang},
year={2023},
booktitle={Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA)},
}
This project uses the following third-party code:
- PointNet2: Licensed under the GraspNet-Baseline Software License Agreement.
- pytorch_barrett_hand_forward_kinematics_layer: Licensed under the MIT License.
- Volumetric Grasping Network: Licensed under the BSD-3-Clause.
- Contact-GraspNet: Licensed under the NVIDIA Source Code License for Contact-GraspNet.
This project is licensed under this license.