A repo for creating a visuotactile dataset based on PyBullet with TACTO vision-based tactile simulator.
Set your result directory path in tacto_pose.py
and run it.
It will save the image from the front vision cam that is displayed at the upper left and the (left-hand) tactile image.
*Note that the gripper is open at first and that it's interactive mode by default, which means you are supposed to control the pose of the grasped object manually by default.