Skip to content
This repository has been archived by the owner on Jan 1, 2024. It is now read-only.

[Request/Support] Using Grasping Stability NN in real world #31

Open
robertokcanale opened this issue Jun 28, 2022 · 0 comments
Open

[Request/Support] Using Grasping Stability NN in real world #31

robertokcanale opened this issue Jun 28, 2022 · 0 comments

Comments

@robertokcanale
Copy link

robertokcanale commented Jun 28, 2022

Hi @wx405557858, my team and I have bought the DIGIT sensors (directly from GelSight. I have read in your paper about the grasping stability NN that you trained.
image

We intend to use the tactile feedback from the sensors to establish the stability of the grasp (we uese real-life YCB Dataset objects). We have some ideas in mind, but we would also test out the NN you trained in the real world. We would probably like to implement it within ROS, so getting anything from python should work.
Could you kindly walk me through, at code level:

-How to run inference of the NN?
-Would it be possible to have the script to run it and the weights? ( we would probably do a finetuning on top of it)
-How/Where to provide the DIGIT tactile images as input?
-What exactly is the provided output?
Regards
Roberto

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant