Skip to content

fredfyyang/Touch-and-Go

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Touch-and-Go





This repository contains the official PyTorch implementation of our applications paper Touch and Go: Learning from Human-Collected Vision and Touch .

Touch and Go: Learning from Human-Collected Vision and Touch
Fengyu Yang, Chenyang Ma, Jiacheng Zhang, Jing Zhu, Wenzhen Yuan, Andrew Owens
University of Michigan and Carnegie Mellon University
In NeurIPS 2022 Datasets and Benchmarks Track

Todo

  • Visuo-tacile Self-supervised Learning
  • Tactile-driven Image Stylization

Citation

If you use this code for your research, please cite our paper.

@inproceedings{
yang2022touch,
  title={Touch and Go: Learning from Human-Collected Vision and Touch},
  author={Fengyu Yang and Chenyang Ma and Jiacheng Zhang and Jing Zhu and Wenzhen Yuan and Andrew Owens},
  booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
  year={2022}
}

Acknowledgments

We thank Xiaofeng Guo and Yufan Zhang for the extensive help with the GelSight sensor, and thank Daniel Geng, Yuexi Du and Zhaoying Pan for the helpful discussions. This work was supported in part by Cisco Systems and Wang Chu Chien-Wen Research Scholarship.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published