cocacola-lab / TLV-Link

An official implementation of Touch100k: A Large-Scale Touch-Language-Vision Dataset for Touch-Centric Multimodal Representation

Home Page:https://cocacola-lab.github.io/Touch100k/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TLV-Link

๐Ÿš€๐Ÿš€๐Ÿš€ The implementation of Touch100k: A Large-Scale Touch-Language-Vision Dataset for Touch-Centric Multimodal Representation.

โค๏ธ Acknowledgement

  • LanguageBind: An open source, language-based multimodal pre-training framework. Thanks for their wonderful work.
  • OpenCLIP: An amazing open-sourced backbone.

๐Ÿ”’ License

Code License This project is under the MIT license.

Data License The dataset is CC BY NC 4.0 (allowing only non-commercial use) and models trained using the dataset should not be used outside of research purposes.

About

An official implementation of Touch100k: A Large-Scale Touch-Language-Vision Dataset for Touch-Centric Multimodal Representation

https://cocacola-lab.github.io/Touch100k/

License:MIT License


Languages

Language:Python 99.5%Language:Shell 0.5%