invictus717 / MetaTransformer

Meta-Transformer for Unified Multimodal Learning

Home Page:https://arxiv.org/abs/2307.10802

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Hardware configurations for fine-tuning?

tctco opened this issue · comments

Hi! I'm very interested in your research project and wish to fine-tune the meta-transformer on my own dataset (mostly multi-modal medical data). Could you provide a (minimal and recommended) hardware configuration for this purpose?

Thanks!

Thank you for your interest! If you wish to fine-tune our pretrained weights on your own dataset, I think 1 GPU card is enough.