adapter-hub / adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Home Page:https://docs.adapterhub.ml

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Adapter for Dinov2 and ViT transformers

FBehrad opened this issue · comments

Environment info

  • adapter-transformers version: ?
  • Platform: windows
  • Python version: 3.8
  • PyTorch version (GPU?): torch | 2.0.1+cu117 | 2.1.0
  • peft: 0.5.0

Details

Hello,
How can we use the provided library for ViT and dinov2?
I checked the documentation and found this page. However, I don't know how to use it in my code.
Also, as Dinov2 is an state-of-the-art model with a promising performance in many tasks, it would be wonderful to have a adapter for it.

Hey @FBehrad , the ViT is supported you can use the ViTAdapterModel which you can load with from_pretrained as you would with transformers. The model provides all the adapter functionality like adding, activating and training adapters. You can also use the transformer model classes as they provide the adapters functionality as well. To get started it might be helpful to check out the quickstart in the documentation and the example notebooks.

But we currently don't support dinov2. I will change the label of this issue to enhancement and leave this open as a feature request.