ZhangYuanhan-AI / NOAH

Searching prompt modules for parameter-efficient transfer learning.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Question about Adapter

RaptorMai opened this issue · comments

Thank you so much for sharing the code. In the original paper of the Adapter, one Adapter layer is placed after the Attention module and the other one is placed after the FF layer. In your paper, it seems the Adapter layer is only placed after the FF layer. Could you elaborate on the rationale behind this change? Thank you so much in advance, and I look forward to hearing back from you.

Hi Mai,

We follow the setting mentioned in the VPT paper: https://arxiv.org/pdf/2203.12119.pdf page 15.

As it said: [63,64] exhaustively searched all possible configurations and found that only inserting adapters after the FFN “Add & LayerNorm”sub-layer works the best. Therefore we also use this setup in our own implementation.