JulesBelveze / bert-squeeze

🛠️ Tools for Transformers compression using PyTorch Lightning ⚡

Home Page:https://julesbelveze.github.io/bert-squeeze/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Use Callback for 2 stage training in DeeBert

JulesBelveze opened this issue · comments

DeeBert models need to be fine-tuned in a two step fashion: first the final layer and then the ramps.
The current implementation requires the user to do two different training. However, this can be achieved in one-shot using a pl.Callback, as done for TheseusBert.