AILab-CVC / SEED

Official implementation of SEED-LLaMA (ICLR 2024).

Home Page:https://ailab-cvc.github.io/seed

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Codebook Training Epochs

Revliter opened this issue · comments

Hello,
Congratulations on the successful development of the SEED model! I am impressed by its ability and wanna to reproduce it locally. However, I am encountering some confusing problems. The config of the codebook training of the seed tokenizer says that it takes up to 500 epochs training upon 500m data. I am wondering is it the right config of the codebook training since it takes so many gpu hours to finish this. It would be thankful if you can clarify this or provide some advice. Thanks for your generous help.

Have got data?