Divadi / SOLOFusion

Time Will Tell: New Outlooks and A Baseline for Temporal Multi-View 3D Object Detection

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Why lr_config is None?

kaixinbear opened this issue · comments

I find that lr_config=None. Is that mean lr is not declaying during training ?
It's strange to me. Hope for your reply!

Yes - we keep learning rate fixed throughout training. This is similar to what is done in some other works that use EMA (BEVDepth, Unbiased Teacher). Even if the non-EMA weights don't settle down, the EMA weights do.

Have you tried let the lr decay in the training with ema? Will the performance be worse than fixing learning rate ?

I haven't tried it myself, since that would require tuning the scheduler. BEVDepth has some relevant experiments in their repository that you can reference.

Thanks for your kind reply.