CoinCheung / BiSeNet

Add bisenetv2. My implementation of BiSeNet

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Why training an ITER is over

XXMxxm220 opened this issue · comments

commented

Why training an ITER is over?

commented

训练一个epoch后 不再继续

您好,问题解决了没

commented

您好,问题解决了没

解决啦

分享一下呗

commented

分享一下呗

你好,我采用的方法是在for it 前,加入了for epoch in range epochs:;epochs是我自己设定的训练周期,再将save_model语句也包含在for epoch 内,保证每个epch保存一次模型。

Why do you need this? You can compute the total iterations by len(dataset) * n_epoches / batch_size, and set the total iterations as this computation result.

commented

Why do you need this? You can compute the total iterations by , and set the total iterations as this computation result.len(dataset) * n_epoches / batch_size

You are right .Thank you for your response.I just wanted to use epoch to mean a training, not iters,so I choose that way. This approach also does not affect the final output