felipecode / coiltraine

Training framework for conditional imitation learning

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Bug: dividing speed_loss by batch size twice

dHonerkamp opened this issue · comments

In the file coiltraine/network/loss.py

we find the following lines from 56 onwards:

   loss_function = loss_branches_vec[0] + loss_branches_vec[1] + loss_branches_vec[2] + \
                      loss_branches_vec[3]
    
   speed_loss = loss_branches_vec[4]/(params['branches'][0].shape[0])

   return torch.sum(loss_function) / (params['branches'][0].shape[0])\
                + torch.sum(speed_loss) / (params['branches'][0].shape[0]),\

It seems the speed_loss is being divided by params['branches'][0].shape[0] (the batch_size?) twice instead of only once. While the rest of the loss ('loss_function') is not.

Is this indeed a bug that changes the scaling of the different losses or am I missing something?

I confuse about this points. I also want to know.

Yes, I found this recently. It was dividing twice.
Thanks for pointing this out. I wouldnt' change that on the repository since this is necessary
in order to reproduce the results.

This shows that the speed prediction should probably use a way smaller
weight than what i used originally.