Hi,
BucolicWind opened this issue · comments
BucolicWind commented
Hi,
The dimension 0 of obj_mask should be batchsize and 'b' stands for the current batch number in the for loop.
obj_mask = torch.ones(batchsize, self.n_anchors,
fsize, fsize).type(dtype)
for b in range(batchsize):
...
obj_mask[b, a, j, i] = 1
So b changes from 0 to 39 if your batchsize is 40.
Which part of yolo_layer.py have you changed?
Originally posted by @hirotomusiker in #46 (comment)