kazuto1011 / deeplab-pytorch

PyTorch re-implementation of DeepLab v2 on COCO-Stuff / PASCAL VOC datasets

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ValueError: Expected input batch_size (182) to match target batch_size (2).

yejg2017 opened this issue · comments

When I run the deeplabv2 model on cocostuff164k, with batch_size=2,I get the error

#ValueError: Expected input batch_size (182) to match target batch_size (2).

And I check the output's of model is [2,182,41,41],and the labels'shape is [2,41,41],I think the problem is loss function.And I do not how to fix this issue.Could you help me ?Thanks

Did you run that on multiple GPUs?

Did you run that on multiple GPUs?

No,I only run that on single GPU

Could you provide the command to reproduce the issue?

It is likely you instantiate a model with 2 output classes and attempt to load coco weights.

Thank you @toshas. I think @yejg2017 modified the code and iterated over the tensor in [2, 182, 41, 41] instead of four-scale logits 4 x [2, 182, H, W]. The error may be due to that the loss function received sliced [182, 41, 41] logit and [2, 41, 41] labels. Please check the tensor just before the loss function again. Don't switch the model to eval mode or don't remove MSC class, if you want to use the multi-scale training script intact. Please reopen this issue if someone still has the same problem and any additional information could be provided.

deeplab-pytorch/main.py

Lines 233 to 237 in 79cb390

for logit in logits:
# Resize labels for {100%, 75%, 50%, Max} logits
_, _, H, W = logit.shape
labels_ = resize_labels(labels, size=(H, W))
iter_loss += criterion(logit, labels_.to(device))