RuntimeError: stack expects each tensor to be equal size, but got [768, 768] at entry 0 and [768, 768, 3] at entry 1
LouisLepez23 opened this issue · comments
Hi, I got the following error when running with the cityscapes dataset:
Traceback (most recent call last):
File "D:\DeepLabV3Plus-Pytorch\main.py", line 387, in
main()
File "D:\DeepLabV3Plus-Pytorch\main.py", line 331, in main
for (images, labels) in train_loader:
File "C:\torch\utils\data\dataloader.py", line 633, in next
data = self._next_data()
File "C:\torch\utils\data\dataloader.py", line 1345, in _next_data
return self._process_data(data)
File "C:\torch\utils\data\dataloader.py", line 1371, in _process_data
data.reraise()
File "C:\torch_utils.py", line 644, in reraise
raise exception
RuntimeError: Caught RuntimeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "C:\torch\utils\data_utils\worker.py", line 308, in _worker_loop
data = fetcher.fetch(index)
File "C:\torch\utils\data_utils\fetch.py", line 54, in fetch
return self.collate_fn(data)
File "C:\torch\utils\data_utils\collate.py", line 265, in default_collate
return collate(batch, collate_fn_map=default_collate_fn_map)
File "C:\torch\utils\data_utils\collate.py", line 142, in collate
return [collate(samples, collate_fn_map=collate_fn_map) for samples in transposed] # Backwards compatibility.
File "C:\torch\utils\data_utils\collate.py", line 142, in
return [collate(samples, collate_fn_map=collate_fn_map) for samples in transposed] # Backwards compatibility.
File "C:\torch\utils\data_utils\collate.py", line 119, in collate
return collate_fn_map[elem_type](batch, collate_fn_map=collate_fn_map)
File "C:\torch\utils\data_utils\collate.py", line 171, in collate_numpy_array_fn
return collate([torch.as_tensor(b) for b in batch], collate_fn_map=collate_fn_map)
File "C:\torch\utils\data_utils\collate.py", line 119, in collate
return collate_fn_map[elem_type](batch, collate_fn_map=collate_fn_map)
File "C:*\torch\utils\data_utils\collate.py", line 162, in collate_tensor_fn
return torch.stack(batch, 0, out=out)
RuntimeError: stack expects each tensor to be equal size, but got [768, 768] at
entry 0 and [768, 768, 3] at entry 1
I wrote the following command: $ python main.py --model deeplabv3plus_mobilenet --dataset cityscapes --enable_
vis --vis_port 8097 --gpu_id 0 --lr 0.1 --crop_size 768 --batch_size 4 --outp
ut_stride 16 --data_root ./datasets/data/cityscapes
Do any of you have an idea how to solve this error?
Thanks