why does model not load state dict when do inference?
HIT-LiuChen opened this issue · comments
lccccccc commented
When just do inference using the model, I found that the model don't do load_state_dict
after torch.load
The related code is from line 301 to line 322 in './Conformer-main/main.py'
Zhiliang Peng commented
sorry for the later reply.
I tryed:
import torch
import models
model = models.Conformer_small_patch16().cuda()
model.load_state_dict(torch.load('./Conformer_small_patch16.pth'), strict=True)
But no problem was found.
lccccccc commented
Thanks for your reply. I mean that when I use the command
CUDA_VISIBLE_DEVICES=0, python main.py --model Conformer_tiny_patch16 --eval --batch-size 64 \
--input-size 224 \
--data-set IMNET \
--num_workers 4 \
--data-path ../ImageNet_ILSVRC2012/ \
--epochs 100 \
--resume ../Conformer_tiny_patch16.pth
in run.sh, the inference model seems not load the dict whose path is shown in "--resume".
Zhiliang Peng commented
lccccccc commented
Thanks for your reply again. I will check the code and my running environment carefully.
Zhiliang Peng commented
OK, thank u~