Although there is no bug in this line, '1 n d' should be '1 1 d' in my opinion, it would be confused.
KKIverson opened this issue · comments
KKIverson commented
vit-pytorch/vit_pytorch/vit.py
Line 117 in 9f87d1c
Phil Wang commented
@KKIverson ohh sure, i made the change
so the b
has to be kept at the batch size, since we are concatting the cls tokens on the 1st dimension
KKIverson commented
@KKIverson ohh sure, i made the change
so the
b
has to be kept at the batch size, since we are concatting the cls tokens on the 1st dimension
Thanks a lot. Actually '1' or 'n' makes no differences, but as a rookie, I think it has some influence on my understanding. ^_^