Small Bugs in how you populate the memory bank in Test.py
dzlin99 opened this issue · comments
dzlin99 commented
In test.py on line 26 and on 84, you guys have the following lines
temploader = torch.utils.data.DataLoader(trainloader.dataset, batch_size=100, shuffle=False, num_workers=1)
for batch_idx, (inputs, targets, indexes) in enumerate(temploader):
batchSize = inputs.size(0)
features = net(inputs)
features = torch.nn.functional.normalize(features)
trainFeatures[:, batch_idx * batchSize:batch_idx * batchSize + batchSize] = features.data.t()
However, say you have a batch size of 100 in a dataset of length 450. On your last line in trainFeatures, the batch size will be 50, and instead of changing indices 400-450 while repopulating, you guys will be changing 200-250 as batch_idx * batch_size is 4 * 50 + 50 instead of 4 * 100 + 50. It should instead be something
orginal_batch_size = 100
temploader = torch.utils.data.DataLoader(trainloader.dataset, batch_size=original_batch_size, shuffle=False, num_workers=1)
for batch_idx, (inputs, targets, indexes) in enumerate(temploader):
batchSize = inputs.size(0)
features = net(inputs)
features = torch.nn.functional.normalize(features)
trainFeatures[:, batch_idx * original_batch_size:batch_idx * original_batch_size + batchSize] = features.data.t()