num_classes during model creation
SupremeLobster opened this issue · comments
Martí Mas Fullana commented
If my understanding is correct, when you do:
self.class_embed = nn.Linear(hidden_dim, num_classes)
at DeformableDETR() in deformable_detr.py, "num_classes" should instead be "num_classes + 1"
The same thing goes for:
self.class_embed.bias.data = torch.ones(num_classes) * bias_value
in the same DeformableDETR() function, where it should be "num_classes + 1" instead of "num_classes".
Otherwise I'm just getting confused somewhere, but I'm pretty sure there should be an extra background class logit.
That is how it was done in the original DETR code anyways.