aim-uofa / AdelaiDet

AdelaiDet is an open source toolbox for multiple instance-level detection and recognition tasks.

Home Page:https://git.io/AdelaiDet

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The detail about the FCPose

JosonChan1998 opened this issue · comments

Thanks for your nice works! But I have some questions about the implement of fcpose.

  1. Does it use 2x downsampling and 8x downsampling feature to compute the mse loss together?
    if self.training and self.loss_on:
    x = torch.cat([x, p3_logits], dim = 1)
    x = self.upsampler(x)
    p1_logits = self.p1_logits(x)
    p1_loss,p3_loss = compute_loss(p1_heatmap_list, p3_heatmap_list, p1_logits, p3_logits)
    losses['p1_loss'] = p1_loss * self.heatmap_loss_weight
    losses['p3_loss'] = p3_loss * self.heatmap_loss_weight

    2.Does it use 2x downsampling feature to compute the softmax loss? As I notice that you use the upsampler to get 2x downsampling features.
    mask_logits = subnetworks_forward(offsets, weights, biases, n_inst).squeeze()
    mask_logits = mask_logits.reshape(-1, 17, H, W)
    larger_mask_logits = self.upsampler(mask_logits)

    keypoint_loss, direction_loss = \
    compute_loss_softmax(gt_bitmasks, larger_mask_logits,
    num_loss, num_instance, direction, mask_logits, gt_keypoint,
    max_ranges, self.distance_norm)

Looking for your reply! Thanks

yes, supervise on larger feat can boost the performance