nyukat / breast_cancer_classifier

Deep Neural Networks Improve Radiologists' Performance in Breast Cancer Screening

Home Page:https://ieeexplore.ieee.org/document/8861376

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How do we manipulate the tensors with required_grad=True?

bhosalems opened this issue · comments

Hi,

I am writing a training procedure, and was taking the help of run_model() code, here probabilities across the vies are averaged. But that works only if tensor is detached as NumPy array.

def run_model() { ..
  batch_predictions = compute_batch_predictions(output, mode=parameters["model_mode"])
  pred_df = pd.DataFrame({k: v[:, 1] for k, v in batch_predictions.items()})
  pred_df.columns.names = ["label", "view_angle"]
  predictions = pred_df.T.reset_index().groupby("label").mean().T[LABELS.LIST].values
}

Can I dod something like below -

def run_model() { ..
  gt = np.transpose(birads_labels['label'].values.reshape(predictions.shape[1], 1))
  gt = torch.tensor(gt, dtype=torch.float, requires_grad=True)
  predictions = torch.tensor(predictions, requires_grad=True)
  l = loss(predictions, gt)
  l.backward()
  optimizer.step()

So I was wondering can we detach the tensor in compute_bach_predictions() but still can use it later as a tensor with required grad true to backpropagate?

If you mean to append the second part to the first, it won't work as you desired to backpropagate through the entire model. For more information, you should check out the PyTorch documentation.

Yeah, once you detach you break the graph and so there's no backpropagation of gradients. I did average directly on tensors based on labels, and it is working now. Thanks anyway.