BGU-CS-VIL / dtan

Official PyTorch implementation for our NeurIPS 2019 paper, Diffeomorphic Temporal Alignment Nets. TensorFlow\Keras version is available at tf_legacy branch.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

IndexError: theta transpose

gauravkuppa opened this issue · comments

I am working off of the PyTorch branch, and I am trying to use different data with DTAN. For some data, the PyTorch branch works perfectly well.

For other data, I get this following error: IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)

Here is the traceback:

Traceback (most recent call last):
  File "/home/dtan/examples/UCR_alignment.py", line 115, in <module>
    run_UCR_alignment(args, dataset_name='pcgg_first_fourth')#, dataset_name="pfi_transfer_ECG")
  File "/home/dtan/examples/UCR_alignment.py", line 102, in run_UCR_alignment
    model = train(args, train_loader, validation_loader, DTANargs, Experiment, print_model=True)
  File "/home/dtan/models/train_model.py", line 40, in train
    train_loss = train_epoch(train_loader, device, optimizer, model, channels, DTANargs)
  File "/home/dtan/models/train_model.py", line 72, in train_epoch
    loss = alignment_loss(output, target, thetas, channels, DTANargs)
  File "/home/dtan/DTAN/alignment_loss.py", line 40, in alignment_loss
    prior_loss += 0.1*smoothness_norm(DTANargs.T, theta, DTANargs.lambda_smooth, DTANargs.lambda_var, print_info=False)
  File "/home/dtan/DTAN/smoothness_prior.py", line 103, in smoothness_norm
    theta_T = torch.transpose(theta, 0, 1)
IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)

Any idea what might be causing this? Is there anything I have to do to the data to allow it to the data to make it work?

Thank you!

Completely missed this issue, terribly sorry!

The issue was fixed.
It happens when the per-class batchsize=1, which might happen for the validation set.
There was a redundant torch.squeeze at the prior computation which removed the batch dim. I have removed it and it seems to fix this issue.

Ron