WenjieDu / SAITS

The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516

Home Page:https://doi.org/10.1016/j.eswa.2023.119619

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Calculation of the loss function

x-phantaci opened this issue · comments

Thank you for excellent work!

The imputation loss of MIT is not covered the complement feature vector in the code.

Secondly, the paper also talks about taking the raw data X without artificially-masking as input to the MIT formula, and I found in the corresponding code that you used the manual masked X^.

Is there something I don't understand. I look forward to your resolution of my doubts!

Hi there,

Thank you so much for your attention to SAITS! If you find SAITS is helpful to your work, please star⭐️ this repository. Your star is your recognition, which can let others notice SAITS. It matters and is definitely a kind of contribution.

I have received your message and will respond ASAP. Thank you again for your patience! 😃

Best,
Wenjie

Hi,

Thank you for your feedback. I think you may get confused by the code at modeling/SA_models.py#L199. We use X_tilde_3 for calculating the imputation loss of MIT. It has no problem because the imputation loss comes from the artificially missing values and all imputations in the complement feature vector come from X_tilde_3.

I don't understand your second point. In the line of code for MIT loss calculation that I mentioned above, we use inputs['X_holdout'] as the target, namely the raw data X, rather than X_hat. Could you please point out the line of code in which file you have questions? Thank you.

I really appreciate your reply.

Your reply answered my doubts. The second question is caused by my carelessness, which has been solved now. Thank you.

Best regards,
Tian Yang

Absolutely my pleasure 😃