Attention to 2D input
raghavgurbaxani opened this issue · comments
Hi @CyberZHG
Thank you for your work on this repo. I am trying to use your repo for a time series forecasting problem with 2D tensor input to the attention module , my model -
Layer (type) Output Shape Param # Connected to
features (InputLayer) (None, 16, 1816) 0
__________________________________________________________________________________________________
lstm_1 (LSTM) (None, 2048) 31662080 features[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 1024) 2098176 lstm_1[0][0]
__________________________________________________________________________________________________
leaky_re_lu_2 (LeakyReLU) (None, 1024) 0 dense_2[0][0]
__________________________________________________________________________________________________
dense_3 (Dense) (None, 120) 123000 leaky_re_lu_2[0][0]
__________________________________________________________________________________________________
feature_weights (InputLayer) (None, 120) 0
__________________________________________________________________________________________________
multiply_1 (Multiply) (None, 120) 0 dense_3[0][0]
feature_weights[0][0]
Total params: 33,883,256
Trainable params: 33,883,256
Non-trainable params: 0
__________________________________________________________________________________________________
However your attention module requires a 3D input, can you suggest necessary changes to make it work after the LSTM layer, i.e. 2D (None,2048) input ?
Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.