keras-team / keras-io

Keras documentation, hosted live at keras.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Question about lack of positional encoding in timeseries_classification_transformer.py

Atousa-Kalantari opened this issue · comments

Issue Type

Bug

Source

source

Keras Version

Keras 2.14

Custom Code

Yes

OS Platform and Distribution

No response

Python version

No response

GPU model and memory

No response

Current Behavior?

I noticed that positional encoding is not used in the timeseries_classification_transformer.py example. Given the importance of sequence order in time series data, why was this omitted? Does this impact the model's effectiveness for time series classification? I'd appreciate any insights on this design choice. Thank you.

Standalone code to reproduce the issue or tutorial link

https://github.com/keras-team/keras-io/blob/master/examples/timeseries/timeseries_classification_transformer.py

Relevant log output

No response