yandex-research / rtdl

Research on Tabular Deep Learning: Papers & Packages

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to get feature importance scores or attention heatmap

Yuntian9708 opened this issue · comments

Hi,
I am trying to get some visualizations of interpretable results from FT-Transformer, such as feature importance or attention heatmaps. I find some discussions about feature importance in paper section 5.3, but I don't know how to achieve it. Is there a way to achieve these based on the source codes you published? Or can you make an example of implementation?
Thank you very much!

Please, see this issue

P.S. The implementation of the paper is now located here: https://github.com/yandex-research/tabular-dl-revisiting-models

If you need further help, feel free to create a new issue in that repository or continue the discussion in the issue I mentioned