Attention Map
amindehnavi opened this issue · comments
Amin Dehnavi commented
Hi, is there any way to generate the output attention maps of model.transformer.decoder.layers[i].cross_attn
layer? when I follow the referenced functions, I finally get stuck in MSDA.ms_deform_attn_forward
function in the forward method of the MSDeformAttnFunction
class which is located at ./models/ops/functions/ms_deform_attn_func.py
file, and I couldn't find any argument to set True to get the attention map in output.
./models/deformable_transformer_plus/DeformableTransformerDecoderLayer