Attention block
JanPalasek opened this issue · comments
Hi, could you please explain where you got the implementation for your attention block? I've studied papers reference to global attention unit in semantic segmentation ([31] https://arxiv.org/pdf/1805.10180.pdf - Pyramid Attention Network
for Semantic Segmentation) and also ([30] https://arxiv.org/pdf/1706.03762.pdf - Attention Is All You Need) and your implementation is significantly different.
Contacted the authors, it seems to be the correct implementation.