CCA8290 / External-Attention-pytorch

Pytorch implementation of "Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

External-Attention-pytorch

Pytorch implementation of "Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks"

Pytorch implementation of "Attention Is All You Need---NIPS2017"

Pytorch implementation of "Squeeze-and-Excitation Networks---CVPR2018"

Pytorch implementation of "Selective Kernel Networks---CVPR2019"


1. External Attention Usage

1.1. Paper

"Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks"

1.2. Overview

1.3. Code

from ExternalAttention import ExternalAttention
import torch

input=torch.randn(50,49,512)
ea = ExternalAttention(d_model=512,S=8)
output=ea(input)
print(output.shape)

2. Self Attention Usage

2.1. Paper

"Attention Is All You Need"

1.2. Overview

1.3. Code

from SelfAttention import ScaledDotProductAttention
import torch

input=torch.randn(50,49,512)
sa = ScaledDotProductAttention(d_model=512, d_k=512, d_v=512, h=8)
output=sa(input,input,input)
print(output.shape)

3. Simplified Self Attention Usage

3.1. Paper

None

3.2. Overview

3.3. Code

from SimplifiedSelfAttention import SimplifiedScaledDotProductAttention
import torch

input=torch.randn(50,49,512)
ssa = SimplifiedScaledDotProductAttention(d_model=512, h=8)
output=ssa(input,input,input)
print(output.shape)

4. Squeeze-and-Excitation Attention Usage

4.1. Paper

"Squeeze-and-Excitation Networks"

4.2. Overview

4.3. Code

from SEAttention import SEAttention
import torch

input=torch.randn(50,512,7,7)
se = SEAttention(channel=512,reduction=8)
output=se(input)
print(output.shape)

4. SK Attention Usage

4.1. Paper

"Selective Kernel Networks"

4.2. Overview

4.3. Code

from SKAttention import SKAttention
import torch

input=torch.randn(50,512,7,7)
se = SKAttention(channel=512,reduction=8)
output=se(input)
print(output.shape)

About

Pytorch implementation of "Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks"


Languages

Language:Python 100.0%