The definition of row-wise attention and col-wise attention
CiaoHe opened this issue · comments
Hi, Luci:
Sorry, it's me again.
Here, I was confused by the definition of row-wise and col-wise attention.
alphafold2/alphafold2_pytorch/alphafold2.py
Lines 208 to 209 in 586792d
alphafold2/alphafold2_pytorch/alphafold2.py
Lines 219 to 220 in 586792d
Based on what I thought, the row w_x
should be represented by (b h) w d
, since once fetch one row, each row should have w
(width) units.
So, maybe here need an inverse of the above definition?
@CiaoHe you are right, this is not clear https://github.com/lucidrains/alphafold2/releases/tag/0.4.13 fixed!