MSA attention gated problem
CiaoHe opened this issue · comments
He Cao commented
Hi Phil:
Compare the 'gating' unit with the original algorithm:
alphafold2/alphafold2_pytorch/alphafold2.py
Line 147 in 586792d
I thought gating here just missed a 'sigmoid()', please have a check.
Best,
Phil Wang commented