Daniel-Jiang358 / pytorch_d2l_transformer_repo

Transformer repo based on pytorch and d2l

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

parallel calculation

V838Mon opened this issue · comments

Pardon me. Could you teach me how to implement the parallel calculation of the multi-head in the self attention block using multi for loop? I have to implement it using multi for loop. Thank you!