lucidrains / point-transformer-pytorch

Implementation of the Point Transformer layer, in Pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

请问,论文中的self-Attention对基数(cardinality)不变,这怎么理解?

swzaaaaaaa opened this issue · comments