mit-han-lab/hardware-aware-transformers Issues
Used version of `fairseq`
Updatedlower loss but the BLEU is 0
UpdatedAbout the Quantization Friendly.
Closed 1Quantization on HAT.
Closed 4RAM in the used Raspberry Pi
Closed 2One question
Closed 2
[ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing