[TVM Integration] Support TVM conversion with FP16 data type
sxjscience opened this issue · comments
Xingjian Shi commented
Description
We fixed TVM integration in GluonNLP recently for fp32 dtype. However, we still do not support fp16 dtype. We should
- Revise the test to add FP16:
gluon-nlp/tests/test_models.py
Lines 65 to 77 in e8d4c8a
- Revise the benchmark to add TVM + FP16:
gluon-nlp/scripts/benchmarks/benchmark_utils.py
Lines 610 to 623 in e8d4c8a