Vaibhavs10 / insanely-fast-whisper

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

torch_dtype only for torch.float16?

yumianhuli1 opened this issue · comments

Does inference currently only support torch_dtype=torch.float16?int8_float16、int8 will be supported?