Purfview / whisper-standalone-win

Whisper & Faster-Whisper standalone executables for those who don't want to bother with Python.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support for other language models

pulpul-s opened this issue · comments

Is it possible to support models from for example Hugging Face? I'd like to try a Finnish optimized model https://huggingface.co/Finnish-NLP/whisper-large-finnish-v3

Whisper-Faster> .\whisper-faster.exe --model fin-large-v3 --sentence --language Finnish '..\test.mp4'

Standalone Faster-Whisper r167.4 running on: CUDA


Warning: 'large-v3' model may produce inferior results, better use 'large-v2'!

Traceback (most recent call last):
  File "D:\whisper-fast\__main__.py", line 1104, in <module>
  File "D:\whisper-fast\__main__.py", line 972, in cli
  File "faster_whisper\transcribe.py", line 136, in __init__
RuntimeError: Unsupported model binary version. This executable supports models with binary version v6 or below, but the model has binary version v149936. This usually means that the model was generated by a later version of CTranslate2. (Forward compatibility is not guaranteed.)
[35420] Failed to execute script '__main__' due to unhandled exception!

You can use fine-tuned models for Faster-Whisper.
But the model in your link is for original OpenAI's Whisper.
To use this model you can use standalone Whisper -> https://github.com/Purfview/whisper-standalone-win/releases/tag/Whisper-OpenAI

Or you can convert that model to be compatible with Faster-Whisper, there is guide: https://github.com/SYSTRAN/faster-whisper#model-conversion