OpenNMT / CTranslate2

Fast inference engine for Transformer models

Home Page:https://opennmt.net/CTranslate2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

faster-whisper inferencing problem

glory03023 opened this issue · comments

Hello, how are you?
I am building faster-whisper windows POC by ctranslate2.
And i created ctranslate2::models::Whisper object whisperpool.
and write the code such as:
std::vector<std::futurectranslate2::models::WhisperGenerationResult> results;
results = whisper_pool.generate(features, prompts, whisper_options);
At this time, how can i get the result string from results object.?

To get the result you could follow this minimum example.:

std::vector<ctranslate2::models::WhisperGenerationResult> outputs;
for (auto& result : results) {
  outputs.push_back(result.get);
}
// get result from outputs
...

Hi, @minhthuc2502 , thank you for your guide.
have you ever tried to make faster-whisper run with pure c/c++ code?
I have made visual studio project for windows POC.
here is the link: https://github.com/glory03023/faster-whisper-windows.git
And still getting error when running, could you please see on it and guide me what should i do to run it?