idiap / pkwrap

A pytorch wrapper for LF-MMI training and parallel training in Kaldi

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

RuntimeError: CUDA out of memory. Tried to allocate… but memory is empty

pchampio opened this issue · comments

I was constantly getting a out of memory error even through I reduced my minibatch size.

I've replace

minibatch_size="1:64", # TODO: this should come from a config

and
minibatch_size="1:64",

to

minibatch_size=f"1:{chain_opts.minibatch_size.split(",")[-1]}", 

Is this correct ?
Thanks for this toolkit!

Right, that would work too. But before that I would test with a small minibatch size like 8 or 16. That should resolve your error, imo.

thanks!