OpenNMT / OpenNMT

Open Source Neural Machine Translation in Torch (deprecated)

Home Page:https://opennmt.net/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Max 1800 tokens per batch?

getao opened this issue · comments

commented

I found in the latest version, there seems to be a maximum token limitation up to 1800 in a batch. Can I change this setting? I found the gpu memory is not fully used.

yes you can change this with -max_tokens
it all depends on the size of your network and your gpu memory.