ai-forever / ner-bert

BERT-NER (nert-bert) with google bert https://github.com/google-research.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This model use a lots of memory

EricAugust opened this issue · comments

Beside bert model, after I train my model.
Load bert and trained model, also data, it need 4.5G, this is very large.
And during this situation, very hard to deploy online.
So is there anyway to reduce memory use?

reduce number of your parametres ))

only BERT embedder need about 2 G gpu memory (in eval mode, without any additional layers on top).

I only add one lstm layer with 256 hiden states and a crf layer.

it is the reality. Otherwise you have to reduce the parameters