liyong1995 / BertModel

利用bert预训练模型生成句向量或词向量

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bert预训练模型

google的bert预训练模型:
BERT-Large, Uncased (Whole Word Masking): 24-layer, 1024-hidden, 16-heads, 340M parameters
BERT-Large, Cased (Whole Word Masking): 24-layer, 1024-hidden, 16-heads, 340M parameters
BERT-Base, Uncased: 12-layer, 768-hidden, 12-heads, 110M parameters
BERT-Large, Uncased: 24-layer, 1024-hidden, 16-heads, 340M parameters
BERT-Base, Cased: 12-layer, 768-hidden, 12-heads , 110M parameters
BERT-Large, Cased: 24-layer, 1024-hidden, 16-heads, 340M parameters
BERT-Base, Multilingual Cased (New, recommended): 104 languages, 12-layer, 768-hidden, 12-heads, 110M parameters
[BERT-Base, Multilingual Uncased (Orig, not recommended) (Not recommended, use Multilingual Casedinstead): 102 languages, 12-layer, 768-hidden, 12-heads, 110M parameters
BERT-Base, Chinese: Chinese Simplified and Traditional, 12-layer, 768-hidden, 12-heads, 110M parameters

  1. 先下载相应的预训练模型
  2. 配置conf.py里边的路径
  3. 利用extract_sen_vec.py 里的 gen_sen_vec()函数生成句向量,gen_word_vec()生成词向量

About

利用bert预训练模型生成句向量或词向量


Languages

Language:Python 100.0%