pre-trained word embedding
TianlinZhang668 opened this issue · comments
TianlinZhang668 commented
Hello, i would like to use pre-trained word embedding, which code will i change. Thank you
Atul Kumar commented
You need to change code here
https://github.com/atulkum/pointer_summarizer/blob/master/data_util/data.py#L35
and here
https://github.com/atulkum/pointer_summarizer/blob/master/training_ptr_gen/model.py#L45
you might need something like
word_emb_matrix = get_word_embd(vocab, config)
embd_vector = torch.from_numpy(word_emb_matrix).float()
self.word_embeds = nn.Embedding.from_pretrained(embd_vector, freeze=False)
This link might help
https://github.com/atulkum/sequence_prediction/blob/e1659b6414ca951a8229e737ae032a9040bde81d/neural_ner/model_utils.py#L81
Let me know how it goes