yxuansu / TaCL

[NAACL'22] TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning

Home Page:https://arxiv.org/abs/2111.04198

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how can i get the embedding ?

mathshangw opened this issue · comments

excuse me how can I extract_embeddings for specific sentences using your model . I used for example keras-bert and it has already a function called extract_embeddings as I need to make your model as an embedding layer using keras.
Appreciate your help , Thanks

excuse me how can I extract_embeddings for specific sentences using your model . I used for example keras-bert and it has already a function called extract_embeddings as I need to make your model as an embedding layer using keras. Appreciate your help , Thanks

Hi, thank you for your interest in our work. Could you be more specific on which layer's embedding you would like to extract?

Feel free to reopen the issue.

sorry for the late reply , i need to extract word-embedding