hlt-mt / FBK-fairseq

Repository containing the open source code of works published at the FBK MT unit.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About NE emb

Crabbit-F opened this issue · comments

Would you mind introduce the 'NE emb' detailly? How to get the vector of it? Thanks!

Hi, what do you mean exactly? The "+ NE emb." architecture from the paper corresponds to the conformer_with_tags architecture in the code with the --add-tags-embeddings argument (see the training command in https://github.com/hlt-mt/FBK-fairseq/blob/master/fbk_works/JOINT_ST_NER2023.md#parallel-joint-st-and-ner).

In the code, this corresponds to the ConformerWithTagsModel model, and the learned weights for the NE tags can be taken from the the decoder (TransformerDecoderWithTags), in particular in the field tags_embeddings.

Thanks a lot! ConformerWithTagsModel mode is helpful to me.

Glad that it helped.