求解答
teorzhang opened this issue · comments
Hwq commented
请问代码中low_encoder和high_encoder中的low和high代表什么意思呢?
Hannibal046 commented
low: low-level feature(token)
high: high-level feature(utterance)
Hwq commented
明白了,感谢回复。另外想问一下,Static-Dynamic Fusion Module是在GraphTransformerMultiHeadAttentionLayer中体现吗?
Hannibal046 commented
Hi, 具体的代码实现可以参考:
SDDS/src/model/graphtransformer.py
Lines 194 to 197 in 8ba2a10
Hwq commented
明白了,感谢
Hannibal046 commented
BART的源代码我也是用的Huggingface的版本: https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/modeling_bart.py
Baseline代码可以参考:
https://github.com/huggingface/transformers/tree/main/examples/pytorch/summarization
Hwq commented
好的 感谢