Tao's repositories
000
Language:HTML000
000
PROP
PROP: Pre-training with Representative Words Prediction for Ad-hoc Retrieval
Apache-2.0000
SimpleReDial-v1
The sources codes of the DR-BERT model and baselines
MIT000
BERT_FP
Fine-grained Post-training for Improving Retrieval-based Dialogue Systems - NAACL 2021
000
DeepCT
DeepCT and HDCT uses BERT to generate novel, context-aware bag-of-words term weights for documents and queries.
BSD-3-Clause000