caskcsg / ir

ConTextual Mask Auto-Encoder for Dense Passage Retrieval

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Information Retrieval Researches

Codes and models for our information retrieval research papers.

Knowledge Computing and Service Group, Institute of Information Engineering, Chinese Academy of Sciences.

Releases

CoT-MAE-qc: Query-as-context Pre-training for Dense Passage Retrieval. A simple yet effective pre-training scheme for single vector Dense Passage Retrieval. (Accepted by EMNLP 2023 Main Conference)

CoT-MAE: ConTextual Mask Auto-Encoder for Dense Passage Retrieval. CoT-MAE is a transformers based Mask Auto-Encoder pre-training architecture designed for Dense Passage Retrieval. (Accepted by AAAI 2022)

About

ConTextual Mask Auto-Encoder for Dense Passage Retrieval

License:Apache License 2.0


Languages

Language:Python 90.4%Language:Shell 9.6%