Fangkai Jiao's repositories
Self-Training-MRC
This is the pytorch implementation of the long paper on ACL 2020: A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction.
KK-s-Paperlist
A list of papers for machine learning, reinforcement learning, NLP or something interesting
Retrieval-based-Pre-training-for-Machine-Reading-Comprehension
Source code of the paper - REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
MemoReader
MemoReader: Large-Scale Reading Comprehension through Neural Memory Controller
MnemonicReader
A PyTorch implementation of Mnemonic Reader for the Machine Comprehension task
NLPUtilsInPython
Some utils for nlp in python
retrieval-based-mrc-pretraining
REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
awesome-multimodal-ml
Reading list for research topics in multimodal machine learning
coqa-baselines
The baselines used in the CoQA paper
FlowQA
Implementation of conversational QA model: FlowQA (with slight improvement)
learning_to_retrieve_reasoning_paths
The official implementation of ICLR 2020, "Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering".
lianjia-beike-spider
链家网和贝壳网房价爬虫,采集北京上海广州深圳等21个**主要城市的房价数据(小区,二手房,出租房,新房),稳定可靠快速!支持csv,MySQL, MongoDB,Excel, json存储,支持Python2和3,图表展示数据,注释丰富 🚁,点星支持
MnemonicReaderAllennlp
PyTorch implementation of MnemonicReader based on Allennlp
NLP-Conferences-Code
NLP-Conferences-Code (ACL、EMNL、NAACL、COLING、AAAI、IJCAI)
NLP-progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
pytorch-template
A simple research template for fast coding
pytorch-tutorial
PyTorch Tutorial for Deep Learning Researchers
strategyqa
The official code of TACL 2021, "Did Aristotle Use a Laptop? A Question Answering Benchmark with Implicit Reasoning Strategies".
trade-dst
Source code for transferable dialogue state generator (TRADE, Wu et al., 2019). https://arxiv.org/abs/1905.08743