vivianzy1985's repositories
ALBERT_NER_KERAS
利用ALBERT和BI-LSTM,在Keras中实现序列标注算法。
annoy
Approximate Nearest Neighbors in C++/Python optimized for memory usage and loading/saving to disk
bert-as-service
Mapping a variable-length sentence to a fixed-length vector using BERT model
BERT-BiLSTM-CRF-NER
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services
bert-of-theseus-tf
tensorflow version of bert-of-theseus
bert-utils
一行代码使用BERT生成句向量,BERT做文本分类、文本相似度计算
bert4keras
keras implement of transformers for humans
CasRel
A Novel Cascade Binary Tagging Framework for Relational Triple Extraction. Accepted by ACL 2020.
Chatbot_CN
基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
DocBank
DocBank: A Benchmark Dataset for Document Layout Analysis
EDA_NLP_for_Chinese
An implement of the paper of EDA for Chinese corpus.中文语料的EDA数据增强工具。NLP数据增强。论文阅读笔记。
ESIM-keras
keras implementation for paper - Enhanced LSTM for Natural Language Inference, tested on Quora Question Pairs dataset.
FastBERT
The score code of FastBERT (ACL2020)
Financial-Knowledge-Graphs
小型金融知识图谱构建流程
Graph-Bert
Source code of Graph-Bert
Jiagu
Jiagu深度学习自然语言处理工具 知识图谱关系抽取 中文分词 词性标注 命名实体识别 情感分析 新词发现 关键词 文本摘要 文本聚类
K-BERT
Source code of K-BERT
keras-bert
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
keras_recompute
saving memory by recomputing for keras
LeetCode-Solution-Well-Formed
我的 LeetCode 做题记录,正在加紧练习中。
nlpcda
一键中文数据增强包 ;中文EDA; NLP数据增强、NER数据增强 :pip install nlpcda
OpenNRE
An Open-Source Package for Neural Relation Extraction (NRE)
pretrained-models
Open Language Pre-trained Model Zoo
pycorrector
pycorrector is a toolkit for text error correction. It was developed to facilitate the designing, comparing, and sharing of deep text error correction models.
sequence_tagging
using bilstm-crf,bert and other methods to do sequence tagging task
simbert
a bert for retrieval and generation
unilm
UniLM - Unified Language Model Pre-training / Pre-training for NLP and Beyond