arvinChen's repositories
gpt2-pretrain
预训练中文GPT2
awesome-DeepLearning
深度学习入门课、资深课、特色课、学术案例、产业实践案例、深度学习知识百科及面试题库The course, case and knowledge of Deep Learning and AI
Bert-Chinese-Text-Classification-Pytorch
使用Bert,ERNIE,进行中文文本分类
bert-cosine-sim
Fine-tune BERT to generate sentence embedding for cosine similarity
BERT-Relation-Extraction
PyTorch implementation for "Matching the Blanks: Distributional Similarity for Relation Learning" paper
CasEE
Source code for ACL 2021 finding paper: CasEE: A Joint Learning Framework with Cascade Decoding for Overlapping Event Extraction
CasRelPyTorch
Reimplement CasRel model in PyTorch.使用PyTorch对吉林大学CasRel模型进行复现,并在百度关系抽取数据集上训练测试。
ChatGLM3
ChatGLM3 series: Open Bilingual Chat LLMs | 开源双语对话语言模型
Chinese-Text-Classification-Pytorch
中文文本分类,TextCNN,TextRNN,FastText,TextRCNN,BiLSTM_Attention,DPCNN,Transformer,基于pytorch,开箱即用。
deep-learning-for-image-processing
deep learning for image processing including classification and object-detection etc.
drf-tutorial
快速入门Django REST framework,学会开发一套自己的RESTful API服务,并且自动生成API文档。视频学习地址:
DSSM-Pytorch
Pytorch Implementation of DSSM (Deep Structured Semantic Models)
FewRel
A Large-Scale Few-Shot Relation Extraction Dataset
flask_demo
flask练习
GitHub-Chinese-Top-Charts
:cn: GitHub中文排行榜,帮助你发现高分优秀中文项目、更高效地吸收国人的优秀经验成果;榜单每周更新一次,敬请关注!
gpt2-summarization
利用GPT2生成文本摘要
learn-nlp-with-transformers
we want to create a repo to illustrate usage of transformers in chinese
LLaMA-Factory
Easy-to-use LLM fine-tuning framework (LLaMA, BLOOM, Mistral, Baichuan, Qwen, ChatGLM)
ML
machine learning
nlp-tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
rasa_chinese_book_code
《Rasa实战:构建开源对话机器人》官方随书代码 | The official source code of Rasa in Action: Building Open Source Conversational AI
Sbert-ChineseExample
Sentence-Transformers Information Retrieval example on Chinese
SimCSE-Chinese-Pytorch
SimCSE在中文上的复现,有监督+无监督
transformer
Implementation of "Attention Is All You Need" using pytorch
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities