Zhimin Lin's repositories
tianchi-gaic-track3-share
天池人工智能创新赛3-ch12hu团队周星星分享
libtorch_tokenizer
BERT Tokenizer in C++
pyspellchecker
Pure Python Spell Checking http://pyspellchecker.readthedocs.io/en/latest/
2021_QQ_AIAC_Tack1_1st
QQ浏览器2021AI算法大赛赛道一 第1名 方案
albumentations
Fast image augmentation library and easy to use wrapper around other libraries. Documentation: https://albumentations.ai/docs/ Paper about library: https://www.mdpi.com/2078-2489/11/2/125
chatglm_finetuning
chatglm 6b finetuning and alpaca finetuning
EasyTransfer
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
EfficientNet-PyTorch
A PyTorch implementation of EfficientNet
gaic_track3_pair_sim
全球人工智能技术创新大赛-赛道三-冠军方案
gpt-fast
Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
HowToCook
程序员在家做饭方法指南。
KDD_WinnieTheBest
KDD Cup 2020 Challenges for Modern E-Commerce Platform: Multimodalities Recall first place
Landmark2019-1st-and-3rd-Place-Solution
The 1st Place Solution of the Google Landmark 2019 Retrieval Challenge and the 3rd Place Solution of the Recognition Challenge.
LLaMA-Factory
Easy-to-use LLM fine-tuning framework (LLaMA, BLOOM, Mistral, Baichuan, Qwen, ChatGLM)
MedQA-ChatGLM
🛰️ 基于真实医疗对话数据在ChatGLM上进行LoRA、P-Tuning V2、Freeze、RLHF等微调,我们的眼光不止于医疗问答
NeZha_Chinese_PyTorch
NEZHA: Neural Contextualized Representation for Chinese Language Understanding
OpenRLHF
A Ray-based High-performance RLHF framework
Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
pretrained-models.pytorch
Pretrained ConvNets for pytorch: NASNet, ResNeXt, ResNet, InceptionV4, InceptionResnetV2, Xception, DPN, etc.
pytorch-image-models
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, EfficientNetV2, NFNet, Vision Transformer, MixNet, MobileNet-V3/V2, RegNet, DPN, CSPNet, and more
sentence-transformers
Multilingual Sentence & Image Embeddings with BERT
TextBrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers