There are 3 repositories under pre-trained-model topic.
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
An Open-Source Framework for Prompt-Learning.
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
Pre-trained Chinese ELECTRA（中文ELECTRA预训练模型）
Official Keras & PyTorch Implementation and Pre-trained Models for Models Genesis - MICCAI 2019
PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
PERT: Pre-training BERT with Permuted Language Model
A collection of Audio and Speech pre-trained models.
Code of the CVPR 2021 Oral paper: A Recurrent Vision-and-Language BERT for Navigation
PyTorch implementation of Lambda Network and pretrained Lambda-ResNet
Source code for our EMNLP'21 paper 《Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning》
Official repository of the AAAI'2022 paper "GALAXY: A Generative Pre-trained Model for Task-Oriented Dialog with Semi-Supervised Learning and Explicit Policy Injection"
Pretrained model for Chinese Scientific Text
Simple, fast and easy to read. Yes, we use the pytorch framework!
cross-domain recommendation，transfer learning，pre-training，self-supervise learning papers and datasets
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)
Universal Joint Feature Extraction for P300 EEG Classification Using Multi-Task Autoencoder (IEEE Access)
Code for reproducing the experiments on large-scale pre-training and transfer learning for the paper "Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images" (https://arxiv.org/abs/2106.00116)
Dataset and code for ”EATN: An Efﬁcient Adaptive Transfer Network for Aspect-level Sentiment Analysis"
Meta-Learning for EEG, Sleep Staging, Transfer Learning, Pre-trained EEG, PSG datasets (IEEE Journal of Biomedical and Health Informatics)
Performance testing of 24 Machine Learning models on Raspberry Pi using TensorFlow Lite and Google Coral USB Accelerator
Powerful handwritten text recognition. A simple-to-use, unofficial implementation of the paper "TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models".
Brain Tomur Classification Using Pre-trained Models
code for Scaling Laws for Language Transfer Learning
ml5 - Simple Image Classification using MobileNet
PyTorch implementation of LS-CNN: Characterizing Local Patches at Multiple Scales for Face Recognition
A collection of multi-quality policies for continuous control tasks.