There are 2 repositories under self-training topic.
Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain Generalization
BOND: BERT-Assisted Open-Domain Name Entity Recognition with Distant Supervision
[CVPR 2022] ST++: Make Self-training Work Better for Semi-supervised Semantic Segmentation
A repository contains more than 12 common statistical machine learning algorithm implementations. 常见机器学习算法原理与实现
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
PyTorch code for MUST
Pay your attention into 1000 hours, and you can master anything you need.
IAST: Instance Adaptive Self-training for Unsupervised Domain Adaptation (ECCV 2020) https://teacher.bupt.edu.cn/zhuchuang/en/index.htm
Self6D++: Occlusion-Aware Self-Supervised Monocular 6D Object Pose Estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) 2021.
SLAM-Supported Semi-Supervised Learning for 6D Object Pose Estimation
Pytorch implementation of our paper: Adapting OCR with Limited Labels
Exploring prompt tuning with pseudolabels for multiple modalities, learning settings, and training strategies.
[IEEE TETCI] "ADAST: Attentive Cross-domain EEG-based Sleep Staging Framework with Iterative Self-Training"
:speech_balloon: Official PyTorch Implementation for CVPR'23 Paper, "The Dialog Must Go On: Improving Visual Dialog via Generative Self-Training"
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
Earth observations, especially satellite data, have produced a wealth of methods and results in meeting global challenges, often presented in unstructured texts such as papers or reports. Accurate extraction of satellite and instrument entities from these unstructured texts can help to link and reuse Earth observation resources.
Synthetic QA generation for long documents.
Deep Bayesian Self-Training [official implementation]
Implementation of COLING 2022 paper "Adaptive Unsupervised Self-training for Disfluency Detection"
[EACL 2021] Self-training Pretrained LMs for Zero- and Few-shot Arabic Sequence Labeling
Applied NoisyStudent in training the task of Inbed Classification
Official implementation of paper "RelationMatch: Matching In-batch Relationships for Semi-supervised Learning" (https://arxiv.org/abs/2305.10397)