There are 2 repositories under distribution-shift topic.
Lightweight, useful implementation of conformal prediction on real data.
Out-of-distribution detection, robustness, and generalization resources. The repository contains a professionally curated list of papers, tutorials, books, videos, articles and open-source libraries etc
Collection of awesome test-time (domain/batch/instance) adaptation methods
Frouros: an open-source Python library for drift detection in machine learning systems.
A repository and benchmark for online test-time adaptation.
Domain Adaptation for Time Series Under Feature and Label Shifts
A curated list of papers and resources about the distribution shift in machine learning.
A graph reliability toolbox based on PyTorch and PyTorch Geometric (PyG).
This repository contains the code of the distribution shift framework presented in A Fine-Grained Analysis on Distribution Shift (Wiles et al., 2022).
The official API of DoubleAdapt (KDD'23), an incremental learning framework for online stock trend forecasting, WITHOUT dependencies on the qlib package.
"Towards Semi-supervised Learning with Non-random Missing Labels" by Yue Duan (ICCV 2023)
The official implementation for ICLR23 paper "GNNSafe: Energy-based Out-of-Distribution Detection for Graph Neural Networks"
Library for the training and evaluation of object-centric models (ICML 2022)
[NeurIPS21] TTT++: When Does Self-supervised Test-time Training Fail or Thrive?
[ICLR'23] Implementation of "Empowering Graph Representation Learning with Test-Time Graph Transformation"
A python package providing a benchmark with various specified distribution shift patterns.
"Shift-Robust GNNs: Overcoming the Limitations of Localized Graph Training Data" (NeurIPS 21')
[ICLR 2023] Official Tensorflow implementation of "Distributionally Robust Post-hoc Classifiers under Prior Shifts"
Reinforcement Learning Environments for Sustainable Energy Systems
Code for "How Well Does GPT-4V(ision) Adapt to Distribution Shifts? A Preliminary Investigation"
A curated list of Robust Machine Learning papers/articles and recent advancements.
NeurIPS22 "RankFeat: Rank-1 Feature Removal for Out-of-distribution Detection"
Resources for the paper titled "Evaluating Latent Space Robustness and Uncertainty of EEG-ML Models under Realistic Distribution Shifts". Accepted at NeurIPS 2022.
The official code of IEEE S&P 2024 paper "Why Does Little Robustness Help? A Further Step Towards Understanding Adversarial Transferability". We study how to train surrogates model for boosting transfer attack.
A Python Library for Biquality Learning