There are 1 repository under double-descent topic.
A curated list of papers of interesting empirical study and insight on deep learning. Continually updating...
MDL Complexity computations and experiments from the paper "Revisiting complexity and the bias-variance tradeoff".
Code for Arxiv Double Descent Demystified: Identifying, Interpreting & Ablating the Sources of a Deep Learning Puzzle
Explore the double-descent phenomena in the context of system identification. Companion code to the paper (https://arxiv.org/abs/2012.06341):
This repository is the official implementation of "Optimization Variance: Delve into the Epoch-Wise Double Descent of DNNs"
Interpolating Neural Networks in Asset Pricing Data. Supports Distributed Training in TensorFlow.
Double Descent results for FCNNs on MNIST, extended by Label Noise (Reconciling Modern Machine-Learning Practice and the Classical Bias–Variance Trade-Off).
This project outlines 4 experiments to explore the effects of several settings on the bias-variance tradeoff curve
ICLR 2022: Phenomenology of Double Descent in Finite-width Neural Networks
Assignments of my CST Part II Deep Neural Networks unit
Toy dataset to study double descent optimization patterns in machine learning.
A Review of Preetum Nakkiran's "More Data Can Hurt for Linear Regression: Sample-wise Double Descent"