There are 6 repositories under regularization topic.
Python for《Deep Learning》,该书为《深度学习》(花书) 数学推导、原理剖析与源码级别代码实现
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Early stopping for PyTorch
Official Pytorch implementation of CutMix regularizer
Training neural models with structured signals.
Code for reproducing Manifold Mixup results (ICML 2019)
Simple Implementation of many GAN models with PyTorch.
[CVPR 2023] DiffusioNeRF: Regularizing Neural Radiance Fields with Denoising Diffusion Models
Deep Learning Specialization courses by Andrew Ng, deeplearning.ai
Repo for "Benchmarking Robustness of 3D Point Cloud Recognition against Common Corruptions" https://arxiv.org/abs/2201.12296
机器学习-Coursera-吴恩达- python+Matlab代码实现
Codes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
Deep Learning Specialization Course by Coursera. Neural Networks, Deep Learning, Hyper Tuning, Regularization, Optimization, Data Processing, Convolutional NN, Sequence Models are including this Course.
[ICLR'21] Neural Pruning via Growing Regularization (PyTorch)
This Repository contains Solutions to the Quizes & Lab Assignments of the Machine Learning Specialization (2022) from Deeplearning.AI on Coursera taught by Andrew Ng, Eddy Shyu, Aarti Bagul, Geoff Ladwig.
[NeurIPS 2021] Well-tuned Simple Nets Excel on Tabular Datasets
The official code for the paper "Delving Deep into Label Smoothing", IEEE TIP 2021
The tools and syntax you need to code neural networks from day one.
AI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
[NeurIPS 2023] The PyTorch Implementation of Scheduled (Stable) Weight Decay.
A C++ toolkit for Convex Optimization (Logistic Loss, SVM, SVR, Least Squares etc.), Convex Optimization algorithms (LBFGS, TRON, SGD, AdsGrad, CG, Nesterov etc.) and Classifiers/Regressors (Logistic Regression, SVMs, Least Squares Regression etc.)
Implementation of key concepts of neuralnetwork via numpy
An Interactive Approach to Understanding Deep Learning with Keras