There are 1 repository under neural-tangent-kernel topic.
A curated list of papers of interesting empirical study and insight on deep learning. Continually updating...
{KFAC,EKFAC,Diagonal,Implicit} Fisher Matrices and finite width NTKs in PyTorch
Multi-framework implementation of Deep Kernel Shaping and Tailored Activation Transformations, which are methods that modify neural network models (and their initializations) to make them easier to train.
codebase for "A Theory of the Inductive Bias and Generalization of Kernel Regression and Wide Neural Networks"
Neural Tangent Kernel (NTK) module for the scikit-learn library
We propose a lossless compression algorithm based on the NTK matrix for DNN. The compressed network yields asymptotically the same NTK as the original (dense and unquantized) network, with its weights and activations taking values only in {0, 1, -1} up to scaling.
Study of the paper 'Neural Thompson Sampling' published in October 2020
TF2 Implementation of Physics Informed Neural Networks and Neural Tangent Kernel
A Python implementation of the Neural Tangent Kernel (jacot et al, 2018)
Code accompanying the paper "On the adaptation of recurrent neural networks for system identification"
Implementation of "Deep Learning in Random Neural Fields: Numerical Experiments via Neural Tangent Kernel"
Code for "Learnware Reduced Kernel Mean Embedding Specification Based on Neural Tangent Kernel"
Empirical analysis of the Laplace and neural tangent kernel reproducing kernel Hilbert space (RKHS)
Senior Project for Statistics & Data Science at Yale University
Understand the spectral bias of deep learning through the study of NTK
Yale S&DS 432 final project studying lazy training dynamics for differentiable optimization problems
Implementation of Approximate Smooth Kernel Value Iteration
Official repository of our work "Finding Lottery Tickets in Vision Models via Data-driven Spectral Foresight Pruning" accepted at CVPR 2024
Code for paper