There are 0 repository under fast-weight-programmers topic.
Official repository for the paper "A Modern Self-Referential Weight Matrix That Learns to Modify Itself" (ICML 2022 & NeurIPS 2021 Deep RL Workshop) and "Accelerating Neural Self-Improvement via Bootstrapping" (ICLR 2023 Workshop)
Official repository for the paper "Going Beyond Linear Transformers with Recurrent Fast Weight Programmers" (NeurIPS 2021)
Official Code Repository for the paper "Key-value memory in the brain"
Official repository for the paper "Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules" (NeurIPS 2022)
PyTorch Language Modeling Toolkit for Fast Weight Programmers
Official repository for the paper "Automating Continual Learning"
Official repository for the paper "Images as Weight Matrices: Sequential Image Generation Through Synaptic Learning Rules" (ICLR 2023)
Official repository for the paper "Blending Complementary Memory Systems in Hybrid Quadratic-Linear Transformers"
Official repository for the paper "Practical Computational Power of Linear Transformers and Their Recurrent and Self-Referential Extensions" (EMNLP 2023)
PyTorch implementation of DCT fast weight RNNs