There are 0 repository under weight-sharing topic.
DeepErwin is a python 3.8+ package that implements and optimizes JAX 2.x wave function models for numerical solutions to the multi-electron Schrödinger equation. DeepErwin supports weight-sharing when optimizing wave functions for multiple nuclear geometries and the usage of pre-trained neural network weights to accelerate optimization.
The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
PyTorch implementation of Lessons on Parameter Sharing across Layers in Transformers
CAMERO: Consistency Regularized Ensemble of Perturbed Language Models with Weight Sharing (ACL 2022)
A PyTorch based comprehensive toolkit for weight-sharing in text classification setting.
Compressing deep neural networks with pruning, trained quantization and Huffman coding
Code for our ASP-DAC 2022 Paper: A Heuristic Exploration to Retraining-free Weight Sharing for CNN Compression
Neural Network Compression