Lu Yin's starred repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
mixture-of-experts
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
html-resume
A single-page resume template completely typeset with HTML & CSS.
Learning-Loss-for-Active-Learning
Reproducing experimental results of LL4AL [Yoo et al. 2019 CVPR]
DynamicReLU
Implementation of Dynamic ReLU on Pytorch
BERT-Tickets
[NeurIPS 2020] "The Lottery Ticket Hypothesis for Pre-trained BERT Networks", Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Zhangyang Wang, Michael Carbin
Random_Pruning
[ICLR 2022] The Unreasonable Effectiveness of Random Pruning: Return of the Most Naive Baseline for Sparse Training by Shiwei Liu, Tianlong Chen, Xiaohan Chen, Li Shen, Decebal Constantin Mocanu, Zhangyang Wang, Mykola Pechenizkiy
git-re-basin-pytorch
Git Re-Basin: Merging Models modulo Permutation Symmetries in PyTorch
FreeTickets
[ICLR 2022] "Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity" by Shiwei Liu, Tianlong Chen, Zahra Atashgahi, Xiaohan Chen, Ghada Sokar, Elena Mocanu, Mykola Pechenizkiy, Zhangyang Wang, Decebal Constantin Mocanu
Junk_DNA_Hypothesis
"Junk DNA Hypothesis: A Task-Centric Angle of LLM Pre-trained Weights through Sparsity" Lu Yin, Shiwei Liu, Ajay Jaiswal, Souvik Kundu, Zhangyang Wang
Selfish-RNN
[ICML 2021] "Selfish Sparse RNN Training" by Shiwei Liu, Decebal Constantin Mocanu, Yulong Pei, Mykola Pechenizkiy
Knowledge-Elicitation-using-Deep-Metric-Learning-and-Psychometric-Testing
Codes for "Knowledge Elicitation using Deep Metric Learning and Psychometric Testing" (ECML 2020)
Generating-the-simple-shape-dataset
Generate a simple shape dataset with different colors, shapes, thicknesses, and heights.
SET-MLP-ONE-MILLION-NEURONS
[Neural Computing and Applications] "Sparse evolutionary Deep Learning with over one million artificial neurons on commodity hardware" by Shiwei Liu, Decebal Constantin Mocanu, Mykola Pechenizkiy