Lu Yin (luuyin)

luuyin

Geek Repo

Company:University of Aberdeen

Location:UK

Home Page:https://scholar.google.com/citations?user=G4Xe1NkAAAAJ

Github PK Tool:Github PK Tool

Lu Yin's starred repositories

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:128013Issues:1099Issues:15047

al-folio

A beautiful, simple, clean, and responsive Jekyll theme for academics

Language:HTMLLicense:MITStargazers:9732Issues:24Issues:519

openrazer

Open source driver and user-space daemon to control Razer lighting and other features on GNU/Linux

Language:CLicense:GPL-2.0Stargazers:3503Issues:66Issues:1744

mixture-of-experts

PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538

Language:PythonLicense:GPL-3.0Stargazers:885Issues:4Issues:26

mixture-of-experts

A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models

Language:PythonLicense:MITStargazers:566Issues:5Issues:10

html-resume

A single-page resume template completely typeset with HTML & CSS.

Language:HTMLLicense:Apache-2.0Stargazers:537Issues:12Issues:6

SLaK

[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity"; [ICML 2023] "Are Large Kernels Better Teachers than Transformers for ConvNets?"

Language:HTMLLicense:MITStargazers:256Issues:6Issues:19

Learning-Loss-for-Active-Learning

Reproducing experimental results of LL4AL [Yoo et al. 2019 CVPR]

DynamicReLU

Implementation of Dynamic ReLU on Pytorch

BERT-Tickets

[NeurIPS 2020] "The Lottery Ticket Hypothesis for Pre-trained BERT Networks", Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Zhangyang Wang, Michael Carbin

Language:PythonLicense:MITStargazers:138Issues:12Issues:8

Random_Pruning

[ICLR 2022] The Unreasonable Effectiveness of Random Pruning: Return of the Most Naive Baseline for Sparse Training by Shiwei Liu, Tianlong Chen, Xiaohan Chen, Li Shen, Decebal Constantin Mocanu, Zhangyang Wang, Mykola Pechenizkiy

Language:PythonStargazers:70Issues:3Issues:0

git-re-basin-pytorch

Git Re-Basin: Merging Models modulo Permutation Symmetries in PyTorch

Language:PythonLicense:MITStargazers:68Issues:4Issues:8

OWL

Official Pytorch Implementation of "Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for Pruning LLMs to High Sparsity"

Language:PythonLicense:MITStargazers:40Issues:2Issues:8

Net2Net

Net2Net implementation on PyTorch for any possible vision layers.

DSify

Boosting Driving Scene Understanding with Advanced Vision-Language Models

Language:PythonLicense:BSD-3-ClauseStargazers:31Issues:2Issues:5

Firefly

Offical Repo for Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks. Accepted by Neurips 2020.

Language:PythonLicense:MITStargazers:28Issues:1Issues:2

FreeTickets

[ICLR 2022] "Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity" by Shiwei Liu, Tianlong Chen, Zahra Atashgahi, Xiaohan Chen, Ghada Sokar, Elena Mocanu, Mykola Pechenizkiy, Zhangyang Wang, Decebal Constantin Mocanu

GraNet

[Neurips 2021] Sparse Training via Boosting Pruning Plasticity with Neuroregeneration

DCTpS

Code for testing DCT plus Sparse (DCTpS) networks

Language:PythonLicense:MITStargazers:14Issues:1Issues:1

OwLore

Official Pytorch Implementation of "OwLore: Outlier-weighed Layerwise Sampled Low-Rank Projection for Memory-Efficient LLM Fine-tuning" by Pengxiang Li, Lu Yin, Xiaowei Gao, Shiwei Liu

Language:PythonStargazers:13Issues:0Issues:0

Junk_DNA_Hypothesis

"Junk DNA Hypothesis: A Task-Centric Angle of LLM Pre-trained Weights through Sparsity" Lu Yin, Shiwei Liu, Ajay Jaiswal, Souvik Kundu, Zhangyang Wang

Language:PythonStargazers:13Issues:12Issues:0

Selfish-RNN

[ICML 2021] "Selfish Sparse RNN Training" by Shiwei Liu, Decebal Constantin Mocanu, Yulong Pei, Mykola Pechenizkiy

UGTs-LoG

This is the official code for UGTs.

Language:PythonStargazers:11Issues:0Issues:0
Language:PythonStargazers:9Issues:1Issues:0
Language:PythonStargazers:6Issues:1Issues:0

Knowledge-Elicitation-using-Deep-Metric-Learning-and-Psychometric-Testing

Codes for "Knowledge Elicitation using Deep Metric Learning and Psychometric Testing" (ECML 2020)

Language:PythonStargazers:5Issues:2Issues:0

chase

Dynamic Sparsity Is Channel-Level Sparsity Learner [Neurips 2023]

Generating-the-simple-shape-dataset

Generate a simple shape dataset with different colors, shapes, thicknesses, and heights.

Language:Jupyter NotebookStargazers:4Issues:0Issues:0

SET-MLP-ONE-MILLION-NEURONS

[Neural Computing and Applications] "Sparse evolutionary Deep Learning with over one million artificial neurons on commodity hardware" by Shiwei Liu, Decebal Constantin Mocanu, Mykola Pechenizkiy

Language:HTMLStargazers:3Issues:1Issues:0

ILM-VP

[CVPR23] "Understanding and Improving Visual Prompting: A Label-Mapping Perspective" by Aochuan Chen, Yuguang Yao, Pin-Yu Chen, Yihua Zhang, and Sijia Liu

Stargazers:1Issues:0Issues:0