Lebeau's repositories
Addressing-Class-Imbalance-FL
This is the code for Addressing Class Imbalance in Federated Learning (AAAI-2021).
amc
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Asynchronous-Federated-Unlearning
A new scalable federated learning research framework
BalanceFL
Repo. for IPSN 2022 paper: "BalanceFL: Addressing Class Imbalance in Long-Tail Federated Learning".
early_exit_dnn_analysis
This code contains all code developed to analyze early-exit DNNs considering an edge-cloud architecture.
Edge-Computing-Dataset
MEC,Edge service, Edge Application, Service Computing.
EdgeViT
This is an unofficial PyTorch implementation of EdgeViT in "EdgeViTs: Competing Light-weight CNNs on Mobile Devices with Vision Transformers", arXiv 2022.
Game-Theoretic-Deep-Reinforcement-Learning
Code of Paper "Joint Task Offloading and Resource Optimization in NOMA-based Vehicular Edge Computing: A Game-Theoretic DRL Approach", JSA 2022.
GBLM-Pruner
Are gradient information useful for pruning of LLMs?
GFL
Galaxy Federated Learning Framework (星际联邦学习框架)
GNN-RL-Model-Compression
GNN-RL Compression: Topology-Aware Network Pruning using Multi-stage Graph Embedding and Reinforcement Learning
graph_nets
PyTorch Implementation and Explanation of Graph Representation Learning papers: DeepWalk, GCN, GraphSAGE, ChebNet & GAT.
Limited-Data-Rolling-Bearing-Fault-Diagnosis-with-Few-shot-Learning
This is the corresponding repository of paper Limited Data Rolling Bearing Fault Diagnosis with Few-shot Learning
llm-kick
[ICLR 2024] Jaiswal, A., Gan, Z., Du, X., Zhang, B., Wang, Z., & Yang, Y. Compressing llms: The truth is rarely pure and never simple.
LLM-Pruner
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.
LTP-token-pruning
[KDD'22] Learned Token Pruning for Transformers
MTFL-For-Personalised-DNNs
Code for 'Multi-Task Federated Learning for Personalised Deep Neural Networks in Edge Computing', published in IEEE TPDS.
Multi-agent-path-planning
Deep learning model powered by Graph Neural Networks and Reinforcement Learning for Multi-agent path planning at @Inria
Neural-Network-Diffusion
We introduce a novel approach for parameter generation, named neural network diffusion (\textbf{p-diff}, p stands for parameter), which employs a standard latent diffusion model to synthesize a new set of parameters
NIID-Bench
Federated Learning on Non-IID Data Silos: An Experimental Study
PABEE
Code for the paper "BERT Loses Patience: Fast and Robust Inference with Early Exit".
PFL-Non-IID
Personalized federated learning simulation platform with Non-IID and unbalanced dataset
retraining-free-pruning
[NeurIPS 2022] A Fast Post-Training Pruning Framework for Transformers
sparsegpt
Code for the ICML 2023 paper "SparseGPT: Massive Language Models Can Be Accurately Pruned in One-Shot".
Vehicular-Trajectories-Processing-for-Didi-Open-Data
Vehicular trajectories processing for Didi GAIA Open Data Set
wanda
A simple and effective LLM pruning approach.