Heterogeneous Federated Learning (HtFL)
Standard federated learning, e.g., FedAvg, assumes that all the participating clients build their local models with the same architecture, which limits its utility in real-world scenarios. In practice, each client can build its model with a specific model architecture for a specific local task.
Scenarios and datasets
Here, we only show the MNIST dataset in the label skew scenario generated via Dirichlet distribution for example. Please refer to my other repository PFLlib for more help.
You can also modify codes in PFLlib to support model heterogeneity scenarios, but it requires much effort. In this repository, you only need to configure system/main.py
to support model heterogeneity scenarios.
Note: you may need to manually clean checkpoint files in the temp/
folder via system/clean_temp_files.py
if your program crashes accidentally. You can also set a checkpoint folder by yourself to prevent automatic deletion using the -sfn
argument in the command line.
Data-free algorithms with code (updating)
- Local — Each client trains its model locally without federation.
- FedDistill — Federated Knowledge Distillation 2020
- FML — Federated Mutual Learning 2020
- LG-FedAvg — Think Locally, Act Globally: Federated Learning with Local and Global Representations 2020
- FedGen — Data-Free Knowledge Distillation for Heterogeneous Federated Learning ICML 2021
- FedProto — FedProto: Federated Prototype Learning across Heterogeneous Clients AAAI 2022
- FedKD — Communication-efficient federated learning via knowledge distillation Nature Communications 2022
- FedGH — FedGH: Heterogeneous Federated Learning with Generalized Global Header ACM MM 2023