sadafsk / FL_RD

This code implements federated learning when the data across different users are compressed with a rate R. We use a rate-distortion approach to define a metric for distortion of gradients which are compressed. The accuracy of a DNN setup is studied versus different compression rates.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

FL_RD

Project: Lossy Gradient Compression: How Much Accuracy Can One Bit Buy?

Acknowledgment: We used some parts of the federated learning code in the following link to implement our simulations.

https://github.com/nicolagulmini

About

This code implements federated learning when the data across different users are compressed with a rate R. We use a rate-distortion approach to define a metric for distortion of gradients which are compressed. The accuracy of a DNN setup is studied versus different compression rates.


Languages

Language:Python 100.0%