epfml / ChocoSGD

Decentralized SGD and Consensus with Communication Compression: https://arxiv.org/abs/1907.09356

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Choco-SGD

This repository provides code for communication-efficient decentralized ML training (both deep learning, compatible with PyTorch, and traditional convex machine learning models.

We provide code for the main experiments in the papers

Please refer to the folders convex_code and dl_code for more details.

References

If you use the code, please cite the following papers:

@inproceedings{koloskova2019choco,
    title = {Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication},
    author = {Anastasia Koloskova and Sebastian U. Stich and Martin Jaggi},
    booktitle = {ICML 2019 - Proceedings of the 36th International Conference on Machine Learning},
    url = {http://proceedings.mlr.press/v97/koloskova19a.html},
    publisher = {PMLR}, 
    volume = {97},
    pages = {3479--3487},
    year = {2019}
}

and

@inproceedings{koloskova2020decentralized,
  title={Decentralized Deep Learning with Arbitrary Communication Compression},
  author={Anastasia Koloskova* and Tao Lin* and Sebastian U Stich and Martin Jaggi},
  booktitle={ICLR 2020 - International Conference on Learning Representations},
  year={2020},
  url={https://openreview.net/forum?id=SkgGCkrKvH}
}

About

Decentralized SGD and Consensus with Communication Compression: https://arxiv.org/abs/1907.09356

License:Apache License 2.0


Languages

Language:Python 54.5%Language:Jupyter Notebook 43.1%Language:Shell 1.6%Language:Dockerfile 0.9%