EPFL Machine Learning and Optimization Laboratory (epfml)

EPFL Machine Learning and Optimization Laboratory

epfml

Geek Repo

Location:Lausanne, Switzerland

Home Page:mlo.epfl.ch

Github PK Tool:Github PK Tool

EPFL Machine Learning and Optimization Laboratory's repositories

ML_course

EPFL Machine Learning Course, Fall 2024

Language:Jupyter NotebookStargazers:1237Issues:93Issues:21

OptML_course

EPFL Course - Optimization for Machine Learning - CS-439

Language:Jupyter NotebookStargazers:1117Issues:75Issues:10

attention-cnn

Source code for "On the Relationship between Self-Attention and Convolutional Layers"

Language:PythonLicense:Apache-2.0Stargazers:1077Issues:27Issues:10

landmark-attention

Landmark Attention: Random-Access Infinite Context Length for Transformers

Language:PythonLicense:Apache-2.0Stargazers:405Issues:40Issues:15

collaborative-attention

Code for Multi-Head Attention: Collaborate Instead of Concatenate

Language:PythonLicense:Apache-2.0Stargazers:148Issues:14Issues:6

disco

DISCO is a code-free and installation-free browser platform that allows any non-technical user to collaboratively train machine learning models without sharing any private data.

Language:TypeScriptLicense:Apache-2.0Stargazers:142Issues:12Issues:311

powersgd

Practical low-rank gradient compression for distributed optimization: https://arxiv.org/abs/1905.13727

Language:PythonLicense:MITStargazers:140Issues:11Issues:18
Language:Jupyter NotebookLicense:NOASSERTIONStargazers:129Issues:7Issues:5
Language:PythonLicense:Apache-2.0Stargazers:73Issues:6Issues:0
Language:PythonLicense:MITStargazers:47Issues:4Issues:0

optML-pku

summer school materials

error-feedback-SGD

SGD with compressed gradients and error-feedback: https://arxiv.org/abs/1901.09847

Language:Jupyter NotebookLicense:MITStargazers:29Issues:7Issues:2
Language:PythonLicense:Apache-2.0Stargazers:14Issues:4Issues:0
Language:PythonLicense:Apache-2.0Stargazers:14Issues:5Issues:0

relaysgd

Code for the paper “RelaySum for Decentralized Deep Learning on Heterogeneous Data”

Language:Jupyter NotebookLicense:MITStargazers:10Issues:7Issues:0

easy-summary

difficulty-guided text summarization

Language:PythonLicense:Apache-2.0Stargazers:5Issues:6Issues:0
Language:PythonLicense:MITStargazers:5Issues:4Issues:0
Language:PythonLicense:Apache-2.0Stargazers:4Issues:4Issues:0

personalized-collaborative-llms

Exploration on-device self-supervised collaborative fine-tuning of large language models with limited local data availability, using Low-Rank Adaptation (LoRA). We introduce three distinct trust-weighted gradient aggregation schemes: weight similarity-based, prediction similarity-based and validation performance-based.

Language:PythonLicense:Apache-2.0Stargazers:2Issues:3Issues:1

cifar

MLO internal cifar 10 / 100 default implementation / reference implementation. single machine, variable batch sizes, allowing maybe gradient compression. need to have clear documentation to make it easy to use, and so that we don't loose time with looking for hyperparameters. we can later keep it in sync with mlbench too, but self-contained is even better

Language:PythonStargazers:0Issues:14Issues:1
Language:PythonStargazers:0Issues:0Issues:0

DoGE

Codebase for ICML submission "DOGE: Domain Reweighting with Generalization Estimation"

Stargazers:0Issues:1Issues:0

epfml-utils

Tools for experimentation and using run:ai. The aim is for these to be small self-contained utilities that are used by multiple people.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:8Issues:1
Language:PythonStargazers:0Issues:0Issues:0
Language:PythonLicense:MITStargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:3Issues:0