ssi-research / BayesAgg_MTL

Code that accompanies the paper Bayesian Uncertainty for Gradient Aggregation in Multi-Task Learning - Accepted to ICML2024

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Bayesian Uncertainty for Gradient Aggregation in Multi-Task Learning [ICML 2024]

As machine learning becomes more prominent there is a growing demand to perform several inference tasks in parallel. Running a dedicated model for each task is computationally expensive and therefore there is a great interest in multi-task learning (MTL). MTL aims at learning a single model that solves several tasks efficiently. Optimizing MTL models is often achieved by computing a single gradient per task and aggregating them for obtaining a combined update direction. However, these approaches do not consider an important aspect, the sensitivity in the gradient dimensions. Here, we introduce a novel gradient aggregation approach using Bayesian inference. We place a probability distribution over the task-specific parameters, which in turn induce a distribution over the gradients of the tasks. This additional valuable information allows us to quantify the uncertainty in each of the gradients dimensions, which can then be factored in when aggregating them. We empirically demonstrate the benefits of our approach in a variety of datasets, achieving state-of-the-art performance.

[Paper]

Installation Instructions

  1. Install repo:
conda create -n "BayesAgg-MTL" python=3.9
conda install pytorch==2.1.1 torchvision==0.16.1 pytorch-cuda=11.8 -c pytorch -c nvidia
pip install -e .
  1. Download the UTKFace dataset from the offical repository [Link] or kaggle [Link] and place it under experiments/utkface/dataset
  2. Download the ChestX-Ray14 dataset from kaggle [Link] and place it under experiments/ChestX_ray14/dataset

Running the code

cd experiments/xxx
python trainer.py

Where xxx is {utkface, ChestX_ray14, CIFAR_MTL}

Citation

Please cite this paper if you want to use it in your work,

@inproceedings{achituve2024bayes,
  author       = {Idan Achituve and
                  Idit Diamant and
                  Arnon Netzer and
                  Gal Chechik and
                  Ethan Fetaya},
  title        = {Bayesian Uncertainty for Gradient Aggregation in Multi-Task Learning},
  booktitle    = {Forty-first International Conference on Machine Learning, {ICML} 2024,
                  Vienna, Austria, July 21-27, 2024},
  publisher    = {OpenReview.net},
  year         = {2024},
}

About

Code that accompanies the paper Bayesian Uncertainty for Gradient Aggregation in Multi-Task Learning - Accepted to ICML2024


Languages

Language:Python 100.0%