XeniaLLL / IAPTT-GM

Code Repository for NeurIPS 2021 accepted paper, named "Torwards Gradient-based Bilevel Optimization with non-convex Followers and Beyond"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Towards Gradient-based Bilevel Optimization with non-convex Followers and Beyond

This repo contains code accompaning the paper, Towards Gradient-based Bilevel Optimization with non-convex Followers and Beyond (Liu et al., NeurIPS 2021). It includes code for running the numerical example, few-shot classification and data hyper-cleaning experiments.

Abstract

In recent years, Bi-Level Optimization (BLO) techniques have received extensive attentions from both learning and vision communities. A variety of BLO models in complex and practical tasks are of non-convex follower structure in nature (a.k.a., without Lower-Level Convexity, LLC for short). However, this challenging class of BLOs is lack of developments on both efficient solution strategies and solid the oretical guarantees. In this work, we propose a new algorithmic framework, named Initialization Auxiliary and Pessimistic Trajectory Truncated Gradient Method (IAPTT-GM), to partially address the above issues. In particular, by introducing an auxiliary as initialization to guide the optimization dynamics and designing a pessimistic trajectory truncation operation, we construct a reliable approximate version of the original BLO in the absence of LLC hypothesis. Our theoretical investigations establish the convergence of solutions returned by IAPTT-GM towards those of the original BLO without LLC. As an additional bonus, we also theoretically justify the quality of our IAPTT-GM embedded with Nesterov’s accelerated dynamics under LLC. The experimental results confirm both the convergence of our algorithm without LLC, and the theoretical findings under LLC.

Dependencies

You can simply run the following command automatically install the dependencies

pip install -r requirement.txt

This code mainly requires the following:

Data Preparation

You can download the omniglot, miniimagenet, tieredimagenet, mnist and fashionmnist dataset from the attached link, and put the dataset in the corresponding experiment/data/dataset_name folder.

Usage

You can run the python file for different applications following the script below:

Python Few_shot.py  # For few shot classification tasks.
Python  Data_hyper_cleaning.py  # For data hyper-cleaning tasks.
Python  Numerical.py  # For the non-convex numerical examples.

Citation

If you use IAPTT-GM for academic research, you are highly encouraged to cite the following paper:

License

MIT License

Copyright (c) 2021 Vision Optimizaion Group

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

About

Code Repository for NeurIPS 2021 accepted paper, named "Torwards Gradient-based Bilevel Optimization with non-convex Followers and Beyond"

License:MIT License


Languages

Language:Python 100.0%