demoleiwang / awesome-mixup

A paper list about mixup.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

awesome-mixup

This repo is a collection of AWESOME things about mixup, including papers, code, etc. Feel free to star and fork. We borrow a lot from openmixup, Awesome-Mixup, awesome-domain-adaptation, and PromptPapers.

Some of these papers are summarized with tables in Google Sheet. Please find the link here: Summary(Restricted)

Basics

This section contains the exploration on the improvements aspects of raw mixup.

  1. [mixup'18] mixup: Beyond Empirical Risk Minimization. ICLR 2018. [code].

    Hongyi Zhang, Moustapha Cisse, Yann N. Dauphin, David Lopez-Paz.

  2. [Manifold Mixup'19] Manifold Mixup: Better Representations by Interpolating Hidden States. ICML 2019. [code]

    Vikas Verma, Alex Lamb, Christopher Beckham, Amir Najafi, Ioannis Mitliagkas, David Lopez-Paz, Yoshua Bengio

  3. [AdaMixup'19] MixUp as Locally Linear Out-Of-Manifold Regularization. AAAI 2019.

    Hongyu Guo, Yongyi Mao, Richong Zhang.

  4. [CutMix'19] CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features. ICCV 2019. [code]

    Sangdoo Yun, Dongyoon Han, Seong Joon Oh, Sanghyuk Chun, Junsuk Choe, Youngjoon Yoo.

  5. [AugMix'20] AugMix: A Simple Data Processing Method to Improve Robustness and Uncertainty. ICLR 2020. [code]

    Dan Hendrycks, Norman Mu, Ekin D. Cubuk, Barret Zoph, Justin Gilmer, Balaji Lakshminarayanan.

  6. [SnapMix'21] SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data. AAAI 2021. [code]

    Shaoli Huang, Xinchao Wang, Dacheng Tao.

  7. [PuzzleMix'20] Puzzle Mix: Exploiting Saliency and Local Statistics for Optimal Mixup. ICML 2020. [code]

    Jang-Hyun Kim, Wonho Choo, Hyun Oh Song.

  8. [SaliencyMix'21] SaliencyMix: A Saliency Guided Data Augmentation Strategy for Better Regularization. ICLR 2021. [code]

    A F M Shahab Uddin and Mst. Sirazam Monira and Wheemyung Shin and TaeChoong Chung and Sung-Ho Bae.

  9. [CoMixup'21] Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity. ICLR 2021. [code]

    Jang-Hyun Kim, Wonho Choo, Hosan Jeong, Hyun Oh Song.

  10. [NFM'22] Noisy Feature Mixup. ICLR 2022. [code]

    Soon Hoe Lim, N. Benjamin Erichson, Francisco Utrera, Winnie Xu, Michael W. Mahoney

  11. [AlignMix'22] AlignMix: Improving representation by interpolating aligned features. CVPR 2022. [code]

    Shashanka Venkataramanan, Ewa Kijak, Laurent Amsaleg, Yannis Avrithis.

  12. [TransMix'22] TransMix: Attend to Mix for Vision Transformers. CVPR 2022. [code]

    Jie-Neng Chen, Shuyang Sun, Ju He, Philip Torr, Alan Yuille, Song Bai.

  13. [GenLabel'22] GenLabel: Mixup Relabeling using Generative Models. ICML 2022. [code]

    Jy-yong Sohn, Liang Shang, Hongxu Chen, Jaekyun Moon, Dimitris Papailiopoulos, Kangwook Lee.

  14. [VLMixer'22] VLMixer: Unpaired Vision-Language Pre-training via Cross-Modal CutMix. ICML 2022. [code]

    Teng Wang, Wenhao Jiang, Zhichao Lu, Feng Zheng, Ran Cheng, Chengguo Yin, Ping Luo

  15. [AutoMix'22] AutoMix: Unveiling the Power of Mixup for Stronger Classifiers. ECCV 2022. [code]

    Zicheng Liu, Siyuan Li, Di Wu, Zihan Liu, Zhiyuan Chen, Lirong Wu, Stan Z. Li.

  16. [TokenMix'22] TokenMix: Rethinking Image Mixing for Data Augmentation in Vision Transformers. ECCV 2022. [code]

    Jihao Liu, Boxiao Liu, Hang Zhou, Hongsheng Li, Yu Liu

  17. [MDD'22] Towards Understanding the Data Dependency of Mixup-style Training. ICLR 2022. [code]

    Muthu Chidambaram, Xiang Wang, Yuzheng Hu, Chenwei Wu, Rong Ge.

  18. [WH-Mixup'22] When and How Mixup Improves Calibration. ICML 2022.

    Linjun Zhang, Zhun Deng, Kenji Kawaguchi, James Zou.

  19. [RegMixup'22] RegMixup: Mixup as a Regularizer Can Surprisingly Improve Accuracy and Out Distribution Robustness. NeurIPS 2022. [code]

    Francesco Pinto, Harry Yang, Ser-Nam Lim, Philip H.S. Torr, Puneet K. Dokania.

  20. [RecursiveMix'22] RecursiveMix: Mixed Learning with History. NeurIPS 2022. [code]

    Lingfeng Yang, Xiang Li, Borui Zhao, Renjie Song, Jian Yang.

  21. [MSDA'22] A Unified Analysis of Mixed Sample Data Augmentation: A Loss Function Perspective. NeurIPS 2022. [code]

    Chanwoo Park, Sangdoo Yun, Sanghyuk Chun.

Contrastive Learning with Mixup

  1. [MixCo'20] MixCo: Mix-up Contrastive Learning for Visual Representation. NeurIPSW. [code]

    MixCo: Sungnyun Kim, Gihun Lee, Sangmin Bae, Se-Young Yun.

  2. [MoCHi'20] Hard Negative Mixing for Contrastive Learning. NeurIPS 2020. [code]

    Yannis Kalantidis, Mert Bulent Sariyildiz, Noe Pion, Philippe Weinzaepfel, Diane Larlus.

  3. [i-Mix'21] i-Mix A Domain-Agnostic Strategy for Contrastive Representation Learning. ICLR 2021. [code]

    Kibok Lee, Yian Zhu, Kihyuk Sohn, Chun-Liang Li, Jinwoo Shin, Honglak Lee.

  4. [FT'19] Improving Contrastive Learning by Visualizing Feature Transformation. ICCV 2019 (Oral). [code]

    Rui Zhu, Bingchen Zhao, Jingen Liu, Zhenglong Sun, Chang Wen Chen.

  5. [Core-tuning'21] Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regularized Fine-Tuning. NeurIPS 2021. [code]

    Yifan Zhang, Bryan Hooi, Dapeng Hu, Jian Liang, Jiashi Feng.

  6. [MixSiam'22] MixSiam: A Mixture-based Approach to Self-supervised Representation Learning. AAAI 2022.

    Xiaoyang Guo, Tianhao Zhao, Yutian Lin, Bo Du.

  7. [Un-Mix'22] Un-Mix: Rethinking Image Mixtures for Unsupervised Visual Representation. AAAI 2022. [code]

    Zhiqiang Shen, Zechun Liu, Zhuang Liu, Marios Savvides, Trevor Darrell, Eric Xing.

  8. [Metrix'22] It Takes Two to Tango: Mixup for Deep Metric Learning. ICLR 2022. [code]

    Shashanka Venkataramanan, Bill Psomas, Ewa Kijak, Laurent Amsaleg, Konstantinos Karantzalos, Yannis Avrithis.

  9. [ProGCL'22] ProGCL: Rethinking Hard Negative Mining in Graph Contrastive Learning. ICML 2022. [code]

    Jun Xia, Lirong Wu, Ge Wang, Jintao Chen, Stan Z.Li.

  10. [M-Mix'22] M-Mix: Generating Hard Negatives via Multi-sample Mixing for Contrastive Learning. KDD 2022. [code]

    Shaofeng Zhang, Meng Liu, Junchi Yan, Hengrui Zhang, Lingxiao Huang, Pinyan Lu, Xiaokang Yang.

Semi-supervised Learning with Mixup

  1. [ICT'19] Interpolation Consistency Training for Semi-Supervised Learning. IJCAI 2019. [code]

    Vikas Verma, Kenji Kawaguchi, Alex Lamb, Juho Kannala, Yoshua Bengio, David Lopez-Paz

  2. [MixMatch'19] MixMatch: A Holistic Approach to Semi-Supervised Learning. NeurIPS 2019. [code]

    David Berthelot, Nicholas Carlini, Ian Goodfellow, Nicolas Papernot, Avital Oliver, Colin Raffel.

  3. [P3MIX'22] Who Is Your Right Mixup Partner in Positive and Unlabeled Learning. ICLR 2022. [code]

    Changchun Li, Ximing Li, Lei Feng, Jihong Ouyang.

Mixup in NLP

  1. [mixup-text'19] Augmenting Data with Mixup for Sentence Classification: An Empirical Study. arXiv 2019. [code]

    Hongyu Guo, Yongyi Mao, Richong Zhang.

  2. [TMix'20] MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification. ACL 2020. [code]

    Jiaao Chen, Zichao Yang, and Diyi Yang.

  3. [Mixup-Transformer'20] Mixup-Transformer: Dynamic Data Augmentation for NLP Tasks. COLING 2020.

    Lichao Sun, Congying Xia, Wenpeng Yin, Tingting Liang, Philip S. Yu, Lifang He.

  4. [AdvAug'20] AdvAug: Robust Adversarial Augmentation for Neural Machine Translation. ACL 2020.

    Yong Cheng, Lu Jiang, Wolfgang Macherey, Jacob Eisenstein.

  5. [SL'20] Sequence-Level Mixed Sample Data Augmentation. EMNLP 2020.

    Demi Guo, Yoon Kim, Alexander Rush.

  6. [BRMC'21] Better Robustness by More Coverage: Adversarial and Mixup Data Augmentation for Robust Finetuning. ACL 2021.

    Chenglei Si, Zhengyan Zhang, Fanchao Qi, Zhiyuan Liu, Yasheng Wang, Qun Liu, Maosong Sun.

  7. [HYPMIX'21] HYPMIX: Hyperbolic Interpolative Data Augmentation. EMNLP 2021. [code]

    Ramit Sawhney, Megh Thakkar, Shivam Agarwal, Di Jin, Diyi Yang, Lucie Flek

  8. [SSMix'21] SSMix: Saliency-Based Span Mixup for Text Classification. ACL Findings 2021. [code]

    Soyoung Yoon, Gyuwan Kim, Kyumin Park

  9. [Multilingual Mix'22] Multilingual Mix: Example Interpolation Improves Multilingual Neural Machine Translation. ACL 2022.

    Yong Cheng, Ankur Bapna, Orhan Firat, Yuan Cao, Pidong Wang, and Wolfgang Macherey

  10. [DMix'22] DMIX: Adaptive Distance-aware Interpolative Mixup. ACL 2022. (SHORT)

    Ramit Sawhney, Megh Thakkar, Shrey Pandit, Ritesh Soun, Di Jin, Diyi Yang, Lucie Flek

  11. [STEMM'22] STEMM: Self-learning with Speech-text Manifold Mixup for Speech Translation. ACL 2022. [code]

    Qingkai Fang, Rong Ye, Lei Li, Yang Feng, Mingxuan Wang.

  12. [CsaNMT'22] Learning to Generalize to More: Continuous Semantic Augmentation for Neural Machine Translation. ACL 2022. [code]

    Xiangpeng Wei, Heng Yu, Yue Hu, Rongxiang Weng, Weihua Luo, Jun Xie, Rong Jin.

  13. [AUMS'22] On the Calibration of Pre-trained Language Models using Mixup Guided by Area Under the Margin and Saliency. ACL 2022. [code]

    Seo Yeon Park and Cornelia Caragea.

  14. [XAIMix'22] Explainability-based mix-up approach for text data augmentation. TKDD.

    Soonki Kwon , Younghoon Lee.

  15. [TreeMix'22] TreeMix: Compositional Constituency-based Data Augmentation for Natural Language Understanding. NAACL 2022. [code]

    Le Zhang, Zichao Yang, Diyi Yang.

  16. [X-Mixup'22] Enhancing Cross-lingual Transfer by Manifold Mixup. ICLR 2022. [code]

    Huiyun Yang, Huadong Chen, Hao Zhou, Lei Li.

Other Application

  1. [SMFM'22] Boosting Factorization Machines via Saliency-Guided Mixup. 2022. [code]

    Chenwang Wu, Defu Lian, Yong Ge, Min Zhou, Enhong Chen, Dacheng Tao.

  2. [MIX-TS] Mixing Up Contrastive Learning: Self-Supervised Representation Learning for Time Series. PRL 2022. [code]

    Kristoffer Wickstrøm, Michael Kampffmeyer, Karl Øyvind Mikalsen, Robert Jenssen.

About

A paper list about mixup.