alejandrods / Analysis-of-classifiers-robust-to-noisy-labels

Analysis of robust classification algorithms for overcoming class-dependant labelling noise: Forward, Importance Reweighting and T-revision. We demonstrate methods for estimating the transition matrix in order to obtain better classifier performance when working with noisy data.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Analysis of classifiers robust to noisy labels

Paper: [arXiv]

Code: Open In Colab

We explore contemporary robust classification algorithms for overcoming class-dependant labelling noise: Forward, Importance Reweighting and T-revision. The classifiers are trained and evaluated on class-conditional random label noise data while the final test data is clean. We demonstrate methods for estimating the transition matrix in order to obtain better classifier performance when working with noisy data. We apply deep learning to three data-sets and derive an end-to-end analysis with unknown noise on the CIFAR dataset from scratch. The effectiveness and robustness of the classifiers are analysed, and we compare and contrast the results of each experiment are using top-1 accuracy as our criterion.

Authors

Alex Díaz
Damian Steele

Cite

@misc{díaz2021analysis,
      title={Analysis of classifiers robust to noisy labels}, 
      author={Alex Díaz and Damian Steele},
      year={2021},
      eprint={2106.00274},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

About

Analysis of robust classification algorithms for overcoming class-dependant labelling noise: Forward, Importance Reweighting and T-revision. We demonstrate methods for estimating the transition matrix in order to obtain better classifier performance when working with noisy data.


Languages

Language:Jupyter Notebook 100.0%