AutoML Toolkit for Deep Learning
AutoGluon automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy deep learning models on tabular, image, and text data.
Example
# First install package from terminal: pip install mxnet autogluon
from autogluon import TabularPrediction as task
train_data = task.Dataset(file_path='https://autogluon.s3.amazonaws.com/datasets/Inc/train.csv')
test_data = task.Dataset(file_path='https://autogluon.s3.amazonaws.com/datasets/Inc/test.csv')
predictor = task.fit(train_data=train_data, label='class')
performance = predictor.evaluate(test_data)
Resources
See the AutoGluon Website for instructions on:
- Installing AutoGluon
- Learning with tabular data
- Learning with image data
- Learning with text data
- More advanced topics such as Neural Architecture Search
Scientific Publications
Articles
- AutoGluon for tabular data: 3 lines of code to achieve top 1% in Kaggle competitions (AWS Open Source Blog, Mar 2020)
- Accurate image classification in 3 lines of code with AutoGluon (Medium, Feb 2020)
- AutoGluon overview & example applications (Towards Data Science, Dec 2019)
Supplementary Notebooks
Citing AutoGluon
If you use AutoGluon in a scientific publication, please cite the following paper:
Erickson, Nick, et al. "AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data." arXiv preprint arXiv:2003.06505 (2020).
BibTeX entry:
@article{agtabular,
title={AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data},
author={Erickson, Nick and Mueller, Jonas and Shirkov, Alexander and Zhang, Hang and Larroy, Pedro and Li, Mu and Smola, Alexander},
journal={arXiv preprint arXiv:2003.06505},
year={2020}
}
License
This library is licensed under the Apache 2.0 License.
Contributing to AutoGluon
We are actively accepting code contributions to the AutoGluon project. If you are interested in contributing to AutoGluon, please read the Contributing Guide to get started.