Made With ML
Applied ML Β· MLOps Β· ProductionJoin 20K+ developers in learning how to responsibly deliver value with ML.
If you need refresh yourself on ML algorithms, check out our ML Foundations repository (π₯ Among the top ML repositories on GitHub)
π¦ Product | π’ Data | π Modeling |
Objective | Annotation | Baselines |
Solution | Exploratory data analysis | Experiment tracking |
Evaluation | Splitting | Optimization |
Iteration | Preprocessing |
π Scripting | (cont.) | π¦ Application | β Testing |
Organization | Styling | CLI | Code |
Packaging | Makefile | API | Data |
Documentation | Logging | Models |
β»οΈ Reproducibility | π Production | (cont.) |
Git | Dashboard | Feature stores |
Pre-commit | CI/CD | Deployment |
Versioning | Pipelines | Monitoring |
Docker |
π New lessons every month!
Subscribe for our monthly updates on new content.
Directory structure
app/
βββ api.py - FastAPI app
βββ cli.py - CLI app
βββ schemas.py - API model schemas
tagifai/
βββ config.py - configuration setup
βββ data.py - data processing components
βββ eval.py - evaluation components
βββ main.py - training/optimization pipelines
βββ models.py - model architectures
βββ predict.py - inference components
βββ train.py - training components
βββ utils.py - supplementary utilities
Documentation for this application can be found here.
Workflows
Use existing model
- Set up environment.
export venv_name="venv"
make venv name=${venv_name} env="dev"
source ${venv_name}/bin/activate
- Pull latest model.
dvc pull
- Run Application
make app env="dev"
You can interact with the API directly or explore via the generated documentation at http://0.0.0.0:5000/docs.
Update model (CI/CD)
Coming soon after CI/CD lesson where the entire application will be retrained and deployed when we push new data (or trigger manual reoptimization/training). The deployed model, with performance comparisons to previously deployed versions, will be ready on a PR to push to the main branch.
Update model (manual)
- Set up the development environment.
export venv_name="venv"
make venv name=${venv_name} env="dev"
source ${venv_name}/bin/activate
- Pull versioned data and model artifacts.
dvc pull
- Optimize using distributions specified in
tagifai.main.objective
. This also writes the best model's params to config/params.json
tagifai optimize \
--params-fp config/params.json \
--study-name optimization \
--num-trials 100
We'll cover how to train using compute instances on the cloud from Amazon Web Services (AWS) or Google Cloud Platforms (GCP) in later lessons. But in the meantime, if you don't have access to GPUs, check out the optimize.ipynb notebook for how to train on Colab and transfer to local. We essentially run optimization, then train the best model to download and transfer it's artifacts.
- Train a model (and save all it's artifacts) using params from config/params.json and publish metrics to model/performance.json. You can view the entire run's details inside
experiments/{experiment_id}/{run_id}
or via the API (GET
/runs/{run_id}).
tagifai train-model \
--params-fp config/params.json \
--model-dir model \
--experiment-name best \
--run-name model
- Predict tags for an input sentence. It'll use the best model saved from
train-model
but you can also specify arun-id
to choose a specific model.
tagifai predict-tags --text "Transfer learning with BERT" # test with CLI app
make app env="dev" # run API and test as well
- View improvements Once you're done training the best model using the current data version, best hyperparameters, etc., we can view performance difference.
tagifai diff
- Commit to git This will clean and update versioned assets (data, experiments), run tests, styling, etc.
git add .
git commit -m ""
git tag -a <TAG_NAME> -m ""
git push origin <BRANCH_NAME>
Commands
Docker
make docker # docker build -t tagifai:latest -f Dockerfile .
# docker run -p 5000:5000 --name tagifai tagifai:latest
Application
make app # uvicorn app.api:app --host 0.0.0.0 --port 5000 --reload --reload-dir tagifai --reload-dir app
make app-prod # gunicorn -c config/gunicorn.py -k uvicorn.workers.UvicornWorker app.api:app
Streamlit dashboard
make streamlit # streamlit run streamlit/app.py
MLFlow
make mlflow # mlflow server -h 0.0.0.0 -p 5000 --backend-store-uri stores/model/
Mkdocs
make docs # python -m mkdocs serve
Testing
make great-expectations # great_expectations checkpoint run [projects, tags]
make test # pytest --cov tagifai --cov app --cov-report html
make test-non-training # pytest -m "not training"
Start Jupyterlab
python -m ipykernel install --user --name=tagifai
jupyter labextension install @jupyter-widgets/jupyterlab-manager
jupyter labextension install @jupyterlab/toc
jupyter lab
You can also run all notebooks on Google Colab.
FAQ
Why is this free?
While this content is for everyone, it's especially targeted towards people who don't have as much opportunity to learn. I firmly believe that creativity and intelligence are randomly distributed but opportunity is siloed. I want to enable more people to create and contribute to innovation.
Who is the author?
- I've deployed large scale ML systems at Apple as well as smaller systems with constraints at startups and want to share the common principles I've learned along the way.
- I created Made With ML so that the community can explore, learn and build ML and I learned how to build it into an end-to-end product that's currently used by over 20K monthly active users.
- Connect with me on Twitter and LinkedIn
To cite this course, please use:
@article{madewithml,
title = "Applied ML - Made With ML",
author = "Goku Mohandas",
url = "https://madewithml.com/courses/mlops/"
year = "2021",
}