rohini-raja / mlcourse.ai

Open Machine Learning Course

Home Page:https://mlcourse.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ODS stickers

mlcourse.ai – Open Machine Learning Course

License: CC BY-NC-SA 4.0 Slack Donate Donate

mlcourse.ai is an open Machine Learning course by OpenDataScience. The course is designed to perfectly balance theory and practice. You can take part in several Kaggle Inclass competitions held during the course. From spring 2017 to fall 2019, 6 sessions of mlcourse.ai took place - 26k participants applied, 10k converted to passing the first assignment, about 1500 participants finished the course. Currently, the course is in self-paced mode. Check out a thorough Roadmap guiding you through the self-paced mlcourse.ai.

Mirrors (:uk:-only): mlcourse.ai (main site), Kaggle Dataset (same notebooks as Kaggle Notebooks)

Self-paced passing

The Roadmap will guide you through 11 weeks of mlcourse.ai. For each week, from Pandas to Gradient Boosting, instructions are given on what artciles to read, lectures to watch, what assignments to accomplish.

Articles

This is the list of published articles on medium.com πŸ‡¬πŸ‡§, habr.com πŸ‡·πŸ‡Ί. Also notebooks in Chinese are mentioned πŸ‡¨πŸ‡³ and links to Kaggle Notebooks (in English) are given. Icons are clickable.

  1. Exploratory Data Analysis with Pandas πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebook
  2. Visual Data Analysis with Python πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebooks: part1, part2
  3. Classification, Decision Trees and k Nearest Neighbors πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebook
  4. Linear Classification and Regression πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebooks: part1, part2, part3, part4, part5
  5. Bagging and Random Forest πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebooks: part1, part2, part3
  6. Feature Engineering and Feature Selection πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebook
  7. Unsupervised Learning: Principal Component Analysis and Clustering πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebook
  8. Vowpal Wabbit: Learning with Gigabytes of Data πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³, Kaggle Notebook
  9. Time Series Analysis with Python, part 1 πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί πŸ‡¨πŸ‡³. Predicting future with Facebook Prophet, part 2 πŸ‡¬πŸ‡§, πŸ‡¨πŸ‡³ Kaggle Notebooks: part1, part2
  10. Gradient Boosting πŸ‡¬πŸ‡§ πŸ‡·πŸ‡Ί, πŸ‡¨πŸ‡³, Kaggle Notebook

Lectures

Videolectures are uploaded to this YouTube playlist. Introduction, video, slides

  1. Exploratory data analysis with Pandas, video
  2. Visualization, main plots for EDA, video
  3. Decision trees: theory and practical part
  4. Logistic regression: theoretical foundations, practical part (baselines in the "Alice" competition)
  5. Ensembles and Random Forest – part 1. Classification metrics – part 2. Example of a business task, predicting a customer payment – part 3
  6. Linear regression and regularization - theory, LASSO & Ridge, LTV prediction - practice
  7. Unsupervised learning - Principal Component Analysis and Clustering
  8. Stochastic Gradient Descent for classification and regression - part 1, part 2 TBA
  9. Time series analysis with Python (ARIMA, Prophet) - video
  10. Gradient boosting: basic ideas - part 1, key ideas behind Xgboost, LightGBM, and CatBoost + practice - part 2

Assignments

  1. Exploratory data analysis with Pandas, nbviewer, Kaggle Notebook, solution
  2. Analyzing cardiovascular disease data, nbviewer, Kaggle Notebook, solution
  3. Decision trees with a toy task and the UCI Adult dataset, nbviewer, Kaggle Notebook, solution
  4. Sarcasm detection, Kaggle Notebook, solution. Linear Regression as an optimization problem, nbviewer, Kaggle Notebook
  5. Logistic Regression and Random Forest in the credit scoring problem, nbviewer, Kaggle Notebook, solution
  6. Exploring OLS, Lasso and Random Forest in a regression task, nbviewer, Kaggle Notebook, solution
  7. Unsupervised learning, nbviewer, Kaggle Notebook, solution
  8. Implementing online regressor, nbviewer, Kaggle Notebook, solution
  9. Time series analysis, nbviewer, Kaggle Notebook, solution
  10. Beating baseline in a competition, Kaggle Notebook

Kaggle competitions

  1. Catch Me If You Can: Intruder Detection through Webpage Session Tracking. Kaggle Inclass
  2. DotA 2 winner prediction. Kaggle Inclass

Citing mlcourse.ai

If you happen to cite mlcourse.ai in your work, you can use this bibtex

@misc{mlcourse_ai,
    author = {Kashnitsky, Yury},
    title = {mlcourse.ai – Open Machine Learning Course},
    year = {2020},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://github.com/Yorko/mlcourse.ai}},
}

Community

Discussions are held in the #mlcourse_ai channel of the OpenDataScience (ods.ai) Slack team.

The course is free but you can support organizers by making a pledge on Patreon (monthly support) or a one-time payment on Ko-fi. Thus you'll foster the spread of Machine Learning in the world!

Donate Donate

About

Open Machine Learning Course

https://mlcourse.ai

License:Other


Languages

Language:Python 99.2%Language:HTML 0.8%Language:Shell 0.0%Language:Batchfile 0.0%