1 Star 0 Fork 1

changht / mlcourse.ai

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README
CC-BY-4.0

ODS stickers

mlcourse.ai – Open Machine Learning Course

License: CC BY-NC-SA 4.0 Slack Donate Donate

mlcourse.ai is an open Machine Learning course by OpenDataScience (ods.ai), led by Yury Kashnitsky (yorko). Having both a Ph.D. degree in applied math and a Kaggle Competitions Master tier, Yury aimed at designing an ML course with a perfect balance between theory and practice. Thus, the course meets you with math formulae in lectures, and a lot of practice in a form of assignments and Kaggle Inclass competitions. Currently, the course is in a self-paced mode. Here we guide you through the self-paced mlcourse.ai.

Bonus: Additionally, you can purchase a Bonus Assignments pack with the best non-demo versions of mlcourse.ai assignments. Select the "Bonus Assignments" tier. Refer to the details of the deal on the main page mlcourse.ai.

Mirrors (:uk:-only): mlcourse.ai (main site), Kaggle Dataset (same notebooks as Kaggle Notebooks)

Self-paced passing

You are guided through 10 weeks of mlcourse.ai. For each week, from Pandas to Gradient Boosting, instructions are given on which articles to read, lectures to watch, what assignments to accomplish.

Articles

This is the list of published articles on medium.com :uk:, habr.com :ru:. Also notebooks in Chinese are mentioned :cn: and links to Kaggle Notebooks (in English) are given. Icons are clickable.

  1. Exploratory Data Analysis with Pandas :uk: :ru: :cn:, Kaggle Notebook
  2. Visual Data Analysis with Python :uk: :ru: :cn:, Kaggle Notebooks: part1, part2
  3. Classification, Decision Trees and k Nearest Neighbors :uk: :ru: :cn:, Kaggle Notebook
  4. Linear Classification and Regression :uk: :ru: :cn:, Kaggle Notebooks: part1, part2, part3, part4, part5
  5. Bagging and Random Forest :uk: :ru: :cn:, Kaggle Notebooks: part1, part2, part3
  6. Feature Engineering and Feature Selection :uk: :ru: :cn:, Kaggle Notebook
  7. Unsupervised Learning: Principal Component Analysis and Clustering :uk: :ru: :cn:, Kaggle Notebook
  8. Vowpal Wabbit: Learning with Gigabytes of Data :uk: :ru: :cn:, Kaggle Notebook
  9. Time Series Analysis with Python, part 1 :uk: :ru: :cn:. Predicting future with Facebook Prophet, part 2 :uk:, :cn: Kaggle Notebooks: part1, part2
  10. Gradient Boosting :uk: :ru:, :cn:, Kaggle Notebook

Lectures

Videolectures are uploaded to this YouTube playlist. Introduction, video, slides

  1. Exploratory data analysis with Pandas, video
  2. Visualization, main plots for EDA, video
  3. Decision trees: theory and practical part
  4. Logistic regression: theoretical foundations, practical part (baselines in the "Alice" competition)
  5. Ensembles and Random Forest – part 1. Classification metrics – part 2. Example of a business task, predicting a customer payment – part 3
  6. Linear regression and regularization - theory, LASSO & Ridge, LTV prediction - practice
  7. Unsupervised learning - Principal Component Analysis and Clustering
  8. Stochastic Gradient Descent for classification and regression - part 1, part 2 TBA
  9. Time series analysis with Python (ARIMA, Prophet) - video
  10. Gradient boosting: basic ideas - part 1, key ideas behind Xgboost, LightGBM, and CatBoost + practice - part 2

Assignments

The following are demo-assignments. Additionally, within the "Bonus Assignments" tier you can get access to non-demo assignments.

  1. Exploratory data analysis with Pandas, nbviewer, Kaggle Notebook, solution
  2. Analyzing cardiovascular disease data, nbviewer, Kaggle Notebook, solution
  3. Decision trees with a toy task and the UCI Adult dataset, nbviewer, Kaggle Notebook, solution
  4. Sarcasm detection, Kaggle Notebook, solution. Linear Regression as an optimization problem, nbviewer, Kaggle Notebook
  5. Logistic Regression and Random Forest in the credit scoring problem, nbviewer, Kaggle Notebook, solution
  6. Exploring OLS, Lasso and Random Forest in a regression task, nbviewer, Kaggle Notebook, solution
  7. Unsupervised learning, nbviewer, Kaggle Notebook, solution
  8. Implementing online regressor, nbviewer, Kaggle Notebook, solution
  9. Time series analysis, nbviewer, Kaggle Notebook, solution
  10. Beating baseline in a competition, Kaggle Notebook

Bonus assignments

Additionally, you can purchase a Bonus Assignments pack with the best non-demo versions of mlcourse.ai assignments. Select the "Bonus Assignments" tier on Patreon or a similar tier on Boosty (rus).

  

Details of the deal

mlcourse.ai is still in self-paced mode but we offer you Bonus Assignments with solutions for a contribution of $17/month. The idea is that you pay for ~1-5 months while studying the course materials, but a single contribution is still fine and opens your access to the bonus pack.

Note: the first payment is charged at the moment of joining the Tier Patreon, and the next payment is charged on the 1st day of the next month, thus it's better to purchase the pack in the 1st half of the month.

mlcourse.ai is never supposed to go fully monetized (it's created in the wonderful open ODS.ai community and will remain open and free) but it'd help to cover some operational costs, and Yury also put in quite some effort into assembling all the best assignments into one pack. Please note that unlike the rest of the course content, Bonus Assignments are copyrighted. Informally, Yury's fine if you share the pack with 2-3 friends but public sharing of the Bonus Assignments pack is prohibited.


The bonus pack contains 10 assignments, in some of them you are challenged to beat a baseline in a Kaggle competition under thorough guidance ("Alice" and "Medium") or implement an algorithm from scratch -- efficient stochastic gradient descent classifier and gradient boosting.

Kaggle competitions

  1. Catch Me If You Can: Intruder Detection through Webpage Session Tracking. Kaggle Inclass
  2. Predicting popularity of a Medium article. Kaggle Inclass
  3. DotA 2 winner prediction. Kaggle Inclass

Citing mlcourse.ai

If you happen to cite mlcourse.ai in your work, you can use this BibTeX record:

@misc{mlcourse_ai,
    author = {Kashnitsky, Yury},
    title = {mlcourse.ai – Open Machine Learning Course},
    year = {2020},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://github.com/Yorko/mlcourse.ai}},
}

Community

Discussions are held in the #mlcourse_ai_eng channel of the OpenDataScience (ods.ai) Slack team (however, as of Sept. 2022, ODS Slack can't invite new users, and only 90-day history is retained, transition to Matrix is in progress).

The course is free but you can support organizers by making a pledge on Patreon (monthly support) or a one-time payment on Ko-fi.

Donate Donate

<div align="center"> ![ODS stickers](https://github.com/Yorko/mlcourse.ai/blob/main/img/ods_stickers.jpg) **[mlcourse.ai](https://mlcourse.ai) – Open Machine Learning Course** [![License: CC BY-NC-SA 4.0](https://img.shields.io/badge/license-CC%20BY--NC--SA%204.0-green)](https://creativecommons.org/licenses/by-nc-sa/4.0/) [![Slack](https://img.shields.io/badge/slack-ods.ai-orange)](https://opendatascience.slack.com/archives/C91N8TL83/p1567408586359500) [![Donate](https://img.shields.io/badge/support-patreon-red)](https://www.patreon.com/ods_mlcourse) [![Donate](https://img.shields.io/badge/support-ko--fi-red)](https://ko-fi.com/mlcourse_ai) </div> [mlcourse.ai](https://mlcourse.ai) is an open Machine Learning course by [OpenDataScience (ods.ai)](https://ods.ai/), led by [Yury Kashnitsky (yorko)](https://yorko.github.io/). Having both a Ph.D. degree in applied math and a Kaggle Competitions Master tier, Yury aimed at designing an ML course with a perfect balance between theory and practice. Thus, the course meets you with math formulae in lectures, and a lot of practice in a form of assignments and Kaggle Inclass competitions. Currently, the course is in a **self-paced mode**. Here we guide you through the self-paced [mlcourse.ai](https://mlcourse.ai). __Bonus:__ Additionally, you can purchase a Bonus Assignments pack with the best non-demo versions of [mlcourse.ai](https://mlcourse.ai/) assignments. Select the ["Bonus Assignments" tier](https://www.patreon.com/ods_mlcourse). Refer to the details of the deal on the main page [mlcourse.ai](https://mlcourse.ai/). Mirrors (:uk:-only): [mlcourse.ai](https://mlcourse.ai) (main site), [Kaggle Dataset](https://www.kaggle.com/kashnitsky/mlcourse) (same notebooks as Kaggle Notebooks) ### Self-paced passing You are guided through 10 weeks of [mlcourse.ai](https://mlcourse.ai). For each week, from Pandas to Gradient Boosting, instructions are given on which articles to read, lectures to watch, what assignments to accomplish. ### Articles This is the list of published articles on medium.com [:uk:](https://medium.com/open-machine-learning-course), habr.com [:ru:](https://habr.com/company/ods/blog/344044/). Also notebooks in Chinese are mentioned :cn: and links to Kaggle Notebooks (in English) are given. Icons are clickable. 1. Exploratory Data Analysis with Pandas [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-1-exploratory-data-analysis-with-pandas-de57880f1a68) [:ru:](https://habrahabr.ru/company/ods/blog/322626/) [:cn:](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_chinese/topic01-%E4%BD%BF%E7%94%A8-Pandas-%E8%BF%9B%E8%A1%8C%E6%95%B0%E6%8D%AE%E6%8E%A2%E7%B4%A2.ipynb), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/topic-1-exploratory-data-analysis-with-pandas) 2. Visual Data Analysis with Python [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-2-visual-data-analysis-in-python-846b989675cd) [:ru:](https://habrahabr.ru/company/ods/blog/323210/) [:cn:](http://nbviewer.ipython.org/urls/raw.github.com/Yorko/mlcourse.ai/main/jupyter_chinese/topic02-Python-%E6%95%B0%E6%8D%AE%E5%8F%AF%E8%A7%86%E5%8C%96%E5%88%86%E6%9E%90.ipynb), Kaggle Notebooks: [part1](https://www.kaggle.com/kashnitsky/topic-2-visual-data-analysis-in-python), [part2](https://www.kaggle.com/kashnitsky/topic-2-part-2-seaborn-and-plotly) 3. Classification, Decision Trees and k Nearest Neighbors [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-3-classification-decision-trees-and-k-nearest-neighbors-8613c6b6d2cd) [:ru:](https://habrahabr.ru/company/ods/blog/322534/) [:cn:](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_chinese/topic03-%E5%86%B3%E7%AD%96%E6%A0%91%E5%92%8C-K-%E8%BF%91%E9%82%BB%E5%88%86%E7%B1%BB.ipynb), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/topic-3-decision-trees-and-knn) 4. Linear Classification and Regression [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-4-linear-classification-and-regression-44a41b9b5220) [:ru:](https://habrahabr.ru/company/ods/blog/323890/) [:cn:](http://nbviewer.ipython.org/urls/raw.github.com/Yorko/mlcourse.ai/main/jupyter_chinese/topic04-%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E5%92%8C%E7%BA%BF%E6%80%A7%E5%88%86%E7%B1%BB%E5%99%A8.ipynb), Kaggle Notebooks: [part1](https://www.kaggle.com/kashnitsky/topic-4-linear-models-part-1-ols), [part2](https://www.kaggle.com/kashnitsky/topic-4-linear-models-part-2-classification), [part3](https://www.kaggle.com/kashnitsky/topic-4-linear-models-part-3-regularization), [part4](https://www.kaggle.com/kashnitsky/topic-4-linear-models-part-4-more-of-logit), [part5](https://www.kaggle.com/kashnitsky/topic-4-linear-models-part-5-validation) 5. Bagging and Random Forest [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-5-ensembles-of-algorithms-and-random-forest-8e05246cbba7) [:ru:](https://habrahabr.ru/company/ods/blog/324402/) [:cn:](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_chinese/topic05-%E9%9B%86%E6%88%90%E5%AD%A6%E4%B9%A0%E5%92%8C%E9%9A%8F%E6%9C%BA%E6%A3%AE%E6%9E%97%E6%96%B9%E6%B3%95.ipynb), Kaggle Notebooks: [part1](https://www.kaggle.com/kashnitsky/topic-5-ensembles-part-1-bagging), [part2](https://www.kaggle.com/kashnitsky/topic-5-ensembles-part-2-random-forest), [part3](https://www.kaggle.com/kashnitsky/topic-5-ensembles-part-3-feature-importance) 6. Feature Engineering and Feature Selection [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-6-feature-engineering-and-feature-selection-8b94f870706a) [:ru:](https://habrahabr.ru/company/ods/blog/325422/) [:cn:](http://nbviewer.ipython.org/urls/raw.github.com/Yorko/mlcourse.ai/main/jupyter_chinese/topic06-%E7%89%B9%E5%BE%81%E5%B7%A5%E7%A8%8B%E5%92%8C%E7%89%B9%E5%BE%81%E9%80%89%E6%8B%A9.ipynb), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/topic-6-feature-engineering-and-feature-selection) 7. Unsupervised Learning: Principal Component Analysis and Clustering [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-7-unsupervised-learning-pca-and-clustering-db7879568417) [:ru:](https://habrahabr.ru/company/ods/blog/325654/) [:cn:](http://nbviewer.ipython.org/urls/raw.github.com/Yorko/mlcourse.ai/main/jupyter_chinese/topic07-%E4%B8%BB%E6%88%90%E5%88%86%E5%88%86%E6%9E%90%E5%92%8C%E8%81%9A%E7%B1%BB.ipynb), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/topic-7-unsupervised-learning-pca-and-clustering) 8. Vowpal Wabbit: Learning with Gigabytes of Data [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-8-vowpal-wabbit-fast-learning-with-gigabytes-of-data-60f750086237) [:ru:](https://habrahabr.ru/company/ods/blog/326418/) [:cn:](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_chinese/topic08-%E9%9A%8F%E6%9C%BA%E6%A2%AF%E5%BA%A6%E4%B8%8B%E9%99%8D%E5%92%8C%E7%8B%AC%E7%83%AD%E7%BC%96%E7%A0%81.ipynb), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/topic-8-online-learning-and-vowpal-wabbit) 9. Time Series Analysis with Python, part 1 [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-9-time-series-analysis-in-python-a270cb05e0b3) [:ru:](https://habrahabr.ru/company/ods/blog/327242/) [:cn:](http://nbviewer.ipython.org/urls/raw.github.com/Yorko/mlcourse.ai/main/jupyter_chinese/topic09-%E6%97%B6%E9%97%B4%E5%BA%8F%E5%88%97%E5%A4%84%E7%90%86%E4%B8%8E%E5%BA%94%E7%94%A8.ipynb). Predicting future with Facebook Prophet, part 2 [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-9-part-3-predicting-the-future-with-facebook-prophet-3f3af145cdc), [:cn:](http://nbviewer.ipython.org/urls/raw.github.com/Yorko/mlcourse.ai/main/jupyter_chinese/topic09-%E6%97%B6%E9%97%B4%E5%BA%8F%E5%88%97%E5%A4%84%E7%90%86%E4%B8%8E%E5%BA%94%E7%94%A8.ipynb) Kaggle Notebooks: [part1](https://www.kaggle.com/kashnitsky/topic-9-part-1-time-series-analysis-in-python), [part2](https://www.kaggle.com/kashnitsky/topic-9-part-2-time-series-with-facebook-prophet) 10. Gradient Boosting [:uk:](https://medium.com/open-machine-learning-course/open-machine-learning-course-topic-10-gradient-boosting-c751538131ac) [:ru:](https://habrahabr.ru/company/ods/blog/327250/), [:cn:](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_chinese/topic05-%E9%9B%86%E6%88%90%E5%AD%A6%E4%B9%A0%E5%92%8C%E9%9A%8F%E6%9C%BA%E6%A3%AE%E6%9E%97%E6%96%B9%E6%B3%95.ipynb), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/topic-10-gradient-boosting) ### Lectures Videolectures are uploaded to [this](https://bit.ly/2zY6Xe2) YouTube playlist. Introduction, [video](https://www.youtube.com/watch?v=DrohHdQa8u8), [slides](https://www.slideshare.net/festline/mlcourseai-fall2019-live-session-0) 1. Exploratory data analysis with Pandas, [video](https://youtu.be/fwWCw_cE5aI) 2. Visualization, main plots for EDA, [video](https://www.youtube.com/watch?v=WNoQTNOME5g) 3. Decision trees: [theory](https://youtu.be/H4XlBTPv5rQ) and [practical part](https://youtu.be/RrVYO6Td9Js) 4. Logistic regression: [theoretical foundations](https://www.youtube.com/watch?v=l3jiw-N544s), [practical part](https://www.youtube.com/watch?v=7o0SWgY89i8) (baselines in the "Alice" competition) 5. Ensembles and Random Forest – [part 1](https://www.youtube.com/watch?v=neXJL-AqI_c). Classification metrics – [part 2](https://www.youtube.com/watch?v=aBOMYqGUlWQ). Example of a business task, predicting a customer payment – [part 3](https://www.youtube.com/watch?v=FmKU-1LZGoE) 6. Linear regression and regularization - [theory](https://youtu.be/ne-MfRfYs_c), LASSO & Ridge, LTV prediction - [practice](https://youtu.be/B8yIaIEMyIc) 7. Unsupervised learning - [Principal Component Analysis](https://youtu.be/-AswHf7h0I4) and [Clustering](https://youtu.be/eVplCo-w4XE) 8. Stochastic Gradient Descent for classification and regression - [part 1](https://youtu.be/EUSXbdzaQE8), part 2 TBA 9. Time series analysis with Python (ARIMA, Prophet) - [video](https://youtu.be/_9lBwXnbOd8) 10. Gradient boosting: basic ideas - [part 1](https://youtu.be/g0ZOtzZqdqk), key ideas behind Xgboost, LightGBM, and CatBoost + practice - [part 2](https://youtu.be/V5158Oug4W8) ### Assignments The following are demo-assignments. Additionally, within the ["Bonus Assignments" tier](https://www.patreon.com/ods_mlcourse) you can get access to non-demo assignments. 1. Exploratory data analysis with Pandas, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment01_pandas_uci_adult.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-1-pandas-and-uci-adult-dataset), [solution](https://www.kaggle.com/kashnitsky/a1-demo-pandas-and-uci-adult-dataset-solution) 2. Analyzing cardiovascular disease data, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment02_analyzing_cardiovascular_desease_data.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-2-analyzing-cardiovascular-data), [solution](https://www.kaggle.com/kashnitsky/a2-demo-analyzing-cardiovascular-data-solution) 3. Decision trees with a toy task and the UCI Adult dataset, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment03_decision_trees.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-3-decision-trees), [solution](https://www.kaggle.com/kashnitsky/a3-demo-decision-trees-solution) 4. Sarcasm detection, [Kaggle Notebook](https://www.kaggle.com/kashnitsky/a4-demo-sarcasm-detection-with-logit), [solution](https://www.kaggle.com/kashnitsky/a4-demo-sarcasm-detection-with-logit-solution). Linear Regression as an optimization problem, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment04_linreg_optimization.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-4-linear-regression-as-optimization) 5. Logistic Regression and Random Forest in the credit scoring problem, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment05_logit_rf_credit_scoring.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-5-logit-and-rf-for-credit-scoring), [solution](https://www.kaggle.com/kashnitsky/a5-demo-logit-and-rf-for-credit-scoring-sol) 6. Exploring OLS, Lasso and Random Forest in a regression task, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment06_regression_wine.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-6-linear-models-and-rf-for-regression), [solution](https://www.kaggle.com/kashnitsky/a6-demo-regression-solution) 7. Unsupervised learning, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment07_unsupervised_learning.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-7-unupervised-learning), [solution](https://www.kaggle.com/kashnitsky/a7-demo-unsupervised-learning-solution) 8. Implementing online regressor, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment08_implement_sgd_regressor.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-8-implementing-online-regressor), [solution](https://www.kaggle.com/kashnitsky/a8-demo-implementing-online-regressor-solution) 9. Time series analysis, [nbviewer](https://nbviewer.jupyter.org/github/Yorko/mlcourse.ai/blob/main/jupyter_english/assignments_demo/assignment09_time_series.ipynb?flush_cache=true), [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-9-time-series-analysis), [solution](https://www.kaggle.com/kashnitsky/a9-demo-time-series-analysis-solution) 10. Beating baseline in a competition, [Kaggle Notebook](https://www.kaggle.com/kashnitsky/assignment-10-gradient-boosting-and-flight-delays) ### Bonus assignments Additionally, you can purchase a **Bonus Assignments pack** with the best non-demo versions of [mlcourse.ai](https://mlcourse.ai/) assignments. Select the ["Bonus Assignments" tier](https://www.patreon.com/ods_mlcourse) on Patreon or a [similar tier](https://boosty.to/ods_mlcourse/purchase/1142055?ssource=DIRECT&share=subscription_link) on Boosty (rus). <div class="row"> <div class="col-md-8" markdown="1"> <p align="center"> <a href="https://www.patreon.com/ods_mlcourse"> <img src="mlcourse_ai_jupyter_book/_static/img/become_a_patron.png"> </a> &nbsp;&nbsp; <a href="https://boosty.to/ods_mlcourse"> <img src="mlcourse_ai_jupyter_book/_static/img/boosty_logo.png" width=200px%> </a> </p> </div> <div class="col-md-4" markdown="1"> <details> <summary>Details of the deal</summary> mlcourse.ai is still in self-paced mode but we offer you Bonus Assignments with solutions for a contribution of $17/month. The idea is that you pay for ~1-5 months while studying the course materials, but a single contribution is still fine and opens your access to the bonus pack. Note: the first payment is charged at the moment of joining the Tier Patreon, and the next payment is charged on the 1st day of the next month, thus it's better to purchase the pack in the 1st half of the month. mlcourse.ai is never supposed to go fully monetized (it's created in the wonderful open ODS.ai community and will remain open and free) but it'd help to cover some operational costs, and Yury also put in quite some effort into assembling all the best assignments into one pack. Please note that unlike the rest of the course content, Bonus Assignments are copyrighted. Informally, Yury's fine if you share the pack with 2-3 friends but public sharing of the Bonus Assignments pack is prohibited. </details> </div> </div><br> The bonus pack contains 10 assignments, in some of them you are challenged to beat a baseline in a Kaggle competition under thorough guidance (["Alice"](https://mlcourse.ai/book/topic04/bonus_assignment04_alice_baselines.html) and ["Medium"](https://mlcourse.ai/book/topic06/bonus_assignment06.html)) or implement an algorithm from scratch -- efficient stochastic gradient descent [classifier](https://mlcourse.ai/book/topic08/bonus_assignment08.html) and [gradient boosting](https://mlcourse.ai/book/topic10/bonus_assignment10.html). ### Kaggle competitions 1. Catch Me If You Can: Intruder Detection through Webpage Session Tracking. [Kaggle Inclass](https://www.kaggle.com/c/catch-me-if-you-can-intruder-detection-through-webpage-session-tracking2) 2. Predicting popularity of a Medium article. [Kaggle Inclass](https://www.kaggle.com/c/how-good-is-your-medium-article) 3. DotA 2 winner prediction. [Kaggle Inclass](https://www.kaggle.com/c/mlcourse-dota2-win-prediction) ### Citing mlcourse.ai If you happen to cite [mlcourse.ai](https://mlcourse.ai) in your work, you can use this BibTeX record: ``` @misc{mlcourse_ai, author = {Kashnitsky, Yury}, title = {mlcourse.ai – Open Machine Learning Course}, year = {2020}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/Yorko/mlcourse.ai}}, } ``` ### Community Discussions are held in the **#mlcourse\_ai\_eng** channel of the [OpenDataScience (ods.ai)](https://ods.ai) Slack team (however, as of Sept. 2022, ODS Slack can't invite new users, and only 90-day history is retained, transition to [Matrix](https://chat.ods.ai/#/welcome) is in progress). *The course is free but you can support organizers by making a pledge on [Patreon](https://www.patreon.com/ods_mlcourse) (monthly support) or a one-time payment on [Ko-fi](https://ko-fi.com/mlcourse_ai).* [![Donate](https://img.shields.io/badge/support-patreon-red)](https://www.patreon.com/ods_mlcourse) [![Donate](https://img.shields.io/badge/support-ko--fi-red)](https://ko-fi.com/mlcourse_ai)

简介

暂无描述 展开 收起
Python 等 2 种语言
CC-BY-4.0
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
1
https://gitee.com/nwpucht/mlcourse.ai.git
git@gitee.com:nwpucht/mlcourse.ai.git
nwpucht
mlcourse.ai
mlcourse.ai
main

搜索帮助