From 4946e9c59d6bd9ffd8f050d62f333a16c31b0d30 Mon Sep 17 00:00:00 2001 From: Piotr Date: Tue, 9 Apr 2019 14:09:43 +0200 Subject: [PATCH] Update README.md --- README.md | 24 +++++++++++++++++++----- 1 file changed, 19 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 59fd7293..cc6e0552 100644 --- a/README.md +++ b/README.md @@ -5,13 +5,19 @@ [![Coverage Status](https://coveralls.io/repos/github/mljar/mljar-supervised/badge.svg?branch=master)](https://coveralls.io/github/mljar/mljar-supervised?branch=master) [![PyPI pyversions](https://img.shields.io/pypi/pyversions/mljar-supervised.svg)](https://pypi.python.org/pypi/mljar-supervised/) -[![Machine Learning for Humans](images/the-mljar.svg =100x)](https://mljar.com) +[![Machine Learning for Humans](images/the-mljar.svg)](https://mljar.com) -**The new standard in Machine Learning!** Always have best model which is selected and tuned. +## The new standard in Machine Learning! -Collection of supervised methods (including processing), used in MLJAR AutoML solution. +Thanks to Automated Machine Learning you don't need to worry about different machine learning interfaces. You don't need to know all algorithms and their hyper-parameters. With AutoML model tuning and training is painless. + +In the current version only binary classification is supported with optimization of LogLoss metric. -With this package you can handle any supervised ML algorithm with the same interface. +## Example + + + +## The tuning algorithm ## Installation @@ -33,7 +39,7 @@ Python 3.6 is required. ## Usage -This is Automated Machine Learning package, so all hard tasks is done for you. The interface is simple but if necessary allows you to control the training process. +This is Automated Machine Learning package, so all hard tasks is done for you. The interface is simple but if necessary it gives you ability to control the training process. #### Train and predict @@ -55,6 +61,14 @@ By the default, the training should finish in less than 1 hour and as ML algorit The parameters that you can use to control the training process are: - **total_time_limit** - it is a total time limit that AutoML can spend for searching to the best ML model. It is in seconds. _Default is set to 3600 seconds._ +- **learner_time_limit** - the time limit for training single model, in case of `k`-fold cross validation, the time spend on training is `k*learner_time_limit`. This parameter is only considered when `total_time_limit` is set to None. _Default is set to 120 seconds_. +- **algorithms** - the list of algorithms that will be checked. _Default is set to ["CatBoost", "Xgboost", "RF", "LightGBM", "NN"]_. +- **start_random_models** - the number of models to check with _not so random_ algorithm. _Default is set to 10_. +- **hill_climbing_steps** - number of hill climbing steps used in models tuning. _Default is set to 3_. +- **top_models_to_improve** - number of models considered for improvement in each hill climbing step. _Default is set to 5_. +- **train_ensemble** - decides if ensemble model is trained at the end of AutoML fit procedure. _Default is set to True_. +- **verbose** - controls printouts, _Default is set to True_. + ## Development