Skip to content

Commit

Permalink
Update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
thieu1995 committed Mar 27, 2024
1 parent 914eada commit 54622d9
Show file tree
Hide file tree
Showing 3 changed files with 9 additions and 9 deletions.
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
author = 'Thieu'

# The full version, including alpha/beta/rc tags
release = '1.1.0'
release = '1.1.1'


# -- General configuration ---------------------------------------------------
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
Welcome to IntelELM's documentation!
====================================

.. image:: https://img.shields.io/badge/release-1.1.0-yellow.svg
.. image:: https://img.shields.io/badge/release-1.1.1-yellow.svg
:target: https://github.com/thieu1995/intelelm/releases

.. image:: https://img.shields.io/pypi/wheel/gensim.svg
Expand Down
14 changes: 7 additions & 7 deletions docs/source/pages/quick_start.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ There are so many ways to install our library. For example:

* Install from the `PyPI release <https://pypi.python.org/pypi/intelelm />`_::

$ pip install intelelm==1.1.0
$ pip install intelelm==1.1.1


* Install directly from source code::
Expand Down Expand Up @@ -63,7 +63,7 @@ process consists of the following steps:
data.y_test = scaler_y.transform(np.reshape(data.y_test, (-1, 1)))
## Define the model
model = ElmRegressor(hidden_size=10, act_name="elu")
model = ElmRegressor(hidden_size=10, act_name="elu", seed=42)
## Test the model
model.fit(data.X_train, data.y_train)
Expand Down Expand Up @@ -95,7 +95,7 @@ If you want to use the Whale Optimization-based ELM (WO-ELM) model, you can chan
opt_paras = {"name": "WOA", "epoch": 100, "pop_size": 30}
model = MhaElmRegressor(hidden_size=10, act_name="elu", obj_name="MSE",
optimizer="OriginalWOA", optimizer_paras=opt_paras, verbose=False)
optimizer="OriginalWOA", optimizer_paras=opt_paras, verbose=False, seed=42)
In the example above, I had to import the MhaElmRegressor class. This is the class that contains all Metaheuristics-based ELM models for regression problems.
Then, I defined parameters for the Whale Optimization algorithm. And I defined parameters for the Whale Optimization-based ELM model.
Expand All @@ -110,7 +110,7 @@ MhaElmClassifier class (these are hybrid models combining metaheuristics algorit
from intelelm import ElmClassifier
model = ElmClassifier(hidden_size=10, act_name="elu")
model = ElmClassifier(hidden_size=10, act_name="elu", seed=42)
Expand All @@ -120,7 +120,7 @@ MhaElmClassifier class (these are hybrid models combining metaheuristics algorit
opt_paras = {"name": "GA", "epoch": 100, "pop_size": 30}
model = MhaElmClassifier(hidden_size=10, act_name="elu", obj_name="BSL",
optimizer="BaseGA", optimizer_paras=opt_paras, verbose=False)
optimizer="BaseGA", optimizer_paras=opt_paras, verbose=False, seed=42)
-------------------
Expand Down Expand Up @@ -168,7 +168,7 @@ library to split and scale the data.
print(MhaElmClassifier.SUPPORTED_OPTIMIZERS)
print(MhaElmClassifier.SUPPORTED_CLS_OBJECTIVES)
opt_paras = {"name": "GA", "epoch": 10, "pop_size": 30}
classifier = MhaElmClassifier(hidden_size=10, act_name="elu", obj_name="KLDL", optimizer="BaseGA", optimizer_paras=opt_paras)
classifier = MhaElmClassifier(hidden_size=10, act_name="elu", obj_name="KLDL", optimizer="BaseGA", optimizer_paras=opt_paras, seed=42)
#### Step 6: Traint the model
classifer.fit(X_train, y_train)
Expand Down Expand Up @@ -217,7 +217,7 @@ In the example below, we use the Whale Optimization-based ELM model as the base
# create model and selector
opt_paras = {"name": "GA", "epoch": 100, "pop_size": 30}
model = MhaElmRegressor(hidden_size=10, act_name="relu", obj_name="MSE",
optimizer="BaseGA", optimizer_paras=opt_paras, verbose=False)
optimizer="BaseGA", optimizer_paras=opt_paras, verbose=False, seed=42)
selector = RFE(estimator=model)
selector.fit(X_train, y_train)
Expand Down

0 comments on commit 54622d9

Please sign in to comment.