Skip to content
/ MIHA Public

Optimizer for configuration of hyperparameters in neural networks

Notifications You must be signed in to change notification settings

Dreamlone/MIHA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

58 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

miha_logo.png

MIHA

Optimizer for configuration of hyperparameters in neural networks.

What does this library do? - Module can optimize hyperparameters of a neural network for a pre-defined architecture.

What deep learning libraries can this module work with? - PyTorch.

What algorithm is used for optimization? - An evolutionary algorithm with mutation and crossover operators is used. The neural network is continuously trained in the process of evolution.

The main concept

main_concept.png

Requirements

'python>=3.7',
'numpy',
'cudatoolkit==10.2',
'torchvision==0.7.0',
'pytorch==1.6.0'

Documentation

Description of the submodules:

For now all the necessary description can be found in docstring.

How to use

How to run the algorithm can be seen in the examples:

Comparison with competing solutions (jupyter notebooks)

Contacts

Feel free to contact us:

About

Optimizer for configuration of hyperparameters in neural networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published