Skip to content

Latest commit

 

History

History
97 lines (73 loc) · 11.6 KB

README.md

File metadata and controls

97 lines (73 loc) · 11.6 KB

ML-DL-implementation

Build Status Gitter Gitpod ready-to-code

Machine Learning and Deep Learning library in python using numpy and matplotlib.

Why this repository?


This repository gives beginners and newcomers in the field of AI and ML a chance to understand the inner workings of popular learning algorithms by presenting them with a simple to analyze the implementation of ML and DL algorithms in pure python using only numpy as a backend for linear algebraic computations for the sake of efficiency.

The goal of this repository is not to create the most efficient implementation but the most transparent one, so that anyone with little knowledge of the field can contribute and learn.

Installation

You can install the library by running the following command,

python3 setup.py install

For development purposes, you can use the option develop as shown below,

python3 setup.py develop

Testing

For testing your patch locally follow the steps given below,

  1. Install pytest-cov. Skip this step if you are already having the package.
  2. Run, python3 -m pytest --doctest-modules --cov=./ --cov-report=html. Look for, htmlcov/index.html and open it in your browser, which will show the coverage report. Try to ensure that the coverage is not decreasing by more than 1% for your patch.

Contributing to the repository

Follow the following steps to get started with contributing to the repository.

  • Clone the project to you local environment. Use git clone https://github.com/RoboticsClubIITJ/ML-DL-implementation/ to get a local copy of the source code in your environment.

  • Install dependencies: You can use pip to install the dependendies on your computer. To install use pip install -r requirements.txt

  • Installation: use python setup.py develop if you want to setup for development or python setup.py install if you only want to try and test out the repository.

  • Make changes, work on a existing issue or create one. Once assigned you can start working on the issue.

  • While you are working please make sure you follow standard programming guidelines. When you send us a PR, your code will be checked for PEP8 formatting and soon some tests will be added so that your code does not break already existing code. Use tools like flake8 to check your code for correct formatting.

Algorithms Implemented

Activations Location Optimizers Location Models Location Backend Location Utils Location
ACTIVATION FUNCTIONS OPTIMIZERS MODELS BACKEND PRE-PROCESSING METHODS
Sigmoid activations.py Gradient Descent optimizers.py Linear Regression models.py Autograd autograd.py Bell Curve preprocessor_utils.py
Tanh activations.py Stochastic Gradient Descent optimizers.py Logistic Regression models.py Tensor tensor.py Standard_Scaler preprocessor_utils.py
Softmax activations.py Mini Batch Gradient Descent optimizers.py Decision Tree Classifier models.py Functions functional.py MaxAbs_Scaler preprocessor_utils.py
Softsign activations.py Momentum Gradient Descent optimizers.py KNN Classifier/Regessor models.py Z_Score_Normalization preprocessor_utils.py
Relu activations.py Nesterov Accelerated Descent optimizers.py Naive Bayes models.py Mean_Normalization preprocessor_utils.py
Leaky Relu activations.py Adagrad optimizers.py Gaussian Naive Bayes models.py Min Max Normalization preprocessor_utils.py
Elu activations.py Adadelta optimizers.py Multinomial Naive Bayes models.py Feature Clipping preprocessor_utils.py
Swish activations.py Adam optimizers.py Polynomial Regression models.py
Unit Step activations.py Bernoulli Naive Bayes models.py
Random Forest Classifier models.py
K Means Clustering models.py
Divisive Clustering models.py
Agglomerative Clustering models.py
Bayes Optimization models.py
Numerical Outliers models.py
Principle Component Analysis models.py
Z_Score models.py
Sequential Neural Network models.py
Loss Functions Location Regularizer Location Metrics Location
LOSS FUNCTIONS REGULARIZER METRICS
Mean Squared Error loss_func.py L1_Regularizer regularizer.py Confusion Matrix metrics.py
Logarithmic Error loss_func.py L2_Regularizer regularizer.py Precision metrics.py
Absolute Error loss_func.py Accuracy metrics.py
Cosine Similarity loss_func.py Recall metrics.py
Log_cosh loss_func.py F1 Score metrics.py
Huber loss_func.py F-B Theta metrics.py
Mean Squared Log Error loss_func.py Specificity metrics.py
Mean Absolute Percentage Error loss_func.py