Skip to content

I have implemented Logestic Regression for Binary classification from scratch with all three types of gradient descent.

Notifications You must be signed in to change notification settings

BukuBukuChagma/Logistic-Regressor-With-All-Three-Batch-Stochastic-and-Mini-Batch-Gradient-Descent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

Binary Logistic Regressor

I have implemented Logistic Regression for Binary classification from scratch with all three types of gradient descent and also include hyper-paremeter tunning of lamda(regularization_co-efficient) and learning_rate.

===> Binary_Logistic_Regression : is the file that contains the Binary Logistic Regressor

The code is quite generic, and can easily be used to fit on any binary Classifications datasets.

#NOTE: As the tittle suggests, this is for binary class problems only.

#Parameters:

  1. main_gradient_descent : can accept either of the three values('batch', 'stochastic', 'mini-batch')
  2. regularizer : can accept either of the two values('l1', 'l2')
  3. hyper_paremeters_assign : can accept either of the two values('auto-assign', 'self-assign'). When auto-assign is choosen, no need to provide 'lamda_value' and 'lr_value'. They will be choosen by Hyper_Parameter Tuning. When 'self-assign' is choose, 'lamda_value' and 'lr_value' must be provided.
  4. hyper_parameters_tuning_gradient_descent : can accept either of the three values('batch', 'stochastic', 'mini-batch')
  5. max_iter : can accept any integer value
  6. early_stopping : can accept either of the two values(True, False)
  7. lamda_value : can accept any float value. Should only be given when 'self-assign' is choosen at (3)
  8. lr_value : can accept any float value. Should only be given when 'self-assign' is choosen at (3)
  9. monitor : can accept either of the two values('val_error', 'val_accuracy')
  10. paitiance : can accept any integer value
  11. error_roundoff : can accept any integer value
  12. acc_roundoff : can accpet any integer value
  13. acc_change : can accept any numerical value(be it integer or float)
  14. error_change : can accept any numerical valuee(be it integer or float)
  15. verbose : can accept either of the two values(True, False)

Libraries used inside the Binary Logistic Regression File:
==> Sci-kit Learn: For train_test_splits, confusion_matrix, classification_report and accuracy report
==> Numpy: Since numpy calculations are faster, so all calculations are done with numpy
==> Random: To intialize weights randomly at beginning
==> OS: Used when you try to use the save model functionality.

About

I have implemented Logestic Regression for Binary classification from scratch with all three types of gradient descent.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages