Skip to content

Latest commit

 

History

History
56 lines (34 loc) · 1.6 KB

README.md

File metadata and controls

56 lines (34 loc) · 1.6 KB

GradientDescent

Overview/Summary

This is the Implementation of gradient descent algorithm to find the value of parameters(m,b) so that the cost function is minimized i.e. finding the line of best fit on the data.

Visualize data

Visualizing the data

Data

Cost Function - Sum of Squared Error

A simple SSE algorithm for measuring error

def CostFunction(m,b,data):
    sumError = 0
    for itr in range(m_examples):
        feature = data[itr,0]
        label = data[itr,1]
        predLabel = (m * feature) + b
        sumError += (label - predLabel)**2
    sumError = sumError/m_examples
    return sumError

Visualize data plus fitting line

  • Before fitting the parameters

Data plus fitting line

  • After Running Gradient Descent

Data plus fitting line

Plot - Error

The error changing as we move toward the minimum.

Error

Dependencies]

  • numpy
  • pandas (read the dataset)
  • matplotlib (plotting)

References

Siraj Raval - Youtube - Intro - The Math of Intelligence

A Neural Network in 13 lines of Python (Part 2 - Gradient Descent) - Andrew Trask