Skip to content

An explicit back propagation example in numpy for MLP on MNIST.

License

Notifications You must be signed in to change notification settings

jareducherek/tutor-grad-mlp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

tutor-grad-mlp

An explicit back propagation example in numpy for MLP on MNIST. Uses gradient descent with momentum to achieve acceptable accuracy on vectorized inputs of the MNIST images. Check out the notebook for an example on how to create the network, train it, and evaluate it.



Currently working activation functions:

  • softmax (last layer only)
  • sigmoid
  • relu
  • tanh

Currently working layer functions:

  • fully connected with bias

About

An explicit back propagation example in numpy for MLP on MNIST.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published