This repository contains projects, practices for Neural Network and Deep Learning
In this practice I built a vanilla neural network using Mini-batch Stochastic Gradient Descent. The network was designed with configurable settings such as network structure, number of training Epochs, mini-batch size and learning rate. Finally, the network was trained on XOR, Iris and MNIST datasets.
This project is meant to enhance and expand on what was achieved from project 1. Building upon the vanilla neural network, I have added more features to allow me explore the performance of NN with more flexibility:
- Early Stopping Criterion
- Activation Funcitons * Sigmoid * Tanh * ReLU * Softmax
- Cost Functions * Quadratic * Cross-Entropy * Negative Log Likelihood
- L2 Regularization
- Momentum Parameter Updates
- Returning Cost and Accuracy for Plotting
- Returning Learned Network