Implemented a feed-forward neural network from scratch using Python and numpy. Internal activation function is ReLU, final activation is softmax. Utilized object-oriented programming to define a general network class, increasing applicability and efficiency. Achieved 79% accuracy classifying images from the MNIST dataset of hand-drawn digits.
Future steps are to implement multiple activation funciton options, automatic methods for cross-validation to tune hyperparameters, as well as expanding to implement a convolutional neural network to better classify images.