Two Layer Neural Network

About the project:

This project was adopted from the Stanford CS231n Convolutional Neural Networks for Visual Recognition course. The goal was to implement a two layer neural network on CIFAR-10, and it involved implementing forward and backward passes, testing different loss functions, and tuning hyperparameters. The two layer fully connected network is defined by a fully connected layer taking an input, followed by a ReLU activation, then followed by another fully connected layer and softmax layer.

The focus of this project was implementing each layer and subsequent operations from scratch. The code was then tested in a modular fashion by the calling code in Google Colab. Each block within the network yielded minimal error, thus proving the validity of my implementation.


Skills:

Software: Python | Google Colab