Skip to content

felixmpaulus/CNN-MNIST

Repository files navigation

Building a basic NN from scratch.

I am building a (convolutional) neural network from scratch to get a better understanding of the underlying fundamentals.

Ressources

Using

Debugging:

determining network size

Architecture

  • the network uses classic backpropagation and biases for each neuron.

  • the final error is calculated using the mean square error formular.

  • so far sigmoid, ReLU und leaky ReLU are implemented as activation functions.

  • Ideas:

XOR

  • the network consists of 2 Input, 1 hidden Layer with 2 Neurons and one Output.

Examining the network

  • non-convergence rate is examined by the following parameters
    • learning rate
      • 0.5
      • 0.3
      • 0.1
      • 0.01
    • activation function
      • sigmoid
      • ReLU
      • leaky ReLU
    • intervall of initial weights
      • [0, 1]
      • [-0.5, 0.5]
      • [0, 0.5]
      • [0.5, 1]

Results

  • the percentage of non-convergence is shown below. 30 samples were taken with each settings, each network was trained 120 000 times.

  • sigmoid:

  • ReLU:

  • leaky ReLU (0.01x):

  • leaky ReLU (0.001x):

The settings that resulted in the lowest non-convergence rate were the following:

  • ReLU, weights in [0, 1], learning rate 0.01 at 7% non-convergence
  • ReLU, weights in [0.5, 1], learning rate 0.01 at 7% non-convergence
  • leaky ReLU with 0.01x, weights in [0.5, 1], learning rate 0.01 at 7% non-convergence
  • leaky ReLU with 0.001x, weights in [0, 1], learning rate 0.01 at 7% non-convergence

Loss during training

  • the following graphs show the decreasing loss of the previously mentioned settings.

  • ReLU, weights in [0, 1], learning rate 0.01:

  • ReLU, weights in [0.5, 1], learning rate 0.01:

  • leaky ReLU with 0.01x, weights in [0.5, 1], learning rate 0.01:

  • leaky ReLU with 0.001x, weights in [0, 1], learning rate 0.01:

Visualization (not of this project):

other stuff

  • kaggle.com

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published