Repository contains implementation of basic neural networks architectures using numpy only. This code was prepared for neural network classes at Wroclaw University of Science and Technology.
At the current state, implementation supports (linked modules):
-
Architectures
- Multilayer Perceptron
- [Convolutaional Neural Network (CNN)]
-
Layers
-
Activations
-
Losses
-
Optimizers:
-
Initializers:
NOTE: It is possible to easly extend or add architectures by composing existing layers or implement new ones.
Implementation utilizes OOP and computational graph approach (with manual gradient flow) and was inspired by article: Nothing but NumPy: Understanding & Creating Neural Networks with Computational Graphs from Scratch. Also several concepts about architecture of solution was taken from PyTorch. Under the scope of this project several study experiments on MNIST dataset, which investigate neural networks, were implemented.
└── src
├── data_processing
│ ├── notebooks
│ └── scripts
├── datasets
├── experiments
│ ├── notebooks
│ ├── one_hidden_mnist
│ ├── optimizers_mnist
│ └── scripts
├── metrics
├── nn
│ ├── activations
│ ├── layers
│ ├── losses
│ ├── networks
│ └── optimizers
└── utils
Author: Jakub Binkowski