- Pure, simple deep neural-network implementation with Numpy. Supports multiple layers of any length and any number of outputs.
- Can achieve a score of around 0.9 on MNIST dataset.
- Currently uses sigmoid for activation.
- Multiple activation functions (maybe even custom if derivative is supplied).
- More implementations.
- Different types of networks: convolutional, recurrent etc..