Releases: azimonti/MNIST-classification
Releases · azimonti/MNIST-classification
Version 1.1
Version 1.0
New Features:
-
MNIST Dataset Programs:
- Implemented two separate programs for neural network training and testing using:
- Stochastic Gradient Descent (SGD): Efficient convergence with good accuracy on MNIST (92.16% after 5 epochs).
- Genetic Algorithm (GA): Experimental approach showcasing slow convergence, highlighting the limitations of GA for MNIST, but providing flexibility for other potential tasks.
- Implemented two separate programs for neural network training and testing using:
-
Command-Line Arguments:
--training_start
: Start training from scratch.--training_continue
: Continue training from a saved state.--testing
: Run the trained model in testing mode.
Known Limitations:
- GA Performance: Slow convergence (~30% accuracy after 3000 generations) on MNIST, as expected, due to the unsuitability of GA for this task. However, the library can be adapted for tasks where SGD isn't applicable, such as simulations of interactions in games.