A simple implementation of neural network with gradient descent and backpropagation, roughly based on torch.autograd
. Dive into the inner workings of neural networks by implementing everything by hand. Understand backpropagation and gradient descent algorithms, which are crucial for learning. We'll revisit Calculus 101 to manually calculate gradients or derivatives at each step of the process. As our network scales up, we'll automate the entire learning process. This hands-on approach ensures a deep understanding of how these algorithms operate at a fundamental level.
-
Notifications
You must be signed in to change notification settings - Fork 0
conscioustahoe/mini-neural-net
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A simple implementation of neural network with gradient descent and backpropagation.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published