Skip to content

A simple implementation of neural network with gradient descent and backpropagation.

Notifications You must be signed in to change notification settings

conscioustahoe/mini-neural-net

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mini-neural-net

A simple implementation of neural network with gradient descent and backpropagation, roughly based on torch.autograd. Dive into the inner workings of neural networks by implementing everything by hand. Understand backpropagation and gradient descent algorithms, which are crucial for learning. We'll revisit Calculus 101 to manually calculate gradients or derivatives at each step of the process. As our network scales up, we'll automate the entire learning process. This hands-on approach ensures a deep understanding of how these algorithms operate at a fundamental level.

About

A simple implementation of neural network with gradient descent and backpropagation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published