Skip to content

Backpropagation and gradient computing for scalar vectors

Notifications You must be signed in to change notification settings

lucasmartiniano6/nanograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

Nanograd

Toy backpropagation framework around scalar 1d vectors. Automatically differentiates and computes all the necessary gradient vectors. Written in C++.

full_graph_autograd Nice TODO

  • Add suport for multi-dimensional scalar vectors

Usage

Vector<float> a = 2; 
Vector<float> b(4, "b");
Vector<float> c = a + b;

Vector<float> d = ((a*b).pow(c)) + a;

a.m_label="a"; c.m_label="c"; d.m_label="d";

d.backward(); // backpropagation; updates all the gradients accordingly
d.print(); // recursively prints the whole chain of operations up to d; including data & gradients

std::cout << a.m_grad << std::endl; // prints only the gradient of vector a

Implementation

Based on:

About

Backpropagation and gradient computing for scalar vectors

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages