Skip to content

Latest commit

 

History

History
29 lines (23 loc) · 909 Bytes

README.md

File metadata and controls

29 lines (23 loc) · 909 Bytes

Nanograd

Toy backpropagation framework around scalar 1d vectors. Automatically differentiates and computes all the necessary gradient vectors. Written in C++.

full_graph_autograd Nice TODO

  • Add suport for multi-dimensional scalar vectors

Usage

Vector<float> a = 2; 
Vector<float> b(4, "b");
Vector<float> c = a + b;

Vector<float> d = ((a*b).pow(c)) + a;

a.m_label="a"; c.m_label="c"; d.m_label="d";

d.backward(); // backpropagation; updates all the gradients accordingly
d.print(); // recursively prints the whole chain of operations up to d; including data & gradients

std::cout << a.m_grad << std::endl; // prints only the gradient of vector a

Implementation

Based on: