Skip to content

Latest commit

 

History

History
52 lines (32 loc) · 2.33 KB

proposal.md

File metadata and controls

52 lines (32 loc) · 2.33 KB

Basic neural network library supported auto-differentiation

Motivation

Most deep learning courses aim to teach math behind the network, architecture and their applications.

However, seldom course talk about how to implement and design the deep learning library and start everything from scratch.

Wish to implement this kind of library and learn how and why the priors (TensorFlow and PyTorch etc.) design their work during the development of final project.

Target

Based on Autograd project, build a similar library that user simply define the function, and this lib can automatically calculate this differentiation form of given function.

Build the computational graph when function is called, calculate the backward propogation with respect to variable or placeholder (in tensorflow term).

Provide a benchmark compared to TensorFlow and PyTorch.

If time allowed, provide a simple multi-layer perceptron (neural network) interface, criterion, optimizer, datasets and dataloader like the priors.

Project Link

AutoDiff-from-scratch

TODO

  • benchmark
  • documents

Reference

Source code

Lecture

Documents