Most deep learning courses aim to teach math behind the network, architecture and their applications.
However, seldom course talk about how to implement and design the deep learning library and start everything from scratch.
Wish to implement this kind of library and learn how and why the priors (TensorFlow and PyTorch etc.) design their work during the development of final project.
Based on Autograd project, build a similar library that user simply define the function, and this lib can automatically calculate this differentiation form of given function.
Build the computational graph when function is called, calculate the backward propogation with respect to variable or placeholder (in tensorflow term).
Provide a benchmark compared to TensorFlow and PyTorch.
If time allowed, provide a simple multi-layer perceptron (neural network) interface, criterion, optimizer, datasets and dataloader like the priors.
- benchmark
- documents