Following the tutorials from aymericdamien's Tensorflow Examples
Reading material from Neural Networks and deep Learning
Implement these with a basic knowledge of Data Analytics and Machine Learning
Also read the awesome tutorial on backpropogation from the same book (Chapter 2) and why it works so well for neural nets
- CNN with Abstraction using Tensorflow (Needs a fix) (tutorial)
Read what happens when the network has a slow learning rate due to the L2 cost function from Chapter 3
- Cross Entropy
- SoftMax
Read about Regularization and why it works. Simple is better, but not neccesarily.
- Over Fitting
- No Free Lunches
L1 Regularization - Makes the network smaller with lesser number of connections. L2 Regularization - Makes sure the weights are not too big. Dropout - Works similar to averaging multiple nets.
- Data Augmentation: Makes the network more susceptible to changes
-
Convolution Neural Networks (textbook chapter 4)
-
Cats vs Dogs - Here's an example using a vanilla CNN
The timesteps are added to the LSTM cell in the RNN example. For more info, read this blog on Understanding LSTMs.
- Saving and Restoring a Model Here's an excellent blog post on saving and restoring models
Special Thanks to Naresh for his awesome backpropogation for my errors!