Skip to content
This repository has been archived by the owner on Jul 21, 2020. It is now read-only.

Latest commit

 

History

History
 
 

week04_[recap]_deep_learning

Note: This week's materials cover the basics of neural nets and deep learning and teach you how to use auto-diff frameworks. If you're already fluent in tensorflow OR pytorch OR theano - feel free to skip this week entirely..

Materials

Bonus materials

Practice

Colab url (pytorch) From now on, we'll have two tracks: theano and tensorflow. We'll also add pytorch seminars as soon as they're ready.

Please pick seminar_theano.ipynb, seminar_tensorflow.ipynb or seminar_pytorch.ipynb.

Note: in this and all following weeks you're only required to get through practice in one of the frameworks. Looking into other alternatives is great for self-education but never mandatory.

What to choose?

  • The simplest choice is PyTorch: it's basically ye olde numpy with automatic gradients and a lot of pre-implemented DL stuff... except all the functions have different names.

  • If you want to be familiar with production-related stuff from day 1, choose TensorFlow. It's much more convenient to deploy (to non-python or to mobiles). The catch is that all those conveniences become inconveniences once you want to write something simple in jupyter.

  • Theano works like tensorflow but it offers a numpy-compatible interface and comes with built-in graph optimization. The payoff is that theano is not as popular as the first two. It is also not meant as a producton framework so deploying to mobiles may be a problem.

  • It's not like choosing house at Hogwarts, you'll be able to switch between frameworks easily once you master the underlying principles.