This repository contains all the documents and links of the Fidle Training .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the Resinfo/SARI and DevLOG CNRS networks.
The objectives of this training are :
- Understanding the bases of Deep Learning neural networks
- Develop a first experience through simple and representative examples
- Understanding Tensorflow/Keras and Jupyter lab technologies
- Apprehend the academic computing environments Tier-2 or Tier-1 with powerfull GPU
For more information, see https://fidle.cnrs.fr :
- Fidle site
- Presentation of the training
- Program 2021/2022
- Subscribe to the list, to stay informed !
- Find us on youtube
For more information, you can contact us at :
Current Version : 2.0.29
Course slides The course in pdf format (12 Mo) |
Notebooks Get a Zip or clone this repository (40 Mo) |
Datasets All the needed datasets (1.2 Go) |
Videos Our Youtube channel |
Have a look about How to get and install these notebooks and datasets.
- LINR1 - Linear regression with direct resolution
Low-level implementation, using numpy, of a direct resolution for a linear regression - GRAD1 - Linear regression with gradient descent
Low level implementation of a solution by gradient descent. Basic and stochastic approach. - POLR1 - Complexity Syndrome
Illustration of the problem of complexity with the polynomial regression - LOGR1 - Logistic regression
Simple example of logistic regression with a sklearn solution
- PER57 - Perceptron Model 1957
Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 !
- BHPD1 - Regression with a Dense Network (DNN)
Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD) - BHPD2 - Regression with a Dense Network (DNN) - Advanced code
A more advanced implementation of the precedent example
- MNIST1 - Simple classification with DNN
An example of classification using a dense neural network for the famous MNIST dataset - MNIST2 - Simple classification with CNN
An example of classification using a convolutional neural network for the famous MNIST dataset
- GTSRB1 - Dataset analysis and preparation
Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset - GTSRB2 - First convolutions
Episode 2 : First convolutions and first classification of our traffic signs - GTSRB3 - Training monitoring
Episode 3 : Monitoring, analysis and check points during a training session - GTSRB4 - Data augmentation
Episode 4 : Adding data by data augmentation when we lack it, to improve our results - GTSRB5 - Full convolutions
Episode 5 : A lot of models, a lot of datasets and a lot of results. - GTSRB6 - Full convolutions as a batch
Episode 6 : To compute bigger, use your notebook in batch mode - GTSRB7 - Batch reports
Episode 7 : Displaying our jobs report, and the winner is... - GTSRB10 - OAR batch script submission
Bash script for an OAR batch submission of an ipython code - GTSRB11 - SLURM batch script
Bash script for a Slurm batch submission of an ipython code
- IMDB1 - Sentiment analysis with hot-one encoding
A basic example of sentiment analysis with sparse encoding, using a dataset from Internet Movie Database (IMDB) - IMDB2 - Sentiment analysis with text embedding
A very classical example of word embedding with a dataset from Internet Movie Database (IMDB) - IMDB3 - Reload and reuse a saved model
Retrieving a saved model to perform a sentiment analysis (movie review) - IMDB4 - Reload embedded vectors
Retrieving embedded vectors from our trained model - IMDB5 - Sentiment analysis with a RNN network
Still the same problem, but with a network combining embedding and RNN
- LADYB1 - Prediction of a 2D trajectory via RNN
Artificial dataset generation and prediction attempt via a recurrent network - SYNOP1 - Preparation of data
Episode 1 : Data analysis and preparation of a usuable meteorological dataset (SYNOP) - SYNOP2 - First predictions at 3h
Episode 2 : RNN training session for weather prediction attempt at 3h - SYNOP3 - 12h predictions
Episode 3: Attempt to predict in a more longer term
- AE1 - Prepare a noisy MNIST dataset
Episode 1: Preparation of a noisy MNIST dataset - AE2 - Building and training an AE denoiser model
Episode 1 : Construction of a denoising autoencoder and training of it with a noisy MNIST dataset. - AE3 - Playing with our denoiser model
Episode 2 : Using the previously trained autoencoder to denoise data - AE4 - Denoiser and classifier model
Episode 4 : Construction of a denoiser and classifier model - AE5 - Advanced denoiser and classifier model
Episode 5 : Construction of an advanced denoiser and classifier model
- VAE1 - First VAE, using functional API (MNIST dataset)
Construction and training of a VAE, using functional APPI, with a latent space of small dimension. - VAE2 - VAE, using a custom model class (MNIST dataset)
Construction and training of a VAE, using model subclass, with a latent space of small dimension. - VAE3 - Analysis of the VAE's latent space of MNIST dataset
Visualization and analysis of the VAE's latent space of the dataset MNIST - VAE5 - Another game play : About the CelebA dataset
Episode 1 : Presentation of the CelebA dataset and problems related to its size - VAE6 - Generation of a clustered dataset
Episode 2 : Analysis of the CelebA dataset and creation of an clustered and usable dataset - VAE7 - Checking the clustered dataset
Episode : 3 Clustered dataset verification and testing of our datagenerator - VAE8 - Training session for our VAE
Episode 4 : Training with our clustered datasets in notebook or batch mode - VAE9 - Data generation from latent space
Episode 5 : Exploring latent space to generate new data - VAE10 - SLURM batch script
Bash script for SLURM batch submission of VAE8 notebooks
- DCGAN01 - A first DCGAN to Draw a Sheep
Episode 1 : Draw me a sheep, revisited with a DCGAN
- ACTF1 - Activation functions
Some activation functions, with their derivatives. - NP1 - A short introduction to Numpy
Numpy is an essential tool for the Scientific Python. - SCRATCH1 - Scratchbook
A scratchbook for small examples - TSB1 - Tensorboard with/from Jupyter
4 ways to use Tensorboard from the Jupyter environment
Have a look about How to get and install these notebooks and datasets.
[en] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
[Fr] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See License.
See Disclaimer.