Skip to content

Investigating the effect of using unsupervised pre-training in the object classification downstream task

Notifications You must be signed in to change notification settings

athaioan/DIP_LFR

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

53 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Module 1 Assignment Learning Feature Representation WASP course

Unsupervised pre-training

Completing pre-training will save the network's weight in the "stored" folder (which will be created if does not already exist)

DIM

Unsupervised pre-training on the Tiny ImageNet using the DIM framework (https://arxiv.org/abs/1808.06670)

python pre_train_DIM.py

dAE

Unsupervised pre-training on the Tiny ImageNet using the denoising-AE

python pre_train_dAE.py

Fine Tuning

Supervised training on the CIFAR10

python fine_tune_cifar10.py

Dataset format

The data structure should look like this. The cifar10 folder will create itself upon running any of the training scripts. We need to create the imagenet_tiny structure ourselves based using the "image_tensor.bin" file provided in the assignment description.

    ├── data
    │   ├──imagenet_tiny
    │   │   ├── image_tensor.bin        
    │   ├──cifar10
    └── ...

Acknowledgements

The DIM pre-training uses the pytorch DIM implementation from https://github.com/DuaneNielsen/DeepInfomaxPytorch

About

Investigating the effect of using unsupervised pre-training in the object classification downstream task

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%