Skip to content

Latest commit

 

History

History
57 lines (46 loc) · 3.72 KB

README.md

File metadata and controls

57 lines (46 loc) · 3.72 KB

Awesome-work implemented by PyTorch

This repository aims to reimplement some interesting work and provides examples for self-study and customization in your own applications.

Contents

Object detection

  1. [AAAI 2019] Gradient Harmonized Single-stage Detector. We modify the GHM-C Loss in this paper for multiple classification problems. [code]

Image classification

  1. [CVPR 2019] Bag of Tricks for Image Classification with Convolutional Neural Networks. We tested some tricks on the CIFAR10 dataset, see Experiments on CIFAR10 for more details.

  2. [CVPR 2019] Selective Kernel Networks. [code]. [our impl.]

Optimization

  1. [ICLR 2019] Decoupled Weight Decay Regularization. [code]

AutoML

We use NNI(Neural Network Intelligence) toolkit to search the best hyperparameters(learning rate, label smoothing, etc.) in local machine. Our code in here, the Experiment of NNI (including search space and config) in here.

We test three hyperparameters by NNI toolkit, including learning rate, label smoothing and alpha(for mixup trick). we use RestNet32 architecture with CosineAnnealing, Mixup and LabelSmoothing tricks for training like Artificial experiment on NNI search experiment, the form of below compares the differents between artificial and NNI search hyperparameters, giving the best mAP@1.

Artificial NNI Search
init learning rate 0.1 0.4888396525811341
alpha(for mixup trick) 1.0 0.47897353118618013
label smoothing 0.1 0.11541420525458237
mAP@1 94.38% 94.71%

Experiments on CIFAR10

image classification tricks on CIFAR10, refer to[1]

Heuristic mAP@1 Lift
ResNet32 92.40% (+0.0, BASELINE)
ResNet32+ZeroGamma 93.39% +0.99%
ResNet32D 93.06% +0.66%
ResNet32D+ZeroGamma 93.05% +0.65%
ResNet32+CosineAnnealing 93.66% +1.26%
ResNet32+LabelSmoothing 92.92% +0.52%
ResNet32+ZeroGamma+CosineAnnealing 93.63% +1.23%
ResNet32+CosineAnnealing+Mixup 94.18% +1.78%
ResNet32+CosineAnnealing+Mixup+LabelSmoothing 94.38% +1.98%