This repository aims to reimplement some interesting work and provides examples for self-study and customization in your own applications.
- [AAAI 2019] Gradient Harmonized Single-stage Detector. We modify the GHM-C Loss in this paper for multiple classification problems. [code]
-
[CVPR 2019] Bag of Tricks for Image Classification with Convolutional Neural Networks. We tested some tricks on the CIFAR10 dataset, see Experiments on CIFAR10 for more details.
-
[CVPR 2019] Selective Kernel Networks. [code]. [our impl.]
- [ICLR 2019] Decoupled Weight Decay Regularization. [code]
We use NNI(Neural Network Intelligence) toolkit to search the best hyperparameters(learning rate, label smoothing, etc.) in local machine. Our code in here, the Experiment of NNI (including search space and config) in here.
We test three hyperparameters by NNI toolkit, including learning rate, label smoothing and alpha(for mixup trick). we use RestNet32 architecture with CosineAnnealing, Mixup and LabelSmoothing tricks for training like Artificial experiment on NNI search experiment, the form of below compares the differents between artificial and NNI search hyperparameters, giving the best mAP@1.
Artificial | NNI Search | |
---|---|---|
init learning rate | 0.1 | 0.4888396525811341 |
alpha(for mixup trick) | 1.0 | 0.47897353118618013 |
label smoothing | 0.1 | 0.11541420525458237 |
mAP@1 | 94.38% | 94.71% |
image classification tricks on CIFAR10, refer to[1]
Heuristic | mAP@1 | Lift |
---|---|---|
ResNet32 | 92.40% | (+0.0, BASELINE) |
ResNet32+ZeroGamma | 93.39% | +0.99% |
ResNet32D | 93.06% | +0.66% |
ResNet32D+ZeroGamma | 93.05% | +0.65% |
ResNet32+CosineAnnealing | 93.66% | +1.26% |
ResNet32+LabelSmoothing | 92.92% | +0.52% |
ResNet32+ZeroGamma+CosineAnnealing | 93.63% | +1.23% |
ResNet32+CosineAnnealing+Mixup | 94.18% | +1.78% |
ResNet32+CosineAnnealing+Mixup+LabelSmoothing | 94.38% | +1.98% |