layout |
---|
default |
Large-scale Distance Metric Learning with Uncertainty.[CVPR'18] pdf
We show that representations can be optimized with triplets defined on multiple centers of each class rather than original examples.
We show that the conventional cross entropy loss with normalized softmax operator is equivalent to a triplet loss defined on proxies from classes. Based on the analysis, an improved loss encoding multiple proxies for each class is proposed.
We propose an algorithm to balance the performance of different classes in the source domain for pre-training. A balanced model can be better for fine-tuning on a different domain.
Dash: Semi-Supervised Learning with Dynamic Thresholding.[ICML'21] pdf
We propose a novel thresholding strategy to pick the appropriate unlabeled data for semi-supervised learning.
We show that representations learned by deep learning is closely related to the training task. With coarse labels only, it is possible to approach the performance with full supervision.
Robust Optimization over Multiple Domains.[AAAI'19] pdf
Given examples from multiple domains, we propose an algorithm to optimize the worst domain effectively.
To tackle the imbalance issue in object detection, we convert the classification problem to a ranking problem and propose a novel loss accordingly.