Open-Set


2023-03-14 更新

Boosting Open-Set Domain Adaptation with Threshold Self-Tuning and Cross-Domain Mixup

Authors:Xinghong Liu, Yi Zhou, Tao Zhou, Jie Qin, Shengcai Liao

Open-set domain adaptation (OSDA) aims to not only recognize target samples belonging to common classes shared by source and target domains but also perceive unknown class samples. Existing OSDA methods suffer from two obstacles. Firstly, a tedious process of manually tuning a hyperparameter $threshold$ is required for most OSDA approaches to separate common and unknown classes. It is difficult to determine a proper threshold when the target domain data is unlabeled. Secondly, most OSDA methods rely only on confidence values to distinguish between common and unknown classes, using limited source and target samples to train models, leading to unsatisfactory performance when the target domain has mostly unknown classes. Our studies demonstrate that exploiting multiple criteria within a more continuous latent space is beneficial for the model’s performance. In this paper, we design a novel threshold self-tuning and cross-domain mixup (TSCM) method to overcome the two drawbacks. TSCM can automatically tune a proper threshold utilizing unlabeled target samples rather than manually setting an empirical hyperparameter. Our method considers multiple criteria instead of only the confidence and uses the threshold generated by itself to separate common and unknown classes in the target domain. Moreover, we introduce a cross-domain mixup method designed for OSDA scenarios to learn domain-invariant features in a more continuous latent space. Comprehensive experiments illustrate that our method consistently achieves superior performance on different benchmarks compared with various state-of-the-art methods.
PDF

点此查看论文截图

文章作者: 木子已
版权声明: 本博客所有文章除特別声明外,均采用 CC BY 4.0 许可协议。转载请注明来源 木子已 !
  目录