본문 바로가기

쓰기

 
발표자 김누리 
발표일자 2022-08-31 
저자 Noo-ri Kim and Jee-Hyong Lee 
학회명 CVPR 2022 
논문지  

Semi-supervised learning (SSL) is a method to make better models using a large number of easily accessible unlabeled data along with a small number of labeled data obtained at a high cost. Most of existing SSL studies focus on the cases where sufficient amount of labeled samples are available, tens to hundreds labeled samples for each class, which still requires a lot of labeling cost. In this paper, we focus on SSL environment with extremely scarce labeled samples, only 1 or 2 labeled samples per class, where most of existing methods fail to learn. We propose a propagation regularizer which can achieve efficient and effective learning with extremely scarce labeled samples by suppressing confirmation bias. In addition, for the realistic model selection in the absence of the validation dataset, we also propose a model selection method based on our propagation regularizer. The proposed methods show 70.9%, 30.3%, and 78.9% accuracy on CIFAR-10, CIFAR-100, SVHN dataset with just one labeled sample per class, which are improved by 8.9% to 120.2% compared to the existing approaches. And our proposed methods also show good performance on a higher resolution dataset, STL-10.


    2022

      Towards Unsupervised Domain Generalization
      2022.10.12
      발표자: 이진섭     발표일자: 2022-10-12     저자: Haiyang Yang* , Linjun Zhou*, Renzhe Xu, Peng Cui†, Zheyan Shen, Haoxin Liu     학회명: 2022 CVPR    
      Bootstrapped Meta-Learning
      2022.09.21
      발표자: 안재한     발표일자: 2022-09-21     저자: Sebastian Flennerhag, Yannick Schroecker, Tom Zahavy, Hado van Hasselt, David Silver, Satinder Singh     학회명: ICLR 2022    
      Federated Multi-Target Domain Adaptation
      2022.09.07
      발표자: 강용훈     발표일자: 2022-09-07     저자: Chun-Han Yao, Boqing Gong, Hang Qi, Yin Cui, Yukun Zhu, Ming-Hsuan Yang     학회명: WACV 2022