본문 바로가기

쓰기

 
발표자 김누리 
발표일자 2021-08-11 
저자 Hyun, Minsung, Jisoo Jeong, and Nojun Kwak. 
학회명 arXiv preprint arXiv:2002.06815 (2020). 
논문지  
Semi-Supervised Learning (SSL) has achieved great success in overcoming the difficulties of labeling and making full use of unlabeled data. However, SSL has a limited assumption that the numbers of samples in different classes are balanced, and many SSL algorithms show lower performance for the datasets with the imbalanced class distribution. In this paper, we introduce a task of class-imbalanced semi-supervised learning (CISSL), which refers to semi-supervised learning with class-imbalanced data. In doing so, we consider class imbalance in both labeled and unlabeled sets. First, we analyze existing SSL methods in imbalanced environments and examine how the class imbalance affects SSL methods. Then we propose Suppressed Consistency Loss (SCL), a regularization method robust to class imbalance. Our method shows better performance than the conventional methods in the CISSL environment. In particular, the more severe the class imbalance and the smaller the size of the labeled data, the better our method performs.

    2021

      WORD TRANSLATION WITHOUT PARALLEL DATA
      2021.08.18
      발표자: 고설준     발표일자: 2021-08-18     저자: Alexis Conneau∗ † ‡ , Guillaume Lample∗ † § , Marc’Aurelio Ranzato† , Ludovic Denoyer§ , Herve J ´ egou ´ †     학회명: ICLR 2018    
      Class-Imbalanced Semi-Supervised Learning
      2021.08.11
      발표자: 김누리     발표일자: 2021-08-11     저자: Hyun, Minsung, Jisoo Jeong, and Nojun Kwak.     학회명: arXiv preprint arXiv:2002.06815 (2020).