Semantics-driven attentive few-shot learning over clean and noisy samples

Creative Commons License


NEUROCOMPUTING, vol.513, pp.59-69, 2022 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 513
  • Publication Date: 2022
  • Doi Number: 10.1016/j.neucom.2022.09.121
  • Journal Name: NEUROCOMPUTING
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, Biotechnology Research Abstracts, Compendex, Computer & Applied Sciences, EMBASE, INSPEC, zbMATH
  • Page Numbers: pp.59-69
  • Keywords: Few -shot learning, Vision and language integration
  • Middle East Technical University Affiliated: Yes


Over the last couple of years, few-shot learning (FSL) has attracted significant attention towards minimiz-ing the dependency on labeled training examples. An inherent difficulty in FSL is handling ambiguities resulting from having too few training samples per class. To tackle this fundamental challenge in FSL, we aim to train meta-learner models that can leverage prior semantic knowledge about novel classes to guide the classifier synthesis process. In particular, we propose semantically-conditioned feature attention and sample attention mechanisms that estimate the importance of representation dimensions and training instances. We also study the problem of sample noise in FSL, towards utilizing meta-learners in more realistic and imperfect settings. Our experimental results demonstrate the effectiveness of the proposed semantic FSL model with and without sample noise.(c) 2022 Elsevier B.V. All rights reserved.