Task guided representation learning using compositional models for zero-shot domain adaptation


Liu S., Ozay M.

Neural Networks, vol.165, pp.370-380, 2023 (SCI-Expanded) identifier identifier identifier

  • Publication Type: Article / Article
  • Volume: 165
  • Publication Date: 2023
  • Doi Number: 10.1016/j.neunet.2023.05.030
  • Journal Name: Neural Networks
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, BIOSIS, Biotechnology Research Abstracts, Communication Abstracts, Compendex, Computer & Applied Sciences, EMBASE, INSPEC, MEDLINE, Psycinfo, zbMATH
  • Page Numbers: pp.370-380
  • Keywords: Domain adaptation, Representation learning, Zero-shot
  • Middle East Technical University Affiliated: Yes

Abstract

Zero-shot domain adaptation (ZDA) methods aim to transfer knowledge about a task learned in a source domain to a target domain, while task-relevant data from target domain are not available. In this work, we address learning feature representations which are invariant to and shared among different domains considering task characteristics for ZDA. To this end, we propose a method for task-guided ZDA (TG-ZDA) which employs multi-branch deep neural networks to learn feature representations exploiting their domain invariance and shareability properties. The proposed TG-ZDA models can be trained end-to-end without requiring synthetic tasks and data generated from estimated representations of target domains. The proposed TG-ZDA has been examined using benchmark ZDA tasks on image classification datasets. Experimental results show that our proposed TG-ZDA outperforms state-of-the-art ZDA methods for different domains and tasks.