Nonlinear supervised dimensionality reduction via smooth regular embeddings

Creative Commons License

Ornek C., VURAL E.

PATTERN RECOGNITION, vol.87, pp.55-66, 2019 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 87
  • Publication Date: 2019
  • Doi Number: 10.1016/j.patcog.2018.10.006
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.55-66
  • Keywords: Manifold learning, Dimensionality reduction, Supervised learning, Out-of-sample, Nonlinear embeddings, EIGENMAPS
  • Middle East Technical University Affiliated: Yes


The recovery of the intrinsic geometric structures of data collections is an important problem in data analysis. Supervised extensions of several manifold learning approaches have been proposed in the recent years. Meanwhile, existing methods primarily focus on the embedding of the training data, and the generalization of the embedding to initially unseen test data is rather ignored. In this work, we build on recent theoretical results on the generalization performance of supervised manifold learning algorithms. Motivated by these performance bounds, we propose a supervised manifold learning method that computes a nonlinear embedding while constructing a smooth and regular interpolation function that extends the embedding to the whole data space in order to achieve satisfactory generalization. The embedding and the interpolator are jointly learnt such that the Lipschitz regularity of the interpolator is imposed while ensuring the separation between different classes. Experimental results on several image data sets show that the proposed method outperforms traditional classifiers and the supervised dimensionality reduction algorithms in comparison in terms of classification accuracy in most settings. (C) 2018 Elsevier Ltd. All rights reserved.