Ornek C., VURAL E.

IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP), Aalborg, Denmark, 17 - 20 September 2018 identifier

  • Publication Type: Conference Paper / Full Text
  • City: Aalborg
  • Country: Denmark
  • Keywords: Supervised manifold learning, supervised dimensionality reduction, out-of-sample extensions, Lipschitz-regular interpolators, generalization bounds, NONLINEAR DIMENSIONALITY REDUCTION, EIGENMAPS
  • Middle East Technical University Affiliated: Yes


Many supervised dimensionality reduction methods have been proposed in the recent years. Linear manifold learning methods often have limited flexibility in learning effective representations, whereas nonlinear methods mainly focus on the embedding of the training samples and do not consider the performance of the generalization of the embedding to initially unseen test samples. In this paper, we build on recent theoretical results on the generalization performance of supervised manifold learners, which state that in order to achieve good generalization performance, a trade-off needs to be sought between the separation of different classes in the embedding and the possibility of constructing out-of-sample interpolators with good Lipschitz regularity. In the light of these results, we propose a new supervised manifold learning algorithm that computes an embedding of the training samples along with a smooth interpolation function generalizing the embedding to the whole space. Our method is based on a learning objective that explicitly takes into account the generalization performance to novel test samples. Experimental results show that the proposed method achieves high classification accuracy in comparison with state-of-the-art supervised manifold learning algorithms.