OPTIMIZATION METHODS & SOFTWARE, cilt.25, sa.6, ss.937-970, 2010 (SCI-Expanded)
As data become heterogeneous, multiple kernel learning methods may help to classify them. To overcome the drawback lying in its (multiple) finite choice, we propose a novel method of 'infinite' kernel combinations for learning problems with the help of infinite and semi-infinite optimizations. Looking at all the infinitesimally fine convex combinations of the kernels from an infinite kernel set, the margin is maximized subject to an infinite number of constraints with a compact index set and an additional (Riemann-Stieltjes) integral constraint due to the combinations. After a parametrization in the space of probability measures, we get a semi-infinite programming problem. We analyse regularity conditions (reduction ansatz) and discuss the type of density functions in the constraints and the bilevel optimization problem derived. Our proposed approach is implemented with the conceptual reduction method and tested on homogeneous and heterogeneous data; this yields a better accuracy than a single-kernel learning for the heterogeneous data. We analyse the structure of problems obtained and discuss structural frontiers, trade-offs and research challenges.