Efficient Multiclass Boosting Classification with Active Learning


Huang J., Ertekin Ş., Song Y., Zha H., Giles C. L.

7th SIAM International Conference on Data Mining, Minnesota, United States Of America, 26 - 28 April 2007, pp.297-308 identifier identifier

  • Publication Type: Conference Paper / Full Text
  • City: Minnesota
  • Country: United States Of America
  • Page Numbers: pp.297-308
  • Middle East Technical University Affiliated: No

Abstract

We propose a novel multiclass classification algorithm Gentle Adaptive Multiclass Boosting Learning (GAMBLE). The algorithm naturally extends the two class Gentle AdaBoost algorithm to multiclass classification by using the multiclass exponential loss and the multiclass response encoding scheme. Unlike other multiclass algorithms which reduce the K-class classification task to K binary classifications, GAMBLE handles the task directly and symmetrically, with only one committee classifier. We formally derive the GAM-BLE algorithm with the quasi-Newton method, and prove the structural equivalence of the two regression trees in each boosting step.