Şen, Mehmet Umut and Erdoğan, Hakan (2013) Linear classifier combination and selection using group sparse regularization and hinge loss. Pattern Recognition Letters, 34 (3). pp. 265-274. ISSN 0167-8655
This is the latest version of this item.
Official URL: http://dx.doi.org/10.1016/j.patrec.2012.10.008
Abstract
The main principle of stacked generalization is using a second-level generalizer to combine the outputs of base classifiers in an ensemble. In this paper, after presenting a short survey of the literature on stacked generalization, we propose to use regularized empirical risk minimization (RERM) as a framework for learning the weights of the combiner which generalizes earlier proposals and enables improved learning methods. Our main contribution is using group sparsity for regularization to facilitate classifier selection. In addition, we propose and analyze using the hinge loss instead of the conventional least squares loss. We performed experiments on three different ensemble setups with differing diversities on 13 real-world datasets of various applications. Results show the power of group sparse regularization over the conventional norm regularization. We are able to reduce the number of selected classifiers of the diverse ensemble without sacrificing accuracy. With the non-diverse ensembles, we even gain accuracy on average by using group sparse regularization. In addition, we show that the hinge loss outperforms the least squares loss which was used in previous studies of stacked generalization.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Classifier combination; Group sparsity; Classifier selection; Regularized empirical risk minimization; Hinge loss |
Divisions: | Faculty of Engineering and Natural Sciences > Academic programs > Electronics Faculty of Engineering and Natural Sciences |
Depositing User: | Hakan Erdoğan |
Date Deposited: | 21 Jan 2014 12:13 |
Last Modified: | 01 Aug 2019 14:56 |
URI: | https://research.sabanciuniv.edu/id/eprint/22983 |
Available Versions of this Item
-
Linear classifier combination and selection using group sparse regularization and hinge loss. (deposited 06 Dec 2012 14:55)
- Linear classifier combination and selection using group sparse regularization and hinge loss. (deposited 21 Jan 2014 12:13) [Currently Displayed]