|
[full paper] |
Csaba Szepesvari, Kornel Kovacs, Andras Kocsor
In this paper we consider two novel kernel machine based feature extraction algorithms in a regression settings. The first method is derived based on the principles underlying the recently introduced Maximum Margin Discimination Analysis (MMDA) algorithm. However, here it is shown that the orthogonalization principle employed by the original MMDA algorithm can be motivated using the well-known ambiguity decomposition, thus providing a firm ground for the good performance of the algorithm. The second algorithm combines kernel machines with average derivative estimation and is derived from the assumption that the true regressor function depends only on a subspace of the original input space. The proposed algorithms are evaluated in preliminary experiments conducted with artificial and real datasets.
Keywords: feature extraction, ensemble learning, kernel methods, Support Vector Machine
Citation: Csaba Szepesvari, Kornel Kovacs, Andras Kocsor: Kernel Machine Based Feature Extraction Algorithms for Regression Problems. In R.López de Mántaras and L.Saitta (eds.): ECAI2004, Proceedings of the 16th European Conference on Artificial Intelligence, IOS Press, Amsterdam, 2004, pp.1093-1094.