We propose a novel algorithm, Regularized Slope Function Networks (RSFN), for classification and feature ranking purposes in the family of Support Vector Machines. The main improvement relies on the fact that the kernel is automatically determined by the training examples. It is built as a function of local classifiers, called slope functions, obtained by parting oppositely labeled pairs of training points. The algorithm, although possessing a meaningful geometrical interpretation, is derived in the framework of Tikhonov regularization theory. Its unique free parameter is the regularization one, representing a trade-off between empirical error and solution complexity. A theoretical bound on the generalization error is also derived, together with Vapnik Chervonenkis dimension. Performances are tested on a number of synthetic and real data sets, where the emphasis is on the microarray case
Regularized Slope Function Networks for Microarray Data Analysis
Merler, Stefano;Jurman, Giuseppe
2004-01-01
Abstract
We propose a novel algorithm, Regularized Slope Function Networks (RSFN), for classification and feature ranking purposes in the family of Support Vector Machines. The main improvement relies on the fact that the kernel is automatically determined by the training examples. It is built as a function of local classifiers, called slope functions, obtained by parting oppositely labeled pairs of training points. The algorithm, although possessing a meaningful geometrical interpretation, is derived in the framework of Tikhonov regularization theory. Its unique free parameter is the regularization one, representing a trade-off between empirical error and solution complexity. A theoretical bound on the generalization error is also derived, together with Vapnik Chervonenkis dimension. Performances are tested on a number of synthetic and real data sets, where the emphasis is on the microarray caseI documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.