Non-parallel support vector classifiers with different loss functions

Siamak Mehrkanoon*, Xiaolin Huang, Johan A. K. Suykens

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

34 Citations (Web of Science)

Abstract

This paper introduces a general framework of non-parallel support vector machines, which involves a regularization term, a scatter loss and a misclassification loss. When dealing with binary problems, the framework with proper losses covers some existing non-parallel classifiers, such as multisurface proximal support vector machine via generalized eigenvalues, twin support vector machines, and its least squares version. The possibility of incorporating different existing scatter and misclassification loss functions into the general framework is discussed. Moreover, in contrast with the mentioned methods, which applies kernel-generated surface, we directly apply the kernel trick in the dual and then obtain nonparametric models. Therefore, one does not need to formulate two different primal problems for the linear and nonlinear kernel respectively. In addition, experimental results are given to illustrate the performance of different loss functions. (C) 2014 Elsevier B.V. All rights reserved.

Original languageEnglish
Pages (from-to)294-301
Number of pages8
JournalNeurocomputing
Volume143
DOIs
Publication statusPublished - 2 Nov 2014
Externally publishedYes

Keywords

  • Non-parallel classifiers
  • Least squares loss
  • Pinball loss
  • Hinge loss
  • Kernel trick
  • TIME-SERIES PREDICTION
  • PINBALL LOSS
  • MACHINES
  • CLASSIFICATION
  • REGRESSION
  • ALGORITHM
  • PARAMETERS

Cite this