Smoothly approximated support vector domain description

Abstract

Support vector domain description (SVDD) is a well-known tool for pattern analysis when only positive examples are reliable. The SVDD model is often fitted by solving a quadratic programming problem, which is time consuming. This paper attempts to fit SVDD in the primal form directly. However, the primal objective function of SVDD is not differentiable which prevents the well-behaved gradient based optimization methods from being applicable. As such, we propose to approximate the primal objective function of SVDD by a differentiable function, and a conjugate gradient method is applied to minimize the smoothly approximated objective function. Extensive experiments on pattern classification were conducted, and compared to the quadratic programming based SVDD, the proposed approach is much more computationally efficient and yields similar classification performance on these problems.

Department(s)

Mathematics

Document Type

Article

DOI

https://doi.org/10.1016/j.patcog.2015.07.003

Keywords

Support vector domain description, Smooth approximation, Quadratic programming, conjugate gradient

Publication Date

2016

Journal Title

Pattern Recognition

Share

COinS