A fast iterative algorithm for support vector data description

Abstract

Support vector data description (SVDD) is a well known model for pattern analysis when only positive examples are reliable. SVDD is usually trained by solving a quadratic programming problem, which is time consuming. This paper formulates the Lagrangian of a simply modified SVDD model as a differentiable convex function over the nonnegative orthant. The resulting minimization problem can be solved by a simple iterative algorithm. The proposed algorithm is easy to implement, without requiring any particular optimization toolbox. Theoretical and experimental analysis show that the algorithm converges r-linearly to the unique minimum point. Extensive experiments on pattern classification were conducted, and compared to the quadratic programming based SVDD (QP-SVDD), the proposed approach is much more computationally efficient (hundreds of times faster) and yields similar performance in terms of receiver operating characteristic curve. Furthermore, the proposed method and QP-SVDD extract almost the same set of support vectors.

Department(s)

Mathematics

Document Type

Article

DOI

https://doi.org/10.1007/s13042-018-0796-7

Keywords

Lagrangian dual function, Penalty function method, Quadratic programming, Support vector data description, Support vectors

Publication Date

5-1-2019

Journal Title

International Journal of Machine Learning and Cybernetics

Share

COinS