A fast iterative algorithm for support vector data description
Support vector data description (SVDD) is a well known model for pattern analysis when only positive examples are reliable. SVDD is usually trained by solving a quadratic programming problem, which is time consuming. This paper formulates the Lagrangian of a simply modified SVDD model as a differentiable convex function over the nonnegative orthant. The resulting minimization problem can be solved by a simple iterative algorithm. The proposed algorithm is easy to implement, without requiring any particular optimization toolbox. Theoretical and experimental analysis show that the algorithm converges r-linearly to the unique minimum point. Extensive experiments on pattern classification were conducted, and compared to the quadratic programming based SVDD (QP-SVDD), the proposed approach is much more computationally efficient (hundreds of times faster) and yields similar performance in terms of receiver operating characteristic curve. Furthermore, the proposed method and QP-SVDD extract almost the same set of support vectors.
Lagrangian dual function, Penalty function method, Quadratic programming, Support vector data description, Support vectors
Zheng, Songfeng. "A fast iterative algorithm for support vector data description." International Journal of Machine Learning and Cybernetics 10, no. 5 (2019): 1173-1187.
International Journal of Machine Learning and Cybernetics