Iteratively reweighted least square for asymmetric L 2 -Loss support vector regression

Abstract

In support vector regression (SVR) model, using the squared ϵ-insensitive loss function makes the objective function of the optimization problem strictly convex and yields a more concise solution. However, the formulation leads to a quadratic programing which is expensive to solve. This paper reformulates the optimization problem by absorbing the constraints in the objective function, and the new formulation shares similarity with weighted least square regression problem. Based on this formulation, we propose an iteratively reweighted least square approach to train the L 2 -loss SVR, for both linear and nonlinear models. The proposed approach is easy to implement, without requiring any additional computing package other than basic linear algebra operations. Numerical studies on real-world datasets show that, compared to the alternatives, the proposed approach can achieve similar prediction accuracy with substantially higher time efficiency.

Department(s)

Mathematics

Document Type

Article

DOI

https://doi.org/10.1080/03610918.2019.1599016

Keywords

Quadratic programing, Squared -insensitive loss function, Support vector regression, Weighted least square

Publication Date

1-1-2019

Journal Title

Communications in Statistics: Simulation and Computation

Share

COinS