Iteratively reweighted least square for asymmetric L 2 -Loss support vector regression
In support vector regression (SVR) model, using the squared ϵ-insensitive loss function makes the objective function of the optimization problem strictly convex and yields a more concise solution. However, the formulation leads to a quadratic programing which is expensive to solve. This paper reformulates the optimization problem by absorbing the constraints in the objective function, and the new formulation shares similarity with weighted least square regression problem. Based on this formulation, we propose an iteratively reweighted least square approach to train the L 2 -loss SVR, for both linear and nonlinear models. The proposed approach is easy to implement, without requiring any additional computing package other than basic linear algebra operations. Numerical studies on real-world datasets show that, compared to the alternatives, the proposed approach can achieve similar prediction accuracy with substantially higher time efficiency.
Quadratic programing, Squared -insensitive loss function, Support vector regression, Weighted least square
Zheng, Songfeng. "Iteratively reweighted least square for asymmetric L 2-Loss support vector regression." Communications in Statistics-Simulation and Computation (2019): 1-17.
Communications in Statistics: Simulation and Computation