KLERC: kernel Lagrangian expectile regression calculator
As a generalization to the ordinary least square regression, expectile regression, which can predict conditional expectiles, is fitted by minimizing an asymmetric square loss function on the training data. In literature, the idea of support vector machine was introduced to expectile regression to increase the flexibility of the model, resulting in support vector expectile regression (SVER). This paper reformulates the Lagrangian function of SVER as a differentiable convex function over the nonnegative orthant, which can be minimized by a simple iterative algorithm. The proposed algorithm is easy to implement, without requiring any particular optimization toolbox besides basic matrix operations. Theoretical and experimental analysis show that the algorithm converges r-linearly to the unique minimum point. The proposed method was compared to alternative algorithms on simulated data and real-world data, and we observe that the proposed method is much more computationally efficient while yielding similar prediction accuracy.
Kernel function, Lagrangian dual function, Linear complementarity problem, Quadratic programming
Zheng, Songfeng, "KLERC: kernel Lagrangian expectile regression calculator" (2020). Articles by College of Natural and Applied Sciences Faculty. 1647.