KLERC: kernel Lagrangian expectile regression calculator

Abstract

As a generalization to the ordinary least square regression, expectile regression, which can predict conditional expectiles, is fitted by minimizing an asymmetric square loss function on the training data. In literature, the idea of support vector machine was introduced to expectile regression to increase the flexibility of the model, resulting in support vector expectile regression (SVER). This paper reformulates the Lagrangian function of SVER as a differentiable convex function over the nonnegative orthant, which can be minimized by a simple iterative algorithm. The proposed algorithm is easy to implement, without requiring any particular optimization toolbox besides basic matrix operations. Theoretical and experimental analysis show that the algorithm converges r-linearly to the unique minimum point. The proposed method was compared to alternative algorithms on simulated data and real-world data, and we observe that the proposed method is much more computationally efficient while yielding similar prediction accuracy.

Department(s)

Mathematics

Document Type

Article

DOI

https://doi.org/10.1007/s00180-020-01003-0

Keywords

Kernel function, Lagrangian dual function, Linear complementarity problem, Quadratic programming

Publication Date

1-1-2020

Journal Title

Computational Statistics

Share

COinS