Gradient Descent Algorithms for Quantile Regression with Smooth Approximation

Abstract

Gradient based optimization methods often converge quickly to a local optimum. However, the check loss function used by quantile regression model is not everywhere differentiable, which prevents the gradient based optimization methods from being applicable. As such, this paper introduces a smooth function to approximate the check loss function so that the gradient based optimization methods could be employed for fitting quantile regression model. The properties of the smooth approximation are discussed. Two algorithms are proposed for minimizing the smoothed objective function. The first method directly applies gradient descent, resulting the gradient descent smooth quantile regression model; the second approach minimizes the smoothed objective function in the framework of functional gradient descent by changing the fitted model along the negative gradient direction in each iteration, which yields boosted smooth quantile regression algorithm. Extensive experiments on simulated data and real-world data show that, compared to alternative quantile regression models, the proposed smooth quantile regression algorithms can achieve higher prediction accuracy and are more efficient in removing noninformative predictors.

Department(s)

Mathematics

Document Type

Article

DOI

https://doi.org/10.1007/s13042-011-0031-2

Keywords

quantile regression, gradient descent, boosting, variable selection

Publication Date

2011

Journal Title

International Journal of Machine Learning and Cybernetics

Share

COinS