Author

Shennan Ye

Date of Graduation

Summer 2012

Degree

Master of Science in Mathematics

Department

Mathematics

Committee Chair

Songfeng Zheng

Abstract

For its simplicity and elegant theoretical properties, Least Squares (LS) regression has been used as a primary tool for a long time. However, it is well known that LS regression is very sensitive to the presence of unusual points in the data used to fit a model. As an alternative to LS regression, this thesis studied the properties and fitting algorithms for Least Absolute Deviations (LADs) regression model. Two optimization methods were investigated for fitting the model of LAD. The first one made use of the weighted median computation in each step, and iteratively estimated the coefficients to minimize the loss function. The second method applied functional gradient descent in each iteration, and built the model incrementally. The experiments on simulated data and real world data were performed, and the performance of these two methods was compared in terms of algorithm efficiency and prediction accuracy. The result of comparison showed that although both techniques work well to minimize the loss function, it is sufficient to say that compared to weighted median method, gradient descent method has a better performance not only on efficiency but also on the accuracy.

Keywords

regression, least absolute deviation, weighted median, gradient descent and boosting

Subject Categories

Mathematics

Copyright

© Shennan Ye

Campus Only

Share

COinS