Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen: http://dx.doi.org/10.18419/opus-9387
Autor(en): Farooq, Muhammad
Titel: Kernel-based expectile regression
Erscheinungsdatum: 2017
Dokumentart: Dissertation
Seiten: viii, 138
URI: http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-94043
http://elib.uni-stuttgart.de/handle/11682/9404
http://dx.doi.org/10.18419/opus-9387
Zusammenfassung: Conditional expectiles are becoming an increasingly important tool in finance as well as in other areas of application such as demography when the goal is to explore the conditional distribution beyond the conditional mean. In this thesis, we consider a support vector machine (SVM) type approach with the asymmetric least squares loss for estimating conditional expectiles. Firstly, we establish learning rates for this approach that are minimax optimal modulo a logarithmic factor if Gaussian RBF kernels are used and the desired expectile is smooth in a Besov sense. It turns out that our learning rates, as a special case, improve the best known rates for kernel-based least squares regression in aforementioned scenario. As key ingredients of our statistical analysis, we establish a general calibration inequality for the asymmetric least squares loss, a corresponding variance bound as well as an improved entropy number bound for Gaussian RBF kernels. Furthermore, we establish optimal learning rates in the case of a generic kernel under the assumption that the target function is in a real interpolation space. Secondly, we complement the theoretical results of our SVM approach with the empirical findings. For this purpose we use a sequential minimal optimization method and design an SVM-like solver for expectile regression considering Gaussian RBF kernels. We conduct various experiments in order to investigate the behavior of the designed solver with respect to different combinations of initialization strategies, working set selection strategies, stopping criteria and number of nearest neighbors, and then look for the best combination of them. We further compare the results of our solver to the recent R-package ER-Boost and find that our solver exhibits a better test performance. In terms of training time, our solver is found to be more sensitive to the training set size and less sensitive to the dimensions of the data set, whereas, ER-Boost behaves the other way around. In addition, our solver is found to be faster than a similarly implemented solver for the quantile regression. Finally, we show the convergence of our designed solver.
Enthalten in den Sammlungen:08 Fakultät Mathematik und Physik

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
farooq2017KernelBasedExpectileRegression.pdf2,12 MBAdobe PDFÖffnen/Anzeigen


Alle Ressourcen in diesem Repositorium sind urheberrechtlich geschützt.