Bitte verwenden Sie diesen Link, um diese Publikation zu zitieren, oder auf sie als Internetquelle zu verweisen: https://hdl.handle.net/10419/239039 
Autor:innen: 
Erscheinungsjahr: 
2019
Quellenangabe: 
[Journal:] Journal of Risk and Financial Management [ISSN:] 1911-8074 [Volume:] 12 [Issue:] 3 [Publisher:] MDPI [Place:] Basel [Year:] 2019 [Pages:] 1-16
Verlag: 
MDPI, Basel
Zusammenfassung: 
Model selection and model averaging are popular approaches for handling modeling uncertainties. The existing literature offers a unified framework for variable selection via penalized likelihood and the tuning parameter selection is vital for consistent selection and optimal estimation. Few studies have explored the finite sample performances of the class of ordinary least squares (OLS) post-selection estimators with the tuning parameter determined by different selection approaches. We aim to supplement the literature by studying the class of OLS post-selection estimators. Inspired by the shrinkage averaging estimator (SAE) and the Mallows model averaging (MMA) estimator, we further propose a shrinkage MMA (SMMA) estimator for averaging high-dimensional sparse models. Our Monte Carlo design features an expanding sparse parameter space and further considers the effect of the effective sample size and the degree of model sparsity on the finite sample performances of estimators. We find that the OLS post-smoothly clipped absolute deviation (SCAD) estimator with the tuning parameter selected by the Bayesian information criterion (BIC) in finite sample outperforms most penalized estimators and that the SMMA performs better when averaging high-dimensional sparse models.
Schlagwörter: 
Mallows criterion
model averaging
model selection
shrinkage
tuning parameter choice
Persistent Identifier der Erstveröffentlichung: 
Creative-Commons-Lizenz: 
cc-by Logo
Dokumentart: 
Article

Datei(en):
Datei
Größe
2.23 MB





Publikationen in EconStor sind urheberrechtlich geschützt.