Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/239039 
Year of Publication: 
2019
Citation: 
[Journal:] Journal of Risk and Financial Management [ISSN:] 1911-8074 [Volume:] 12 [Issue:] 3 [Publisher:] MDPI [Place:] Basel [Year:] 2019 [Pages:] 1-16
Publisher: 
MDPI, Basel
Abstract: 
Model selection and model averaging are popular approaches for handling modeling uncertainties. The existing literature offers a unified framework for variable selection via penalized likelihood and the tuning parameter selection is vital for consistent selection and optimal estimation. Few studies have explored the finite sample performances of the class of ordinary least squares (OLS) post-selection estimators with the tuning parameter determined by different selection approaches. We aim to supplement the literature by studying the class of OLS post-selection estimators. Inspired by the shrinkage averaging estimator (SAE) and the Mallows model averaging (MMA) estimator, we further propose a shrinkage MMA (SMMA) estimator for averaging high-dimensional sparse models. Our Monte Carlo design features an expanding sparse parameter space and further considers the effect of the effective sample size and the degree of model sparsity on the finite sample performances of estimators. We find that the OLS post-smoothly clipped absolute deviation (SCAD) estimator with the tuning parameter selected by the Bayesian information criterion (BIC) in finite sample outperforms most penalized estimators and that the SMMA performs better when averaging high-dimensional sparse models.
Subjects: 
Mallows criterion
model averaging
model selection
shrinkage
tuning parameter choice
Persistent Identifier of the first edition: 
Creative Commons License: 
cc-by Logo
Document Type: 
Article

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.