Bitte verwenden Sie diesen Link, um diese Publikation zu zitieren, oder auf sie als Internetquelle zu verweisen: https://hdl.handle.net/10419/296448 
Erscheinungsjahr: 
2023
Quellenangabe: 
[Journal:] Theoretical Economics [ISSN:] 1555-7561 [Volume:] 18 [Issue:] 4 [Year:] 2023 [Pages:] 1585-1622
Verlag: 
The Econometric Society, New Haven, CT
Zusammenfassung: 
We show that Bayesian posteriors concentrate on the outcome distributions that approximately minimize the Kullback-Leibler divergence from the empirical distribution, uniformly over sample paths, even when the prior does not have full support. This generalizes Diaconis and Freedman (1990)'s uniform convergence result to e.g., priors that have finite support, are constrained by independence assumptions, or have a parametric form that cannot match some probability distributions. The concentration result lets us provide a rate of convergence for Berk (1996)’s result on the limiting behavior of posterior beliefs when the prior is misspecified. We provide a bound on approximation errors in “anticipated-utility” models, and extend our analysis to outcomes that are perceived to follow a Markov proces
Schlagwörter: 
Bayesian consistency
Misspecified learning
JEL: 
D83
D90
Persistent Identifier der Erstveröffentlichung: 
Creative-Commons-Lizenz: 
cc-by-nc Logo
Dokumentart: 
Article

Datei(en):
Datei
Größe
405.41 kB





Publikationen in EconStor sind urheberrechtlich geschützt.