Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/267977 
Year of Publication: 
2022
Series/Report no.: 
Working Paper No. WP 2022-26
Publisher: 
Federal Reserve Bank of Chicago, Chicago, IL
Abstract: 
Economists typically make simplifying assumptions to make the solution and estimation of their highly complex models feasible. These simplifications include approximating the true nonlinear dynamics of the model, disregarding aggregate uncertainty or assuming that all agents are identical. While relaxing these assumptions is well-known to give rise to complicated curse-of-dimensionality problems, it is often unclear how seriously these simplifications distort the dynamics and predictions of the model. We leverage the recent advancements in machine learning to develop a solution and estimation method based on neural networks that does not require these strong assumptions. We apply our method to a nonlinear Heterogeneous Agents New Keynesian (HANK) model with a zero lower bound (ZLB) constraint for the nominal interest rate to show that the method is much more efficient than existing global solution methods and that the estimation converges to the true parameter values. Further, this application sheds light on how effectively our method is capable to simultaneously deal with a large number of state variables and parameters, nonlinear dynamics, heterogeneity as well as aggregate uncertainty.
Subjects: 
Machine learning
neural networks
Bayesian estimation
global solution
heterogeneous agents
nonlinearities
aggregate uncertainty
HANK model
zero lower bound
JEL: 
C11
C45
D31
E32
E52
Persistent Identifier of the first edition: 
Document Type: 
Working Paper

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.