Dynamic Stochastic General Equilibrium Models

15th Workshop on Methods and Applications for Dynamic Stochastic General Equilibrium Models

The 15th Workshop on Methods and Applications for Dynamic Stochastic General Equilibrium Models took place October 12-13 in Chicago. Research Associates Jesús Fernández-Villaverde and Frank Schorfheide, both of University of Pennsylvania; Leonardo Melosi of Federal Reserve Bank of Chicago, and Research Associate Giorgio Primiceri of Northwestern University organized the meeting. These researchers' papers were presented and discussed:

Francesco Bianchi, Duke University and NBER, and Howard Kung and Mikhail Tirskikh, London Business School

The Origins and Effects of Macroeconomic Uncertainty

Bianchi, Kung, and Tirskikh construct and estimate a dynamic stochastic general equilibrium model that features demand- and supply-side uncertainty. Using term structure and macroeconomic data, the researchers find sizable effects of uncertainty on risk premia and business cycle fluctuations. Both demand-side and supply-side uncertainty imply large contractions in real activity and an increase in term premia, but supply-side uncertainty shocks have larger effects on inflation and investment. Bianchi, Kung, and Tirskikh introduce a novel analytical decomposition to illustrate how five distinct risk propagation channels can account for these differences. Uncertainty shocks are strongly correlated in the beginning of our sample, but decouple in the aftermath of the Great Recession.

Taeyoung Doh and A. Lee Smith, Federal Reserve Bank of Kansas City

Reconciling VAR-based Forecasts with Survey Forecasts

This paper proposes a novel Bayesian approach to model realized data and survey forecast of the same variable jointly in a vector autoregression (VAR). In particular, our method imposes a prior distribution on the consistency between the forecast implied by the VAR and the survey forecast for the same variable. When the prior is placed on unconditional forecasts from the VAR, the prior shapes the posterior of the reducedform VAR coefficients. When the prior is placed on conditional forecasts (i.e. impulse responses), the prior shapes the posterior of the structural VAR coefficients. To implement our prior, Doh and Smith combine importance sampling with a maximum entropy prior for forecast consistency to obtain posterior draws of VAR parameters at low computational cost. The researchers use two empirical examples to illustrate some potential applications of our methodology: (i) the evolution of tail risks for inflation in a time-varying parameter VAR model and (ii) the identification of credible forward guidance shocks using sign and forecast-consistency restrictions in a monetary VAR model.

Florin O. Bilbiie, Paris School of Economics

A Catch-22 for HANK Models: No Puzzles, No Amplification

Bilbiie shows how New Keynesian (NK) models with heterogeneous agents (HA) can deliver aggregate demand amplification (monetary-fiscal multipliers and deep recessions) through a New Keynesian Cross -- but only if hand-to-mouth's income elasticity to aggregate is larger than one. This "hand-to-mouth channel" gives static amplification through the within-the-period elasticity of aggregate demand to policies. A dynamic, complementary "self-insurance channel" magnifies the effects when households are hand-to-mouth only occasionally: the aggregate Euler equation features discounting when the elasticity is lower than one, but compounding when larger. The former channel matters most for short-lived shocks, the latter for persistent or future news. Yet amplification has a dark side: the very same condition that delivers it also aggravates the NK troubles (the forward guidance puzzle, neo-Fisherian effects, and the paradox of flexibility) and creates new ones (insufficiency of the Taylor principle and a paradox of thrift). This tension also holds in a quantitative, empirically-relevant DSGE version of the model that I build. Phrased positively: adding HA can solve all these NK troubles, but that also rules out amplification and multipliers-- a Catch-22.

Pablo A. Guerrón-Quintana, Boston College, and Grey Gordon, Indiana University

A Quantitative Theory of Hard and Soft Sovereign Defaults

Empirical research on sovereign default shows "hard defaults" -- defined as defaults with above-average haircuts -- have worse outcomes for GDP growth than "soft defaults" and that sovereigns continue to borrow post-default. Guerrón-Quintana and Gordon propose a model capable of capturing these and other empirical regularities. In it, the sovereign makes period-by-period decisions of whether to make the prescribed debt payments or not. Hard defaults arise when the sovereign repeatedly chooses to not pay over the course of many years. Unlike in the standard model, default does not exogenously result in autarky. Rather, autarky-like conditions arise endogenously as the shocks leading to default result in higher spreads than the sovereign is willing to pay. The calibrated model predicts that growth shocks are the main determinant of whether default is hard or soft. The researcher's use the model and the particle filter to decompose how much of the empirical correlation between default intensity and output growth is selection and how much is causal. Decomposition of model forces shows that one-third (one-tenth) of hard (soft) defaults are explained by actual default costs with the rest explained by selection. A historical decomposition of shocks reveals that transitory shocks and trend shocks were the primary drivers of the Argentinean defaults in the 1980s and the 2000s, respectively. Estimated haircuts were 20 percentage points higher in the 2001 default than in the one in the 1980s, consistent with the data. The researchers' estimated productivity shocks coincide with major events such as the convertibility plan and the Asian crisis.

Michael D. Cai, Marco Del Negro, and Abhi Gupta, Federal Reserve Bank of New York; Marc Giannoni, Federal Reserve Bank of Dallas; Pearl Li, Stanford University; and Erica Moszkowski, Harvard University

DSGE Forecasts of the Lost Recovery

The years following the Great Recession were challenging for forecasters. Unlike other deep downturns, this recession was not followed by a swift recovery, but generated a sizable and persistent output gap that was not accompanied by deflation as a traditional Phillips curve relationship would have predicted. Moreover, the zero lower bound and unconventional monetary policy generated an unprecedented policy environment. Cai, Del Negro, Giannoni, Gupta, Li, and Moszkowski document the real real-time forecasting performance of the New York Fed dynamic stochastic general equilibrium (DSGE) model during this period and explain the results using the pseudo real-time forecasting performance results from a battery of DSGE models. They find the New York Fed DSGE model's forecasting accuracy to be comparable to that of private forecasters and notably better, for output growth, than the median forecasts from the Federal Open Market Committee's Summary of Economic Projections. The model's financial frictions were key in obtaining these results, as they implied a slow recovery following the financial crisis.

Dario Caldara, Chiara Scotti, and Molin Zhong, Federal Reserve Board

Uncertainty and Financial Stability: A VAR Analysis

Caldara and Scotti study the relative importance of three shocks for macroeconomic and financial conditions: real, financial and uncertainty shocks. They find that shocks to financial uncertainty play a supplementary role in addition to real and financial shocks, and lead to a deterioration of the macroeconomic and financial outlook. Asset valuations in equities, corporate markets, and real estate decline. Businesses and households increase their savings and start a long-lasting process of deleveraging. The supply of credit to the economy retrenches, with underwriting standards tightening especially for commercial real estate and commercial and industrial loans. The financial sector experiences a deleveraging in banks, and a buildup of leverage in other nonfinancial institutions.

Jesper Lindé, Sveriges Riksbank, and Mathias Trabandt, Freie Universität Berlin

Resolving the Missing Deflation Puzzle

Lindé and Trabandt propose a resolution of the missing deflation puzzle, i.e. the fact that inflation fell very little during the Great Recession against the backdrop of the large and persistent fall in GDP. The resolution of the puzzle stresses the importance of nonlinearities in price and wagesetting using Kimball (1995) aggregation. The researchers show that a nonlinear macroeconomic model with Kimball aggregation resolves the missing deflation puzzle. Importantly, the linearized version of the underlying nonlinear model fails to resolve the missing deflation puzzle. In addition, the nonlinear model reproduces the skewness and kurtosis of inflation observed in post-war U.S. data. All told, the results caution against the common practice of using linearized models to study inflation dynamics when the economy is exposed to large shocks.

Pablo Ottonello, University of Michigan, and Thomas Winberry, University of Chicago and NBER

Financial Heterogeneity and the Investment Channel of Monetary Policy (NBER Working Paper No. 24221)

Ottonello and Winberry study the role of heterogeneity in firms' financial positions in determining the investment channel of monetary policy. Empirically, they show that firms with low leverage or high credit ratings are the most responsive to monetary policy shocks. The researchers develop a heterogeneous firm New Keynesian model with default risk to interpret these facts and study their aggregate implications. In the model, firms with high default risk are less responsive to monetary shocks because their marginal cost of external finance is high. The aggregate effect of monetary policy therefore depends on the distribution of default risk across firms.