My main fields of interest are Time Series Econometrics and Macroeconometrics, with particular interest in persistency in structural models. Structural macroeconomic models surely are far from the data generating process, but I believe we can learn from them. Even an epsilon step forward improves our understanding of reality.
with Francisco Blasques
Abstract: Parameter estimates of structural economic models are often difficult to interpret at the light of the underlying economic theory. Bayesian methods have become increasingly popular as a tool for conducting inference on structural models since priors offer a way to exert control over the estimation results. This paper proposes a penalized indirect inference estimator that allows researchers to obtain economically meaningful parameter estimates in a frequentist setting. The asymptotic properties of the estimator are established for both correctly and incorrectly specified models. A Monte Carlo study reveals the role of the penalty function in shaping the finite sample distribution of the estimator. The advantages of using this estimator are highlighted in the empirical study of a state-of-the-art dynamic stochastic general equilibrium model.
Abstract: We studied hypotheses testing in the presence of a possibly singular covariance matrix. We proposed an alternative way to handle possible non-regularity in a covariance matrix of a Wald test, using the identity matrix as the weighting matrix when calculating the quadratic form. The resulting test statistic is not pivotal, but its asymptotic distribution can be approximated using bootstrap methods. In order to prove the validity of the approximations, we showed that the square root of a positive semi-definite matrix is a continuously differentiable transformation with respect to the elements of the matrix. This result is important for the continuous mapping theorem to be applicable. We used two types of approximations. The first relies on the parametric bootstrap and draws from the asymptotic distribution of the restriction with an estimated covariance matrix. The second applies the residual bootstrap to obtain the distribution of the test and delivers critical values, which control size and show good empirical power even in small samples. In contrast to regularization approaches, the test statistic considered in this paper does not involve arbitrary truncation parameters for which no practical guidelines are available and does not modify the information in the data.
with Franz Palm and Jean-Pierre Urbain
Abstract: We focus on the role of the detrending in the estimation of a simple Dynamic Stochastic General Equilibrium (DSGE) model. While many have studied the implications of different univariate filtering techniques for DSGE models, less is known about the effects of multivariate filtering techniques. DSGE models assume that the system of macro variables develops along a stable growth path (great ratios are stationary). In terms of multivariate properties of the system of economic variables that implies common deterministic trend in the data and — depending on a presence of a unit-root in the system — cointegration. We hypothesize that the multivariate detrending techniques that exploit the time-series properties of the data improve the efficiency of the estimators of the structural parameters of the DSGE models. We use restricted estimators of the deterministic trend parameters to improve the efficiency of the existing quasi-differencing estimation method; we propose the multivariate Beveridge-Nelson decomposition as a filter; we augment the data with quantities that do not depend on the deterministic trend. Results show that the knowledge of the deterministic component makes a difference in terms of efficiency of the estimators. Simulations suggest that proposed estimators improve upon existing methods in terms of root mean square error and bias when the shocks are highly persistent yet stationary. Researchers should be careful: estimates of the deterministic trend are far from the true values for the current sample sizes in macro, given possible unit roots in the data.
with Franz Palm and Jean-Pierre Urbain
Abstract: Macroeconomic models are always multivariate. A lot of macroeconomic variables are not stationary. Many estimation methods of macroeconomic models rely on a preliminary step of making the data stationary. Typically this is done by relying on univariate filtering techniques. In this paper, we focus on the role of the detrending in the estimation of a simple Dynamic Stochastic General Equilibrium (DSGE) model. While many have studied the implications of different univariate filtering techniques for DSGE models, less is known about the effects of multivariate filtering techniques. DSGE models assume that the system of macro variables develops along a stable growth path (great ratios are stationary). In terms of multivariate properties of the system of economic variables that implies common deterministic trend in the data and — depending on a presence of a unit-root in the system — cointegration. We hypothesize that the multivariate detrending techniques that exploit the time-series properties of the data improve the efficiency of the estimators of the structural parameters of the DSGE models. We use restricted estimators of the deterministic trend parameters to improve the efficiency of the existing quasi-differencing estimation method; we propose the multivariate Beveridge-Nelson decomposition as a filter; we augment the data with quantities that do not depend on the deterministic trend. Results show that the knowledge of the deterministic component makes a difference in terms of efficiency of the estimators. Simulations suggest that proposed estimators improve upon existing methods in terms of root mean square error and bias when the shocks are highly persistent yet stationary. Researchers should be careful: estimates of the deterministic trend are far from the true values for the current sample sizes in macro, given possible unit roots in the data.
with Francisco Blasques, André Lucas
with Alexey Gorn