Working papers results

2001 - n° 204 04/03/2003

This paper presents firm level evidence on the change of non-manual wage premia and employment shares in Italian manufacturing during the nineties. We find that the relative stability of aggregate wage premia and employment shares hides offsetting disaggregate forces. First, while technical progress raises the relative demand for skilled labor within firms, demand changes associated with exports reduce the relative demand for skills. Second, within the class of non-manual workers, wage premia and employment shares of executives rise substantially, whereas those of clerks fall in a similar proportion. We also find that the export status of firms plays a key role in explaining labor market dynamics, as exporters account for most of both demand-related and technology-related shifts. Overall, our results for Italy question the general validity of the conventional view that emphasizes the role of labor market institutions, as opposed to trade and technology, in determining wage and employment dynamics in continental Europe.

Paolo Manasse (IGIER and University of Bologna), Luca Stanca and Alessandro Turrini
2001 - n° 203 04/03/2003

Time series models are often adopted for forecasting because of their simplicity and good performance. The number of parameters in these models increases quickly with the number of variables modelled, so that usually only univariate or small-scale multivariate models are considered. Yet, data are now readily available for a very large number of macroeconomic variables that are potentially useful when forecasting. Hence, in this paper we construct a large macroeconomic data-set for the UK, with about 80 variables, model it using a dynamic factor model, and compare the resulting forecasts with those from a set of standard time series models. We find that just six factors are sufficient to explain 50% of the variability of all the variables in the data set. Moreover, these factors, which can be considered as the main driving forces of the economy, are related to key variables such as interest rates, monetary aggregates, prices, housing and labour market variables, and stock prices. Finally, the factor-based forecasts are shown to improve upon standard benchmarks for prices, real aggregates, and financial variables, at virtually no additional modelling or computational costs.

Michael Artis (Dept. of Economics, European University Institute) , Anindya Banerjee (Dept. of Economics, European University Institute) and Massimiliano Marcellino(Istituto di Economia Politica, Universita Bocconi, IGIER)
2001 - n° 202 04/03/2003

This paper studies within-family decision making regarding investment in income protection for surviving spouses using a simple and tractable Nash-bargaining model. A change in US pension law (the Retirement Equity Act of 1984) is used as an instrument to derive predictions from the bargaining model and to contrast these with the predictions of the classical single-utility-function model of the household. This law change gave spouses of married pension-plan participants the right to survivor benefits unless they explicitly waived this right. The classical view of household behavior predicts that this would have had no effect on choices, while the bargaining model predicts an increase in spousal survivor protection. In the empirical part of the paper, the predictions of the classical model regarding the amount of life-insurance protection and the likelihood of a pensioner selecting survivor benefits are rejected in favor of the predictions of the Nash-bargaining model. The paper thus provides evidence for the need to take the existence of multiple decision makers into account when studying household behavior.

Saku Aura (IGIER and IEP Bocconi University)
2001 - n° 201 04/03/2003

This paper investigates time series methods for forecasting four Euro-area wide aggregate variables: real GDP, industrial production, price inflation, and the unemployment rate. We consider two empirical questions arising from this problem. First, is it better to build aggregate Euro-area wide forecasting models for these variables, or are there gains from aggregating country-specific forecasts for the component country variables? Second, are there gains from using information from additional predictors beyond simple univariate time series forecasts, and if so, how large are these gains, and how are these gains best achieved? It turns out that typically there are gains from forecasting these series at the country level, then pooling the forecasts, relative to forecasting at the aggregate level. This suggests that structural macroeconometric modeling of the Euro area is appropriately done at the country-specific level, rather than directly at the aggregate level. Moreover, our simulated out-of-sample forecast experiment provides little evidence that forecasts from multivariate models are more accurate than forecasts from univariate models. If we restrict attention to multivariate models, the forecasts obtained from a dynamic factor model appear to be somewhat more accurate than the other methods.

Massimiliano Marcellino(Istituto di Economia Politica, Universita Bocconi IGIER), James H. Stock (Kennedy School of Government, Harvard University and the NBER) and Mark W. Watson (Department of Economics and Woodrow Wilson School, Princeton University and the NBER)
2001 - n° 200 04/03/2003

The rate of inflation in the US has declined from an average of 4.5% in the period 1960-79 to an average of 3.6% in 1980-98. Between those two periods, the standard deviations of inflation and the output gap have also declined. These facts can be attributed to the interaction of three possible factors: a shift in central bank preferences, a reduction in the variability of aggregate supply shocks and a more efficient conduct of monetary policy. In this paper we identify the relative roles of these factors. Our framework is based on the estimation of a small structural macro model for the US economy jointly with the first order conditions, which solve the intertemporal optimization problem faced by the Fed. Overall, our results indicate that the policy preferences of the Fed, and in particular the (implicit) inflation target, have changed drastically with the advent of the Volcker-Greespan era. In addition, we find that the variance of supply shocks has been lower and also monetary policy has been conducted more efficiently during this period.

Carlo Ambrogio Favero (IGIER-Università Bocconi and CEPR), Riccardo Rovelli (Università di Bologna)
2001 - n° 199 04/03/2003

Cross-country evidence on inflation and income inequality suggests that they are positively related. I explore the hypothesis that this correlation is the outcome of a distributional conflict underlying the determination of fiscal policy. I study a bargaining model of the political system in which inflation and inequality are positively correlated due to the relative vulnerability to inflation of low income households.

Stefania Albanesi (Università Bocconi, IGIER)
2001 - n° 198 04/03/2003

We examine whether standard monetary general equilibrium models with benevolent monetary authorities acting under discretion can generate persistent episodes of high and low inflation. Specifically, we ask whether private agents expectations of high or low inflation can lead them to take actions which then make it optimal for monetary authorities to validate these expectations. We find that this is the case for a large class of economies and that the result depends importantly on the properties of money demand.

Stefania Albanesi (Università Bocconi, IGIER), V.V. Chari (University of Minnesota), Lawrence J. Christiano (Northwestern University)
2001 - n° 197 04/03/2003
This paper explores the extent to which predictability of asset returns could be exploited for dynamic portfolio allocation among several (seven) assets taking model uncertainty explicitly into account.We consider model uncertainty when solving the problem of a representative fund manager who allocates funds between stock and bonds in three geographical areas: Europe, USA and Japan. We consider explicitly model uncertainty by implementing thick modelling to derive the average portfolio allocation generated by the recursively selected top fifty per cent of models in term of adjusted R-squared The portfolio allocation based on this strategy leads to systematic over-performance with respect to optimal portfolio allocation among several assets is based on the predictions of the best model as selected by the adjusted R-squared . Such over performance is mainly attributable to a reduction in the volatility of the returns on the selected portfolios. Thick modelling leads also to systematic replication, but not to over-performance, of a typical benchmark\ portfolio for our asset allocation problem.

Carlo Ambrogio Favero (Università Bocconi, IGIER), Marco Aiolfi (Università Bocconi, IGIER), Giorgio Primiceri (Princeton University)
2001 - n° 196 04/03/2003
Observed policy rates are smooth. Why should central banks smooth interest rates? We investigate if model uncertainty and parameters instability are a valid reason. We do so by implementing a novel thick recursive modelling approach within the framework of small structural macroeconomic models. At each point in time we estimate all models generated by the combinations of a base-set of $k$ observable regressors. Our econometric procedure delivers 2$^{k}$ models for aggregate demand and supply at any point in time.We compute optimal monetary policies for each of these specifications and then take their average as our benchmark optimal monetary policy. We then compare observed policy rates with those generated by the traditional thin modelling approach to optimal monetary policy and to our proposed thick modelling approach.Our results confirms the difficulty of recovering the deep parameters describing the preferences of the monetary policy makers from their observed behaviour. However, they also show that thick recursive modelling can, at least partially,explain the observed interest rate smoothness.

Carlo Ambrogio Favero (Università Bocconi, IGIER), Fabio Milani (Università Bocconi, IGIER)
Keywords: model uncertainty, optimal monetary policy, interest rate smoothing
2001 - n° 195 04/03/2003
The expectations model of the term structure of interest rates has been subjected to numerous empirical tests and almost invariably rejected.
In fact, the vast majority of the empirical evidence is based on the estimation of single-equation models and on the assumption that realized returns are a valid proxy for expected returns. A recent strand of the macroeconomic literature has analyzed monetary policy by including the central bank reaction function in small empirical macro models.
By simulating these models forward it is possible to derive the full forward path of short-term interest rates and hence to construct any long-term yields using model based forecasts. A test of the theory can then be performed by comparing observed long-term yield with those simulated and the associated 95 per cent confidence interval.
The application of this framework to the analysis of US term structure in the nineties does not
lead to the rejection of the expectations mode

Carlo Ambrogio Favero(Università Bocconi, IGIER)