hero working papers

Working papers

IGIER fellows and affiliates publish books and articles in academic journals. Their current research projects are featured in the Working Paper series. 

2006 - n° 307
Robust control allows policymakers to formulate policies that guard against
model misspecification. The principal tools used to solve robust control problems
are state-space methods (see Hansen and Sargent, 2006, and Giordani and
Soderlind, 2004). In this paper we show that the structural-form methods
developed by Dennis (2006) to solve control problems with rational expectations
can also be applied to robust control problems, with the advantage that they
bypass the task, often onerous, of having to express the reference model in
statespace form. Interestingly, because state-space forms and structural forms
are not unique the two approaches do not necessarily return the same equilibria
for robust control problems. We apply both state-space and structural solution
methods to an empirical New Keynesian business cycle model and find that the
differences between the methods are both qualitatively and quantitatively important.
In particular, with the structural-form solution methods the specification errors generally
involve changes to the conditional variances in addition to theconditional means of the
shock processes.

Richard Dennis, Kai Leitemo, and Ulf Soderstrom
Keywords: Robust control, Misspecification, Optimal policy
2006 - n° 306
The estimation of structural dynamic factor models (DFMs) for large sets of variables
is attracting considerable attention. In this paper we briefly review the underlying
theory and then compare the impulse response functions resulting from two alternative
estimation methods for the DFM. Finally, as an example, we reconsider the issue of
the identification of the driving forces of the US economy, using data for about 150
macroeconomic variables.

George Kapetanios and Massimiliano Marcellino
Keywords: Factor models, Principal components, Subspace algorithms, StructuralIdentification, Structural VAR
2006 - n° 305
The estimation of dynamic factor models for large sets of variables has attracted
considerable attention recently, due to the increased availability of large datasets. In
this paper we propose a new parametric methodology for estimating factors from large
datasets based on state space models and discuss its theoretical properties. In particular,
we show that it is possible to estimate consistently the factor space. We also
develop a consistent information criterion for the determination of the number of factors
to be included in the model. Finally, we conduct a set of simulation experiments
that show that our approach compares well with existing alternatives.

George Kapetanios and Massimiliano Marcellino
Keywords: Factor models, Principal components, Subspace algorithms
2006 - n° 304
This paper develops a dynamic general equilibrium model that
integrates labor market search and matching into an otherwise
standard New Keynesian model. I allow for changes of the labor
input at both the extensive and the intensive margin and develop
two alternative specifications of the bargaining process. Under
efficient bargaining (EB) hours are determined jointly by the firm
and the worker as a part of the same Nash bargain that determines
wages. With right to manage (RTM), instead, firms retain the right to
set hours of work unilaterally. I show that introducing search and
matching frictions affects the cyclical behavior of real marginal costs
by way of two different channels: a wage channel under RTM and an
extensive margin channel under EB. In both cases, the presence of
search and matching frictions may cause a lower elasticity of marginal
costs with respect to output and thus help to account for the observed
inertia in inflation.

Antonella Trigari
Keywords: Labor Market Search, Wage Bargaining, Business Cycles Inflation, Monetary Policy Shocks
2006 - n° 303
We investigate identifiability issues in DSGE models and their consequences for
parameter estimation and model evaluation whenthe objective function measures
the distance between estimated and model impulse responses. We show that
observational equivalence, partial and weak identification problems are widespread,that
they lead to biased estimates, unreliable t-statistics and may induce investigators to
select false models. We examine whether different objective functions affect identification
and study how small samples interact with parameters and shock identification.
We provide diagnostics and tests to detect identification failures and apply them to a
state-of-the-art model.

Fabio Canova (ICREA adnUPF) andLuca Sala (IEP, IGIERand Università Bocconi)
Keywords: identification, DSGE models
2006 - n° 302
Does democracy promote economic development? This paper reviews recent
attempts to addresses this question that exploited within-country variation.
It shows that the answer is largely positive, but also depends on the details
of democratic reforms. First, the sequence of economic vs political reforms
matters: countries liberalizing their economy before extending political rights
do better. Second, different forms of democratic government lead to different
economic policies, and this might explain why presidential democracy leads
to faster growth than parliamentary democracy. Third, it is important to distinguish
between expected and actual political reforms. Taking expectations of regime
change into account helps identify a stronger growth effect of democracy.

T. Persson (Stockholm University) and G. Tabellini (Università Bocconi and IGIER)
Keywords: Democracy; Reform; Growth; Institutions; Difference in Difference
2005 - n° 301

The Italian economy is often said to be on a declining path. In this paper, we document that:
(i) Italy�s current decline is a labor productivity problem (ii) the labor productivity slowdown
stems from declining productivity growth in all industries but utilities (with manufacturing
contributing for about one half of the reduction) and diminished inter-industry reallocation of
workers from agriculture to market services; (iii) the labor productivity slowdown has been
mostly driven by declining TFP, with roughly unchanged capital deepening. The only mild
decline of capital deepening is due to the rise in the value added share of capital that
counteracted declining capital accumulation.

Francesco Daveri (Università di Parma and IGIER) and Cecilia Jona-Lasinio (ISTAT)
Keywords: Productivity growth, Productivity slowdown, TFP, decline, Italy
2005 - n° 300

We lay out a tractable model for fiscal and monetary policy analysis in
a currency union, and analyze its implications for the optimal design of such
policies. Monetary policy is conducted by a common central bank, which sets
the interest rate for the union as a whole. Fiscal policy is implemented at
the country level, through the choice of government spending level. The model
incorporates country-specific shocks and nominal rigidities. Under our assumptions,
the optimal monetary policy requires that inflation be stabilized at the
union level. On the other hand, the relinquishment of an independent monetary
policy, coupled with nominal price rigidities, generates a stabilization role
for fiscal policy, one beyond the efficient provision of public goods. Interestingly,
the stabilizing role for fiscal policy is shown to be desirable not only from
the viewpoint of each individual country, but also from that of the union as
a whole. In addition, our paper offers some insights on two aspects of policy
design in currency unions: the conditions for equilibrium determinacy and
the effects of exogenous government spending variations.

Jordi Gal and Tommaso Monacelli
Keywords: monetary union, sticky prices, countercyclical policy, inflation differentials
2005 - n° 299

Pooling forecasts obtained from different procedures typically reduces
the mean square forecast error and more generally improves the quality
of the forecast. In this paper we evaluate whether pooling interpolated
or backdated time series obtained from different procedures can also
improve the quality of the generated data. Both simulation results
and empirical analyses with macroeconomic time series indicate that
pooling plays a positive and important role also in this context.

Massimiliano Marcellino
Keywords: Pooling, Interpolation, Factor Model, Kalman Filter, Spline
2005 - n° 298

In this paper we assess the possibility of producing unbiased forecasts for fiscal variables in the
euro area by comparing a set of procedures that rely on different information sets and
econometric techniques. In particular, we consider ARMA models, VARs, small scale semi-
structural models at the national and euro area level, institutional forecasts (OECD), and
pooling. Our small scale models are characterized by the joint modelling of fiscal and monetary
policy using simple rules, combined with equations for the evolution of all the relevant
fundamentals for the Maastricht Treaty and the Stability and Growth Pact. We rank models on
the basis of their forecasting performance using the mean square and mean absolute error
criteria at different horizons. Overall, simple time series methods and pooling work well and are
able to deliver unbiased forecasts, or slightly upward biased forecast for the debt-GDP
dynamics. This result is mostly due to the short sample available, the robustness of simple
methods to structural breaks, and to the difficulty of modelling the joint behaviour of several
variables in a period of substantial institutional and economic changes. A bootstrap experiment
highlights that, even when the data are generated using the estimated small scale multi
country model, simple time series models can produce more accurate forecasts, due to
their parsimonious specification.

Carlo A. Favero and Massimiliano Marcellino
Keywords: Fiscal forecasting, Forecasting comparison, Fiscal rules, Euro area
2005 - n° 297

Many countries, especially developing ones, follow procyclical fiscal polices, namely spending goes up (taxes go down) in booms and spending goes down (taxes go up) in recessions. We provide an explanation for this suboptimal fiscal policy based upon political distortions and incentives for less-than-benevolent government to appropriate rents. Voters have incentives similar to the starving the Leviathan classic
argument, and demand more public goods or fewer taxes to prevent governments from appropriating rents when the economy is doing well.
We test this argument against more traditional explanations based purely on borrowing constraints, with a reasonable amount of success.

Alberto Alesina (Harvard) and Guido Tabellini (IGIER, Bocconi)
2005 - n° 296

Do countries gain by coordinating their monetary policies if they have different economic structures? We address this issue in the context of a new open-economy macro model with a traded and a non-traded sector and more importantly, with a across-country asymmetry in the size of the traded sector. We study optimal monetary policy under independent and cooperating central banks, based on analytical expressions for welfare objectives derived from quadratic approximations to individual preferences. In the presence of asymmetric structures, a new source of gains from coordination emerges due to a terms-of-trade externality. This externality affects unfavorably the country that is more exposed to trade and its effects tend
to be overlooked when national central banks act independently. The welfare gains from coordination are sizable and increase with the degree of asymmetry across countries and the degree of openness, and decrease with the within-country correlation of sectoral shocks.

Evi Pappa (LSE, CEP and IGIER) and Zheng Liu (Emory University)
Keywords: Optimal Monetary Policy; International Policy Coordination; Multiple Sectors; Asymmetric Structures; Sticky Prices
2005 - n° 295

We study whether fiscal restrictions affect volatilities and correlations of macrovariables
and the probability of excessive debt for a sample of 48 US states. Fiscal constraints are
characterized with a number of indicators and volatility and correlations are computed in several
ways. The second moments of macroeconomic variables in states with different fiscal constraints
are economically and statistically similar. Excessive debt and the mechanism linking budget
deficit and excessive debts are independent of whether tight or loose fiscal constraints are in
place. Creative budget accounting may account for the results.

Fabio Canova (IGIER, Universitat Pompeu Fabra, and CEPR) and Evi Pappa (London School of Economics, CEP and IGIER)
Keywords: Fiscal restrictions, Excessive Debt, Business cycles, US states
2005 - n° 294

We study how constrained fiscal policy can affect regional inflation and output in a two-region model of a monetary union with sticky prices and distortionary taxation. Both government expenditure and taxes can be used to stabilize regional variables; however, the best welfare outcome is obtained under constant taxes and constant regional inflations. With cooperation debt and deficit constraints reduce regional inflation variability, but the path of output is suboptimal. Under non-cooperation the opposite occurs due to a trade-off between taxation and inflation variability. Decentralized rules, rather than constraints, stabilize regional inflation and output. They imply more fiscal action for smaller union members.

Evi Pappa(LSE, CEP and IGIER)
Keywords: Inflation Differentials, Monetary Union, Budgetary Restrictions, Fiscal rules
2005 - n° 293

We study the mechanics of transmission of fiscal shocks to labor markets. We
characterize a set of robust implications following government consumption, investment
and employment shocks in a RBC and a New-Keynesian model and use part of them to
identify shocks in the data. In line with the New-Keynesian story, shocks to government
consumption and investment increase real wages and employment contemporaneously
both in US aggregate and in US state data. The dynamics in response to employment
shocks are mixed, but in many cases are inconsistent with the predictions of the RBC
model.

Evi Pappa
2005 - n° 292

Does culture have a causal effect on economic development? The data on European
regions suggest that it does. Culture is measured by indicators of individual values
and beliefs, such as trust and respect for others, and confidence in individual selfdetermination.
To isolate the exogenous variation in culture, I rely on two historical
variables used as instruments: the literacy rate at the end of the XIXth century, and
the political institutions in place over the past several centuries. The political and
social history of Europe provides a rich source of variation in these two variables at a
regional level. The exogenous component of culture due to history is strongly
correlated with current regional economic development, after controlling for
contemporaneous education, urbanization rates around 1850 and national effects.
Moreover, the data do not reject the over-identifying assumption that the two
historical variables used as instruments only influence regional development through
culture. The indicators of culture used in this paper are also strongly correlated with
economic development and with available measures of institutions in a cross-country
setting.

Guido Tabellini (IGIER, Università Bocconi and CEPR)
Keywords: culture, economic development, trust, literacy, institutions
2005 - n° 291

Consumption is striking back. Some recent evidence indicates that
the well-known asset pricing puzzles generated by the difficulties of
matching fluctuations in asset prices with high frequency fluctuations
in consumption might be solved found by considering consumption in
the long-run. A first strand of the literature concentrates on multiperiod
differences in log consumption, a second concentrates on the
cointegrating relation for consumption. Interestingly, only the (multiperiod)
Euler Equation for the consumer optimization problem is
considered by the first strand of the literature, while the cointegrationbased
literature concentrates exclusively on the (linearized) intertemporal
budget constraint. In this paper, we show that using the first
order condition in the linearized budget constraint to derive an explicit
long-run consumption function delivers an even more striking
strike back.

Carlo A. Favero (IGIER, Università Bocconi and CEPR)
Keywords: Cointegrating Consumption function, lon-run stock marketreturns, elasticity of intertemporal substitution
2005 - n° 290

This paper studies how a central bank's preference for robustness against
model misspecification affects the design of monetary policy in a New-Keynesian
model of a small open economy. Due to the simple model structure,
we are able to solve analytically for the optimal robust policy rule, and we
separately analyze the effects of robustness against misspecification concerning
the determination of inflation, output and the exchange rate. We show that
an increased central bank preference for robustness makes monetary policy
respond more aggressively or more cautiously to shocks, depending on the
type of shock and the source of misspecification.

Kai Leitemo (Norwegian School of Management)and Ulf Soderstrom (IGIER and Bocconi University)
Keywords: Knightian uncertainty, model uncertainty, robust control, minmaxpolicies
2005 - n° 289

This paper introduces underground activities and tax evasion into a one sector dynamic general equilibrium model with external effects. The model presents a novel mechanism driving the self-fulfilling prophecies, which is triggered by the reallocation of resources to the underground sector to avoid the excess tax burden. This mechanism differs from the customary one, and it is complementary to it. In addition, the explicit introduction of an (even tiny) underground sector allows to reduce aggregate degree of increasing returns required for indeterminacy, and for having well behaved input demand schedules (in the sense they slope down).

Journal of Economic Literature Classification Numbers: O40, E260

Francesco Busato (University of Aarhus), Bruno Chiarini (University of Naples, Parthenope)and Enrico Marchetti (La Sapienza, Rome)
Keywords: Indeterminacy and Sunspots, Tax Evasion and Underground Activities
2005 - n° 288

A central problem for the game theoretic analysis of voting is that voting games
have very many Nash equilibria. In this paper, we consider a new refinement
concept for voting games that combines two ideas that appear reasonable for voting
games: First, trembling hand perfection (voters sometimes make mistakes when
casting their vote) and second, coordination of voters with similar interests. We
apply this refinement to an analysis of multicandidate elections under plurality rule
and runoff rule.
For plurality rule, we show that our refinement implies Duverger's law: In all
equilibria, (at most) two candidates receive a positive number of votes. For the case
of 3 candidates, we can completely characterize the set of equilibria. Often, there
exists a unique equilibrium satisfying our refinement; surprisingly, this is even true,
if there is no Condorcet winner. We also consider the equilibria under a runoff rule
and analyze when plurality rule and runoff rule yield different outcomes.

Matthias Messner and Mattias Polborn
Keywords: strategic voting, runoff rule, plurality rule, equilibrium refinement, trembling hand perfection, coalition-proofness
2005 - n° 287

Building on recent work on dynamic interactive epistemology, we
extend the analysis of extensive-form psychological games (Geneakoplos,
Pearce & Stacchetti, Games and Economic Behavior, 1989) to
include conditional higher-order beliefs and enlarged domains of pay-off
functions. The approach allows modeling dynamic psychological
effects (such as sequential reciprocity, psychological forward induction,
and regret) that are ruled out when epistemic types are identified with
hierarchies of initial beliefs. We define a notion of psycholigical sequential
equilibrium, which generalizes the sequential equilibrium notion for
traditional games, for which we prove existence under mild assumptions.
Our framework also allows us to directly formulate assumptions about
"dynamic" rationality and interactive beliefs in order to explore strategic
interaction without assuming that players' beliefs are coordinated on an
equilibrium. In particular, we provide an exploration of (extensive-form)
rationalizability in psychological games.

Pierpaolo Battigalli (University Bocconi)and Martin Dufwenberg (University of Arizona)
Keywords: psychological games, belief-dependent motivation, extensive-form solution concepts,dynamic interactive epistemology
2005 - n° 286

We provide a summary updated guide for the construction, use and evaluation of
leading indicators, and an assessment of the most relevant recent developments in this
field of economic forecasting. To begin with, we analyze the problem of selecting a
target coincident variable for the leading indicators, which requires coincident indicator
selection, construction of composite coincident indexes, choice of filtering methods,
and business cycle dating procedures to transform the continous target into a binary
expansion/recession indicator. Next, we deal with criteria for choosing good leading
indicators, and simple non-model based methods to combine them into composite indexes.
Then, we examine models and methods to transform the leading indicators into
forecasts of the target variable. Finally, we consider the evaluation of the resulting
leading indicator based forecasts, and review the recent literature on the forecasting
performance of leading indicators.

Massimilano Marcellino (Università Bocconi and IGIER)
Keywords: Business Cycles, Leading Indicators, Coincident Indicators, Turning Points,Forecasting
2005 - n° 285
Mark Watson (Princeton University and NBER)

Abstract

Iterated multiperiod ahead time series forecasts are made using a one-period ahead model, iterated forward for the desired number of periods, whereas direct forecasts are made using a horizon-specific estimated model, where the dependent variable is the multi-period ahead value being forecasted. Which approach is better is an empirical matter: in theory, iterated forecasts are more efficient if correctly specified, but direct forecasts are more robust to model misspecification. This paper compares empirical iterated and direct forecasts from linear univariate and bivariate models by applying simulated out-of-sample methods to 171 U.S. monthly macroeconomic time series spanning 1959 - 2002. The iterated forecasts typically outperform the direct forecasts, particularly if the models can select long lag specifications. The relative performance of the iterated forecasts improves with the forecast horizon.

Massimiliano Marcellino (Università Bocconi and IGIER), James Stock (Harvard University and NBER) and Mark Watson (Princeton University and NBER)
Keywords: multistep forecasts, VAR forecasts, forecast comparisons
2005 - n° 284

We analyse the panel of the Greenbook forecasts (sample 1970-1996) and a
large panel of monthly variables for the US (sample 1970-2003) and show that
the bulk of dynamics of both the variables and their forecasts is explained by two
shocks. Moreover, a two factor model which exploits, in real time, information
on many time series to extract a two dimensional signal, produces a degree of
forecasting accuracy of the federal funds rate similar to that of the markets, and,
for output and inflation, similar to that of the Greenbook forecasts. This leads us
to conclude that the stochastic dimension of the US economy is two. We also show
that dimension two is generated by a real and nominal shock, with output mainly
driven by the real shock and inflation by the nominal shock. The implication is
that, by tracking any forecastable measure of real activity and price dynamics, the
Central Bank can track all fundamental dynamics in the economy.

Domenico Giannone (ECARES, Universit Libre de Bruxelles), Lucrezia Reichlin (ECARES, Universit Libre de Bruxelles and CEPR) and Luca Sala (IGIER and IEP, Università Bocconi)
2005 - n° 283

How does the relationship between an investor and entrepreneur depend on the legal
system? In a double moral hazard framework, we show how optimal contracts,
corporate governance, and investor actions depend on the legal system. With better
legal protection, investors give more non-contractible support, demand more downside
protection, and exercise more governance. Investors in better legal systems develop
stronger governance and support competencies. Therefore, when investing in a different
legal systems they behave differently than local investors. We test these predictions
using a hand-collected dataset of European venture capital deals. The empirical
results confirm the predictions of the model.

Laura Bottazzi, Marco Da Rin and Thomas Hellmann
2005 - n° 282
Tommaso Monacelli (IGIER, Universita Bocconi and CEPR)

Abstract

We employ Markov-switching regression methods to estimate fiscal policy feedback rules
in the U.S. for the period 1960-2002. Our approach allows to capture policy regime changes
endogenously. We reach three main conclusions. First, fiscal policy may be characterized,
according to Leeper (1991) terminology, as active from the 1960s throughout the 1980s, switching
gradually to passive in the early 1990s and switching back to active in early 2001. Second,
regime-switching fiscal rules are capable of tracking the time-series behaviour of the U.S. primary
deficit better than rules based on a constant parameter specification. Third, regime-switches in
monetary and fiscal policy rules do not exhibit any degree of synchronization. Our results are
at odds with the view that the post-war U.S. fiscal policy regime may be classified as passive at
all times, and seem to pose a challenge for the specification of the correct monetary-fiscal mix
within recent optimizing macroeconomic models considered suitable for policy analysis.

Carlo Favero(IGIER, Università Bocconi and CEPR) and Tommaso Monacelli (IGIER, Università Bocconi and CEPR)
Keywords: active and passive fiscal policy rule, Markov-switching estimation, monetary policy rule
2005 - n° 281

We explore the determinants of yield differentials between sovereign bonds in the Euro
area. There is a common trend in yield differentials, which is correlated with a measure
of the international risk factor. In contrast, liquidity differentials display sizeable heterogeneity
and no common factor. We present a model that predicts that yield differentials
should increase in both liquidity and risk, with an interaction term whose magnitude and
sign depends on the size of the liquidity differential with respect to the reference country.
Testing these predictions on daily data, we find that the international risk factor is consistently
priced, while liquidity differentials are priced only for a subset of countries and
their interaction with the risk factor is crucial to detect their effect.

Carlo Favero, Marco Pagano and Ernst-Ludwig von Thadden
2005 - n° 280

This paper brings together two strands of the empirical macro literature:
the reduced-form evidence that the yield spread helps in forecasting output
and the structural evidence on the difficulties of estimating the effect of monetary
policy on output in an intertemporal Euler equation. We show that
including a short-term interest rate and inflation in the forecasting equation
improves the forecasting performance of the spread for future output but the
coefficients on the short rate and inflation are difficult to interpret using a
standard macroeconomic framework. A decomposition of the yield spread
into an expectations-related component and a term premium allows a better
understanding of the forecasting model. In fact, the best forecasting model for
output is obtained by considering the term premium, the short-term interest
rate and inflation as predictors. We provide a possible structural interpretation
of these results by allowing for time-varying risk aversion, linearly related
to our estimate of the term premium, in an intertemporal Euler equation for
output.

Carlo Favero, Iryna Kaminska and Ulf Soderstrom
Keywords: Yield curve, term structure of interest rates, predictability, forecasting,GDP growth, estimated Euler equation
2005 - n° 279

We study optimal monetary policy in two prototype economies with sticky prices and credit
market frictions. In the first economy, credit frictions apply to the financing of the capital stock,
generate acceleration in response to shocks and the financial markup (i.e., the premium on
external funds) is countercyclical and negatively correlated with the asset price. In the second
economy, credit frictions apply to the flow of investment, generate persistence, and the financial
markup is procyclical and positively correlated with the asset price. We model monetary policy
in terms of welfare-maximizing interest rate rules. The main finding of our analysis is that strict
inflation stabilization is a robust optimal monetary policy prescription. The intuition is that, in
both models, credit frictions work in the direction of dampening the cyclical behavior of inflation
relative to its credit-frictionless level. Thus neither economy, despite yielding different inflation
and investment dynamics, generates a trade-off between price and financial markup stabilization.
A corollary of this result is that reacting to asset prices does not bear any independent welfare
role in the conduct of monetary policy.

Ester Faia (Universitat Pompeu Fabra) and Tommaso Monacelli (IGIER, Università Bocconi and CEPR)
Keywords: Optimal monetary policy rules, financial distortions, price stability, asset prices
2005 - n° 278

We provide a long term perspective on the individual retirement behavior
and on the future of early retirement. In a cross-country sample, we
find that total pension spending depends positively on the degree of early
retirement and on the share of elderly in the population, which increase
the proportion of retirees, but has hardly any effect on the per-capita pension
benefits. We show that in a Markovian political economic theoretical
framework, in which incentives to retire early are embedded, a political
equilibrium is characterized by an increasing sequence of social security
contribution rates converging to a steady state and early retirement. Comparative
statics suggest that aging and productivity slow-downs lead to
higher taxes and more early retirement. However, when income effects
are factored in, the model suggests that periods of stagnation - characterized
by decreasing labor income - may lead middle aged individuals to
postpone retirement.

J. Ignacio Conde-Ruiz (Prime Ministers Economic Bureau and FEDEA), Vincenzo Galasso (IGIER, Universita' Bocconi and CEPR)and Paola Profeta (Universita' di Pavia and Universita' Bocconi)
Keywords: pensions, lifetime income effect, tax burden, politicoeconomicMarkovian equilibrium
2004 - n° 277

Using a structural Vector Autoregression approach, this paper compares the
macroeconomic effects of the three main government spending tools: government
investment, consumption, and transfers to households, both in terms of the size
and the speed of their effects on GDP and its components. Contrary to a common
opinion, there is no evidence that government investment shocks are more
effective than government consumption shocks in boosting GDP: this is true both
in the short and, perhaps more surprisingly, in the long run. In fact, government
investment appears to crowd out private investment, especially in dwelling and in
machinery and equipment. There is no evidence that government investment pays
for itself in the long run, as proponents of the Golden Rule implicitly or explicitly
argue. The positive effects of government consumption itself are rather limited,
and defense purchases have even smaller (or negative) effects on GDP and private
investment. There is also no evidence that government transfers are more effective
than government consumption in stimulating demand.

Roberto Perotti (IGIER, Università Bocconi)
2004 - n° 276

This paper studies the effects of fiscal policy on GDP, inflation and interest rates
in 5 OECD countries, using a structural Vector Autoregression approach. Its main
results can be summarized as follows: 1) The effects of fiscal policy on GDP tend
to be small: government spending multipliers larger than 1 can be estimated only
in the US in the pre-1980 period. 2) There is no evidence that tax cuts work faster
or more effectively than spending increases. 3) The effects of government spending
shocks and tax cuts on GDP and its components have become substantially weaker
over time; in the post-1980 period these effects are mostly negative, particularly on
private investment. 4) Only in the post-1980 period is there evidence of positive
effects of government spending on long interest rates. In fact, when the real interest
rate is held constant in the impulse responses, much of the decline in the response
of GDP in the post-1980 period in the US and UK disappears. 5) Under plausible
values of its price elasticity, government spending typically has small effects on
inflation. 6) Both the decline in the variance of the fiscal shocks and the change
in their transmission mechanism contribute to the decline in the variance of GDP
after 1980.

Roberto Perotti (IGIER, Università Bocconi)
2004 - n° 275

Focusing on signaling games, I illustrate the relevance of the rationalizability
approach for the analysis multistage games with incomplete
information. I define a class of iterative solution procedures, featuring a
notion of forward induction: the Receiver tries to explain the Sender's
message in a way which is consistent with the Sender's strategic sophistication
and certain given restrictions on beliefs. The approach is applied to
some numerical examples and economic models. In a standard model with
verifiable messages a full disclosure result is obtained. In a model of job
market signaling the best separating equilibrium emerges as the unique
rationalizable outcome only when the high and low types are sufficiently
different. Otherwise, rationalizability only puts bounds on the education
choices of different types.

Pierpaolo Battigalli (Bocconi University and IGIER)
Keywords: incomplete information, signaling, rationalization
2004 - n° 274

This paper suggests that the main (and possibly unique) source of β- and σ- convergence
in GDP per worker (i.e. labor productivity) across Italian regions over the
1980-2000 period is the change in technical and allocative efficiency, i.e. convergence
in relative TFP levels. To reach this conclusion, I construct an approximation of
the production frontier at different points in time using Data Envelope Analysis
(DEA), and measure efficiency as the output-based distance from the frontier. This
method is entirely data-driven, and does not require the specification of any particular
functional form for technology. Changes in GDP per worker can be decomposed
in changes in relative efficiency, changes due to overall technological progress, and
changes due to capital deepening. My results suggest that: (i) differences in relative
TFP are quantitatively important; (ii) while technological progress and capital
deepening are the main, and equally important, forces behind the rightward shift
in the distribution of GDP per worker, convergence in relative TFP is the main
determinant of the change in the distribution's shape.

Marco Maffezzoli (Università Bocconi and IGIER)
Keywords: Italian regions, regional convergence, Total Factor Productivity, DataEnvelope Analysis
2004 - n° 273

We study the effects of model uncertainty in a simple New-Keynesian
model using robust control techniques. Due to the simple model structure, we
are able to find closed-form solutions for the robust control problem, analyzing
both instrument rules and targeting rules under different timing assumptions.
In all cases but one, an increased preference for robustness makes monetary
policy respond more aggressively to cost shocks but leaves the response to
demand shocks unchanged. As a consequence, inflation is less volatile and
output is more volatile than under the non-robust policy. Under one particular
timing assumption, however, increasing the preference for robustness has no
effect on the optimal targeting rule (nor on the economy).

Kai Leitemo (Norwegian School of Management) and Ulf Sderstrom (Dept. of Economicsand IGIER, Università Bocconi)
Keywords: Knightian uncertainty, model uncertainty, robust control, minmaxpolicies
2004 - n° 272

The existing studies of unemployment benefit and unemployment duration suggest that reforms
that lower either the level or the duration of benefits should reduce unemployment. Despite the
large number of such reforms implemented in Europe in the past decades, this paper presents
evidence that shows no correlation between the reforms and the evolution of unemployment.
This paper also provides an explanation for this fact by exploring the
interactions between unemployment benefits and social assistance programmes. Unemployed
workers who are also eligible, or expect to become eligible, for some social assistance
programmes are less concerned about their benefits being reduced or terminated. They will not
search particularly intensively around the time of benefit exhaustion nor will come particularly
less choosy about job offers by reducing their reservation wages. Data from the European
Community Household Panel (ECHP) are used to provide evidence to support this argument.
Results show that, in fact, for social assistance recipients the probability of finding a job is not
particularly higher during the last months of entitlement.

Michele Pellizzari
Keywords: Unemployment duration, unemployment insurance, social assistance
2004 - n° 271

We document the presence of a trade-off between unemployment benefits (UB) and employment protection legislation (EPL) in the provision of insurance against labor market risk. Different countries' locations along this trade-off represent stable, hard to modify, politico-economic equilibria. We develop a model in which voters are required to cast a ballot over the strictness of EPL, the generosity of UBs and the amount of redistribution involved by the financing of unemployment insurance. Agents are heterogeneous along two dimensions:  imployment status - insiders and outsiders - and skills - low and high. Unlike previous work on EPL, we model employment protection as an institution redistributing among insiders, notably in favour of the low-skill workers. A key implication of the model is that configurations with strict EPL and low UB should emerge in presence of compressed wage structures. Micro data on wage premia on educational attainments and on the strictness of EPL are in line with our results. We also find empirical support to the substantive assumptions of the model on the effects of EPL.

Tito Boeri (Università Bocconi, IGIER andFondazione Rodolfo Debenedetti), J. Ignacio Conde-Ruiz (FEDEA) and Vincenzo Galasso (IGIER, Università Bocconi and CEPR)
Keywords: employment protection, unemployment insurance, political equilibria
2004 - n° 270
Alessandro Sembenelli (Università di Torino)

Abstract

We study how public policy can contribute to increase the share of early stage and
high-tech venture capital investments, thus helping the development of active venture
capital markets. A simple extension of the seminal model by Holmstrom and Tirole
(1997) provides a theoretical base for our analysis. We then explore a unique panel of
data for 14 European countries between 1988 and 2001. We have several novel findings.
First, the opening of stock markets targeted at entrepreneurial companies positively
affects the shares of early stage and high-tech venture capital investments; reductions
in capital gains tax rates have a similar, albeit weaker, effect. Second, a reduction in
labor regulation results in a higher share of high-tech investments. Finally, we find no
evidence of a shortage of supply of venture capital funds in Europe, and no evidence
of an effect of increased public R&D spending on the share of high-tech or early stage
venture capital investments.

Marco Da Rin (Università di Torino, ECGI and IGIER), Giovanna Nicodano (Università di Torino) and Alessandro Sembenelli (Università di Torino)
Keywords: Venture Capital, Capital Gains Tax,Public R&D Expenditure, Barriers to Entrepreneurship, Stock Markets, Public Policy
2004 - n° 269
dell'Economia e delle Finanze) and Cristian Tegami (CONSIP SpA)

Abstract

The aim of this paper is to propose a new method for forecasting Italian
inflation. We expand on a standard factor model framework (see Stock and
Watson (1998)) along several dimensions. To start with we pay special
attention to the modeling of the autoregressive component of the inflation.
Second, we apply forecast combination (Granger (2000) and Pesaran and
Timmermann (2001)) and generate our forecast by averaging the predictions
of a large number of models. Third, we allow for time variation in parameters
by applying rolling regression techniques, with a window of three-years of
monthly data. Backtesting shows that our strategy outperforms both the
benchmark model (i.e. a factor model which does not allow for model
uncertainty) and additional univariate (ARMA) and multivariate (VAR)
models. Our strategy proves to improve on alternative models also when
applied to turning point prediction.

Carlo A. Favero (IGIER, Università Bocconi), Ottavio Ricchi (Ministero
2004 - n° 268

This paper integrates a theory of equilibrium unemployment into a monetary model
with nominal price rigidities. The model is used to study the dynamic response of the
economy to a monetary policy shock. The labor market displays search and matching
frictions and bargaining over real wages and hours of work. Search frictions generate unemployment in equilibrium. Wage bargaining introduces a microfounded real wage
rigidity. First, I study a Nash bargaining model. Then, I develop an alternative
bargaining model, which I refer to as right-to-manage bargaining. Both models have
similar predictions in terms of real wage dynamics: bargaining significantly reduces
the volatility of the real wage. But they have different implications for inflation
dynamics: under right-to-manage, the real wage rigidity also results in smaller
fluctuations of inflation. These findings are consistent with recent evidence
suggesting that real wages and inflation only vary by a moderate amount in
response to a monetary shock. Finally, the model can explain important features of
labor-market fluctuations. In particular, a monetary expansion leads to a rise in job
creation and to a hump-shaped decline in unemployment.

Antonella Trigari
Keywords: Monetary Policy, Labor Market Search, Business Cycles, Inflation
2004 - n° 267

This paper explores the quantitative plausibility of three candidate explanations for the
European productivity slowdown with respect to the US. The empirical plausibility of the
common wisdom on the topic (the "IT usage" hypothesis) is found to crucially depend on
how IT-using industries are defined. If a narrow definition is chosen, the IT usage
hypothesis no longer explains the whole of the EU productivity slowdown but just about
55% of it, with the remaining part to be attributed to other factors than IT, as argued in the
IT irrelevance view. No room is left for IT-producing industries as another potential
vehicle for the US-EU productivity growth gap, instead.

Francesco Daveri (University of Parma and IGIER)
2004 - n° 266
Marco Da Rin (Turin University, ECGI and IGIER) and Thomas Hellmann (University of British Columbia)

Abstract

Financial intermediaries can choose the extent to which they want to be active
investors, providing valuable services like advice, support and corporate governance.
We examine the determinants of the decision to become an active financial
intermediary using a hand-collected dataset on European venture capital deals. We
find organizational specialization to be a key driver. Venture firms which are
independent and focused on venture capital alone get more involved with their
companies. The human capital of venture partners is another key driver of active
financial intermediation. Venture firms whose partners' have prior business
experience or a scientific education provide more support and governance. These
results have implications for prevailing views of financial intermediation, which largely
abstract from issues of specialization and human capital.

Laura Bottazzi (Bocconi University, IGIER and CEPR),
Keywords: Venture Capital, Corporate Governance, Human Capital, Organizations
2004 - n° 265

This paper discusses the recent literature on the role of the state in economic development.
It concludes that government incentives to enact sound policies are key to economic success.
It also discusses the evidence on what happens after episodes of economic and political
liberalizations, asking whether political liberalizations strengthen government incentives to
enact sound economic policies. The answer is mixed. Most episodes of economic
liberalizations are indeed preceded by political liberalizations. But the countries that have
done better are those that have managed to open up the economy first, and only later have
liberalized their political system.

Guido Tabellini
Keywords: Growth, Institutions and Democracy
2004 - n° 264
Guido Tabellini (IGIER, Università Bocconi)

Abstract

This paper studies empirically the effects of and the interactions amongst economic and
political liberalizations. Economic liberalizations are measured by a widely used indicator
that captures the scope of the market in the economy, and in particular of policies
towards freer international trade (cf. Sachs and Werner 1995, Wacziarg and Welch 2003).
Political liberalizations correspond to the event of becoming a democracy. Using a
difference-in-difference estimation, we ask what are the effects of liberalizations on
economic performance, on macroeconomic policy and on structural policies. The main
results concern the quantitative relevance of the feedback and interaction effects
between the two kinds of reforms. First, we find positive feedback effects between
economic and political reforms. The timing of events indicates that causality is more
likely to run from political to economic liberalizations, rather than viceversa, but we
cannot rule out feedback effects in both directions. Second, the sequence of reforms
matters. Countries that first liberalize and then become democracies do much better
than countries that pursue the opposite sequence, in almost all dimensions.

Francesco Giavazzi (IGIER, Università Bocconi) and Guido Tabellini(IGIER, Università Bocconi)
Keywords: development, democracy, economic reform, growth
2004 - n° 263

We develop a structural model of a small open economy with gradual exchange rate pass-through and endogenous inertia in inflation and output. We then estimate the model by matching the implied impulse responses with those obtained from a VAR model estimated on Swedish data. Although our model is highly stylized it captures very well the responses of output, domestic and imported inflation, the interest rate, and the real exchange rate. However, in order to account for the observed persistence in the real exchange rate and
the large deviations from UIP, we need a large and volatile premium on foreign exchange.

Jesper Lind (Sveriges Riksbank), Marianne Nessn (Sveriges Riksbank) and Ulf Soderstrom (IGIER and Università Bocconi)
Keywords: Structural open-economy model, new open-economy macroeconomics,estimation, calibration
2004 - n° 262

Firing frictions and renegotiation costs affect worker and firm preferences
for rigid wages versus individualized Nash bargaining in a standard
model of equilibrium unemployment, in which workers vary by
observable skill. Rigid wages permit savings on renegotiation costs and
prevent workers from exploiting the firing friction. For standard calibrations,
the model can account for political support for wage rigidity
by both workers and firms, especially in labor markets for intermediate
skills. The firing friction is necessary for this effect, and reinforces
the impact of both turbulence and other labor market institutions on
preferences for rigid wages.

Tito Boeri and Michael Burda
Keywords: Wage rigidities, job protection, firing taxes, renegotiation costs, equilibrium unemployment
2004 - n° 261

We analyse the evolution of the business cycle in the accession countries, after a careful examination of the seasonal properties of the available series and the required modification of the cycle dating procedures. We then focus on the degree of cyclical concordance within the group of accession countries, which turns out to be in general lower than that between the existing EU countries (the Baltic countries constitute an exception). With respect to the Eurozone, the indications of synchronization are also generally low and lower relative to the position obtaining for countries taking part in previous enlargements (with the exceptions of Poland, Slovenia and Hungary). In the light of the optimal currency area literature, these results cast doubts on the usefulness of adopting the euro in the near future for most accession countries, though other criteria such as the extent of trade and the gains in credibility may point in a different direction.

Michael Artis, Massimiliano Marcellino and Tommaso Proietti
Keywords: Business cycles, dating algorithms, cycle synchronization, EU enlargement, seasonal adjustment
2004 - n° 260

The accession of ten countries into the European Union makes the
forecasting of their key macroeconomic indicators such as GDP
growth, inflation and interest rates an exercise of some importance.
Because of the transition period, only short spans of reliable time series
are available which suggests the adoption of simple time series models
as forecasting tools, because of their parsimonious specification and
good performance. Nevertheless, despite this constraint on the span of
data, a large number of macroeconomic variables (for a given time
span) are available which are of potential use in forecasting, making the
class of dynamic factor models a reasonable alternative forecasting tool.
We compare the relative performance of the two forecasting approaches,
first by means of simulation experiments and then by using data for five
Acceding countries. We also evaluate the role of Euro-area information for
forecasting, and the usefulness of robustifying techniques such as
intercept corrections and second differencing. We find that factor models
work well in general, even though there are marked differences across
countries. Robustifying techniques are useful in a few cases, while
Euro-area information is virtually irrelevant.

Anindya Banerjee, Massimiliano Marcellino and Igor Masten
Keywords: Factor models, forecasts, time series models, Acceding countries
2004 - n° 259

The hazard rate of investment is derived within a real option model, and its properties
are analyzed in order to directly study the relation between uncertainty and investment.
Maximum likelihood estimates of the hazard are calculated using a sample of MNEs that
have invested in Central and Eastern Europe over the period 1990-1998. Employing a
standard, non-parametric specification of the hazard, our measure of uncertainty has a
negative effect on investment, but the reduced-form model is unable to control for nonlinearities
in the relationship. The structural estimation of the option-based hazard is
instead able to account for the non-linearities and exhibits a significant value of waiting,
though the latter is independent from our measure of uncertainty. This finding supports
the existence of alternative channels through which uncertainty can affect investment.

Carlo Altomonte and Enrico Pennings
Keywords: hazard rates, uncertainty, foreign investment
2004 - n° 258

Equilibrium business cycle models have typically less shocks than variables.
As pointed out by Altug, 1989 and Sargent, 1989, if variables are measured with
error, this characteristic implies that the model solution for measured variables has
a factor structure. This paper compares estimation performance for the impulse
response coefficients based on a VAR approximation to this class of models and
an estimation method that explicitly takes into account the restrictions implied
by the factor structure. Bias and mean squared error for both factor based and
VAR based estimates of impulse response functions are quantified using, as data
generating process, a calibrated standard equilibrium business cycle model. We
show that, at short horizons, VAR estimates of impulse response functions are less
accurate than factor estimates while the two methods perform similarly at medium
and long run horizons.

Domenico Giannone, Lucrezia Reichlin and Luca Sala
2004 - n° 257

This paper aims to test some implications of the Fiscal theory of
the price level (FTPL). We develop a model similar to Leeper (1991)
and Woodford (1996), but extended so to generate real effects of fiscal
policy also in the "Ricardian" regime, via an OLG demographic
structure. We test on the data the predictions of the FTPL as incorporated
in the model. We find that the US fiscal policy in the period
1960-1979 can be classified as "Non-Ricardian", while it is "Ricardian"
since 1990. According to our analysis, the fiscal theory of the
price level characterizes one phase of the post-war US history.

Luca Sala
Keywords: Fiscal theory of the price level, monetary and fiscalpolicy interaction, VAR models, fiscal shocks
2004 - n° 256

We use a quantitative model of the U.S. economy to analyze the response
of long-term interest rates to monetary policy, and compare the model results
with empirical evidence. We find that the strong and time-varying yield curve
response to monetary policy innovations found in the data can be explained by
the model. A key ingredient in explaining the yield curve response is central
bank private information about the state of the economy or about its own
target for inflation.

Tore Ellingsen (Stockholm School of Economics) and Ulf Soderstrom (IGIER, Università Bocconi)
Keywords: Term structure of interest rates, yield curve, central bank privateinformation, excess sensitivity
2004 - n° 255

In this paper a simple dynamic optimization problem is solved with the help of
the recursive saddle point method developed by Marcet and Marimon (1999). According
to Marcet and Marimon, their technique should yield a full characterization
of the set of solutions for this problem. We show though, that while their method
allows us to calculate the true value of the optimization program, not all solutions
which it admits are correct. Indeed, some of the policies which it generates as
solutions to our problem, are either suboptimal or do not even satisfy feasibility.
We identify the reasons underlying this failure and discuss its implications for the
numerous existing applications.

Matthias Messner (Bocconi University and IGIER) and Nicola Pavoni (University College London)
Keywords: Recursive saddle point, recursive contracts, dynamic programming
2004 - n° 254

We analyze welfare maximizing monetary policy in a dynamic general equilibrium two-country
model with price stickiness and imperfect competition. In this context, a typical terms
of trade externality affects policy interaction between independent monetary authorities. Unlike
the existing literature, we remain consistent to a public finance approach by an explicit
consideration of all the distortions that are relevant to the Ramsey planner. This strategy entails
two main advantages. First, it allows an accurate characterization of optimal policy in an economy
that evolves around a steady state which is not necessarily efficient. Second, it allows to describe
a full range of alternative dynamic equilibria when price setters in both countries are completely
forward-looking and households' preferences are not restricted. We study optimal policy both in
the long-run and in response to shocks, and we compare commitment under Nash competition
and under cooperation. By deriving a second order accurate solution to the policy functions,
we also characterize the welfare gains from international policy cooperation.

Ester Faia (Universitat Pompeu Fabra) and Tommaso Monacelli (IGIER, Università Bocconi and CEPR)
Keywords: optimal monetary policy, Ramsey planner, Nash equilibrium, cooperation,sticky prices, imperfect competition
2004 - n° 253
Carlo Favero (IGIER, CEPR and Università Bocconi) and Iryna Kaminska (IGIER, Università Bocconi)

Abstract

In this paper we concentrate on the hypothesis that the empirical
rejections of the Expectations Theory (ET) of the term structure of interest
rates can be caused by improper modelling of expectations. Our
starting point is an interesting anomaly found by Campbell-Shiller (1987),
when by taking a VAR approach they abandon limited information
approach to test the ET, in which realized returns are taken as a proxy for
expected returns. We use financial factors and macroeconomic information
to construct a test of the theory based on simulating investors'
effort to use the model in 'real time' to forecast future monetary policy
rates. Our findings suggest that the importance of fluctuations of risk
premia in explaining the deviation from the ET is reduced when some
forecasting model for short-term rates is adopted and a proper evaluation
of uncertainty associated to policy rates forecast is considered.

Andrea Carriero (IGIER, Università Bocconi), Carlo Favero (IGIER, CEPR and Università Bocconi)and Iryna Kaminska (IGIER, Università Bocconi)
Keywords: Expectations Theory, Macroeconomic Information in Finance
2003 - n° 252

Employment protection legislations (EPL) are not enforced uniformly across the board. There are a number of exemptions to the coverage
of these provisions: firms below a given threshold scale and workers with temporary contracts are not subject to the most restrictive  rovisions. This within country variation in enforcement allows to make inferences on the impact of EPL which go beyond the usual cross-country approach. In this paper we develop a simple model which explains why these exemptions are in place to start with. Then we empirically assess the effects of EPL on dismissal probabilities, based on a double-difference approach. Our results are in line with the predictions of the theoretical model. Workers in firms exempted from EPL are more likely to be laid-off We do not observe this effect in the case of temporary workers. There is no effect of the exemption threshold on the growth of firms.

Tito Boeri (Università Bocconi-IGIER) and Juan F. Jimeno (FEDEA and Universidad de Alcal)
2003 - n° 251

We present a theoretical model of a parliamentary democracy, where
party structures, government coalitions and fiscal policies are endogenously
determined. The model predicts that, relative to proportional elections, majoritarian
elections reduce government spending because they reduce party
fragmentation and, therefore, the incidence of coalition governments. Party
fragmentation can persist under majoritarian rule if party supporters are
unevenly distributed across electoral districts. Economic and political data,
from up to 50 post-war parliamentary democracies, strongly support our
joint predictions from the electoral rule, to the party system, to the type of
government, and to government spending.

Torsten Persson (IIES, Stockholm University, CEPR and NBER), Gerard Roland (UC Berkeley, CEPRE andWDI) and Guido Tabellini(IGIER, Bocconi University, CEPR and CES-Ifo)
Keywords: electoral rules, party systems, coalition governments, fiscal policy, electoral accountability
2003 - n° 250

While there is consensus on the need to raise the time spent in the market by
European women, it is not clear how these goals should be achieved. Tax wedges,
assistance in the job search process, and part-time jobs are policy instruments that
are widely debated in policy circles. The paper presents a simple model of labour
supply with market frictions and heterogenous home production where the effects of
these policies can be coherently analysed. We show that subsidies to labour market
entry increases women's entrance in the labour market, but they also increase exits from
the labour market, with ambiguous effect on employment. Subsidies to part-time do
increase employment, but they have ambiguous effects on hours and market production.
Finally, reductions in taxes on market activities that are highly substitutable with home
production have unambiguous positive effects on market employment and production.

Pietro Garibaldi (Bocconi University, IGIER and CEPR) and Etienne Wasmer (ECARES, Free University of Brussels, University of Metz and CEPR)
2003 - n° 249

We examine a model of contracting where parties interact repeatedly and can contract
at any point in time, but writing enforceable contracts is costly. A contract can
describe contingencies and actions at a more or less detailed level, and the cost of writing
a contract is proportional to the amount of detail. We consider both formal (externally
enforced) and informal (self-enforcing) contracts. The presence of writing costs has important
implications both for the optimal structure of formal contracts, particularly the
tradeo. between contingent and spot contracts, and for the interaction between formal
and informal contracting. Our model sheds light on these implications and generates a
rich set of predictions about the determinants of the optimal mode of contracting.

Pierpaolo Battigalli (Bocconi University, IEP and IGIER)and Giovanni Maggi (Princeton University)
Keywords: writing costs, contingent vs spot contracting, formal vs informal contracts
2003 - n° 248

This paper presents a simple model of imperfect labor markets with endogenous labor market participation and home production. We show that a two-sector economy (home and market) implies a three-state labor market when labor market imperfections take the form of an irreversible entry cost incurred by workers. This simple framework brings several results. First, it delivers an expression for the employment rate and as side-products, a measure of the unemployment rate and the size of the labour force. Second, it rationalizes several empirical works on the definition of unemployment in labor force surveys. Third, it derives endogenously all flows between three labour market states. Fourth, a calibration of the model rationalizes di.erences in employment rates: in the US., we find a market productivity premium of +30% and market frictions of -15% compared to France. Finally, the model is a very simple reduced form of search models with which it is fully consistent: the irreversible entry cost is the opportunity cost of search and depends on aggregate conditions.

Pietro Garibaldi (Università Bocconi, CEPR and fRDB) and Etienne Wasmer (ECARES, Free University of Brussels, University of Metz and CEPR)
2003 - n° 247

The existing literature ignores the fact that in most European countries the
strictness of Employment Protection Legislation (EPL) varies across the firm size
distribution. In Italy firms are obliged to rehire an unfairly dismissed worker only
if they employ more than 15 employees. Theoretically, the paper solves a
baseline model of EPL with threshold effects, and shows that firms close to the
threshold are characterized by an increase in inaction and by a reluctance to
grow. Empirically, the paper estimates transition probability matrices on firm
level employment using a longitudinal data set based on Italian Social Security
(INPS) records, and finds two results. First, firms close to the 15 employees
threshold experience an increase in persistence of 1.5 percent with respect to a
baseline statistical model. Second, firms with 15 employees are more likely to
move backward than upward. Finally, the paper tests the effect of a 1990 reform
which tightened the regulation on individual dismissal only for small firms. It
finds that the persistence of small firms relative to large firms increased
significantly. Overall, these threshold effects are significant and robust, but
quantitatively small.

Pietro Garibaldi (IGIER,Università Bocconi, CEPRand fRDB), Lia Pacelli (Università di Torino, LABORatorio R.Revelli) and Andrea Borgarello (LABORatorio R.Revelli)
Keywords: Employment Protection Legislation, Firm Size
2003 - n° 246

We consider a society that has to elect an official who provides a public service
for the citizens. Potential candidates differ in their competence and every potential
candidate has private information about his opportunity cost to perform the task
of the elected official. We develop a new citizen candidate model with a unique
equilibrium to analyze citizens' candidature decisions.
Under some weak additional assumptions, bad candidates run with a higher
probability than good ones, and for unattractive positions, good candidates freeride
on bad ones. We also analyze the comparative static effects of wage increases
and cost of running on the potential candidates' entry decisions.

Matthias Messner (Bocconi University and IGIER) and Matthias Polborn (UWO and University of Illinois)
Keywords: Citizen-candidate model, political economy, private provision of publicgoods, wage for politicians
2003 - n° 245

This paper examines competition in a liberalized market, with reference to some key features of the natural gas industry. Each firm has a low (zero) marginal cost core capacity, due to long term contracts with take or pay obligations, and additional capacity at higher marginal costs. The market is decentralized and the firms decide which customers to serve, competing then in prices. We show that under both sequential and simultaneous entry, there is a strong incentive to segment the market: when take-or-pay obligations are still to be covered, entering and competing for the same customers implies low margins. If instead a firm is left as a monopolist on a fraction of the market,  xhausting its obligation, it has no further incentive to enter a second market, where the rival will be monopolist as well. Hence, we obtain entry without competition. Antitrust ceilings do not prevent such an outcome while a wholesale pool market induces generalized competition and low margins in the retail segment.

Michele Polo (Università Bocconi, IGIER and SET) and Carlo Scarpa (University of Brescia and SET)
2003 - n° 244
What is the future of social security systems in OECD countries? In our view, the answer
belongs to the realm of politics. We evaluate how political constraints shape the social
security system in six countries - France, Germany, Italy, Spain, the UK and the US -
under population aging. Two main aspects of the aging process are relevant to the
analysis. First, the increase in the dependency ratio - the ratio of retirees to workers
- reduces the average profitability of the unfunded social security system, thereby
inducing the agents to reduce the size of the system by substituting their claims
towards future pensions with more private savings. Second, an aging electorate leads
to larger systems, since it increases the relevance of pension spending on the
policy-makers' agenda. The overall assessment from our simulations is that the political
aspect dominates in all countries, albeit with some differences. Spain, the fastest aging
country, faces the largest increase in the social security contribution rate. When labor
market considerations are introduced, the political effect still dominates, but it is less
sizeable. Country specific characteristics (not accounted for in our simulations), such as
the degree of redistribution in the pension system and the existence of family ties in
the society, may also matter. Our simulations deliver a strong policy implication: an
increase in the effective retirement age always decreases the size of the system chosen
by the voters, while often increasing its generosity. Finally, delegation of pension policy
to the EC may reduce political accountability and hence help to reform the systems.

Vincenzo Galasso (IGIER, Bocconi University and CEPR) and Paola Profeta
Keywords: Political Equilibria, Demographic Dynamics, Retirement Age
2003 - n° 243
We offer a simple explanation for oligopolistic reaction based on Bayesian learning by
rival firms operating in an uncertain environment. We test the implications of the model
through a discrete choice panel data sample of MNEs that have invested in Central and
Eastern Europe over the period 1990-1997. Interacting the measure of rivals investment
in country-industry pairs with uncertainty we find strong evidence for oligopolistic reaction,
especially through the channel of Bayesian learning postulated by the model. The
findings are robust with respect to different model specifications.

Carlo Altomonte (Università Bocconi and KU Leuven)and Enrico Pennings (Università Bocconi and IGIER)
Keywords: discrete choice panel data, uncertainty, FDI, oligopolistic reaction
2003 - n° 242
This paper presents firm-level evidence on the dynamics of the relative demand for non-manual workers in Italian manufacturing during the 1990s. The analysis provides a number of interesting results. First, the rise within firms in the share of non manual workers in both employment and hours worked (within-firm skill upgrading) is the main determinant of the increase in the relative demand for skilled workers. By contrast, demand changes associated to trade have mitigated such a rise by shifting employment away from skill-intensive firms. Second, while the relative number of hours worked by skilled workers within firms has risen, the hourly wage premium has fallen. Third, within-firm skill upgrading is strongly and signi cantly related to investment in computers and R&D. Fourth, we find that technical progress has raised the relative productivity of skilled workers (the skill-bias of technical progress is positive). Finally we show that the standard approach that measures annual, rather than hourly relative wages, produces a downward bias in the estimate of the skill-bias of technical progress.

Paolo Manasse (University of Bologna) and Luca Stanca (University of Milan-Bicocca)
Keywords: wage differentials, skill bias, technical progress, globalization
2003 - n° 241
We present a dynamic comparative advantage model in which moderate reductions in
trade costs can generate sizable increases in trade volumes over time. A fall in trade
costs has two e.ects on the volume of trade. First, for given factor endowments, it
raises the degree of specialization of countries, leading to a larger volume of trade
in the short run. Second, it raises the factor price of each country's abundant
production factor, leading to diverging paths of relative factor endowments across
countries and a rising degree of specialization. A simulation exercise shows that
a fall in trade costs over time produces a non-linear increase in the trade share of
output as in the data. Even when elasticities of substitution are not particularly
high, moderate reductions in trade costs lead to large trade volumes over time.

Alejandro Cunat (LSE and CEPR) and Marco Maffezzoli (Universit� Bocconi and IGIER)
Keywords: International Trade, Heckscher-Ohlin
2003 - n° 240
Employees of "globalized" firms face a riskier, but potentially more rewarding,
menu of labor market outcomes. We document this neglected trade-off of
globalization for a sample of Indian manufacturing firms. On the one hand,
the employees of firms subject to foreign competition face a more uncertain
stream of earnings and riskier employment prospects. On the other, they enjoy
a more rapid career and/or have more opportunities to train and upgrade
their skills. The negative uncertainty costs and the positive incentive effects
of globalization are thus twin to each other. Concentrating on just one side
of the coin gives a misleading picture of globalization.

Francesco Daveri (University of Parma and IGIER) . Paolo Manasse (University of Bologna andIGIER) and Danila Serra (London School of Economics)
Keywords: Globalization; Uncertainty; Trade and Wages; Wages; Employment; India; Training, Promotions, Labor markets
2003 - n° 239
We document the presence of a trade-off between unemployment benefits (UB) and
employment protection legislation (EPL) in the provision of insurance against labour
market risk. The mix of quantity restrictions and price regulations adopted by the
various countries would seem to correspond to a stable politico-economic equilibrium.
We develop a model in which voters are required to cast a ballot over the strictness of
EPL and over the generosity of UB. Agents are heterogeneous along two dimensions:
employment status - there are insiders and outsiders - and skills - low and high skills.
We show that if there exists a majority of low-skill insiders, the voting game has a
politico-economic equilibrium with low UB and high EPL; otherwise, the equilibrium
features high UB and low EPL. Another testable implication of the model is that a
larger share of elderly workers increases the demand for EPL. Panel data on institutions
and on the age and educational structures of the populations are broadly in line with
our results. We also find that those favouring EPL over UB in a public opinion poll
carried in 2001 in Italy have precisely the same characteristics predicted by our model.

Tito Boeri (Università Bocconi, IGIER, and Fondazione Rodolfo Debenedetti), J.Ignacio Conde-Ruiz (FEDEA) and Vincenzo Galasso (IGIER, Università Bocconi and CEPR)
Keywords: employment protection, unemployment insurance, political equilibria
2003 - n° 238

Policies are typically chosen by politicians and bureaucrats. This paper investigates the criteria that should lead a society to allocate policy tasks to elected policymakers (politicians) or non elected bureaucrats. Politicians tend to be preferable for tasks that have the following features: they do not involve too much
specific technical ability relative to effort; there is uncertainty ex ante about ex post preferences of the public and flexibility is valuable; time inconsistency is not an issue; small but powerful vested interests do not have large stakes in the policy outcome; effective decisions over policies require taking into account policy
complementarities and compensating the losers; the policies imply redistributive conflicts among large groups of voters. The reverse apply to the attribution of prerogatives to bureaucrats.

Alberto Alesina (Harvard University) and Guido Tabellini (Bocconi University)
2003 - n° 237
In this paper we compare alternative approaches for dating the Euro area business cycle and analyzing its characteristics. First, we extend a commonly used dating procedure to allow for length, size and amplitude restrictions, and to compute the probability of a phase change. Second, we apply the modified algorithm for dating both the classical Euro area cycle and the deviation cycle, where the latter is obtained by a variety of methods, including a modified HP filter that reproduces the features of the BK filter but avoids end-point problems, and a production function based approach. Third, we repeat the dating exercise for the main Euro
area countries, evaluate the degree of syncronization, and compare the results with the UK and the US. Fourth, we construct indices of business cycle diffusion, and assess how spread are cyclical movements throughout the economy. Finally, we repeat the dating exercise using monthly industrial production data, to evaluate whether the higher sampling frequency can compensate the higher variability of the series and produce a more accurate dating.

Michael Artis (European University Institute and CEPR), Massimiliano Marcellino (Università Bocconi, Igier and CEPR) and Tommaso Proietti (Università di Udine and European University Institute)
Keywords: Business cycle, Euro area, cycle dating, cycle synchronization
2003 - n° 236
In this paper we evaluate the relative merits of three approaches to information extraction
from a large data set for forecasting, namely, the use of an automated model selection
procedure, the adoption of a factor model, and single-indicator-based forecast pooling. The
comparison is conducted using a large set of indicators for forecasting US inflation and GDP
growth. We also compare our large set of leading indicators with purely autoregressive
models, using an evaluation procedure that is particularly relevant for policy making. The
evaluation is conducted both ex-post and in a pseudo real time context, for several forecast
horizons, and using both recursive and rolling estimation. The results indicate a preference for
simple forecasting tools, with a good relative performance of pure autoregressive models, and
substantial instability in the leading characteristics of the indicators.

Anindya Banerjee (European University Institute) and Massimiliano Marcellino (IEP-Bocconi University, IGIER)
Keywords: leading indicator, factor model, model selection, GDP growth, inflation
2003 - n° 235
In this paper we evaluate the role of a set of variables as leading indicators for Euro-area
inflation and GDP growth. Our evaluation is based on using the variables in the ECB Euroarea
model database, plus a set of similar variables for the US. We compare the forecasting
performance of each indicator with that of purely autoregressive models, using an evaluation
procedure that is particularly relevant for policy making. The evaluation is conducted both expost
and in a pseudo real time context, for several forecast horizons, and using both recursive
and rolling estimation. We also analyze three different approaches to combining the
information from several indicators. First, we discuss the use as indicators of the estimated
factors from a dynamic factor model for all the indicators. Second, an automated model
selection procedure is applied to models with a large set of indicators. Third, we consider
pooling the single indicator forecasts. The results indicate that single indicator forecasts are on
average better than those derived from more complicated methods, but for them to beat the
autoregression a different indicator has to be used in each period. A simple real-time
procedure for indicator-selection produces good results.


Anindya Banerjee (European University Institute), Massiliano Marcellino (IEP, IGIER, Bocconi University) and Igor Masten (European University Institute)
Keywords: leading indicator, factor model, model selection, GDP growth, inflation
2003 - n° 234
There has been a lot of interest recently in developing small scale
rule-based empirical macro models for the analysis of monetary policy.
These models, based on the conventional view that inflation
stabilization should be a concern of monetary policy only, have typically neglected
the role of fiscal policy. We start with the evidence that a baseline
VAR-augmented Taylor rule can deliver recurrent mispredictions of
inflation in the U.S. before 1987. We then show that a fiscal feed-back rule, in
which the primary deficit reacts to both the output gap and the
government debt, can well characterize the behavior of fiscal policy throughout the
sample. However, by employing Markov-switching methods, we find
evidence of substantial instability across fiscal regimes. Yet this precisely happens
\QTR{it}{before 1987}. We then augment the monetary VAR\ with a
fiscal policy rule and control for the endogenous regime switches for both
rules. We find that only over time windows belonging to the pre-1987 period
the model based on the two rules can predict the behavior of \ inflation
better than the one based just on the monetary policy rule. \QTR{it}{After
1987}, when fiscal policy is estimated to switch to a regime of fiscal discipline,
the monetary-fiscal mix can be appropriately described as a regime of
monetary dominance. Over this period a monetary policy rule based
model is always a better predictor of the inflation behavior than the one
comprising both a monetary and a fiscal rule.

Carlo A. Favero (IGIER, Università Bocconi and CEPR)
Keywords: Monetary and Fiscal Policy Rules, Markov Switching, Inflation
2003 - n° 233
Within a small open economy we derive a tractable framework for the analysis
of the optimal monetary policy design problem as well as of simple feedback
rules. The international relative price channel is emphasized as the one peculiar
to the open economy dimension of monetary policy. Hence flexibility in
the nominal exchange rate enhances such channel. We first show that a feature
of the optimal policy under commitment, unlike the one under discretion,
is to entail stationary nominal exchange rate and price level. We show that
this property characterizes also a regime of fixed exchange rates. Hence, in
evaluating the desirability of such a regime, this benefit needs to be weighed
against the cost of excess smoothness in the terms of trade. We show that
there exist combinations of the parameter values that make a regime of fixed
exchange rates more desirable than the discretionary optimal policy. When the
economy is sufficiently open, this happens for a high relative weight assigned to
output gap variability in the Central Bank's loss function and for high values of
the elasticity substitution between domestic and foreign goods. We draw from
this interesting conclusions for a modern version of the optimal currency area
literature.

Tommaso Monacelli (IGIER, Università Bocconi)
Keywords: Optimal monetary policy, commitment, discretion, fixed exchange rates
2003 - n° 232
Do fiscal policy variables - overall spending, revenue, deficits and
welfare-state spending - display systematic patterns in the vicinity of
elections? And do such electoral cycles differ among political systems?
We investigate these questions in a data set encompassing sixty democracies
from 1960-98. Without conditioning on the political system, we find
that taxes are cut before elections, painful fiscal adjustments are postponed
until after the elections, while welfare-state spending displays no
electoral cycle. Our subsequent results show that the pre-election tax cuts
is a universal phenomenon. The post-election fiscal adjustments (spending
cuts, tax hikes and rises in surplus) are, however, only present in
presidential democracies. Moreover, majoritarian electoral rules alone are
associated with pre-electoral spending cuts, while proportional electoral
rules are associated with expansions of welfare spending both before and
after elections.

Torsten Persson (IIES,Stockholm University) and Guido Tabellini (IGIER, Bocconi University)
Keywords: Elections, constitution, form of government, electoral rules, fiscal policy
2003 - n° 231
We construct and numerically solve a dynamic Heckscher-Ohlin model which, depending
on the distribution of production factors in the world and parameter values, allows for
worldwide factor price equalization or complete specialization. We explore the dynamics
of the model under different parameter values, and relate our theoretical results to the
empirical literature that studies the determinants of countries' income per capita growth
and levels. In general, the model is capable of generating predictions in accordance with
the most important ndings in the empirical growth literature. At the same time, it
avoids some of the most serious problems of the (autarkic) neoclassical growth model.

Alejandro Cunat (LSE, CEP, CEPR) and Marco Maffezzoli (IGIER, Università Bocconi)
Keywords: International Trade, Heckscher-Ohlin, Economic Growth, Convergence,Simulation
2003 - n° 230
In this paper we review the recent liberalization process in energy markets promoted by the European Commission in the late Nineties and implemented in all the member countries. The electricity and gas industries are characterized by a predominant role of network infrastructures, and by upstream and downstream segments that can be opened to competition. The key issues that must be addressed to design a liberalization plan include the horizontal and vertical structure of the industry, the access to the transport facilities, the organization of a wholesale market and the development of competition in the liberalized segments. We analyze the liberalization policies in the EU as a two step approach: the Directives and the national liberalization plans have focussed so far on the goal of creating a level playing field for new comers through Third Party Access to the network infrastructure, the unbundling of monopolized from competitive activities of the incumbent and the opening of demand. Today, within a heterogeneous picture, all the member countries are implementing this phase. The second step refers to the development of a competitive environment in the liberalized markets, a goal that requires, but is not implied by, the creation of fair entry conditions to new comers. The reduction of market power of the incumbent through divestitures and the entry process, and the design of the market rules are the crucial issues, and neither the Directives not the national plans have been in most cases very effective on this issue. As a result, while we can start appreciating the entry of new operators in both the electricity and the gas industry, the effects on consumers choice and final prices are rather limited, in particular in the gas industry. In the second part of the paper we move our attention to the Italian case, describing the national liberalization plans and the policy issues still opened. Both the electricity and the gas reforms are more advanced than the minimum standards required by the Directives, and include in some cases interesting innovations. In particular, the Bersani Decree on electricity requires capacity divestitures in the generation plans and adopts a proprietary unbundling of the transport network, while the Letta Decree on gas introduces antitrust ceilings and a very quick schedule towards complete demand opening. Among the more relevant open issues, in the electricity industry the incumbent firm can maintain a market share of 50% in generation, with likely distortions in the wholesale market. There are two possible ways out of this central problem: a "market solution" that requires further reductions in the generation capacity of the dominant firm and an improvement in transborder interconnection capacity together with the start up of the wholesale market; an "administrative solution" that tries to limit the effects of the incumbent market power on prices by assigning the foreign low cost energy to some categories of (large) customers and introducing bid caps on prices, while delaying the opening of the wholesale market. It is not clear which choice has been made by the Government, even if the latter emerges from many recent decisions. In the gas industry the insufficient unbundling of the dominant firm is the most serious obstacle to developing competition. The antitrust ceilings may even determine perverse effects, with the new firms acting as (upstream) customers and (downstream) competitors of the dominant firm. Moreover, the access to international transmission capacity seems a crucial issue. Finally, the nature of competition with take-or-pay contracts suggests that a wholesale market for gas would be necessary. The last open issues are institutional: we argue that the recent assignment of the energy policy at the regional level and the prospected reduction of independence of the energy authority are two institutional reforms with a very negative impact on the liberalization process.

Michele Polo (Università di Sassari and IGIER) and Carlo Scarpa (Università di Brescia)
2003 - n° 229
We propose a theory of international agreements on product standards. The key feature of the model is that agreements are viewed as incomplete contracts. In particular, these do not specify standards for products that may arise in the future. One potential remedy to contractual incompleteness is a dispute settlement procedure (DSP) that provides arbitration in states of the world that are not covered by the ex ante agreement. We identify conditions under which a DSP can provide ex-ante efficiency gains, and examine how these gains depend on the fundamentals of the problem. Another potential remedy to contractual incompleteness is given by rigid rules, i.e. rules that are not product-specific. We argue that the nondiscrimination rule is the only rule of this kind that increases ex-ante efficiency for any probability distribution over potential products. Finally we show that, under relatively weak conditions, the optimal ex-ante agreement is structured in three parts: (i) a set of clauses that specify standards for existing products; (ii) a rigid nondiscrimination rule, and (iii) a dispute settlement procedure. Although the model focuses on the case of product standards, the analysis suggests a more general incomplete-contracting theory of trade agreements.

Pierpaolo Battigalli (Bocconi University and IGIER) and Giovanni Maggi (Princeton University and NBER)
Keywords: Trade Agreements, Standards, Incomplete contracts, Dispute Settlement Procedure, Nondiscrimination
2003 - n° 228
We study the effects on the optimal monetary policy design problem of allowing for deviations from the law of one price in import goods prices. We reach three basic results. First, we show that incomplete pass-through renders the analysis of monetary policy of an open economy fundamentally different from the one of a closed economy, unlike canonical models with perfect pass-through which emphasize a type of isomorphism. Second, and in response to efficient productivity shocks, incomplete pass-through has the effect of generating endogenously a short-run tradeoff between the stabilization of inflation and of the output gap. This holds independently of the measure of inflation being targeted by the monetary authority. Third, in studying the optimal program under commitment relative to discretion, we show that the former entails a smoothing of the deviations from the law of one price, in stark contrast with the established empirical evidence. In addition, an optimal commitment policy always requires, relative to discretion, more stable nominal and real exchange rates.

Tommaso Monacelli (IGIER, Bocconi University)
Keywords: deviations from the law of one price, policy trade-off, gains from commitment, exchange rate channel
2003 - n° 227
The extraordinary success of the U.S. economy and the parallel growth slowdown of the large European countries and Japan in the 1990s bear a simple rationale. The United States has eventually benefited from the effective adoption of information technologies. The introduction of the newly installed IT capital has not instead enhanced aggregate capital accumulation and TFP growth in Europe and Japan. At least on impact, IT capital has mainly displaced existing capital and methods of production rather than supplementing them. The limited growth-enhancing effects from information technologies in countries other than the United States have occurred in the IT-producing sectors, while the IT-using industries havecontributed the bulk of productivity gains in the United States.

Francesco Daveri (Università di Parma, and IGIER)
Keywords: Labor productivity growth, G-7, Information technology, Sector productivity
2002 - n° 226
This paper presents evidence on the geographical dimension of the IT revolution in the U.S. economy. BEA and Census data show that, although neither IT diffusion nor the productivity revival was geographically narrow, the matching of the two across the U.S. states has been far from perfect. The late 1990s productivity acceleration mostly occurred in those states specialized in the production of IT goods & services as well as of non-IT durable goods. When those states are excluded from the sample, the remaining states do not exhibit any significant acceleration in productivity. In particular, the association between productivity gains and IT use is at best weak at the state level. This contrasts with previous aggregate and sector evidence, where the importance of both IT production and use was stressed.

Francesco Daveriand Andrea Mascotto
Keywords: Productivity growth, US states, IT revolution, Information technology, USA
2002 - n° 225
We derive a set of stylized facts on the effects of non-systematic fiscal policy in the four largest countries of the Euro area, and discuss their implications for the fiscal policy coordination debate, for the effectiveness of fiscal shocks in stabilizing the economies, and for the interaction of fiscal and monetary policy. We find relevant differences across countries in the effects of non-systematic fiscal policy, and substantial uncertainty about the size of these effects, which casts doubts on the possibility of a fiscal coordination. Moreover, expenditure shocks are usually rather ineffective in increasing output growth or reducing its volatility, and can require deficit financing. Tax policies also appear to have minor effects on output, and tax cuts could also require deficit financing. Finally, fiscal shocks appear to have an impact on interest rates, either direct or trough the output gap and inflation while, in general, the effects of monetary policy on disbursements and receipts seem to be minor.

Massimiliano Marcellino (Istituto di Economia Politica, Università Bocconi, IGIER)
Keywords: Fiscal policy, Policy coordination, Stabilization policy, Monetary policy
2002 - n° 223
Two competing methods have been recently developed to estimate large-scale dynamic factor models based, respectively, on static and dynamic principal components. In this paper we use two large datasets of macroeconomic variables for the US and for the Euro area to evaluate in practice the relative performance of the two approaches to factor model estimation. The comparison is based both on the relative goodness of fit of the models, and on the usefulness of the factors when used in the estimation of forward looking Taylor rules, and as additional regressors in monetary VARs. It turns out that dynamic principal components provide a more parsimonious summary of the information, but the overall performance of the two methods is very similar, in particular when a common information set is adopted. Moreover, the information extracted from the large datasets turns out to be quite useful for the empirical analysis of monetary policy.

Carlo Ambrogio Favero(IEP-Bocconi University, IGIER and CEPR ), Massimiliano Marcellino (IEP-Bocconi University, IGIER and CEPR) and Francesca Neglia (IGIER)
Keywords: factor model, principal component, Taylor rule, monetary shock
2002 - n° 222
Has the spurt of IT-centered innovations of the 1990s resulted in sizably higher productivity growth? This question, first raised in the US and later on in Europe and the rest of the world, has not been given a firm answer yet. This paper adds to the evidence on Europe by looking at a seemingly ideal new economy laboratory, i.e. the sectors of Finland. We find three main results. First, Nokia was absolutely crucial in getting all started. Second, much the same as in the US, TFP productivity gains spilled over onto few other sectors and cyclical factors did play a role in boosting productivity in the second part of the 1990s. Third, nevertheless, the timing and the sector distribution of productivity gains are strongly and negatively related to the dynamics of the machinery and equipment sector price deflator. This is suggestive that productivity gains cannot simply be the side effect of fortunate cyclical circumstances.

Francesco Daveri (Università di Parma and IGIER) and Olmo Silva (European University Institute)
Keywords: Productivity growth, Total Factor Productivity, Information technology, Finland, Europe
2002 - n° 221
Recent financial research has provided evidence on the predictability of asset returns. In this paper we consider the results contained in Pesaran-Timmerman(1995), which provided evidence on predictability over the sample 1959-1992. We show that the extension of the sample to the ninetie weakens considerably the statistical and economic significance of the predictability of stock returns based on earlier data. We propose an extension of their framework, based on the explicit consideration of model uncertainty under rich parameterizations for the predictive models.
We propose a novel methodology to deal with model uncertainty based on thick modeling, i.e. on considering a multiplicity of predictive models rather than a single predictive model. We show that portfolio allocations based on a thick modelling strategy sistematically overperforms thin modelling.

Marco Aiolfi (Bocconi University) and Carlo Ambrogio Favero (Bocconi University and CEPR)
2002 - n° 220
Recent empirical evidence suggests a negative relationship between trade integration and income per capita convergence. We show that moderate reductions in trade posts can generate sizable increases in income per capita divergence in a neoclassical two-country model of trade and growth. The welfare of both countries, however, rises with trade integration due to changes in their consumption time paths. Our setup sheds light on the striking nonlinear growth in the trade share of output since World War II: a linear fall in trade costs over time produces an exponential increase in the trade share of GDP. Concerning the empirical relationship between openness and technological progress, we perform an exercise that cautions against the use of aggregate production functions to obtain Solow residuals: two countries that reduce their trade costs and experience no technological progress are measured to have positive TFP growth rates if an aggregate production function is used for that purpose.

Alejandro Cuat (LSE, CEP and CEPR) and Marco Maffezzoli (IEP - Università Bocconi)
Keywords: International Trade, Heckscher-Ohlin, Economic Growth
2002 - n° 219
A feature of new economic geography model is their mathematical intractability. This intractability results from the fact that the functional relationship between the indirect utility differential and the state variable cannot be found explicitly. We illustrate three methods that can be utilized to approximate the unknown function. These methods are simple and give a remarkable improvement in the precision of approximation with respect to the commonly utilized Lagrange approximation. Precision of approximation is important in models that feature catastrophic behavior. We apply these methods to the core-periphery model. Naturally, they can be applied to all cases of unknown functional relationships.

Marco Maffezzoli (Istituto di Economia Politica, Università Bocconi) and Federico Trionfetti (Kings College, London and CEPII, Paris)
Keywords: projection methods, spatial models, economic geography
2002 - n° 218
The creation of Europes new stock markets represents a major experiment in market design with important implications for the ability to support innovative, fast-growing companies. We evaluate the success of these markets based on a large number of measures of firm performance and strategy which extend to several pre- and post-listing years. Our hand collected databasemis obtained from the listing prospectuses and annual reports of 538 companies which listed on the Neuer Markt, Nouveau March, and Nuovo Mercato from 1996 through 2001. Three findings stand out. First, these companies experience a dramatic change after the IPO, rebalancing their capital structure, increasing their debt and investment, accelerating growth, and becoming less profitable. These changes are consistent with the existence of credit constraints, and are greater than for companies listing on the main markets. Second, we document a considerable variation in post-IPO growth rates and corporate strategy, across both companies and markets. This variation is largely due to the ability to raise equity capital at IPO. Third, the adoption of US GAAP accounting standards substantially increases firms ability to raise capital. While Europes new markets have provided high-growth companies with an unprecedented opportunity to finance their growth, the adoption (and enforcement) of tighter standards of disclosure is then crucial for their success.

Laura Bottazzi (Università Bocconi, IGIER and CEPR) and Marco Da Rin (Università di Torino and IGIER)
Keywords: Initial Public Offerings (IPOs) Corporate disclosure Going public Stock markets Ownership Operating performance Accounting standards
2002 - n° 217

In a Common Currency Area (CCA) the Common Central Bank sets a uniform rate of inflation across countries, taking into account the areas economic conditions. Suppose that countries in recession favor a more expansionary policy than countries in expansion, a conflict of interest between members arises when national business cycles are not fully synchronized. If governments of member countries have an informational advantage over the state of their domestic economy, such conflict may create an adverse selection problem: national authorities overemphasize their shocks, in order to shape the common policy towards their needs. This creates an inefficiency over and above the one-policy-fits-all cost discussed in the optimal currency area literature. In order to minimize this extra-burden of asymmetric information, monetary policy must over-react to large symmetric shocks and under-react to small asymmetric ones. The result is sub-optimal volatility of inflation.

Laura Bottazzi (Università Bocconi, IGIER and CEPR) and Paolo Manasse (Università di Bologna and IGIER)
2002 - n° 216

After the creation of the European Monetary Union (EMU), both the European Commission (EC) and the European Central Bank (ECB) are focusing more and more on the evolution of the EMU as a whole, rather than on single member countries. A particularly relevant issue from a policy point of view is the availability of reliable forecasts for the key macroeconomic variables. Hence, both the fiscal and the monetary authorities have developed aggregate forecasting models, along the lines previously adopted for the analysis of single countries. A similar approach will be likely followed in empirical analyses on, e.g., the existence of an aggregate Taylor rule or the evaluation of the aggregate impact of monetary policy shocks, where linear specifications are usually adopted. Yet, it is uncertain whether standard linear models provide the proper statistical framework to address these issues. The process of aggregation across countries can produce smoother series, better suited for the analysis with linear models, by averaging out country specific shocks. But the method of construction of the aggregate series, which often involves time-varying weights, and the presence of common shocks across the countries, such as the deflation in the early 1980s and the convergence process in the early 1990s, can introduce substantial non-linearity into the generating process of the aggregate series. To evaluate whether this is the case, we fit a variety of non-linear and time-varying models to aggregate EMU macroeconomic variables, and compare them with linear specifications. Since non-linear models often over-fit in sample, we assess their performance in a real time forecasting framework. It turns out that for several variables linear models are beaten by non-linear specifications, a result that questions the use of standard linear methods for forecasting and modeling EMU variables.

 

 

Massimiliano Marcellino (IEP- Bocconi University, IGIER and CEPR)
Keywords: European Monetary Union, Forecasting, Time-Varying models, Non-linear models, Instability, Non-linearity,
2002 - n° 215
The aim of this paper is to estimate the effect of research externalities across space, in generating innovation.We do so by using R&D and patent data for eighty-six European Regions in the 1977-1995 period. We find that spillovers exist for regions within a distance of 300 Km from each other. The estimates are robust to simultaneity, omitted variable bias, different specifications of distance functions, country and border effects. The size of these spillovers is small, though. Doubling R&D spending in a region would increase the output of new ideas in other regions within 300 Km only by 2-3%, while it would increase the innovation of the region itself by 80-90%. Given the small size and the limited range of diffusion, we interpret these externalities as the result of local diffusion of non-codified knowledge, embodied in people and spreading via personal contacts. This interpretation is reinforced by the finding that the spillovers are somewhat weaker across national borders.

Laura Bottazzi (Università Bocconi, IGIER and CEPR) and Giovanni Peri (University of California, Davis and CESifo)
Keywords: Innovation, R&D Spillovers, Europe, Regions
2002 - n° 214
The objective of this study is to investigate the behaviour of monetary and fiscal authorities in the Euro area. Our main contribution is joint modelling of behaviour of the two authorities. Our investigation highlights a number of facts. The systematic monetary policies adopted by the non-German authorities in the seventies were not capable of stabilizing inflation. Such results have been achieved in the eigthies and the nineties by anchoring more tightly domestic monetary policy to German monetary policy. All the main episodes of expansionary fiscal policy occurred in the course of the eigthies and the nineties in Europe cannot be explained by the sytematic behaviour of fiscal authorities. Stabilization of inflation has been achieved independently from the lack of fiscal discipline. There are important interactions between the two authorities but they depend exclusively on the responses of governemnts expenditures and receipts to interest rate payments on the public debt.

Carlo Favero (Università Bocconi and IGIER)
2002 - n° 213

Revised version: June 28, 2002

Despite the fast catching-up in ICT diffusion experienced by most EU countries in the last few years, information technologies have so far delivered little productivity gains in Europe. In the second half of the past decade, the growth contributions from ICT capital rose in six EU countries only (the UK, Denmark, Finland, Sweden, Ireland and Greece). Quite unlike the United States, this has not generally been associated to higher labour or total factor productivity growth rates, the only exceptions being Ireland and Greece. Particularly worrisome, the large countries in Continental Europe (Germany, France, Italy and Spain) showed stagnating or mildly declining growth contributions from ICT capital, together with definite declines in TFP growth compared to the first half of the 1990s. It looks like that the celebrated Solow paradox on the lack of correlation between ICT investment and productivity growth has fled the US to migrate to Europe.

Francesco Daveri (Università di Parma and IGIER)
2002 - n° 212

It is rather common to have several competing forecasts for the same variable, and many methods have been suggested to pick up the best, on the basis of their past forecasting performance. As an alternative, the forecasts can be combined to obtain a pooled forecast, and several options are available to select what forecasts should be pooled, and how to determine their relative weights. In this paper we compare the relative performance of alternative pooling methods, using a very large dataset of about 500 macroeconomic variables for the countries in the European Monetary Union. In this case the forecasting exercise is further complicated by the short time span available, due to the need of collecting a homogeneous dataset. For each variable in the dataset, we consider 58 forecasts produced by a range of linear, time-varying and non-linear models, plus 16 pooled forecasts. Our results indicate that on average combination methods work well. Yet, a more disaggregate analysis reveals that single non-linear models can outperform combination forecasts for several series, even though they perform rather badly for other series so that on average their performance is not as good as that of pooled forecasts. Similar results are obtained for a subset of unstable series, the pooled forecasts behave only slightly better, and for three key macroeconomic variables, namely, industrial production, unemployment and inflation.

Massimliano Marcellino
2002 - n° 211

In this paper we evaluate the relative performance of linear, non-linear and time-varying models for about 500 macroeconomic variables for the countries in the Euro area, using real-time forecasting methodology. It turns out that linear models work well for about 35% of the series under analysis, time-varying models for another 35% and on-linear models for the remaining 30% of the series. The gains in forecasting accuracy from the choice of the best model can be substantial, in particular for longer forecast horizons.These results emerge from a detailed is aggregated analysis, while they are hidden when an average loss function is used. To explore in more detail the issue of parameter instability, we then apply a battery of tests, detecting non-constancy in about 20-30% of the time series. For these variables the forecasting performance of the time-varying and non-linear models further improves, with larger gains for a larger fraction of the series. Finally, we evaluate whether non-linear models perform better for three key macroeconomic variables: industrial production, inflation and unemployment. It turns out that this is often the case. Hence, overall, our results indicate that there is a substantial amount of instability and non-linearity in the EMU, and suggest that it can be worth going beyond linear models for several EMU macroeconomic variables.

Massimiliano Marcellino
2002 - n° 210
This paper introduces Heckscher-Ohlin trade features into a two-country DSGE model, and studies the international transmission of productivity shocks through trade in goods. This framework improves upon existing international real business cycle models in generating business cycle properties comparable with the empirical evidence concerning the terms of trade and the trade balance.

Alejandro Cuat (LSE, CEP and CEPR) e Marco Maffezzoli (Istituto di Economia Politica, Università Bocconi)
Keywords: International Trade, Heckscher-Ohlin, Business Cycles, Productivity Shocks
2002 - n° 209

Index tracking requires to build a portfolio of stocks (a replica) whose behavior is as close as possible to that of a given stock index. Typically, much fewer stocks should appear in the replica than in the index, and there should be no low frequency (persistent) components in the tracking error. Unfortunately, the latter property is not satisfied by many commonly used methods for index tracking. These are based on the in-sample minimization of a loss function, but do not take into account the dynamic properties of the index components. Instead, we represent the index components with a dynamic factor model, and develop a procedure that, in a first step, builds a replica that is driven by the same persistent factors as the index. In a second step, it is also possible to refine the replica so that it minimizes a loss function, as in the traditional approach. Both Monte Carlo simulations and an application to the EuroStoxx50 index provide substantial support for our approach.

Francesco Corielli (IMQ-Universita Bocconi) and Massimiliano Marcellino (Istituto di Economia Politica, Universita Bocconi, IGIER)
2001 - n° 208
Nowadays a considerable amount of information on the behavior of the economy is readily available, in the form of large datasets of macroeconomic variables. Central bankers can be expected to base their decisions on this very large information set, so that it can be difficult to track their decisions using small models, such as standard Taylor rules. Small scale structural VARs can suffer from a similar problem when used to highlight stylized facts or for policy simulation exercises. On the other hand, large scale structural models are hardly manageable, and still suffer from those identification problems that led to the success of VARs. In this paper we combine recent time-series techniques for the analysis of large datasets with more traditional small scale models to analyze monetary policy in Europe. In particular, we model hundreds of macroeconomic variables with a dynamic factor model, and summarize their informational content with a few estimated factors. These factors are then used as instruments in the estimation of forward looking Taylor rules, and as additional regressors in structural VARs. The latter are then used to evaluate the effects of unexpected and systematic monetary policy.
Carlo Ambrogio Favero(Universita Bocconi and IGIER) and Massimiliano Marcellino(Istituto di Economia Politica, Universita Bocconi, IGIER)