Working papers
IGIER fellows and affiliates publish books and articles in academic journals. Their current research projects are featured in the Working Paper series.
-
-
-
-
-
-
-
-
-
-
-
-
All robust equilibria of plurality voting games satisfy Duverger's Law: In any robust equilibrium, exactly two candidates receive a positive number of votes. Moreover, robust- ness (only) rules out a victory of the Condorcet loser.
All robust equilibria under runoff rule satisfy Duverger's Hypothesis: First round votes vare (almost always) dispersed over more than two alternatives. Robustness has strong implications for equilibrium outcomes under runoff rule: For large parts of the parameter space, the robust equilibrium outcome is unique.
-
-
-
'Crises feed uncertainty. And uncertainty affects behaviour, which feeds the crisis.'
Olivier Blanchard, The Economist, January 29, 2009
-
-
-
2000 Mathematics Subject Classification: Primary 28A12, 28A25, 46G12; Secondary 91B06
-
Data on people's subjective expectations of returns as well as on their schooling decisions allow me to directly estimate and compare cost distributions of poor and rich individuals. I find that poor individuals require significantly higher expected returns to be induced to attend college, implying that they face higher costs than individuals with wealthy parents. I then test predictions of a model of college attendance choice in the presence of credit constraints, using parental income and wealth as a proxy for the household's (unobserved) interest rate. I find that poor individuals with high expected returns are particularly responsive to changes in direct costs, which is consistent with credit constraints playing an important role. Evaluating potential welfare implications by applying the Local Instrumental Variables approach of Heckman and Vytlacil (2005) to my model, I find that a sizeable fraction of poor individuals would change their decision in response to a reduction in direct costs. Individuals at the margin have expected returns that are as high or higher than the individuals already attending college, suggesting that government policies such as fellowship programs could lead to large welfare gains.
Furthermore, we show how the extremal transfers can be put to use in mechanism design problems where Revenue Equivalence does not hold. To this end we rst explore the role of extremal transfers when the agents with type dependent outside options are free to participate in the mechanism. Finally, we consider the question of budget balanced implementation. We show that an allocation rule can be implemented in an incentive compatible, individually rational and ex post budget balanced mechanism if and only if there exists an individually rational extremal transfer scheme that delivers an ex ante budget surplus.
-
facilitated the introduction of structural reforms, defined as deregulation
in the product markets and liberalization and deregulation in the labor
markets. After reviewing the theoretical arguments that may link the
adoption of the Euro and structural reforms, we investigate the empirical
evidence. We find that the adoption of the Euro has been associated with
an acceleration of the pace of structural reforms in the product market.
The adoption of the Euro does not seem to have accelerated labor market
reforms in the "primary labor market;" however, the run up to the Euro
adoption seems to have been accompanied by wage moderation. We also
investigate issues concerning the sequencing of goods and labor market
reforms.
two alternative theories - children as consumption vs. investment good. We use
as a natural experiment the Italian pension reforms of the 90s that introduced a clear
discontinuity in the treatment across workers. This policy experiment is particularly
well suited, since the consumption motive predicts lower future pensions to reduce
fertility, while the old-age security to increase it. Our empirical analysis identifies
a clear and robust positive effect of less generous future pensions on post-reform
fertility. These findings are consistent with old-age security even for contemporary
fertility.
ex-ante optimality requires intergenerational risk sharing. We compare the level
of time-consistent intergenerational risk sharing chosen by a social planner and by office
seeking politicians. In the political setting, the transfer of resources across generations
- a PAYG pension system - is determined as a Markov equilibrium of a probabilistic
voting game. Negative shocks represented by low realized returns on the risky asset
induce politicians to compensate the old through a PAYG system. Unless the young are
crucial to win the election, this political system generates more intergenerational risk
sharing than the (time consistent) social optimum. In particular, these transfers are
more persistent and less responsive to the realization of the shock than optimal. This is
because politicians anticipate their current transfers to the elderly to be compensated
through offsetting transfers by future politicians, and thus have an incentive to overspend.
Perhaps surprisingly, aging increases the socially optimal transfer but makes
politicians less likely to overspend, by making it more costly for future politicians to
compensate the current young.
We develop and estimate a medium scale macroeconomic model that allows for unemployment
and staggered nominal wage contracting. In contrast to most existing quantitative models,
employment adjustment is on the extensive margain and the employment of existing workers is
efficient. Wage rigidity, however, affects the hiring of new workers. The former is introduced
via the staggered Nash bargaing setup of Gertler and Trigari (2006). A robust finding is that
the model with wage rigidity provides a better description of the data than does a flexible wage
version. Overall, the model fits the data roughly as well as existing quantitative macroeconomic
models, such as Smets and Wouters (2007) or Christiano, Eichenbaum and Evans (2005). More
work is necessary, however, to ensure a robust identification of the key labor market parameters.
and productivity growth in the Italian manufacturing industries in 1995-2003.
Our results indicate that the off-shoring of intermediates within the same
industry (narrow off-shoring) is beneficial for productivity growth, while
the off-shoring of services is not. We also find that the way in which off-
shoring is measured may matter considerably. The positive relation between off-
shoring of intermediates and productivity growth is there with our direct
measures based on input-output data but disappears when either a broad measure
or the Feenstra-Hanson off-shoring measure employed in other studies are used
instead.
their preferences concerning an irreversible social decision. Voters can either implement
the project in the first period, or they can postpone the decision to the
second period. We analyze the effects of different majority rules. Individual first
period voting behavior may become "less conservative" under supermajority rules,
and it is even possible that a project is implemented in the first period under a
supermajority rule that would not be implemented under simple majority rule.
We characterize the optimal majority rule, which is a supermajority rule. In
contrast to individual investment problems, society may be better off if the option
to postpone the decision did not exist. These results are qualitatively robust to
natural generalizations of our model.
If successful, the innovative effort allows to take new actions that may be ex-post wel-
fare enhancing (legal) or decreasing (illegal). Deterrence in this setting works by affecting
the incentives to invest in innovation (average deterrence). Type-I errors, through over-
enforcement, discourage innovative effort while type-II errors (under-enforcement) spur it.
The ex-ante expected welfare effect of innovations shapes the optimal policy design. When
innovations are ex-ante welfare improving, laissez-faire is chosen. When innovations are
instead welfare decreasing, law enforcement should limit them through average deterrence.
We consider several policy environments differing in the instruments available. Enforcement
effort is always positive and fines are (weakly) increasing in the social loss of innovations. In
some cases accuracy is not implemented, contrary to the traditional model where it always
enhances (marginal) deterrence, while in others it is improved selectively only on type-II
errors (asymmetric protocols of investigation).
cointegration and dynamic factor models. It introduces the Factor-augmented Error
Correction Model (FECM), where the factors estimated from a large set of variables in levels
are jointly modelled with a few key economic variables of interest. With respect to the standard
ECM, the FECM protects, at least in part, from omitted variable bias and the dependence of
cointegration analysis on the specific limited set of variables under analysis. It may also be in
some cases a refinement of the standard Dynamic Factor Model (DFM), since it allows us to
include the error correction terms into the equations, and by allowing for cointegration prevent
the errors from being non-invertible moving average processes. In addition, the FECM is a
natural generalization of factor augmented VARs (FAVAR) considered by Bernanke, Boivin and
Eliasz (2005) inter alia, which are specified in first differences and are therefore misspecified in
the presence of cointegration. The FECM has a vast range of applicability. A set of Monte Carlo
experiments and two detailed empirical examples highlight its merits in finite samples relative to
standard ECM and FAVAR models. The analysis is conducted primarily within an in-sample
framework, although the out-of-sample implications are also explored.
diffusion index-based methods in short samples with structural change. We
consider several data generation processes, to mimic different types of
structural change, and compare the relative forecasting performance of factor
models and more traditional time series methods. We find that changes in the
loading structure of the factors into the variables of interest are extremely
important in determining the performance of factor models. We complement
the analysis with an empirical evaluation of forecasts for the key
macroeconomic variables of the Euro area and Slovenia, for which relatively
short samples are officially available and structural changes are likely. The
results are coherent with the findings of the simulation exercise, and confirm
the relatively good performance of factor-based forecasts also in short samples
with structural change.
models that can handle unbalanced datasets. Due to the different release lags of business cycle
indicators, data unbalancedness often emerges at the end of multivariate samples, which is some-
times referred to as the 'ragged edge' of the data. Using a large monthly dataset of the German
economy, we compare the performance of different factor models in the presence of the ragged edge:
static and dynamic principal components based on realigned data, the Expectation-Maximisation
(EM) algorithm and the Kalman smoother in a state-space model context. The monthly factors
are used to estimate current quarter GDP, called the 'nowcast', using different versions of what
we call factor-based mixed-data sampling (Factor-MIDAS) approaches. We compare all possible
combinations of factor estimation methods and Factor-MIDAS projections with respect to now-
cast performance. Additionally, we compare the performance of the nowcast factor models with
the performance of quarterly factor models based on time-aggregated and thus balanced data,
which neglect the most timely observations of business cycle indicators at the end of the sample.
Our empirical findings show that the factor estimation methods don't differ much with respect
to nowcasting accuracy. Concerning the projections, the most parsimonious MIDAS projection
performs best overall. Finally, quarterly models are in general outperformed by the nowcast factor
models that can exploit ragged-edge data.
work of the standard neoclassical growth model. The short-run revenue loss after an in-
come tax cut is partly -- or, depending on parameter values, even completely -- offset
by growth in the long-run, due to the resulting incentives to further accumulate capital.
We study how the dynamic response of government revenue to a tax cut changes if we
allow a Ramsey economy to engage in international trade: the open economy's ability to
reallocate resources between labor-intensive and capital-intensive industries reduces the
negative effect of factor accumulation on factor returns, thus encouraging the economy to
accumulate more than it would do under autarky. We explore the quantitative implica-
tions of this intuition for the US in terms of two issues recently treated in the literature:
dynamic scoring and the Laffer curve. Our results demonstrate that international trade
enhances the response of government revenue to tax cuts by a relevant amount. In our
benchmark calibration, a reduction in the capital-income tax rate has virtually no effect
on government revenue in steady state.
variable most directly related to current and expected monetary policy,
the yield on long term government bonds. We find that the level of longterm
rates in Europe is almost entirely explained by U.S. shocks and by
the systematic response of U.S. and European variables (inflation, short
term rates and the output gap) to these shocks. Our results suggest in
particular that U.S. variables are more important than local variables
in the policy rule followed by European monetary authorities: this was
true for the Bundesbank before EMU and has remained true for the
ECB, at least so far. Using closed economy models to analyze monetary
policy in the Euro is thus inconsistent with the empirical evidence on the
determinants of Euro area long-term rates. It is also inconsistent with
the way the Governing Council of the ECB appears to make actual policy
decisions.
the functioning of current institutions? This paper argues that individual
values and convictions about the scope of application of norms
of good conduct provide the "missing link". Evidence from a variety
of sources points to two main findings. First, individual values consistent
with generalized (as opposed to limited) morality are widespread
in societies that were ruled by non-despotic political institutions in
the distant past. Second, well functioning institutions are often observed
in countries or regions where individual values are consistent
with generalized morality, and under different identifying assumptions
this suggests a causal effect from values to institutional outcomes. The
paper ends with a discussion of the implications for future research.
the evolution of models estimated to evaluate the macroeconomic impact of the
effect of monetary policy . We argue that the main challenge for the
econometrics of monetary policy is the combination of theoretical models and
information from the data to construct empirical models. The failure of the
large econometrics models at the beginning of the 1970s might be explained by
their incapability of taking proper account of both these aspects. The great
critiques by Lucas and Sims have generated an alternative approach which, at
least initially, has been almost entirely dominated by theory. The LSE
approach has instead concentrated on the properties of the statistical models
and on the best way of incorporating information from the data into the
empirical models, paying little attention to the economic foundation of the
adopted specification. The realization that the solution of a DSGE model can
be approximated by a restricted VAR, which is also a statistical model, has
generated a potential link between the two approaches. The open question is
which type of VARs are most appropriate for the econometric analysis of
monetary policy.
This paper studies a theoretical model where individuals respond
to incentives but are also influenced by norms of good conduct inherited
from earlier generations. Parents rationally choose what values to
transmit to their offspring, and this choice is influenced by the quality
of external enforcement and the pattern of likely future transactions.
The equilibrium displays strategic complementarities between values
and current behavior, which reinforce the effects of changes in the
external environment. Values evolve gradually over time, and if the
quality of external enforcement is chosen under majority rule, there is
histeresis: adverse initial conditions may lead to a unique equilibrium
path where external enforcement remains weak and individual values
discourage cooperation.
This paper reconsiders the developments of model evaluation in macroeconometrics over the last forty years. Our analysis starts from the failure of early empirical macroeconomic models caused by stagflation in the seventies. The different diagnosis of this failure are then analyzed to classify them in two groups: explanations related to problems in the theoretical models that lead to problems in the identification of the relevant econometric model and explanations related to problems in the underlying statistical model that lead to misspecification of the relevant econometric model. Developments in macroeconometric model evaluation after the failure of the Cowles foundation models are then discussed to illustrate how the different critiques have initiated different approaches in macroeconometrics. The evolution of what has been considered the consensus approach to macroeconometric model evaluation over the last thirty years is then followed. The criticism moved to Cowles foundation models in the early seventies might apply almost exactly to DSGE-VAR model evaluation in the first decade of
the new millenium. However, the combination of general statistical model, such as a Factor Augmented VAR, with a DSGE model seems to produce forecasts that perform better than those based exclusively on the theoretical and on the statistical model.
of Uganda began to publish newspaper ads on the timing and amount of funds
disbursed to the districts. The intent of the campaign was to boost schools' and
parents' ability to monitor the local officials in charge of disbursing funds to the
schools. The mass information campaign was successful. But since newspaper
penetration varies greatly across districts, the exposure to information about the
program, and thus funding, dier across districts. I use this variation in program
exposure between districts to evaluate whether public funds have an effect on
student performance. The results show that money matters: On average, stu-
dents in districts highly exposed to the information campaign, and hence to the
grant program, scored 0.40 standard deviations better in the Primary Leaving
Exam (PLE) than students in districts less exposed to information. The results
are robust to controlling for a broad range of confounding factors.
sidered attractive by the profession not only from the theoretical perspec-
tive but also from an empirical standpoint. As a consequence of this
development, methods for diagnosing the fit of these models are being
proposed and implemented. In this article we illustrate how the concept
of statistical identification, that was introduced and used by Spanos(1990)
to criticize traditional evaluation methods of Cowles Commission models,
could be relevant for DSGE models. We conclude that the recently pro-
posed model evaluation method, based on the DSGE - VAR(λ), might not satisfy
the condition for statistical identification. However, our appli-
cation also shows that the adoption of a FAVAR as a statistically identified
benchmark leaves unaltered the support of the data for the DSGE model
and that a DSGE-FAVAR can be an optimal forecasting model.
bonds in the Euro area. There is a common trend in yield differentials, which
is correlated with a measure of aggregate risk. In contrast, liquidity differentials
display sizeable heterogeneity and no common factor. We propose a simple model
with endogenous liquidity demand, where a bond's liquidity premium depends both
on its transaction cost and on investment opportunities. The model predicts that
yield differentials should increase in both liquidity and risk, with an interaction
term of the opposite sign. Testing these predictions on daily data, we find that
the aggregate risk factor is consistently priced, liquidity differentials are priced for
a subset of countries, and their interaction with the risk factor is in line with the
model's prediction and crucial to detect their effect.
We estimate the effect of political regime transitions on growth with semi-parametric methods, combining difference in differences with
matching, that have not been used in macroeconomic settings. Our semi-parametric estimates suggest that previous parametric estimates
may have seriously underestimated the growth effects of democracy. In particular, we find an average negative effect on growth of leav-
ing democracy on the order of -2 percentage points implying effects on income per capita as large as 45 percent over the 1960-2000 panel.
Heterogenous characteristics of reforming and non-reforming countries appear to play an important role in driving these results.
nology is biased in favor of a country's abundant production factors. We provide an expla-
nation to this finding based on the Heckscher-Ohlin model. Countries trade and specialize
in the industries that use intensively the production factors they are abundantly endowed
with. For given factor endowment ratios, this implies smaller international differences in
factor price ratios than under autarky. Thus, when measuring the factor bias of technol-
ogy with the same aggregate production function for all countries, they appear to have
an abundant-factor bias in their technologies.
factor techniques, to produce composite coincident indices (CCIs) at the sectoral
level for the European countries and for Europe as a whole. Few CCIs are available
for Europe compared to the US, and most of them use macroeconomic variables and
focus on aggregate activity. However, there are often delays in the release of macroeconomic
data, later revisions, and differences in the definition of the variables across
countries, while the surveys are timely available, not subject to revision, and fully comparable
across countries. Moreover, there are substantial discrepancies in activity at
the sectoral level, which justifies the interest in a sectoral disaggregation. Compared
to the Confidence Indicators produced by the European Commission, which are based
on a simple average of the aggregate survey answers, we show that factor based CCIs,
using survey answers at a more disaggregate level, produce higher correlation with the
reference series for the majority of sectors and countries.
but also for the decisions of private agents, consumers and firms. Since it is difficult
to identify a single variable that provides a good measure of current economic
conditions, it can be preferable to consider a combination of several coincident indicators,
i.e., a composite coincident index (CCI). In this paper, we review the main
statistical techniques for the construction of CCIs, propose a new pooling-based
method, and apply the alternative techniques for constructing CCIs for the largest
European countries in the euro area and for the euro area as a whole. We find that
different statistical techniques yield comparable CCIs, so that it is possible to reach
a consensus on the status of the economy.
We provide a unified state-space modelling framework that encom-
passes different existing discrete-time yield curve models. within such
framework we analyze the impact on forecasting performance of two
crucial modelling choices, i.e. the imposition of no-arbitrage restric-
tions and the size of the information set used to extract factors. Using
US yield curve data, we find that: a. macro factors are very useful in
forecasting at medium/long forecasting horizon; b. financial factors
are useful in short run forecasting; c. no-arbitrage models are effec-
tive in shrinking the dimensionality of the parameter space and, when
supplemented with additional macro information, are very effective in
forecasting; d. within no-arbitrage models, assuming time-varying risk
price is more favorable than assuming constant risk price for medium
horizon-maturity forecast when yield factors dominate the informa-
tion set, and for short horizon and long maturity forecast when macro
factors dominate the information set; e. however, given the complex-
ity and the highly non-linear parameterization of no-arbitrage models,
it is very difficult to exploit within this type of models the additional
information offered by large macroeconomic datasets.
a common weakness: taxes, government spending and interest rates
are assumed to respond to various macroeconomic variables but not
to the level of the public debt; moreover the impact of fiscal shocks
on the dynamics of the debt-to-GDP ratio are not tracked. We ana-
lyze the effects of fiscal shocks allowing for a direct response of taxes,
government spending and the cost of debt service to the level of the
public debt. We show that omitting such a feedback can result in
incorrect estimates of the dynamic effects of fiscal shocks. In par-
ticular the absence of an effect of fiscal shocks on long-term interest
rates-a frequent finding in research based on VAR's that omit a debt
feedback-can be explained by their mis-specification, especially over
samples in which the debt dynamics appears to be unstable. Using
data for the U.S. economy and the identification assumption proposed
by Blanchard and Perotti (2002) we reconsider the effects of fiscal
policy shocks correcting for these shortcomings.
U.K. Compared to the closed economy, the presence of an exchange rate channel for
monetary policy not only produces new trade-offs for monetary policy, but it also
introduces an additional source of specification errors. We find that exchange rate
shocks are an important contributor to volatility in the model, and that the exchange
rate equation is particularly vulnerable to model misspecification, along with the
equation for domestic inflation. However, when policy is set with discretion, the
cost of insuring against model misspecification appears reasonably small.
systems will have to be retrenched. In particular, retirement age will have to be largely
increased. Yet, is this policy measure feasible in OECD countries? Since the answer
belongs mainly to the realm of politics, I evaluate the political feasibility of postponing
retirement under aging in France, Italy, the UK, and the US. Simulations for the year
2050 steady state demographic, economic and political scenario suggest that retirement
age will be postponed in all countries, while the social security contribution rate will
rise in all countries, but Italy. The political support for increasing the retirement age
stems mainly from the negative income effect induced by aging, which reduces the
profitability of the existing social security system, and thus the individuals net social
security wealth.
action or seek a profitable innovation that may enhance or reduce welfare. The legislator
sets fines calibrated to the harmfulness of unlawful actions. The range of fines defines norm
flexibility. Expected sanctions guide firms' choices among unlawful actions (marginal deter-
rence) and/or stunt their initiative altogether (average deterrence). With loyal enforcers,
maximum norm flexibility is optimal, so as to exploit both marginal and average deterrence.
With corrupt enforcers, instead, the legislator should prefer more rigid norms that prevent
bribery and misreporting, at the cost of reducing marginal deterrence and stunting private
initiative. The greater is potential corruption, the more rigid the optimal norms.
als, which we call generalized fractionalization index, that uses information on similarities
among individuals. We show that the generalized index is a natural extension of the
widely used ethno-linguistic fractionalization index and is alsosimple tocompute. The
paper offers some empirical illustrations on how the new index can be operationalized and
what difference it makes as compared to standard indices. These applications pertain to
the pattern of diversity in the United States across states. Journal of Economic Literature
fiscal policy by considering the Italian case. Empirical analysis has been so
far rather inconclusive on this important topic. We abscribe such evidence
to three problems: identification, regime-switching and maturity effects. All
these aspects are particularly relevant to the Italian case.
We propose a parsimonious model with three factors to
represent the whole yield curve, and we consider yield
differentials between Italian and German Government bonds.
To take into account the possibility of regime-switching, we explicitly include
a hidden two-state Markov chain that represents market expectations. The
model is estimated using Bayesian econometric techniques. We find that government
debt and its evolution significantly influence the yield of government
bonds, that such effects are maturity dependent and regime-dependent. Hence
when investigating the effect of fiscal policy on the term-structure it is of crucial
importance to allow for multiple regimes in the estimation.
Kewords: Fiscal Policy, Term Structure, regime switching, Bayesian estimation
utility from decision problems under exogenous uncertainty to choice in strategic
environments. Interactive uncertainty is modeled both explicitly - using
hierarchies of preference relations, the analogue of beliefs hierarchies
implicitly - using preference structures, the analogue of type spaces la
Harsanyi - and it is shown that the two approaches are equivalent.
Preference structures can be seen as those sets of hierarchies arising when certain
restrictions on preferences, along with the players' common certainty of
the restrictions, are imposed. Preferences are a priori assumed to satisfy only
very mild properties (reflexivity, transitivity, and monotone continuity).
Thus, the results provide a framework for the analysis of behavior in games
under essentially any axiomatic structure. An explicit characterization is
given for Savage's axioms, and it is shown that a hierarchy of relatively
simple preference relations uniquely identifies the decision maker's
utilities and beliefs of all orders. Connections with the literature on beliefs
hierarchies and correlated equilibria are discussed.
Kewords: Subjective probability, Preference hierarchies, Type spaces, Beliefs
hierarchies, Common belief, Expected utility, Incomplete information,
Correlated equilibria