Working papers results

2015 - n° 558
This paper proposes and discusses an instrumental variable estimator that can be of particular relevance when many instruments are available and/or the number of instruments is large relative to the total number of observations. Intuition and recent work (see, e.g., Hahn (2002)) suggest that parsimonious devices used in the construction of the final instruments may provide eective estimation strategies. Shrinkage is a well known approach that promotes parsimony. We consider a new shrinkage 2SLS estimator. We derive a consistency result for this estimator under general conditions, and via Monte Carlo simulation show that this estimator has good potential for inference in small samples.

A. Carriero, G. Kapetanios, and M. Marcellino
Keywords: Instrumental Variable Estimation, 2SLS, Shrinkage, Bayesian Regression J
2015 - n° 557
The question of how economic inequality changed during the centuries leading up to the industrial revolution has been attracting a growing amount of research effort. Nevertheless, a complete picture of the tendencies in economic inequality throughout pre-industrial Europe has remained out of our grasp. This paper begins to resolve this problem by comparing long-term changes in inequality between Central and Northern Italy on the one hand and the Southern and Northern Low Countries on the other hand. Based on new archival material, we reconstruct regional estimates of economic inequality between 1500 and 1800 and analyze them in the light of the Little Divergence debate, assessing the role of economic growth, urbanization, proletarianization, and political institutions. We argue that different explanations should be invoked to understand the early modern growth of inequality throughout Europe, since several factors conspired to make for a society in which it was much easier for inequality to rise than to fall. We also argue that although there was apparently a 'Little Convergence' in inequality, at least some parts of southern and northern Europe diverged in terms of inequality extraction ratios.

Guido Alfani and Wouter Ryckbosch
Keywords: Economic inequality; early modern period; Sabaudian State; Florentine State; Italy; Low Countries; Belgium; The Netherlands; inequality extraction; wealth concentration; fiscal state; proletarianization
2015 - n° 556
The triplet-based risk analysis of Kaplan and Garrick (1981) is the keystone of state-of-the-art probabilistic risk assesment in several applied fields. This paper performs a sharp embedding of the elements of this framework into the one of formal decision theory, which is mainly con- cerned with the methodological and modelling issues of rational decision making. In order to show the applicability of such an embedding, we also explicitly develop it within a nuclear probabilistic risk assessment, as prescribed by the U.S. NRC. The aim of this exercise is twofold: on the one hand, it gives risk analysis a direct access to the rich toolbox that decision theory has developed, in the last decades, in order to deal with complex layers of uncertainty; on the other, it exposes decision theory to the challenges of risk analysis, thus providing it with broader scope and new stimuli.

E. Borgonovo, V. Cappelli, F. Maccheroni, M. Marinacci
2015 - n° 555
Government spending at the zero lower bound (ZLB) is not necessarily welfare enhancing, even when its output multiplier is large. We illustrate this point in the context of a standard New Keynesian model. In that model, when government spending provides direct utility to the household, its optimal level is at most 0.5- 1 percent of GDP for recessions of -4 percent; the numbers are higher for deeper recessions. When spending does not provide direct utility, it is generically welfare- detrimental: it should be kept unchanged at a long run-optimal value. These results are confirmed in a medium-scale DSGE version of the model featuring sticky wages and equilibrium unemployment.

Florin Bilbiie, Tommaso Monacelli, Roberto Perotti
Keywords: Government spending multiplier, zero lower bound, welfare
2015 - n° 554
This paper tests the commonly-used assumption that people apply a single discount rate to the utility from different sources of consumption. Using survey data from Uganda with both hypothetical and incentivized choices over different goods, we elicit time preferences from about 2,400 subjects. We reject the null of equal discount rates across goods; the average person in our sample is more impatient about sugar, meat and starchy plantains than about money and a list of other goods. We review theassumptions to recover discount rates from experimental choices for the case of goodspecific discounting. Consistently with the theoretical framework, we find convergence in discount rates across goods for two groups expected to engage in or think about arbitraging the rewards: traders and individuals with large quantities of the good at home. As an application, we evaluate empirically the conditions under which goodspecific discounting could predict a low-asset poverty trap.

Diego Ubfal
Keywords: time preferences, good-specific discounting, narrow-bracketing, selfcontrol problems, poverty traps
2015 - n° 553
We study decision problems in which the consequences of the alternative actions depend on states determined by a generative mechanism representing some natural or social phenomenon. Model uncertainty arises as decision makers may not know such mechanism. Two types of uncertainty result, a state uncertainty within models and a model uncertainty across them. We discuss some two-stage static decision criteria proposed in the literature that address state uncertainty in the first stage and model uncertainty in the second one (by considering subjective probabilities over models). We consider two approaches to the Ellsberg-type phenomena that these decision problems feature: a Bayesian approach based on the distinction between subjective attitudes toward the two kinds of uncertainty, and a non Bayesian one that permits multiple subjective probabilities. Several applications are used to illustrate concepts as they are introduced.

Massimo Marinacci
2015 - n° 552
We prove that a subtle but substantial bias exists in a standard measure of the conditional dependence of present outcomes on streaks of past outcomes in sequential data. The magnitude of this novel form of selection bias generally decreases as the sequence gets longer, but increases in streak length, and remains substantial for a range of sequence lengths often used in empirical work. The bias has important implications for the literature that investigates incorrect beliefs in sequential decision making - most notably the Hot Hand Fallacy and the Gambler's Fallacy. Upon correcting for the bias, the conclusions of prominent studies in the hot hand fallacy literature are reversed. The bias also provides a novel structural explanation for how belief in the law of small numbers can persist in the face of experience.

Joshua B. Miller and Adam Sanjurjo
Keywords: Law of Small Numbers; Alternation Bias; Negative Recency Bias; Gambler's Fallacy; Hot Hand Fallacy; Hot Hand Effect; Sequential Decision Making; Sequential Data; Selection Bias; Finite Sample Bias; Small Sample Bias
2015 - n° 551
We study optimal selling strategies of a seller who is poorly informed about the buyer's value for the object. When the maxmin seller only knows that the mean of the distribution of the buyer's valuations belongs to some interval then nature can keep him to payoff zero no matter how much information the seller has about the mean. However, when the seller has information about the mean and the variance, or the mean and the upper bound of the support, the seller optimally commits to a randomization over prices and obtains a strictly positive payoff. In such a case additional information about the mean and/or the variance affects his payoff.

Nenad Kos and Matthias Messner
Keywords: Optimal mechanism design, Robustness, Incentive compatibility, Individual rationality, Ambiguity aversion
2015 - n° 550
This paper proposes a Bayesian estimation framework for a typical multi-factor model with timevarying risk exposures to macroeconomic risk factors and corresponding premia to price U.S. publicly traded assets. The model assumes that risk exposures and idiosynchratic volatility follow a break-point latent process, allowing for changes at any point on time but not restricting them to change at all points. The empirical application to 40 years of U.S. data and 23 portfolios shows that the approach yields sensible results compared to previous two-step methods based on naive recursive estimation schemes, as well as a set of alternative model restrictions. A variance decomposition test shows that although most of the predictable variation comes from the market risk premium, a number of additional macroeconomic risks, including real output and inflation shocks, are significantly priced in the cross-section. A Bayes factor analysis massively favors of the proposed change-point model.

Daniele Bianchi, Massimo Guidolin and Francesco Ravazzolo
Keywords: Structural breaks, Stochastic volatility, Multi-factor linear models, Asset Pricing
2015 - n° 549
We characterize the consistency of a large class of nonexpected utility preferences (including mean-variance preferences and prospect theory preferences) with stochastic orders (for example, stochastic dominances of different degrees). Our characterization rests on a novel decision theoretic result that provides a behavioral interpretation of the set of all derivatives of the functional representing the decision maker's preferences. As an illustration, we consider in some detail prospect theory and choice-acclimating preferences, two popular models of reference dependence under risk, and we show the incompatibility of loss aversion with prudence.

Simone Cerreia Vioglio, Fabio Maccheroni, Massimo Marinacci
Keywords: Stochastic dominance, integral stochastic orders, nonexpected utility, risk aversion, multi-utility representation, prospect theory, choice-acclimating personal equilibria