Working papers results
2015 - n° 561 10/11/2015
We study monotone, continuous, and quasiconcave functionals defifined over an M-space. We show that if g is also Clarke-Rockafellar differentiable at (see below picture) , then the closure of Greenberg- Pierskalla differentials at x coincides with the closed cone generated by the Clarke-Rockafellar differentials at x. Under the same assumptions, we show that the set of normalized Greenberg-Pierskalla differentials at x coincides with the closure of the set of normalized Clarke-Rockafellar differentials at x. As a corollary, we obtain a differential characterization of quasiconcavity a la Arrow and Enthoven (1961) for Clarke-Rockafellar differentiable functions.
2015 - n° 560 22/10/2015
A well functioning bureaucracy can promote prosperity, as advocated by Max Weber. But when bureaucracy gets jammed, it causes stagnation, as described by Franz Kafka. We propose a dynamic theory of the interaction between the production of laws and the efficiency of bureaucracy. When bureaucracy is inefficient the effects of politicians legislative acts are hard to assess. Therefore, incompetent politicians have strong incentives to pass laws to acquire the reputation of skill-full reformers. But too many, often contradictory reforms can in turn lead to a collapse in bureaucratic fficiency. This interaction leads to the existence of both Weberian and Kafkian steady states. A temporary surge in political instability, a strong pressure for reforms by the public, and the appointment of short-lived technocratic governments can determine a permanent shift towards the Kafkian nightmare steady state. Using micro-data for Italy, we provide evidence consistent with one key prediction of the theory: the relative supply of laws by incompetent politicians increases when legislatures are expected to be short.
2015 - n° 559 30/09/2015
This paper studies how voters optimally allocate costly attention in a model of probabilistic voting. The equilibrium solves a modified social planning problem that reflects voters' choice of attention. Voters are more attentive when their stakes are higher, when their cost of information is lower and prior uncertainty is higher. We explore the implications of this in a variety of applications. In equilibrium, extremist voters are more influential and public goods are under-provided. The analysis also yields predictions about the equilibrium pattern of information, and about policy divergence by two opportunistic candidates. Endogenous attention can lead to multiple equilibria, explaining how poor voters in developing countries can be politically empowered by welfare programs.
2015 - n° 558 29/09/2015
This paper proposes and discusses an instrumental variable estimator that can be of particular relevance when many instruments are available and/or the number of instruments is large relative to the total number of observations. Intuition and recent work (see, e.g., Hahn (2002)) suggest that parsimonious devices used in the construction of the final instruments may provide eective estimation strategies. Shrinkage is a well known approach that promotes parsimony. We consider a new shrinkage 2SLS estimator. We derive a consistency result for this estimator under general conditions, and via Monte Carlo simulation show that this estimator has good potential for inference in small samples.
Keywords: Instrumental Variable Estimation, 2SLS, Shrinkage, Bayesian Regression J
2015 - n° 557 29/09/2015
The question of how economic inequality changed during the centuries leading up to the industrial revolution has been attracting a growing amount of research effort. Nevertheless, a complete picture of the tendencies in economic inequality throughout pre-industrial Europe has remained out of our grasp. This paper begins to resolve this problem by comparing long-term changes in inequality between Central and Northern Italy on the one hand and the Southern and Northern Low Countries on the other hand. Based on new archival material, we reconstruct regional estimates of economic inequality between 1500 and 1800 and analyze them in the light of the Little Divergence debate, assessing the role of economic growth, urbanization, proletarianization, and political institutions. We argue that different explanations should be invoked to understand the early modern growth of inequality throughout Europe, since several factors conspired to make for a society in which it was much easier for inequality to rise than to fall. We also argue that although there was apparently a 'Little Convergence' in inequality, at least some parts of southern and northern Europe diverged in terms of inequality extraction ratios.
Keywords: Economic inequality; early modern period; Sabaudian State; Florentine State; Italy; Low Countries; Belgium; The Netherlands; inequality extraction; wealth concentration; fiscal state; proletarianization
2015 - n° 556 02/09/2015
The triplet-based risk analysis of Kaplan and Garrick (1981) is the keystone of state-of-the-art probabilistic risk assesment in several applied fields. This paper performs a sharp embedding of the elements of this framework into the one of formal decision theory, which is mainly con- cerned with the methodological and modelling issues of rational decision making. In order to show the applicability of such an embedding, we also explicitly develop it within a nuclear probabilistic risk assessment, as prescribed by the U.S. NRC. The aim of this exercise is twofold: on the one hand, it gives risk analysis a direct access to the rich toolbox that decision theory has developed, in the last decades, in order to deal with complex layers of uncertainty; on the other, it exposes decision theory to the challenges of risk analysis, thus providing it with broader scope and new stimuli.
2015 - n° 555 30/07/2015
Government spending at the zero lower bound (ZLB) is not necessarily welfare enhancing, even when its output multiplier is large. We illustrate this point in the context of a standard New Keynesian model. In that model, when government spending provides direct utility to the household, its optimal level is at most 0.5- 1 percent of GDP for recessions of -4 percent; the numbers are higher for deeper recessions. When spending does not provide direct utility, it is generically welfare- detrimental: it should be kept unchanged at a long run-optimal value. These results are confirmed in a medium-scale DSGE version of the model featuring sticky wages and equilibrium unemployment.
Keywords: Government spending multiplier, zero lower bound, welfare
2015 - n° 554 30/07/2015
This paper tests the commonly-used assumption that people apply a single discount rate to the utility from different sources of consumption. Using survey data from Uganda with both hypothetical and incentivized choices over different goods, we elicit time preferences from about 2,400 subjects. We reject the null of equal discount rates across goods; the average person in our sample is more impatient about sugar, meat and starchy plantains than about money and a list of other goods. We review theassumptions to recover discount rates from experimental choices for the case of goodspecific discounting. Consistently with the theoretical framework, we find convergence in discount rates across goods for two groups expected to engage in or think about arbitraging the rewards: traders and individuals with large quantities of the good at home. As an application, we evaluate empirically the conditions under which goodspecific discounting could predict a low-asset poverty trap.
Keywords: time preferences, good-specific discounting, narrow-bracketing, selfcontrol problems, poverty traps
2015 - n° 553 10/07/2015
We study decision problems in which the consequences of the alternative actions depend on states determined by a generative mechanism representing some natural or social phenomenon. Model uncertainty arises as decision makers may not know such mechanism. Two types of uncertainty result, a state uncertainty within models and a model uncertainty across them. We discuss some two-stage static decision criteria proposed in the literature that address state uncertainty in the first stage and model uncertainty in the second one (by considering subjective probabilities over models). We consider two approaches to the Ellsberg-type phenomena that these decision problems feature: a Bayesian approach based on the distinction between subjective attitudes toward the two kinds of uncertainty, and a non Bayesian one that permits multiple subjective probabilities. Several applications are used to illustrate concepts as they are introduced.
2015 - n° 552 10/07/2015
We prove that a subtle but substantial bias exists in a standard measure of the conditional dependence of present outcomes on streaks of past outcomes in sequential data. The magnitude of this novel form of selection bias generally decreases as the sequence gets longer, but increases in streak length, and remains substantial for a range of sequence lengths often used in empirical work. The bias has important implications for the literature that investigates incorrect beliefs in sequential decision making - most notably the Hot Hand Fallacy and the Gambler's Fallacy. Upon correcting for the bias, the conclusions of prominent studies in the hot hand fallacy literature are reversed. The bias also provides a novel structural explanation for how belief in the law of small numbers can persist in the face of experience.
Keywords: Law of Small Numbers; Alternation Bias; Negative Recency Bias; Gambler's Fallacy; Hot Hand Fallacy; Hot Hand Effect; Sequential Decision Making; Sequential Data; Selection Bias; Finite Sample Bias; Small Sample Bias