Working papers results
2015 - n° 564 21/12/2015
This paper explores the potential use of entertainment media programs for achieving development goals. I propose a simple framework for interpreting media effects that hinges on three channels: (i) information provision, (ii) role modeling and preference change, and (iii) time use. I then review the existing evidence on how exposure to commercial television and radio affects outcomes such as fertility preferences, gender norms, education, migration and social capital. I complement these individual country studies with cross-country evidence from Africa and with a more in-depth analysis for Nigeria, using the Demographic Health Surveys. I then consider the potential educational role of entertainment media, starting with a discussion of the psychological underpinnings and then reviewing recent rigorous evaluations of edutainment programs. I conclude by highlighting open questions and avenues for future research.
2015 - n° 563 10/11/2015
We empirically identify the lending standards applied by banks to small and medium firms over the cycle. We exploit an institutional feature of the Italian credit market that generates a sharp discontinuity in the allocation of comparable firms into credit risk categories. Using loan-level data, we show that during the expansionary phase of the cycle, banks relax lending standards by narrowing the interest rate spreads between substandard and performing firms. During the contractionary phase of the cycle, the abrupt tightening of lending standards leads to the exclusion of substandard firms from credit. These firms then report significantly lower production, investment, and employment. Finally, we find that the drying up of the interbank market is an important factor determining the change in bank lending standards.
Keywords: Credit Cycles; Financial Contracts; Credit Rationing; Real Activity
2015 - n° 562 10/11/2015
We study the effects of a conventional monetary expansion, quantitative easing, and of the maturity extension program on corporate bond yields using impulse response functions to shocks obtained from flexible models with regimes. We construct weekly bond portfolios sorting individual bond trades by rating and maturity from TRACE. A standard single-state VAR model is inadequate to capture the dynamics of the data. On the contrary, under a three-state Markov switching model with time-homogeneous VAR coefficients, we find that unconventional policies may have been generally expected to decrease corporate yields. However, even though the sign of the responses is the one expected by policy-makers, the size of the estimated effects depends on the assumptions regarding the decline in long-term Treasury yields caused by unconventional policies, on which considerable uncertainty remains.
Keywords: Unconventional monetary policy, corporate bonds, term structure of Treasury yields, impulse response function, Markov swit ching vector autoregression
2015 - n° 561 10/11/2015
We study monotone, continuous, and quasiconcave functionals defifined over an M-space. We show that if g is also Clarke-Rockafellar differentiable at (see below picture) , then the closure of Greenberg- Pierskalla differentials at x coincides with the closed cone generated by the Clarke-Rockafellar differentials at x. Under the same assumptions, we show that the set of normalized Greenberg-Pierskalla differentials at x coincides with the closure of the set of normalized Clarke-Rockafellar differentials at x. As a corollary, we obtain a differential characterization of quasiconcavity a la Arrow and Enthoven (1961) for Clarke-Rockafellar differentiable functions.
2015 - n° 560 22/10/2015
A well functioning bureaucracy can promote prosperity, as advocated by Max Weber. But when bureaucracy gets jammed, it causes stagnation, as described by Franz Kafka. We propose a dynamic theory of the interaction between the production of laws and the efficiency of bureaucracy. When bureaucracy is inefficient the effects of politicians legislative acts are hard to assess. Therefore, incompetent politicians have strong incentives to pass laws to acquire the reputation of skill-full reformers. But too many, often contradictory reforms can in turn lead to a collapse in bureaucratic fficiency. This interaction leads to the existence of both Weberian and Kafkian steady states. A temporary surge in political instability, a strong pressure for reforms by the public, and the appointment of short-lived technocratic governments can determine a permanent shift towards the Kafkian nightmare steady state. Using micro-data for Italy, we provide evidence consistent with one key prediction of the theory: the relative supply of laws by incompetent politicians increases when legislatures are expected to be short.
2015 - n° 559 30/09/2015
This paper studies how voters optimally allocate costly attention in a model of probabilistic voting. The equilibrium solves a modified social planning problem that reflects voters' choice of attention. Voters are more attentive when their stakes are higher, when their cost of information is lower and prior uncertainty is higher. We explore the implications of this in a variety of applications. In equilibrium, extremist voters are more influential and public goods are under-provided. The analysis also yields predictions about the equilibrium pattern of information, and about policy divergence by two opportunistic candidates. Endogenous attention can lead to multiple equilibria, explaining how poor voters in developing countries can be politically empowered by welfare programs.
2015 - n° 558 29/09/2015
This paper proposes and discusses an instrumental variable estimator that can be of particular relevance when many instruments are available and/or the number of instruments is large relative to the total number of observations. Intuition and recent work (see, e.g., Hahn (2002)) suggest that parsimonious devices used in the construction of the final instruments may provide eective estimation strategies. Shrinkage is a well known approach that promotes parsimony. We consider a new shrinkage 2SLS estimator. We derive a consistency result for this estimator under general conditions, and via Monte Carlo simulation show that this estimator has good potential for inference in small samples.
Keywords: Instrumental Variable Estimation, 2SLS, Shrinkage, Bayesian Regression J
2015 - n° 557 29/09/2015
The question of how economic inequality changed during the centuries leading up to the industrial revolution has been attracting a growing amount of research effort. Nevertheless, a complete picture of the tendencies in economic inequality throughout pre-industrial Europe has remained out of our grasp. This paper begins to resolve this problem by comparing long-term changes in inequality between Central and Northern Italy on the one hand and the Southern and Northern Low Countries on the other hand. Based on new archival material, we reconstruct regional estimates of economic inequality between 1500 and 1800 and analyze them in the light of the Little Divergence debate, assessing the role of economic growth, urbanization, proletarianization, and political institutions. We argue that different explanations should be invoked to understand the early modern growth of inequality throughout Europe, since several factors conspired to make for a society in which it was much easier for inequality to rise than to fall. We also argue that although there was apparently a 'Little Convergence' in inequality, at least some parts of southern and northern Europe diverged in terms of inequality extraction ratios.
Keywords: Economic inequality; early modern period; Sabaudian State; Florentine State; Italy; Low Countries; Belgium; The Netherlands; inequality extraction; wealth concentration; fiscal state; proletarianization
2015 - n° 556 02/09/2015
The triplet-based risk analysis of Kaplan and Garrick (1981) is the keystone of state-of-the-art probabilistic risk assesment in several applied fields. This paper performs a sharp embedding of the elements of this framework into the one of formal decision theory, which is mainly con- cerned with the methodological and modelling issues of rational decision making. In order to show the applicability of such an embedding, we also explicitly develop it within a nuclear probabilistic risk assessment, as prescribed by the U.S. NRC. The aim of this exercise is twofold: on the one hand, it gives risk analysis a direct access to the rich toolbox that decision theory has developed, in the last decades, in order to deal with complex layers of uncertainty; on the other, it exposes decision theory to the challenges of risk analysis, thus providing it with broader scope and new stimuli.
2015 - n° 555 30/07/2015
Government spending at the zero lower bound (ZLB) is not necessarily welfare enhancing, even when its output multiplier is large. We illustrate this point in the context of a standard New Keynesian model. In that model, when government spending provides direct utility to the household, its optimal level is at most 0.5- 1 percent of GDP for recessions of -4 percent; the numbers are higher for deeper recessions. When spending does not provide direct utility, it is generically welfare- detrimental: it should be kept unchanged at a long run-optimal value. These results are confirmed in a medium-scale DSGE version of the model featuring sticky wages and equilibrium unemployment.
Keywords: Government spending multiplier, zero lower bound, welfare