We document two new facts about the distributions of answers in famous statistical problems: they are i) multi-modal and ii) unstable with respect to irrelevant changes in the problem. We offer a model in which, when solving a problem, people represent each hypothesis by attending “bottom up” to its salient features while neglecting other, potentially more relevant, ones. Only the statistics associated with salient features are used, others are neglected. The model unifies Gambler’s Fallacy, its variation by sample size, under- and overreaction in inference, and insensitivity to multiple signals, all as a byproduct of selective attention. The model also makes new predictions on how controlled changes in the salience of specific features should jointly shape measured attention and biases. We test and confirm these predictions experimentally, including by measuring attention and documenting novel biases predicted by the model. Bottom-up attention to features emerges as a unifying framework for biases conventionally explained using a variety of stable heuristics or distortions of the Bayes rule.
Author(s): Pedro Bordalo, John Conlon, Nicola Gennaioli, Spencer Kwon, and Andrei Shleifer