## Cosma Shalizi Waterboards the Rev. Dr. Thomas Bayes

Bayesian inference gone horribly wrong. Cosma Shalizi:

Some Bayesian Finger-Puzzle Exercises, or: Often Wrong, Never In Doubt: The theme here is to construct some simple yet pointed examples where Bayesian inference goes wrong, though the data-generating processes are well-behaved, and the priors look harmless enough. In reality, however, there is no such thing as an prior without bias, and in these examples the bias is so strong that Bayesian learning reaches absurd conclusions....

Example 1:... The posterior estimate of the mean [of the generating process] thus wanders from being close to +1 to being close to -1 and back erratically, hardly ever spending time near zero, even though (from the law of large numbers) the sample mean [of the sufficient statistic] converges to [the true mean of the generating process of] zero....

Example 2:... As we get more and more data, the sample mean converges almost surely to zero (by the law of large numbers), which here drives the mean and variance of the posterior to zero almost surely as well. In other words, the Bayesian becomes dogmatically certain that the data are distributed according to a standard Gaussian with mean 0 and variance 1. This is so even though the sample variance almost surely converges to the true variance, which is 2. This Bayesian, then, is certain that the data are really not that variable, and any time now will start settling down....

It is a violation of the Geneva Convention to force a Bayesian statistician to begin analysis with a prior that places a weight of zero on the true underlying generating process, isn't it?