Saturday, December 17, 2016
Independence and Exchangeability
Bayesian statistics differs from frequentist statistics in its treatment of unknown values. Bayesian statistics regards probability as an epistemic concept. Under this approach, unknown parameters are given a prior probability distribution. This contrasts with the frequentists approach where parameters are regarded as unknown constants. Indeed, under the epistemic interpretation, the notion of an unknown constant is a contradiction in terms.
In classical frequentist statistics, the samples are often supposed to be formed by independent and identically distributed random variables (iid), while in Bayesian statistics they can only be considered as such if conditioned to the parameter value, which is based on the notion of exchangeability. For example, coin tosses are independent given the numerical value of the probability of Heads, p. Without the knowledge of the numerical value of p, the trials are exchangeable and hence are conditionally independent only if given the value of p. This is the essence of the celebrated Bruno De Finetti's Representation Theorem from 1937.
This theorem asserts that if $\mathbf{x}$ is exchangeable, then it can be represented as a Naive Bayes' model with the latent parent variable representing some meta-parameter, i.e. the $x_i$s are independent given the value of the parameter. In other words, the elements of $\mathbf{x}$ are IID, conditional on the meta-parameter indexing the distribution of $\mathbf{x}$. Hence, this Representation theorem shows how statistical models emerge in a Bayesian context: under the hypothesis of exchangeability of the observables $\{X_i\}^{\infty}_{i=1}$, there is a parameter $\Theta$ such that, given the value of $\Theta$, the observables are conditionally independent and identically distributed. Moreover, De Finetti's strong law shows that our opinion about the unobservable $\Theta$, is the opinion about the limit of $\bar{X}_n$ as $n$ tends to $\infty$.
Subscribe to:
Posts (Atom)