Bayesian [bey-zee-uhn] probability is the likelihood that something will happen based on all available evidence. The more commonly understood concept of frequency probability is the chance that something will happen based only on past occurrences. Rather than interpreting probability as merely the propensity of some phenomenon, Bayesian probability is a quantity assigned for the purpose of representing a state of knowledge, or a state of belief. This allows the application of probability to all sorts of propositions rather than just ones that come with a reference class (historical data).
‘Prior probability’ is information about a hypothesis known before the experiment is undertaken (e.g. a flipped coin has a 50% chance of landing on heads), information learned afterwards is called ‘Posterior probability’ (e.g. if a coin lands on heads many times in a row it is probably improperly weighted). The term ‘Bayesian’ refers to 18th century mathematician and theologian Thomas Bayes, who summarized the theory thusly: ‘The probability of any event is the ratio between the value at which an expectation depending on the happening of the event ought to be computed, and the value of the thing expected upon its happening.’ (i.e. Likelihood equals Prior probability over Posterior probability.)
Broadly speaking, there are two views on Bayesian probability that interpret the concept in different ways. According to the objectivist view, the rules of Bayesian statistics can be justified by requirements of rationality and consistency and interpreted as an extension of logic. For objectivists, probability measures the plausibility of propositions, i.e. the probability of a proposition corresponds to a reasonable belief everyone (even a ‘robot’) sharing the same knowledge should share in accordance with the rules of Bayesian statistics. In fact, many modern machine learning methods are based on objectivist Bayesian principles. According to the subjectivist view, probability quantifies a ‘personal belief.’ For subjectivists, rationality and coherence constrain the probabilities a subject may have, but allow for substantial variation within those constraints.
The objective and subjective variants of Bayesian probability differ mainly in their interpretation and construction of the prior probability. Both methods are characterized by the use of random variables, or, more generally, unknown quantities, to model all sources of uncertainty in statistical models. This also includes uncertainty resulting from lack of information. For the frequentist a hypothesis is a proposition (which must be either true or false), so that the frequentist probability of a hypothesis is either one or zero. In Bayesian statistics, any probability can be assigned to a hypothesis.
In the 1980s, there was a dramatic growth in research and applications of Bayesian methods, mostly attributed to the discovery of Markov chain Monte Carlo methods (a class of algorithms for sampling from a probability distribution), which removed many of the computational problems, and an increasing interest in nonstandard, complex applications.
The use of Bayesian probabilities as the basis of Bayesian inference has been supported by several arguments, such as the Dutch book argument proposed by 20th century Italian statistician Bruno de Finetti. A Dutch book is made when a clever gambler places a set of bets that guarantee a profit, no matter what the outcome of the bets. If a bookmaker follows the rules of the Bayesian calculus in the construction of his odds, a Dutch book cannot be made.
Leave a Reply