Frequentist vs Bayesian Probability

Frequentist

Basic notion of probability: # Results / # Attempts

Bayesian

The probability is not a number, but a distribution itself.

[1] www.behind-the-enemy-lines.com/2008/01/are-you-bayesian-or-frequentist-or.html

Random Variable

In probability and statistics, a random variable, random quantity, aleatory variable or stochastic variable is a variable whose value is subject to variations due to chance (i.e. randomness, in a mathematical sense). A random variable can take on a set of possible different values (similarly to other mathematical variables), each with an associated probability, in contrast to other mathematical variables.

Expectation (Expected Value) of a Random Variable:

Probability - 图1

Same, for continuous variables:

Probability - 图2

Independence

Two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of the other.

Probability - 图3

Conditionality

Probability - 图4

Bayes Theorem (rule, law)

Simple form:

Probability - 图5

With Law of Total probability:

Probability - 图6

Marginalisation

The marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. It gives the probabilities of various values of the variables in the subset without reference to the values of the other variables.

Continous:

Probability - 图7
Discrete:

Probability - 图8

Law of Total Probability

Is a fundamental rule relating marginal probabilities to conditional probabilities. It expressses the total propability of an outcome which can be realized via several distinct events - hence the name.

Probability - 图9

Chain Rule

Permits the calculation of any member of the joint distribution of a set of random variables using only conditional probabilities.

Two events

Probability - 图10

More than two events

For more than two events Probability - 图11 the chain rule extends to the formula

Probability - 图12

  1. <br />which by induction may be turned into
  2. ![](https://cdn.nlark.com/yuque/__latex/1b7f993385658debafdf5756e7827d77.svg#card=math&code=%7B%5Cdisplaystyle%20%5Cmathrm%20%7BP%7D%20%0A%28A_%7Bn%7D%5Ccap%20%5Cldots%20%5Ccap%20A_%7B1%7D%29%3D%5Cprod%20_%7Bk%3D1%7D%5E%7Bn%7D%5Cmathrm%20%7BP%7D%20%0A%5Cleft%28A_%7Bk%7D%5C%2C%7B%5CBigg%20%7C%7D%5C%2C%5Cbigcap%20_%7Bj%3D1%7D%5E%7Bk-1%7DA_%7Bj%7D%5Cright%29%7D&height=45&width=231)

Example

With four events Probability - 图13, the chain rule is

  1. ![](https://cdn.nlark.com/yuque/__latex/63c46fdb99849ff7fd383a219dee43e8.svg#card=math&code=%7B%5Cdisplaystyle%20%0A%7B%5Cbegin%7Baligned%7D%5Cmathrm%20%7BP%7D%20%28A_%7B4%7D%5Ccap%20A_%7B3%7D%5Ccap%20A_%7B2%7D%5Ccap%20%0AA_%7B1%7D%29%26%3D%5Cmathrm%20%7BP%7D%20%28A_%7B4%7D%5Cmid%20A_%7B3%7D%5Ccap%20A_%7B2%7D%5Ccap%20A_%7B1%7D%29%5Ccdot%20%0A%5Cmathrm%20%7BP%7D%20%28A_%7B3%7D%5Ccap%20A_%7B2%7D%5Ccap%20A_%7B1%7D%29%5C%5C%26%3D%5Cmathrm%20%7BP%7D%20%28A_%7B4%7D%5Cmid%20%0AA_%7B3%7D%5Ccap%20A_%7B2%7D%5Ccap%20A_%7B1%7D%29%5Ccdot%20%5Cmathrm%20%7BP%7D%20%28A_%7B3%7D%5Cmid%20A_%7B2%7D%5Ccap%20%0AA_%7B1%7D%29%5Ccdot%20%5Cmathrm%20%7BP%7D%20%28A_%7B2%7D%5Ccap%20A_%7B1%7D%29%5C%5C%26%3D%5Cmathrm%20%7BP%7D%20%28A_%7B4%7D%5Cmid%20%0AA_%7B3%7D%5Ccap%20A_%7B2%7D%5Ccap%20A_%7B1%7D%29%5Ccdot%20%5Cmathrm%20%7BP%7D%20%28A_%7B3%7D%5Cmid%20A_%7B2%7D%5Ccap%20%0AA_%7B1%7D%29%5Ccdot%20%5Cmathrm%20%7BP%7D%20%28A_%7B2%7D%5Cmid%20A_%7B1%7D%29%5Ccdot%20%5Cmathrm%20%7BP%7D%20%0A%28A_%7B1%7D%29%5Cend%7Baligned%7D%7D%7D&height=53&width=503)

Bayesian Inference

Bayesian inference derives the posterior probability as a consequence of two antecedents, a prior probability and a “likelihood function” derived from a statistical model for the observed data. Bayesian inference computes the posterior probability according to Bayes’ theorem. It can be applied iteratively so to update the confidence on out hypothesis.

Probability - 图14