Frequentist vs Bayesian Probability
Frequentist
Basic notion of probability: # Results / # Attempts
Bayesian
The probability is not a number, but a distribution itself.
[1] www.behind-the-enemy-lines.com/2008/01/are-you-bayesian-or-frequentist-or.html
Random Variable
In probability and statistics, a random variable, random quantity, aleatory variable or stochastic variable is a variable whose value is subject to variations due to chance (i.e. randomness, in a mathematical sense). A random variable can take on a set of possible different values (similarly to other mathematical variables), each with an associated probability, in contrast to other mathematical variables.
Expectation (Expected Value) of a Random Variable:
Same, for continuous variables:
Independence
Two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of the other.
Conditionality
Bayes Theorem (rule, law)
Simple form:
With Law of Total probability:
Marginalisation
The marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. It gives the probabilities of various values of the variables in the subset without reference to the values of the other variables.
Continous:
Discrete:
Law of Total Probability
Is a fundamental rule relating marginal probabilities to conditional probabilities. It expressses the total propability of an outcome which can be realized via several distinct events - hence the name.
Chain Rule
Permits the calculation of any member of the joint distribution of a set of random variables using only conditional probabilities.
Two events
More than two events
For more than two events the chain rule extends to the formula
<br />which by induction may be turned into

Example
With four events , the chain rule is

Bayesian Inference
Bayesian inference derives the posterior probability as a consequence of two antecedents, a prior probability and a “likelihood function” derived from a statistical model for the observed data. Bayesian inference computes the posterior probability according to Bayes’ theorem. It can be applied iteratively so to update the confidence on out hypothesis.