Читать книгу Informatics and Machine Learning. From Martingales to Metaheuristics онлайн

64 страница из 101

(Assumption 1)

Assumption 1)


where f is a function of one variable and C is a constant. For the trivial choice of function and constant there is:


which is the conventional rule for conditional probabilities (and c‐and‐b|a is rewritten as p(c,b|a), etc.). The second assumption relates the likelihoods of propositions b and ~b when the proposition a is known to be true:

(Assumption 2)

for some function S. Consistency with the Boolean algebra of propositions then forces two relations on S:


which together can be solved to give:


Assumption 1Assumption 2


to obtain the classic Bayes Theorem.

2.5.2 Bayes' Rule

The derivation of Bayes’ rule is obtained from the property of conditional probability:




Bayes' Rule provides an update rule for probability distributions in response to observed information. Terminology:

 p(xi ) is referred to as the “prior distribution on X” in this context.

 p(xi ∣ yj ) is referred to as the “posterior distribution on X given Y.”

2.5.3 Estimation Based on Maximal Conditional Probabilities

Правообладателям