Machine learning(3): Bayesian Learning(1)

Conditional Probability

Machine learning(3): Bayesian Learning(1)

Bayes Theorem

Machine learning(3): Bayesian Learning(1)

The conditional probability in independence

Machine learning(3): Bayesian Learning(1)
Especially,
Machine learning(3): Bayesian Learning(1)

Conditional independence

Machine learning(3): Bayesian Learning(1)

Bayesian Learning

Estimating probabilities is essentially a matter of counting the
occurrences of particular combinations of values in the
training data set.
Machine learning(3): Bayesian Learning(1)
That is the basic estimating probabilities.
But if we have low conts, our probability is closed to 0. So, we could find a way to solve it .
Machine learning(3): Bayesian Learning(1)

Complete Bayes Classifiers

Machine learning(3): Bayesian Learning(1)
But using the complete bayes has a lot examples to support.
We could estimate some examples:
Machine learning(3): Bayesian Learning(1)
So, we could see if we want to use some sinple examples, their probabilities of examples are up to bigger numbers.
This method is no useful for all situtations.

Naive Bayes Classifiers

The complete Bayes classifier is impractical because so much
data is required to estimate the conditional probabilities.
Can we get round this problem by finding a much more
economical way to estimate them.
Machine learning(3): Bayesian Learning(1)
You could see that. It just need shorter number of examples to complete the prediction.

NUMERIC ATTRIBUTES

Two type of solution:
<1> Discretization
Machine learning(3): Bayesian Learning(1)
<2> Assume a distribution
Machine learning(3): Bayesian Learning(1)

BAYESIAN BELIEF NETWORKS

Build a model with two conditions:
<1> Specifies which conditional independence assumptions
are valid.
<2> Provides sets of conditional probabilities to specify the
joint probability distributions wherever dependencies
exist.
Machine learning(3): Bayesian Learning(1)