@[toc]
A probabilistic framework for solving classification problems
Conditional Probability:
Bayes theorem:
Consider each attribute and class label as random variables
Given a record with attributes
- Goal is to predict class C
- Specifically, we want to find the value of C that maximizes
Approach:
-
compute the posterior probability for all values of C using the Bayes theorem
-
Choose value of C that maximizes
-
Equivalent to choosing value of C that maximizes
Naive Bayes Classifier
Assume independence among attributes when class is given:
- Can estimate for all and .
- New point is classified to if is maximal.
How to Estimate Probabilities from Data
For continuous attributes:
- Discretize the range into bins
- one ordinal attribute per bin
- violates independence assumption
- Two-way split: (A<v) or (A>v)
- cjoose only one of the two splits as new attribute
- Probability density estimation
- Assume attribute follows a normal distribution
- Use data to estimate parameters of distribution(e.g., mean and standard deviation) b
- Once probability distribution is known, can use it to estimate the conditional probability
Normal distribution :
One for each pair
If one of the conditional probability is zero, then the entire expression becomes zero
Probability estimation:
c :number of classes, p :prior probability, m :parameter
Naive Bayes(Summary)
Robust to isolated noise points.
Handle missing values by ignoring the instance during probability estimate calculations
Robust to irrelevant attributes
Independence assumption may not hold for some attributes
- Use other techniques such as Bayesian Belief Networks (BBN)
网友评论