Naive Bayes


Notes and Ideas:

  • Text classification
    • naive bayes is easy because we make a very strong(and wrong) assumption.
  • Ex.
    • Basic spam classifier:
      • use proportion of spam from training set as prediction
      • P(spam|X) where X is our features from our emails -> logistic regression
      • prior information ( training set )
  • Bayes Theorem

    Bayes


    Example: From the prior information: we would have the Truepositive rate (sensitivity) = 0.85 = P(+ | D(true)). P(- | D(true)) = 0.15

    Hence: P(- | D(false)) = 0.1, P(+ | D(false)) = 0.1 P(D) = 0.0001

    Goal: find P(D | +): test positive and the true prior information:


    Theorem:

    1. =

    Definition:

    Link to original
  • =
  • x = []

methods:

Bernoulli method:

  • each features

TAGS

#classification