Saturday, January 29, 2005

An example of the Bayes Principle in Action

From the Wired article mentioned below. This sums it up pretty well:

In the language of statistics, Bayes' theorem relates phenomena observed in the present (the evidence) to phenomena known to have occurred in the past (the prior) and ideas about what is going to happen (the model). Mathematically, the formula is represented as


P(yt)P(t)
P(ty) = --------------
P(y)

For nonmathematicians, understanding the implications of the theorem without recourse to real-world metaphors is difficult. So imagine a short-order cook working in a busy café. In a din of conversation, clattering plates, and street noise, the waiters dash by the kitchen counter, shouting orders. What a customer actually ordered is t in the above equation. The garble the cook hears is y. A Bayesian decision process would allow the beleaguered chef to send out as many correct orders as possible.

He has some prior knowledge: Few customers order the smothered duck feet, while the steak frites is a perennial favorite. The cook can use information like this to calculate the prior probability that a given dish was ordered: P(t), or the probability of t. He also knows that one waiter tends to mispronounce any dish with a French name, that a bus roars past the restaurant every 10 minutes, and that the sizzling sounds from the skillets can obscure the difference between, say, ham and lamb. Taking these factors into account, he creates a complicated model of ways the orders might get confused: P(yt), the probability of y, given t. Bayes' theorem allows the chef to chain the probabilities of all possible influences on what he hears to calculate the probability, P(ty), that the order he heard was for the dish the customer chose.


It's pretty easy to see how this type of model could be helpful in constructing an application that could learn what's Spam and what's not Spam.

No comments: