180 likes | 384 Vues
An Intuitive Explanation of Bayes ' Theorem. By Eliezer Yudkowsky. reference. http://yudkowsky.net/rational/bayes. Questions.
E N D
An Intuitive Explanation of Bayes' Theorem By EliezerYudkowsky
reference • http://yudkowsky.net/rational/bayes
Questions • 100 out of 10,000 women at age forty who participate in routine screening have breast cancer. 80 of every 100 women with breast cancer will get a positive mammography. 950 out of 9,900 women without breast cancer will also get a positive mammography. If 10,000 women in this age group undergo a routine screening, about what fraction of women with positive mammographies will actually have breast cancer?
Compute • A= positive mammographies & actually have breast cancer • B = positive mammographies • results = A/B *100% • A=100*(80/100)=80 • B= women with breast cancer with a positive mammography (A) + women without breast cancer with a positive mammography (C) • B= 100 * (80/100) +(10000-100) *(950/9900) • results = A/B *100% =80/1030=7.8%
Questions – different version • 1% of women at age forty who participate in routine screening have breast cancer. 80% of women with breast cancer will get positive mammographies. 9.6% of women without breast cancer will also get positive mammographies. A woman in this age group had a positive mammography in a routine screening. What is the probability that she actually has breast cancer? • A=1%*80% • B=A+(1-1%)*9.6% • Results=A/B*100%=7.8%
Egg problem • Some eggs are painted red and some are painted blue. 40% of the eggs in the bin contain pearls, and 60% contain nothing. 30% of eggs containing pearls are painted blue, and 10% of eggs containing nothing are painted blue. What is the probability that a blue egg contains a pearl? • p(pearl) = 40% • p(blue|pearl) = 30% • p(blue|~pearl) = 10% • p(pearl|blue) = ? • "~" is shorthand for "not", so ~pearl reads "not pearl". • blue|pearl is shorthand for "blue given pearl“ • "the probability that an egg is painted blue, given that the egg contains a pearl". • the order of implication is read right-to-left • blue|pearl means "blue<-pearl", the degree to which pearl-ness implies blue-ness • <d|c><c|b><b|a> reads as "the probability that a particle at A goes to B, then to C, ending up at D".
More notation • The item on the right side = what you already know or the premise, • The item on the left side = the implication or conclusion. • p(blue|pearl) = 30%, • we already know that some egg contains a pearl, then we can conclude there is a 30% chance that the egg is painted blue. • p(pearl|blue) • "the chance that a blue egg contains a pearl" or • "the probability that an egg contains a pearl, if we know the egg is painted blue" • p(pearl|blue) = p(pear&blue) / p(blue)
Bayes' Theorem • A=1%*80% • B=A+(1-1%)*9.6% • Results=A/B*100%=7.8% • A=p(cancer)*p(positive|cancer) • B=A+ p(~cancer) *p(positive|~cancer) • A/B= p(cancer|positive)
Bayes' Theorem • What we know • P(A)=15% • P(E| A =10%) • What we also know • P(~A)=1- 15% =85% • P(E|~ A) =80% • Why not 1- 10% ? • What we want to know • Probability of area ? in given E • P(A|E)=? • How? A ? E p(E | A )p(A ) = p(A | E) p(E) p(E | A )p(A ) = + p(E | ~A )p(~A ) p(E | A )p(A )
Bayes' Theorem A2 A3 A4 A1 E A6 A5 p(E | A )p(A ) p(E | A )p(A ) = = i i i i p(A | E) å i p(E) p(E | A )p(A ) j j j where {Ai} forms a partition of the event space, • Based on definition of conditional probability • p(Ai|E) is posterior probability given evidence E • p(Ai) is the prior probability • P(E|Ai) is the likelihood of the evidence given Ai • p(E) is the preposterior probability of the evidence
likelihood*prior Posterior= evidence p(E | A )p(A ) = i i p(A | E) i p(E)
Example • You go to the doctor’s office, where you take a test for a horrible disease • The test is 99% accurate • If the test is positive, 99% of the time if you have the disease, negative 99% if you don’t • The disease itself is rare: occurs in I in 10,000 people • Your test is positive. What is the probability you have the disease?
Why is this useful? • Useful for assessing diagnostic probability from causal probability • P(cause|effect)= P(effect|cause)P(cause) P(effect) • Let M be meningitis, S be stiff neckP(m|s)=P(s|m)P(m) = 0.8 X 0.0001 = 0.0008 P(s) 0.1 • Note: posterior probability of meningitis is still very small!
Homework • Write a simulation for Monty Hall problem • Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?