1 / 10

Applied Probability Lecture 2

Applied Probability Lecture 2. Rajeev Surati. Agenda. Independence Bayes Theorem Introduction to Probability Mass Functions. Independence. Simply put P(A|B) = P(A) This implies that P(AB)=P(A|B)P(B)=P(A) P(B) Interpretation in Event space: . A. B. Bayes Theorem.

cleary
Télécharger la présentation

Applied Probability Lecture 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applied Probability Lecture 2 Rajeev Surati

  2. Agenda • Independence • Bayes Theorem • Introduction to Probability Mass Functions

  3. Independence • Simply put P(A|B) = P(A) • This implies that P(AB)=P(A|B)P(B)=P(A) P(B) • Interpretation in Event space: A B

  4. Bayes Theorem • Sample Space Interpretation Generalized

  5. Steroids(quick review) • Manufacturer says steroid test is 99% accurate(*). If news reports that an athlete tests positive, are we so certain that he/she is taking steroids • 99% accurate if steroids are present, 15% false positives; finally assuming 10% of all athletes take steroids.

  6. Monty Hall • Three doors(A,B,C) behind one is a krispy kreme doughnut • Rajeev selects say door A. Monty, who knows where the donut is, opens say door b which is empty(as he perpetrated) and offers to let Rajeev switch. What should Rajeev do.

  7. Explanations • 1 Probability behind P(A|He Knew )is 1/3, P(B|He knew) is 0 therefore P(C| He knew) = ?? • Bayesian method • Take experiment to Limit

  8. Random Variables • Before this we talked about “Probabilities” of events and sets of events where in many cases we hand selected the set of fine grain events that made up an event whose probability we were seeking. Now we move onto another more interesting way to group this point: using a function to ascribe values to every point in a sample space (discrete or continuous) • One example might be the number of heads r in 3 tosses of a coin.

  9. Probability Mass Function probability that the experimental value of a random variable x obtained on a performance of the experiment is equal to same story value of pmf. Can extend up to more dimensions which then allows for conditional pmfs

  10. Expected Values • E(x) given a p.m.f. provides some sense of the center of mass of the pmf. • Variance is another measure that provides some mesure of the distribution of a pmf/pdf around its expected value.

More Related