1 / 33

Once again about the science-policy interface

Once again about the science-policy interface. Q R A. Open risk management: overview. Human-human interface. There are really interesting new interfaces for transmitting information from person to person: Facebook: How are you? Wikipedia: What is thing X? Opasnet: What should we do?

cato
Télécharger la présentation

Once again about the science-policy interface

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Once again about the science-policy interface

  2. Q R A Open risk management: overview

  3. Human-human interface • There are really interesting new interfaces for transmitting information from person to person: • Facebook: How are you? • Wikipedia: What is thing X? • Opasnet: What should we do? • A universal interface for communication about decisions and decision support is urgently needed. • When that problem is solved, the communication problem between science and policy is solved as well. There is no need for a separate science-policy interface.

  4. Introduction to probability theory Jouni Tuomisto THL

  5. Probability of a red ball • P(x|K) = R/N, • x=event that a red ball is picked • K=your knowledge about the situation

  6. Probability of an event x Prize p Red Red ball 100 € • If you are indifferent between decisions 1 and 2, then your probability of x is p=R/N. Decision 1 White ball 0 € 1-p x happens 100 € Decision 2 0 € x does not happen

  7. The meaning of uncertainty • Uncertainty is that which disappears when we become certain. • We become certain of a declarative sentence when (a) truth conditions exist and (b) the conditions for the value ‘true’ hold. • (Bedford and Cooke 2001) • Truth conditions: • It is possible to design a setting where it can be observed whether the truth conditions are met or not.

  8. Different kinds of uncertainty • Aleatory (variability, irreducible) • Epistemic (reducible), actually the difference only depends on its purpose in a model. • Weights of individuals is aleatory if we are interested in each person, but epistemic if we are interested in a random person in the population. • Parameter (in a model): should be observable! • Model: several models can be treated as parameters in a meta-model

  9. Different kinds of uncertainty: not really uncertainty • Ambiguity: not uncertainty but fuzziness of description • Volitional uncertainty: “The probability that I will clean up the basement next weekend.” • Uncertainties about own actions cannot be measured by probabilities.

  10. What is probability? • 1. Frequentists talk about probabilities only when dealing with experiments that are random and well-defined. The probability of a random event denotes the relative frequency of occurrence of an experiment's outcome, when repeating the experiment. Frequentists consider probability to be the relative frequency "in the long run" of outcomes.[1] • 2. Bayesians, however, assign probabilities to any statement whatsoever, even when no random process is involved. Probability, for a Bayesian, is a way to represent an individual's degree of belief in a statement, or an objective degree of rational belief, given the evidence. • Source: Wikipedia

  11. Significance and confidence • Significance level: • The probability of some aspect of the data, given H is true. • Probability: • Your probability of H, given data. • Confidence: • Probability that the interval includes θ (θ is given). • Probability: • Probability that θ is included in the interval (data is given).

  12. Positions of Bayesian approach • Statistics is the study of uncertainty. • Uncertainty should be measured by probability. • Data uncertainty is so measured, conditional on the parameters. • Parameter uncertainty is similarly measured by probability. • Inference is performed within the probability calculus, mainly by Bayesian rule.

  13. Probability and conditional probabilities • The totality of possible states of the world P(A) P(B)

  14. Probability rules • Rule 1 (convexity): • For all A and B, 0 ≤ P(A|B) ≤ 1 and P(A|A)=1. • Cromwell’s rule P(A|B)=1 if and only if A is a logical consequence of B. • Rule 2 (addition): if A and B are exclusive, given C, • P(A U B|C) = P(A|C) + P(B|C). • P(A U B|C) = P(A|C) + P(B|C) – P(A ∩ B|C) if not exclusive. • Rule 3 (multiplication): for all A, B, and C, • P(AB|C) = P(A|BC) P(B|C) • Rule 4 (conglomerability): if {Bn} is a partition, possibly infinite, of C and P(A|BnC)=k, the same value for all n, then P(A|C)=k.

  15. Binomial distribution • You make n trials with success probability p. The number of successful trials k follows the binomial distribution. • Like drawing n balls (with replacement) from an urn and k being red. • P(n,k|p) = n!/k!/(n-k)! pk (1-p)(n-k)

  16. Example • You draw randomly 3 balls (with replacement) from an urn with 40 red and 60 white balls. What is the probability distribution for the number of red balls?

  17. Answer • P(n,k|p) = n!/k!/(n-k)! pk (1-p)(n-k) • 0 red: 3!/0!/3! *0.40*(1-0.4)3-0 • = 1*1*0.63 = 0.216 • 1 red: 3!/1!/2! *0.41*(1-0.4)3-1 = 0.432 • 2 red: 3!/2!/2! *0.42*(1-0.4)3-2 = 0.288 • 3 red: 3!/3!/0! *0.43*(1-0.4)3-3 = 0.064

  18. Binomial distribution (n=6) P(•|p,n)

  19. Binomial distribution: likelihoods P(k|•,n)

  20. Bayes’ theorem • P(θ|x) = P(x|θ) P(θ) / P(x) • Proof: • P(θ,x) = P(θ|x)P(x) = P(x|θ) P(θ) • P(x) is often difficult to determine, but it is independent of θ and thus a constant over θ. Therefore: • P(θ|x) ~ P(x|θ) P(θ) (proportionality)

  21. Bayes with words • Likelihood: P(x|θ) • Prior: P(θ) • Posterior: P(θ|x) • Posterior ~ Prior*Likelihood

  22. Getting rid of nuisance factors • P(θ,α|x) = P(x ,α |θ) P(θ ,α) / P(x ,α) • P(θ|x) = ∫P(θ ,α |x) dα

  23. Example of Bayes’ rule with balls • You draw randomly 3 balls (with replacement) from an urn with 40 red and 60 white balls. What is the probability distribution for the number of red balls? • How do you update your prior if you draw three red balls in row? • Probability of 3 red given prior = likelihood =0.43 = 0.064 • Prior = 0.4 • Posterior=P(θ|R) = P(R|θ) P(θ) / P(R) =0.064*0.4/0.064=0.4 !!

  24. Why doesn’t the probability change with new data? • Because the prior is not uncertain, although it is a probability. • P(p=0.4)=1, P(p<>0.4)=0 • Therefore, it is unaffected by any data, even if you get, say, five red balls in row P=0.45=0.010. • You talk about your unlikely results with the guy who sold the urn to you. He replies: ”Did I say it has less red balls? Maybe it was more red balls. I really don’t remember, but the ratio is 40:60 for sure.” • How does this change your model?

  25. P(Y|R)=P(R|Y) P(Y)/P(R) =0.4*0.5/(0.4*0.5+0.6*0.5) =0.2/0.5 = 0.4 New Bayes model with uncertain prior: red ball drawn

  26. P(Y|R)=P(R|Y) P(Y)/P(R) =0.4*0.4/(0.4*0.4+0.6*0.6)=0.16/0.52=4/13≈0.308 New Bayes model with uncertain prior: second ball is red

  27. P(Y|R)=P(R|Y) P(Y)/P(R) =0.4*(4/13)/((1.6+5.4)/13) = 1.6/7 = 8/35≈0.229 New Bayes model with uncertain prior: third ball is red

  28. Conclusion from the red ball study • We are not modelling the reality directly; we are modeling our understanding of reality. • It might be useful to think of the Bayes rule as a 2*2 table. • The principle is the same, even if there are more than 2 rows or columns. • The principle is the same, even if there are more than two dimensions in the table.

  29. Bayes’ rule in diagnostics • Imagine there is a clinical test for narcolepsy with 0.99 sensitivity and 0.99 specificity. • A man was worried about his 6-year-old daughter who got the swine flu vaccination. He took the daughter to a private laboratory for the test. • Now he comes to you with the daughter. The test result is positive. • Does the daughter have narcolepsy?

  30. Narcolepsy diagnostics? • What is sensitivity? • N(true positive)/N(disease) =P(test+|disease) • What is specificity? • N(true negative)/N(healthy) =P(test-|healthy)

  31. P(N|t+)=P(t+|N) P(N)/P(t+) =0.99*0.001/0.01099 ≈ 0.0901 Narcolepsy diagnostics

  32. P(N|t+)=P(t+|N) P(N)/P(t+) =0.95*0.0901/0.1311 ≈ 0.6528 Narcolepsy: importance of anamnesis. sensitivity=specificity=0.95

  33. P(N|t-)=P(t-|N) P(N)/P(t-) =0.05*0.0901/0.8689 ≈ 0.0052 Narcolepsy: importance of negative anamnesis. Sens.=spec.=0.95

More Related