1 / 81

Judea Pearl University of California Los Angeles

Judea Pearl University of California Los Angeles. THE MATHEMATICS OF CAUSE AND EFFECT. GENETIC MODELS (S. WRIGHT, 1920). OUTLINE. Lecture 1. Monday 3:30-5:30 Why causal talk? Actions and Counterfactuals Identifying and bounding causal effects Policy Analysis

leiko
Télécharger la présentation

Judea Pearl University of California Los Angeles

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Judea Pearl University of California Los Angeles THE MATHEMATICS OF CAUSE AND EFFECT

  2. GENETIC MODELS (S. WRIGHT, 1920)

  3. OUTLINE Lecture 1. Monday 3:30-5:30 • Why causal talk? Actions and Counterfactuals • Identifying and bounding causal effects Policy Analysis Lecture 2. Tuesday 3:00-5:00 • Identifying and bounding probabilities of causes Attribution • The Actual Cause Explanation References: http://bayes.cs.ucla.edu/jp_home.html Slides + transcripts CAUSALITY (forthcoming)

  4. David Hume (1711–1776)

  5. HUME’S LEGACY Analytical vs. empirical claims Causal claims are empirical All empirical claims originate from experience.

  6. THE TWO RIDDLESOF CAUSATION • What empirical evidence legitimizes a cause-effect connection? • What inferences can be drawn from causal information? and how?

  7. The Art ofCausal Mentoring “Easy, man! that hurts!”

  8. OLD RIDDLES IN NEW DRESS • How should a robot acquire causal • information from the environment? • How should a robot process causal • information received from its • creator-programmer?

  9. CAUSATION AS A PROGRAMMER'S NIGHTMARE • Input: • “If the grass is wet, then it rained” • “if we break this bottle, the grass • will get wet” • Output: • “If we break this bottle, then it rained”

  10. CAUSATION AS APROGRAMMER'S NIGHTMARE (Cont.) ( Lin, 1995) • Input: • A suitcase will open iff both • locks are open. • The right lock is open • Query: • What if we open the left lock? • Output: • The right lock might get closed.

  11. THE BASIC PRINCIPLES Causation = encoding of behavior under interventions Interventions = surgeries on mechanisms Mechanisms = stable functional relationships = equations + graphs

  12. WHAT'S IN A CAUSAL MODEL? Oracle that assigns truth value to causal sentences: Action sentences:B if we doA. Counterfactuals:BB if it wereA. Explanation:B occurred because of A. Optional: with what probability?

  13. CAUSAL MODELS WHY THEY ARE NEEDED X Y Z INPUT OUTPUT

  14. CAUSAL MODELS AT WORK(The impatient firing-squad) U (Court order) C (Captain) A B (Riflemen) D (Death)

  15. CAUSAL MODELS AT WORK (Glossary) U: Court orders the execution C: Captain gives a signal A: Rifleman-A shoots B: Rifleman-B shoots D: Prisoner dies =: Functional Equality (new symbol) U C=U C A=C B=C A B D=AB D

  16. SENTENCES TO BE EVALUATED U S1. prediction: AD S2. abduction: DC S3. transduction: AB S4. action: CDA S5. counterfactual: D D{A} S6. explanation: Caused(A, D) C A B D

  17. STANDARD MODEL FOR STANDARD QUERIES • S1. (prediction):If rifleman-A shot, the prisoner is dead,A  D • S2. (abduction):If the prisoner is alive, then the Captain did not signal, • D  C • S3. (transduction):If rifleman-A shot, then B shot as well, • A B U iff C iff iff A B OR D

  18. WHY CAUSAL MODELS? GUIDE FOR SURGERY S4. (action): If the captain gave no signal and Mr. Adecides toshoot, the prisoner will die: C  DA, and B will not shoot: C BA U C B A D

  19. WHY CAUSAL MODELS? GUIDE FOR SURGERY  TRUE S4. (action): If the captain gave no signal and Mr. Adecides toshoot, the prisoner will die: C  DA, and B will not shoot: C BA U C B A D

  20. MUTILATION IN SYMBOLIC CAUSAL MODELS U TRUE C B A D Model MA(Modify A=C): (U) C = U (C) A = C (A) B = C (B) D = AB (D) Facts: C Conclusions: ? S4. (action): If the captain gave no signal and A decides to shoot, the prisoner will die and B will not shoot,C  DA & BA

  21. MUTILATION IN SYMBOLIC CAUSAL MODELS U TRUE C B A D Model MA(Modify A=C): (U) C = U (C) (A) B = C (B) D = AB (D) Facts: C Conclusions: ? A=C S4. (action): If the captain gave no signal and A decides to shoot, the prisoner will die and B will not shoot,C  DA & BA

  22. MUTILATION IN SYMBOLIC CAUSAL MODELS U TRUE C B A D Model MA(Modify A=C): (U) C = U (C) A (A) B = C (B) D = AB (D) Facts: C Conclusions: A, D,B, U, C A=C S4. (action): If the captain gave no signal and A decides to shoot, the prisoner will die and B will not shoot,C  DA & BA

  23. 3-STEPS TO COMPUTING COUNTERFACTUALS U U TRUE TRUE C C FALSE FALSE A B A B D D TRUE TRUE S5. If the prisoner is dead, he would still be dead if A had not shot. DDA Abduction Action Prediction U TRUE C A B D

  24. COMPUTING PROBABILITIES OF COUNTERFACTUALS U U P(u|D) P(u) P(u|D) P(u|D) C C FALSE FALSE A B A B D D TRUE P(DA|D) P(S5). The prisoner is dead. How likely is it that he would be dead if A had not shot. P(DA|D) = ? Abduction Action Prediction U C A B D

  25. SYMBOLIC EVALUATION OF COUNTERFACTUALS Prove: D DA Combined Theory: (U) C* = U C = U (C) A* A = C (A) B* = C* B = C (B) D* = A*B* D = AB (D) Facts: D Conclusions:U, A, B, C, D, A*, C*, B*,D*

  26. PROBABILITY OF COUNTERFACTUALS THE TWIN NETWORK U W C C* FALSE  B B* A A* TRUE D D* TRUE P(Alive had A not shot |A shot, Dead) = P(D) in model <MA, P(u,w|A,D)> = P(D*|D)in twin-network

  27. or <U, V, F, P(u)> P(y | do(x)) P(Yx=y) CAUSAL MODEL (FORMAL) M = <U, V, F> U - Background variables V - Endogenous variables F - Set of functions{U V\Vi Vi} vi =fi (pai , ui ) Submodel: Mx = <U, V, Fx>, representingdo(x) Fx= Replaces equation for X with X=x Actions and Counterfactuals: Yx(u) = Solution of Y in Mx

  28. WHY COUNTERFACTUALS? Action queries are triggered by (modifiable) observations, demanding abductive step, i.e., counterfactual processing. E.g., Troubleshooting Observation:The output is low Action query: Will the output get higher – if we replace the transistor? Counterfactual query:Would the output be higher – had the transistor been replaced?

  29. WHY CAUSALITY? FROM MECHANISMS TO MODALITY Causality-free specification: Causal specification: Prerequisite: one-to-one correspondence between variables and mechanisms action name mechanism name ramifications direct-effects do(p) ramifications

  30. MID-STORY OUTLINE • Background: • From Hume to robotics • Semantics and principles: • Causal models, Surgeries, • Actions and Counterfactuals • Applications I: • Evaluating Actions and Plans • from Data and Theories • Applications II: • Finding Explanations and • Single-event Causation

  31. INTERVENTION AS SURGERY Example: Policy analysis Model underlying data Model for policy evaluation Economic conditions Economic conditions Tax Tax Economic consequences Economic consequences

  32. PREDICTING THE EFFECTS OF POLICIES 1. Surgeon General (1964): P (c |do(s)) P (c | s) Smoking Cancer 2. Tobacco Industry: Genotype (unobserved) P (c |do(s)) = P (c) Smoking Cancer 3. Combined: P (c | do(s)) = noncomputable Smoking Cancer

  33. PREDICTING THE EFFECTS OF POLICIES 1. Surgeon General (1964): P (c |do(s)) P (c | s) Smoking Cancer 2. Tobacco Industry: Genotype (unobserved) P (c |do(s)) = P (c) Smoking Cancer 3. Combined: P (c | do(s)) = noncomputable Smoking Cancer

  34. PREDICTING THE EFFECTS OF POLICIES Tar Smoking Cancer 1. Surgeon General (1964): P (c |do(s)) P (c | s) Smoking Cancer 2. Tobacco Industry: Genotype (unobserved) P (c |do(s)) = P (c) Smoking Cancer 3. Combined: P (c | do(s)) = noncomputable Smoking Cancer 4. Combined and refined: P (c | do(s)) = computable

  35. The Science of Seeing

  36. The Art of Doing

  37. Combining Seeing and Doing

  38. Available: algebra of seeing e.g., What is the chance it rained if we seethe grass wet? P (rain | wet) = ? {=P(wet|rain) } Needed: algebra of doing e.g., What is the chance it rained if we make the grass wet? P (rain | do(wet)) = ? {= P (rain)} NEEDED: ALGEBRA OF DOING

  39. Rule 1: Ignoring observations • P(y | do{x}, z, w) = P(y | do{x}, w) • Rule 2: Action/observation exchange • P(y | do{x}, do{z}, w) = P(y| do{x},z,w) • Rule 3: Ignoring actions • P(y | do{x}, do{z}, w) = P(y| do{x}, w) RULES OF CAUSAL CALCULUS

  40. DERIVATION IN CAUSAL CALCULUS Genotype (Unobserved) Smoking Tar Cancer Probability Axioms P (c | do{s}) =tP (c | do{s}, t) P (t |do{s}) Rule 2 = tP (c |do{s},do{t}) P (t |do{s}) Rule 2 = tP (c |do{s},do{t}) P (t | s) Rule 3 = tP (c |do{t}) P (t | s) Probability Axioms = stP (c | do{t},s) P (s|do{t}) P(t |s) Rule 2 = stP (c | t, s) P (s|do{t}) P(t |s) Rule 3 = stP (c | t, s) P (s) P(t |s)

  41. LEARNING TO ACT BY WATCHING OTHER ACTORS U1 E.g., Process-control Hidden dials X1 U2 Control knobs Z X2 Visible dials Y Output Problem: Find the effect of (do(x1), do(x2)) on Y, from data on X1, Z, X2 and Y.

  42. LEARNING TO ACT BY WATCHING OTHER ACTORS Patient’s history U1 Patient’s immune status E.g., Drug-management (Pearl & Robins, 1985) X1 U2 Dosages Of Bactrim Z Episodes of PCP X2 Y recovery/death Solution: P(y|do(x1), do(x2)) =zP(y|z, x1, x2) P(z|x1)

  43. W Confounding • Factors + correction LEGAL ATTRIBUTION: WHEN IS A DISEASE DUE TO EXPOSURE? Exposure to Radiation X Enabling Factors Q AND Other causes U OR Y (Leukemia) BUT-FOR criterion: PN=P(Yx y | X = x,Y = y) > 0.5 Q. When is PN identifiable from P(x,y)? A. No confounding + monotonicity PN = [P(y | x) P(y |x )] / P(y | x)

  44. Judea Pearl University of California Los Angeles THE MATHEMATICS OF CAUSE AND EFFECT

  45. OUTLINE Lecture 1. Monday 3:30-5:30 • Why causal talk? Actions and Counterfactuals • Identifying and bounding causal effects Policy Analysis Lecture 2. Tuesday 3:00-5:00 • Identifying and bounding probabilities of causes Attribution • The Actual Cause Explanation References: http://bayes.cs.ucla.edu/jp_home.html Slides + transcripts CAUSALITY (forthcoming)

  46. APPLICATIONS-II • Finding explanations for reported events • Generating verbal explanations • Understanding causal talk • Formulating theories of causal thinking

  47. Causal Explanation “She handed me the fruit and I ate”

  48. Causal Explanation “She handed me the fruit and I ate” “The serpent deceived me, and I ate”

  49. ACTUAL CAUSATION AND THE COUNTERFACTUAL TEST • "We may definea causeto be an object followed by • another,..., where, if the first objecthad not been, the • second never had existed." • Hume, Enquiry, 1748 • Lewis (1973):"x CAUSED y"if x and y are true, and • y is false in the closest non-x-world. • Structural interpretation: • (i) X(u)=x • (ii) Y(u)=y • (iii) Yx(u) y for x x

More Related