1 / 50

Causal Cognition 2: reasoning

Causal Cognition 2: reasoning. David Lagnado University College London. Causal models in reasoning. How are causal models used? Probability judgments Inductive and counterfactual reasoning Categorization Evidential and legal reasoning Decision making Attributions of responsibility .

rosalyn
Télécharger la présentation

Causal Cognition 2: reasoning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Causal Cognition 2: reasoning David Lagnado University College London

  2. Causal models in reasoning • How are causal models used? • Probability judgments • Inductive and counterfactual reasoning • Categorization • Evidential and legal reasoning • Decision making • Attributions of responsibility

  3. Probability judgment • People better at causal than probabilistic reasoning • Use prior causal models to generate probability judgments (via mental simulation) • Neat fit between causal model and probability judgments facilitates probabilistic reasoning

  4. Medical diagnosis problem • The probability of breast cancer is 1% for a woman at age forty who participates in routine screening. If a woman has breast cancer, the probability is 80% that she will get a positive mammography. If a woman does not have breast cancer, the probability is 9.6% that she will also get a positive mammography. A woman in this age group had a positive mammography in a routine screening. • What is the probability that she actually has breast cancer? __ % 7.8

  5. Empirical Results • Eddy (1982) • 95% of doctors gave answers around 75%! • Only 5% gave correct answer • Casscells et al. (1978) • Only 18% gave correct answer • Most responses close to 80% • = P(+ve test|cancer) • Replicated numerous times

  6. Correct Bayesian solution • Use Bayes rule D = disease; ¬D = no disease T+ = positive test result P(D) = base-rate of disease P(T+|D) = true positive (hit rate) P(T+|¬D) = false positive rate • Intuition: two ways to get a +test result, if D is true or if D is false

  7. Correct Bayesian solution • P(D) = .01; P(¬D) = .99 • P(T+|D) = .8 • P(T+|¬D) = .096

  8. Standard account: Attribute substitution • Computation of P(cancer|+ve test) is hard • Substitute with a readily accessible value P(+ve test|cancer) • Hence majority respond 80% • More generally, tendency to confuse P(A|B) with P(B|A)? (or assume that they are equivalent) • Cf. Prosecutor’s fallacy • Confuse P(DNA match | not gulity) with P(Not guilty | DNA match) • Ignore prior of guilt

  9. Causal account • Causal framework for judgments(Krynski & Tenenbaum, 2007) • People fail in standard MDT because they construct a causal model that doesn’t readily accommodate the false-positive statistic P(+test|¬cancer) Cancer +ve Test Result Step 1. Model construction

  10. P(cancer) = 1% Causal account Does not fit into model P(+test|¬cancer) = 10% Cancer P(+test|cancer) = 80% +ve Test Result Step 2. Parameter assignment

  11. Causal account Typical answers neglects the base-rate P(cancer|+test) = 80% or = 1 – (+test|¬cancer) = 90% P(cancer) = 1% Cancer P(+test|cancer) = 80% +ve Test Result Step 3. Bayesian inference

  12. Causal account Benign cyst scenario • 1% of women had breast cancer • Of those with breast cancer, 80% received a +ve test result • 30% of women had a benign cyst • Of those with a benign cyst, 50% received a +ve test result • All others received a –ve result Cyst Cancer +ve Test Result Step 1. Model construction

  13. Causal account Benign cyst scenario • 1% of women had breast cancer • Of those with breast cancer, 80% received a +ve test result • 30% of women had a benign cyst • Of those with a benign cyst, 50% received a +ve test result • All others received a –ve result P(cyst) = 30% P(cancer) = 1% Cyst Cancer P(+test|cancer) = 80% +ve Test Result P(+test|cyst) = 50% Step 2. Parameter assignment

  14. P(cancer|+test) = P(cancer) x P(+test|cancer) P(cancer) x P(+test|cancer) + P(cyst) x P(+test|cyst) P(cancer|+test) = 1% x 80% = 5% 1% x 80% + 30% x 50% Causal account P(cyst) = 30% P(cancer) = 1% Cyst Cancer P(+test|cancer) = 80% +ve Test Result P(+test|cyst) = 50% Step 3. Bayesian inference

  15. Results • Benign cyst vs. false positive scenarios • Correct responses 43% vs 16% • Base-rate neglect 4% vs 28% • Before inference people construct causal models, and need to fit parameter values to these models • The Benign cyst scenario facilitates this, whereas the false positive scenario inhibits it

  16. Extension to other areas of probability judgment? • Asymmetry in inference • Easier to predict effects from causes than vice-versa (in latter case need to consider alternative causes, and use Bayes rule) • General tendency to see evidence for causal mechanisms in random data • Hot-hand fallacy • Gambler’s fallacy (chance as a self-correcting process) • Cascaded inference

  17. Counterfactual reasoning • Close link between causal and counterfactual thinking • Psychological accounts of counterfactual reasoning • Mental logic • Mental models (Johnson-Laird, 2001) • Both suppose that X causes Y closely tied to ‘If X, then Y’ as material implication • Mental simulation (Kahneman et al.) • CBNs offer formal approach to answering counterfactuals (Pearl, 2000)

  18. U court order Firing squad (deterministic case) • Suppose that D • Would D still have occurred, if A hadn’t fired? A fires C Captain Dead B fires Blue = Unknown

  19. U court order Firing squad (deterministic case) • Abduction • Update beliefs on evidence D • D therefore A or B; therefore C; therefore U, A and B A fires C Captain Dead B fires Green = TRUE

  20. Firing squad (deterministic case) Action A fires A fires • Action • Do (not-A) • Set A to false; remove other links into A (graph surgery) • Re-set all variables to unknown except U U court order C Captain Dead B fires

  21. Firing squad (deterministic case) Action A fires A fires U court order C Captain Dead B fires • Inference • U is true; therefore C; therefore B; therefore D

  22. Firing squad (deterministic case) Action A fires A fires U court order C Captain Dead B fires • Inference --- D is still true • Would D still have occurred, if A hadn’t fired? • Experimental study (Sloman & Lagnado, 2005) • 80% subjects say ‘yes’

  23. Causal reasoning • People’s counterfactual inferences obey ‘undoing’ • Especially with causal scenarios • Extended to probabilistic causation • Not explained on other theories of reasoning (mental logic, mental model theory, probabilistic models) • Requires logic of causality (do-calculus)

  24. Decision making • Importance of causal models in decision making (Sloman & Hagmayer, 2006) • Choose action that maximizes expected utility • Probability of outcome given that you do action A • Construct a causal model of decision situation • Use interventional probabilities

  25. Common cause – men concerned with equality issues also concerned with health issue Direct cause – doing chores is additional exercise each day PC man Chores Health Chores Health • Recent research has shown that of 100 men who help with the chores, 82 are in good health whereas only 32 of 100 men who do not help with the chores are. Imagine a friend of yours is married and is concerned about his health. He reads about the research and asks for your advice on whether he should start to do chores or not to improve his health. What is your recommendation? • Different possible models to explain correlation between chores and health • Subjects told either: Do chores: 69% 23%

  26. Evidential reasoning How do people reason with uncertain evidence? How do they assess and combine different items of evidence? What representations do they use? What inference processes? How do these compare with normative theories?

  27. Reasoning with legal evidence Legal domain E.g. juror, judge, investigator, media Complex bodies of interrelated evidence Forensic evidence; eyewitness testimony; alibis; confessions etc Need to integrate wide variety of evidence to reach conclusions (e.g. guilt of suspect)

  28. Descriptive models of juror reasoning Belief adjustment model Hogarth & Einhorn, 1992 Story model Pennington & Hastie, 1986, 1992 Coherence-based models Simon, 2007; Simon & Holyoak, 2002; Thagard, 2000

  29. Belief adjustment model Online judgments formed by adjusting from a prior anchor Over-weights later items Can lead to order effects Ignores causal relations between items of evidence

  30. Beyond reasonable doubt > C Innocent Guilty Innocent Guilty Innocent Guilty DECISION Initial Opinion Anchor S New Evidence Item Belief AdjustmentProcess S* Compare S* vs Criterion C Decision criterion C to convict Utility Evaluation of decisions Background knowledge and assumptions Judge’s instructions on presumption of innocence Trial events (witnesses, exhibits, arguments) Judge’s instructions on the standard of proof Severity of the crime etc. Belief adjustment model

  31. Belief Adjustment algorithm Jack accused of murdering Ian Background: Jack found out that Ian was having an affair with his girlfriend Start with initial anchor (based on background story) S0 Evidence encoded as +ve or –ve Weighted according to credibility of source Prosecution witness: Ex-girlfriend says Jack is violent + w1.e1 S1 =S0 + w1.e1 Added to anchor Continue Defence witness: Sister says Jack is pacifist - w2.e2 S2 =S1 - w2.e2

  32. Evidence for BAM (Hogarth & Einhorn, 1992) Order effects when evidence is processed item-by-item Recency - over-weight final item Jack accused of murdering Ian Background: Jack found out that Ian was having an affair with his girlfriend Order 1 Order 2 Prosecution witness: Ex-girlfriend says Jack is violent Defence witness: Sister says Jack is pacifist Prosecution witness: Ex-girlfriend says Jack is violent Defence witness: Sister says Jack is pacifist Jack rated more guilty with order 2

  33. Problems Does not capture full extent of human reasoning Does not address interrelations between evidence items Treats each item as independent No re-evaluation of earlier items of evidence in the light of new evidence

  34. Story model Evidence evaluated through story construction Stories involve network of causal relations between events Causal narratives not arguments People represent events in the world, not inference process Stories constructed prior to judgment or decision Stories determine verdicts, and are not post hoc justifications

  35. Evidence evaluation through story construction • Representation of possible verdicts • Decision by classifying best story into one verdict category Likely to be considerable interplay between these 3 stages

  36. Constructing a story Jurors impose a narrative structure on trial information Engage in an active constructive process ‘Sense-making’ by organizing information into compelling story Heavy use of causality Physical mental

  37. Example scenario(Pennington & Hastie, 1988) 3 hour video-taped re-enactment of a criminal trial The defendant, Johnson, was charged with stabbing another man, Caldwell, to death in a bar-room fight. Mock jurors provided with large amount of evidence 1 The first witness is a police officer: Sergeant Robert Harris 2 I was on my usual foot patrol at 9:00 p.m. 3 I heard loud voices from the direction of Gleason's Bar 4 Johnson and Caldwell were outside the bar 5 Johnson laughed at Caldwell 6 Caldwell punched Johnson in the face 7 Johnson raised a knife over his head 8 I yelled, "Johnson, don't do it" 9 Johnson stabbed Caldwell in the chest … (over 80 items) Must decide between verdicts of guilty (of murder) or not guilty (self-defence)

  38. Jurors’ story models elicited via think-aloud protocols

  39. Example Story model Psychological states: J very angry with C Goals: J intends to confront C J intends to kill C Actions: J goes home and gets knife J returns to bar C hits J J stabs C Consequences: C wounded & dies Initiating events: J&C argued in bar C threatened J J has no weapon J leaves NB This story model promotes first-degree murder verdict Others promote not guilty (eg self-defence)

  40. Evaluating a story Not probabilistic inferences Acceptance (with confidence level) Certainly true; uncertain; certainly false Certainty principles Coverage Uniqueness Coherence

  41. Coherence Consistency No internal contradictions Plausibility fit with juror’s world knowledge etc. Completeness No missing parts

  42. Evidence for story model Verbal protocols 85% of events causally linked Verdicts covaried with story models Recognition memory tests More likely to falsely remember items consistent with story model E.g. If murder verdict story constructed, falsely remember ‘Johnson was looking for Cardwell’

  43. Story vs witness order More likely to convict when prosecution evidence in story order More likely to acquit when defence evidence in story order • Vary order of presentation of evidence to influence ease of story construction % mock jurors choosing guilty verdict

  44. Shortcomings Not precisely specified No formal or computational models of causal model construction or inference But captures crucial insight that people use causal knowledge to represent and reason about legal evidence

  45. Coherence-based models Process-level account Mind strives for coherent representations Elements cohere or compete Judgments emerge through interactive process that maximizes coherence (constraint satisfaction) Bidirectional reasoning (evidence can be re-evaluated to fit emerging conclusions)

  46. Formal model of evidential reasoning Bayesian networks to represent relations between bodies of evidence and hypotheses Captures dependencies between items Permits inference from evidence to hypotheses (and vice-versa) Increasingly used in legal contexts

  47. Partial Bayesian net for Sacco and Vanzetti trial

  48. Applicable to human reasoning? Vast number of variables Numerous probability estimates required Complex computations

  49. Applicable to human reasoning? Fully-fledged BNs unsuitable as model of limited-capacity human reasoning BUT – a key aspect is the qualitative relations between variables (what depends on what) Judgments of relevance & causal dependency critical in legal analyses And people seem quite good at this! DNA match raises probability of guilt an impartial alibi lowers it Guilt DNA Alibi + -

  50. More realistic model People reason using small-scale qualitative models Limited number of variables (at one time) Require comparative rather than precise probabilities Guided by causal knowledge Captures relevance relations Enables inferences about hypotheses on basis of evidence

More Related