1 / 19

Cognitive Computer Vision

Cognitive Computer Vision. Kingsley Sage khs20@sussex.ac.uk and Hilary Buxton hilaryb@sussex.ac.uk Prepared under ECVision Specific Action 8-3 http://www.ecvision.org. Lecture 6. Inference in Bayesian networks Predictive inference Diagnostic inference Combined inference

magar
Télécharger la présentation

Cognitive Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cognitive Computer Vision Kingsley Sage khs20@sussex.ac.uk and Hilary Buxton hilaryb@sussex.ac.uk Prepared under ECVision Specific Action 8-3 http://www.ecvision.org

  2. Lecture 6 • Inference in Bayesian networks • Predictive inference • Diagnostic inference • Combined inference • Intercausal inference • General approaches for inference • Bayesian inference tools

  3. So why is Bayesian inference relevant to Cognitive CV? • Provides a well-founded methodology for reasoning with uncertainty • These methods are the basis for our model of perception guided by expectation • We can develop well-founded methods of learning rather than just being stuck with hand-coded models

  4. Inference • Inference • Calculating a probability over a set of nodes given the values of other nodes • Four modes of inference: • PREDICTIVE (from root to leaf) • DIAGNOSTIC (from leaf to root) • COMBINED (predictive and diagnostic) • INTERCAUSAL

  5. Inference • Inference Also called conditioning or belief updating • We will have some values (evidence nodes) and want to establish others (query nodes) • Don’t confuse priors with evidence • Priors are statistical statements of how likely something is to “happen” (frequentist view) • Evidence means that you know it has happened

  6. B A O C N A vision example • All discrete nodes • A and B are feature detectors for some area in an image (perhaps A is colour based and B is shape based) • O is an object detector that bases its decision solely on A and B • N determines how likely another is to be found nearby when the object detector finds its object • C represents an action context that is relevant when the object detector finds its object

  7. B A O C N A vision example • A detects red areas, B detects the cup shape, O detects the cup of tea, the potential nearby object is a saucer and the action context is someone picking up the tea to drink it!!

  8. B A O C N A vision example These priors are established during a training process This table specifies the performance of the object detector where T =detected, and F = not detected The context is “will be picked up” if c=T. The saucer object is nearby if n=T

  9. Predictive inference • Let’s see this applied to our example • We use marginalisation to evaluate our queries based on the evidence we have observed (if we have any)

  10. Predictive inference • In the absence of any observed evidence

  11. Predictive inference • Let’s say we now have evidence that a=T • And if a=T and b=T

  12. Diagnostic inference • Reasoning from leaf upwards to root nodes • Use Bayes rule

  13. X O N Diagnostic inference • If there had been a link from another node into N, we would have needed to have normalised our expression over the additional node

  14. A O B C N Combined inference Evidence • Where you have evidence from say N and B and form a query on an intermediate node • E.g. use diagnostic inference to determine p(o=T|n=?) and then use predictive inference to determine p(o=T) given the evidence • Can compute, for example p(o=T|n=T,b=T) Query Evidence

  15. Evidence Query A O B Evidence Intercausal inference“explaining away” • A and B are independent • A is dependent on B given O • If, for example, p(a=T|o=T) > p(a=T|o=T,b=T) then the odds are that a=T rather than b=T caused o=T • We say that O is “explained away”

  16. General approach to inference • Having their origins in Pearl’s work on Junction Trees (“Probabilistic Reasoning in Intelligent Systems”, Pearl 1988) • Efficient schemes exist for global computation of probabilities using local message passing (e.g. Jensen and Lauritzen 1990 and Lauritzen and Spiegelhalter 1988) • Beyond the scope of this course, but …

  17. Bayesian inference tools • there are a number of packages out there to do the work for you!! • http://www.cs.ubc.ca/~murphyk/Software : Kevin Murphy’s BNT • http://www.csse.monash.edu.au/bai/book/appendix_b.pdf : Excellent summary of various packages and their capabilities

  18. Summary • Bayesian inference allows the values of evidence nodes to be used systematically to update query nodes • We can distinguish 4 modes of inference: predictive, diagnostic, combined and explaining away • Large Bayesian networks can be evaluated efficiently using Bayesian inference toolkits available on the Internet

  19. Next time … • Gaussian mixtures • A lot of excellent reference material on Bayesian reasoning can be found at: http://www.csse.monash.edu.au/bai http://www.dcs.qmw.ac.uk/~norman/BBNs/idxlist.htm

More Related