1 / 36

10/22

10/22. Homework 3 returned; solutions posted Homework 4 socket opened Project 3 assigned Mid-term on Wednesday (Optional) Review session Tuesday.

dolan-oneal
Télécharger la présentation

10/22

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 10/22 Homework 3 returned; solutions posted Homework 4 socket opened Project 3 assigned Mid-term on Wednesday (Optional) Review session Tuesday

  2. Conjunctive queries are essentially computing joint distributions on sets of query variables. A special case of computing the full joint on query variables is finding just the query variable configuration that is Most likely given the evidence. There are two special cases here Also MPE—Most Probable Explanation Most likely assignment to all other variables given the evidence Mostly involves max/product MAP—Maximum a posteriori Most likely assignment to some variables given the evidence Can involve, max/product/sum operations

  3. Exact Inference Complexity NP-hard (actually #P-Complete; since we “count” models) Polynomial for “Singly connected” networks (one path between each pair of nodes) Algorithms Enumeration Variable elimination Avoids the redundant computations of Enumeration [Many others such as “message passing” algorithms, Constraint-propagation based algorithms etc.] Approximate Inference Complexity NP-Hard for both absolute and relative approximation Algorithms Based on Stochastic Simulation Sampling from empty networks Rejection sampling Likelihood weighting MCMC [And many more] Overview of BN Inference Algorithms TONS OF APPROACHES

  4. Network Topology & Complexity of Inference The “size” of the merged network can be Exponentially larger (so polynomial inference On that network isn’t exactly god’s gift  Cloudy Multiply- connected Inference NP-hard Sprinklers Rain Wetgrass Can be converted to singly-connected (by merging nodes) Cloudy Singly Connected Networks (poly-trees – At most one path between any pair of nodes) Inference is polynomial Sprinklers+Rain (takes 4 values 2x2) Wetgrass

  5. Examples of singly connected networks include Markov Chains and Hidden Markov Models

  6. fA(a,b,e)*fj(a)*fM(a)+ fA(~a,b,e)*fj(~a)*fM(~a)+

  7. Complexity depends on the size of the largest factor which in turn depends on the order in which variables are eliminated..

  8. Sufficient Condition 1 In general, any leaf node that is not a query or evidence variable is irrelevant (and can be removed) (once it is removed, others may be seen to be irrelevant) Can drop irrelevant variables from the network before starting the query off..

  9. Notice that sampling methods could in general be used even when we don’t know the bayes net (and are just observing the world)! We should strive to make the sampling more efficient given that we know the bayes net

  10. Generating a Sample from the Network <C, ~S, R, W> Network Samples Joint distribution

  11. That is, the rejection sampling method doesn’t really use the bayes network that much…

  12. Notice that to attach the likelihood to the evidence, weare using the CPTs in the bayes net. (Model-free empirical observation, in contrast, either gives you a sample or not; we can’t get fractional samples)

  13. MCMC not covered

  14. Note that the other parents of zj are part of the markov blanket

More Related