320 likes | 459 Vues
This paper focuses on enhancing lifted probabilistic inference by introducing partial inversion, an advanced technique that simplifies the process of inference without resorting to grounding. It highlights two main contributions: a general method for partial inversion, extending previous work, and the application of lifted Maximum a Posteriori (MAP) estimation. Utilizing the First-Order Variable Elimination algorithm, the approach effectively manages complex random variables and reduces computational overhead, leading to more effective epidemic modeling and probabilistic reasoning across various domains.
E N D
MPE and Partial Inversion inLifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir and Dan Roth
Lifted Probabilistic Inference • We assume probabilistic statements such as8Person, DiseaseP(sick(Person,Disease) | epidemics(Disease)) = 0.3 • Typical approach is grounding. • We seek to do inference at first-order level, like it is done in logic. • Faster and more intelligible. • Two contributions: • Partial inversion: more general technique than previous work (IJCAI '05) • MPE and Lifted assignments
Representing structure epidemic(measles) … epidemic(flu) … … sick(mary,measles) sick(mary,flu) sick(bob,measles) sick(bob,flu) … … … Poole (2003) named these parfactors, for “parameterized factors” Logical variable epidemic(D) Atom sick(P,D)
epidemic(Disease) sick(Person,Disease) Parfactor 8 Person, Disease f(sick(Person,Disease), epidemic(Disease))
epidemic(Disease) sick(Person,Disease) Parfactor Person mary, Disease flu 8 Person, Disease f(sick(Person,Disease), epidemic(Disease)), Person mary, Disease flu
Joint Distribution • As in propositional case, proportional to product of all factors • But here, “all factors” means all instantiations of all parfactors: P(...) ÕXf1(p(X)) ÕX,Yf2(p(X),q(X,Y))
Inference task - Marginalization åq(X,Y)ÕXf1(p(X)) ÕX,Yf2(p(X),q(X,Y)) Marginal on all random variables in p(X): summation over all assignments to all instances of q(X,Y)
The FOVE Algorithm • First-Order Variable Elimination (FOVE): a generalization of Variable Elimination in propositional graphical models. • Eliminates classes of random variables at once.
FOVE P(hospital(mary)) = ? epidemic(measles) epidemic(D) D measles sick(mary,measles) sick(mary, D) D measles hospital(mary)
FOVE P(hospital(mary)) = ? epidemic(D) D measles sick(mary,measles) sick(mary, D) D measles hospital(mary)
FOVE P(hospital(mary)) = ? epidemic(D) D measles sick(mary, D) D measles hospital(mary)
FOVE P(hospital(mary)) = ? D measles sick(mary, D) D measles hospital(mary)
FOVE P(hospital(mary)) = ? hospital(mary)
Counting Elimination - A Combinatorial Approach åe(D)ÕD1D2f(e(D1),e(D2)) = åe(D)f(0,0)#(0,0) in assignment f(0,1)#(0,1) in assignment f(1,0)#(1,0) in assignment f(1,1)#(1,1) in assignment Let i be the number of e(D)’s assigned 1: = åi Õv1,v2f(v1,v2)#(v1,v2) given i (number of assignments with |{D : e(D)=1}| = i)
Counting Elimination - Conditions • It does not work oneliminating class epidemic fromf(epidemic(D1, Region), epidemic(D2, Region), donations). • In general, counting elimination does not apply when atoms share logical variables. • Here, Region is shared between atoms.
Partial Inversion Provides a way of not sharing logical variables åe(D,R) ÕD1D2,R f( e(D1,R), e(D2,R), d ) = ÕR åe(D,r) ÕD1D2 f( e(D1,r), e(D2,r), d ) (R is now bound, so not a variable anymore) = ÕR f’( d ) = f’( d )|R|= f’’( d )
Partial Inversion, graphically epidemic(D1,R) Each instance a counting elimination problem donations D1 D2 epidemic(D2,R) … epidemic(D1,r1) epidemic(D1,r10) … D1 D2 D1 D2 epidemic(D2,r1) epidemic(D2,r10) donations
Another (not so partial) inversion åq(X,Y)ÕX,Yf(p(X),q(X,Y)) (expensive) =ÕX,Yåq(X,Y) f(p(X),q(X,Y)) (propositional) = ÕX,Yf'(p(X)) = ÕXf'|Y|(p(X)) = ÕXf''(p(X)) (marginal on p(X))
Another (not so partial) inversion p(X) Each instance a propositional elimination problem q(X,Y) … p(x1) p(xn) … q(x1,y1) q(xn,yn)
friends(bob,mary) friends(mary,bob) friends(X,Y) … … X Y friends(Y,X) friends(bob,mary) friends(mary,bob) Partial inversion conditions f( friends(X,Y), friends(Y,X)) Cannot partially invert on X,Y because friends(bob,mary) appears in more than one instance of parfactor.
Summary of Partial Inversion • More general than previousInversion Elimination. • Generates Counting Elimination or Propositional sub-problems. • Cannot be applied to “entangled parfactors”. • Does not depend on domain size.
Second contribution: Lifted MPE • In propositional case,MPE done by factors containing MPE of eliminated variables. C A B D
MPE • In propositional case,MPE done by factors containing MPE of eliminated variables. A B D
MPE • In propositional case,MPE done by factors containing MPE of eliminated variables. A B
MPE • In propositional case,MPE done by factors containing MPE of eliminated variables. A
MPE • In propositional case,MPE done by factors containing MPE of eliminated variables.
MPE • Same idea in First-order case • But factors are quantified and so are assignments: 8 X, Y f(p(X), q(X,Y))
MPE 8 X, Y f(p(X), q(X,Y)) After Inversion Elimination of q(X,Y): Liftedassignments 8 X f’(p(X))
MPE After Inversion Elimination of p(X): 8 X f’(p(X)) f’’()
MPE 8 D1, D2 f(e(D1), e(D2)) After Counting Elimination of e: f’()
Conclusions • Partial Inversion:More general algorithm, subsumes Inversion elimination • Lifted Most Probable Explanation (MPE) • same idea as in propositional VE, but with • Lifted assignments: • describe sets of basic assignments • universally quantified comes from Partial Inversion • existentially quantified comes from Counting elimination • Ultimate goal: • to perform lifted probabilistic inference in way similar to logic inference: without grounding and at a higher level.