820 likes | 1.77k Vues
Random Finite Sets in Stochastic Filtering. Ba-Ngu Vo EEE Department University of Melbourne Australia. http://www.ee.unimelb.edu.au/staff/bv/. IEEE Victorian Chapter July 28, 2009. Stochastic Filtering History. 1940’s: Wiener filter Pioneering work by Wiener, Kolmogorov.
E N D
Random Finite Sets in Stochastic Filtering Ba-Ngu VoEEE Department University of MelbourneAustralia http://www.ee.unimelb.edu.au/staff/bv/ IEEE Victorian Chapter July 28, 2009
Stochastic Filtering History 1940’s: Wiener filter Pioneering work by Wiener, Kolmogorov N. Wiener (1894-1964) A. N. Kolmogorov (1903-1987) R. E. Kalman (1930-) • 1950’s: Kalman filter Work by Bode & Shannon, Zadeh & Ragazzini, Levinson, Swerling, Stratonovich, etc. • 1960’s: Publication of the Kalman filter, Kalman-Bucy filter, Schmidt’s 1st implementation – Apollo program LMS algorithm by Widrow & Hoff • 1970’s: Aerospace applications Sorenson & Alspach, Singer, Bar-Shalom, Reid, etc.
Particle Filter (1990’s--) Computational tools for non-linear non-Gaussian filtering Gordon, Salmond & Smith, Doucet … Stochastic Filtering: The Present • Random Finite Set (1990’s--) Unified framework for multi-object filtering & control Probability Hypothesis Density (PHD) filters, Bernoulli filter Pioneering work by Mahler
The Bayes (nonlinear) Filter Practical Challenges Multi-Object Filtering Random Finite Set PHD/CPHD Filters & Applications Conclusions Outline
The Bayes (nonlinear) Filter observation space zk zk-1 state space state dynamic xk xk-1 state-vector System Model gk(zk| xk) fk|k-1(xk| xk-1) Measurement Likelihood Markov Transition Density Objective pk(xk | z1:k) measurement history(z1,…, zk) posterior pdf of the state
The Bayes (nonlinear) Filter observation space zk zk-1 gk(zk| xk) state space state dynamic xk fk|k-1(xk| xk-1) xk-1 state-vector pk-1(xk-1| z1:k-1) dxk-1 pk|k-1(xk| z1:k-1) gk(zk| xk) fk|k-1(xk| xk-1) gk(zk| xk)pk-1(xk-1| z1:k-1)dxk Bayes filter data-update prediction pk|k-1(xk| z1:k-1) pk(xk| z1:k) pk-1(xk-1|z1:k-1)
The Bayes (nonlinear) Filter Bayes filter pk|k-1(.| z1:k-1) pk(.| z1:k) pk-1(.|z1:k-1) data-update prediction Particle filter N N (i) (i) (i) N (i) (i) (i) {wk-1, xk-1} {wk|k-1, xk|k-1} {wk, xk } i=1 i=1 i=1 observation space zk zk-1 gk(zk| xk) state space state dynamic xk fk|k-1(xk| xk-1) xk-1 state-vector Kalman filter N(.;mk-1, Pk-1) N(.;mk|k-1, Pk|k-1) N(.;mk, Pk )
Practical Challenges observation space zk zk-1 gk(zk| xk) state space state dynamic xk fk|k-1(xk| xk-1) xk-1 state-vector • So far, we assumed exactly 1 observation at each time • Holds only for a small number of applications • Practical measuring device: • may fail to detect true observation (detection uncertainty) & • picks up false observations (clutter)
Practical Challenges Detection uncertainty: or Not detected Detected False observations (clutter) Number of false observations • unknown • random False
Practical Challenges Not detected ‘+’ Observation = False Detected No information on which is the observation of the state Number of observations is a random variable.
Practical Challenges observation space zk zk-1 state space state dynamic xk xk-1 state-vector • Summary of practical challenges: • Number of observations is random & time varying • True observation may not be present • Do not know which observations are false/true • Ordering of observations not relevant
Multi-Object Filtering observation space observationproduced by objects state space state dynamic Xk Xk-1 3 objects 5 objects • Objective: Jointly estimate the number & states of objects • Numerous applications: defence, surveillance, robotics, biomed, … • Challenges: • Random number of objects and measurements • Detection uncertainty, clutter, association uncertainty
Multi-Object Filtering 2 objects 2 objects How can we mathematically represent the multi-object state? Usual practice: stack individual states into a large vector! • Fundamental inconsistency: True Multi-object state Estimated Multi-object state Estimate is correct but estimation error ??? Remedy: use
Multi-Object Filtering 2 objects 2 objects True Multi-object state Estimated Multi-object State 1 object True Multi-object state Estimated Multi-object State no object What are the estimation errors?
Multi-Object Filtering Miss-distance: error between estimate and true state measures how close an estimate is to the true value well-understood for single target: Euclidean distance, MSE, etc fundamental in estimation/filtering & control • Vector representation doesn’t admit multi-object miss-distance • Finite set representation admits multi-object miss-distance, e.g. • Haussdorf, Wasserstein, OSPA [Schuhmacher et. al. 08] In fact the “distance” is a distance for sets not vectors
Multi-Object Filtering multi-object observation observations Z multi-object state states X X Reconceptualise as a finite set-valued filtering problem prediction data-update pk(Xk|Z1:k) pk|k-1(Xk|Z1:k-1) pk-1(Xk-1|Z1:k-1) Multi-object state & observation represented by finite sets Bayesian framework treats state/observation as random variables Bayesian multi-object filteringrequires random finite set (RFS)
Random Finite Set What is a random finite set (RFS)? The number of points is random, The points have no ordering and are random An RFS is a finite set-valued random variable Also known as: point process or random point pattern Example 1: Bernoulli RFS sampleu ~ uniform[0,1] ifu < r, samplex ~ p(.), end; E Example 2: multi-Bernoulli RFS = Union of Bernoulli RFSs
Random Finite Set Example 3: Poisson RFS E Sample n~ Poiss(r), for i=1:n, samplexi ~ p(.) , end; Example 4: i.i.d. cluster RFS E Sample n~ c(.), for i=1:n, samplexi ~ p(.) , end;
Random Finite Set multi-object observation observations Z multi-object state states X X Multi-object Bayes filter prediction data-update pk(Xk|Z1:k) pk|k-1(Xk|Z1:k-1) pk-1(Xk-1|Z1:k-1) ? ? Need suitable notions of density/integration for finite set
Random Finite Set F(E) E State space Collection of finite subsets ofE S S T T Belief “distribution”of S bS(T ) = P(SÍT ) , TÍE Probability distributionof S PS(T) = P(S ÎT ) ,TÍF(E) Choquet (1968) Point Process Theory (1950-1960’s) Mahler’s Finite Set Statistics (1994) Probability densityof S pS : F(E)® [0,¥) PS(T)=òTpS(X)m(dX) Belief “density”of S fS : F(E)® [0,¥) bS(T )=òTfS(X)dX VSD (2005) Conventional integral Set integral
The PHD Filter Multi-object Bayes filter pk(Xk|Z1:k) prediction pk|k-1(Xk|Z1:k-1) pk-1(Xk-1|Z1:k-1) data-update Computationally expensive! state of system: random vector single-object Bayes filter first-moment filter (e.g.a-b-gfilter) Single-object state of system: random set multi-object Bayes filter first-moment filter (“PHD” filter) Multi-object
The PHD Filter v PHD (intensity function) of an RFS v(x0)= density of expected number of objects atx0 v(x)dx= expected number of objects inS S x0 state space S
The PHD Filter Avoids data association! state space vk vk-1 PHD filter [Mahler 03] PHD prediction PHD update vk|k-1(xk|Z1:k-1) vk-1(xk-1|Z1:k-1) vk(xk|Z1:k) prediction pk-1(Xk-1|Z1:k-1) pk|k-1(Xk|Z1:k-1) pk(Xk|Z1:k) update Multi-object Bayes filter
The PHD Filter: Prediction intensity from previous time-step predicted intensity Markov transition intensity term for spontaneous object births Nk|k-1 = vk|k-1(x|Z1:k-1)dx predicted expected number of objects vk|k-1 =Fk|k-1vk-1 (Fk|k-1a)(xk)=fk|k-1(xk, x)a(x)dx+ gk(xk) vk|k-1(xk|Z1:k-1) = fk|k-1(xk, xk-1)vk-1(xk-1|Z1:k-1)dxk-1 + gk(xk) fk|k-1(xk, xk-1) = ek|k-1(xk-1) fk|k-1(xk|xk-1) + bk|k-1(xk|xk-1) probability of object survival Markov transition density term for objects spawned by existing objects
The PHD Filter: Update vk=Ykvk|k-1 [S + 1 -pD,k(x)]a(x) yk,z(x) (Yka)(x)= <yk,z,a> + kk(z) zZk [S + 1 -pD,k(xk)]vk|k-1(xk|Z1:k-1) pD,k(xk)gk(z|xk) vk(xk|Z1:k) Dk(z) + kk(z) zZk measurement Bayes-updated intensity predicted intensity (from previous time) intensity of false alarms probability of detection Dk(z) = pD,k(x)gk(z|x)vk|k-1(x|Z1:k-1)dx Nk= vk(x|Z1:k)dx sensor likelihood function expected number of objects
The PHD Filter vk|k-1(.|Z1:k-1) vk-1( .|Z1:k-1) vk(.|Z1:k) Gaussian Mixture PHD Filter [VM 05,06], Jk-1 Jk|k-1 Jk (j) (j) (j) (j) (j) (j) (j) (j) (j) {wk|k-1, mk|k-1, Pk|k-1} {wk, mk, Pk } {wk-1, mk-1, Pk-1} j=1 j=1 j=1 Jk-1 Jk|k-1 Jk (j) (j) (j) (j) (j) (j) {wk-1, xk-1} {wk|k-1, xk|k-1 } {wk, xk } j=1 j=1 j=1 Particle PHD Filter [VSD 03, 05], [Mahler & Zajic 03], [Sidenbladh 03]
The PHD filter Particle-PHD filter [VSD 03, 05] Bistatic radar [Tobias &Lanterman 05] Track labelling track association [Panta et. al. 07, Lin et. al 06] Convergence [VDS 05, Johansen et al 07, Clark & Bell 06,] Computer vision [Maggio et. al. 07, Wang et. al. 2008, ] Auxiliary particle PHD filter [Whitley et. al. 07] Traffic intensity estimation [Battistelli et. al. 08] GM-PHD filter [VM 05, 06] Extended & Unscented Kalman PHD filter [VM 06] Jump Markov PHD filter [Pasha et. al. 06] Track continuity [Clark et. al. 06] Convergence [Clark et. al. 07] British Petrolium (Pipeline tracking) 07 Visual tracking [Pham et. al. 07] Cell tracking [Juang et. al. 09]
The PHD filter Video data: tracking football players [Pham et al. 07] Data courtesy of Czyz et. al.
The PHD filter Video tracking of people walking (340 frames) [Pham et al. 07] Data courtesy of K. Smith IDIAP Research Institute.
The Cardinalised PHD Filter cardinality prediction cardinality update ck|k-1(n|Z1:k-1) ck-1(n|Z1:k-1) ck(n|Z1:k) intensity prediction intensity update vk|k-1(xk|Z1:k-1) vk-1(xk-1|Z1:k-1) vk(xk|Z1:k) Drawback of PHD filter: High variance of cardinality estimate Relax Poisson assumption: allows any cardinality distribution Jointly propagate: intensity function & cardinality distribution. CPHD filter [Mahler 06, 07], Gaussian Mixture CPHD filter [VVC 06, 07] Higher computational cost than PHD Still cheaper than state-of-the-art traditional techniques
The Cardinalised PHD Filter GM-CPHD filter [VVC 05, 06] GMTI Radar [Ulmke et. al. 07] Tested by FGAN (NATO Bold Avenger exercise) 07, Acoustic source tracking [Pham et. al. 08] Tested on MSTWG and SEABAR Datasets [Erdinc et. al. 08] Comparison with MHT [Svensson et al. 09] Convoy tracking [Pollard et. al. 09] Tracking from aerial image [Pollard et. al. 09] Lockheed Martin (Space Fence) 09.
The Cardinalised PHD Filter Sonar images
The Cardinalised PHD Filter Large scale multiple target tracking with small false alarm rate Courtesy of Lockheed Martin
The Cardinalised PHD Filter Up to 1500 closely spaced targets on a standard laptop! OSPA distance (satisfies all metric axioms) = per target cardinality & state error Courtesy of Lockheed Martin
The PHD Filter in SLAM SLAM (Simultaneous Localisation and Mapping) Objective: Jointly estimate robot pose & map (set of landmarks)
The PHD Filter in SLAM (Feature) Map = finite set of landmarks Bayesian SLAM requires modelling the map by an RFS RFS-SLAM prediction Set integral Transition density RFS-SLAM update Controls Measurement likelihood Set integral Measurements RFS-SLAM [Mullane et. al. 08] Map Robot pose
The PHD Filter in SLAM Mapping: special case of SLAM with known robot poses PHD approximation: propagate 1st moment of the map RFS PHD of the posterior map RFS
The PHD Filter in SLAM Mapping: special case of SLAM with known robot poses
The PHD Filter in SLAM PHD SLAM (approximation of RFS-SLAM recursion): • Augment landmarks with the vehicle pose • Represent set of augmented landmarks as a marked point process • Propagate PHD of the marked point process Experiment: Nanyang Technological University Campus
The PHD Filter in SLAM Low clutter: All 3 algorithms can close the loop Ground truth plotted in green Higher clutter: Only PHD-SLAM can close the loop
Concluding Remarks • Random Finite Set Filtering • Borne out of practical & fundamental necessity • Significant theoretical extension of classical filtering • Yields efficient algorithms such as the PHD filters • Beyond the PHD filters • Multi-Bernoulli, Gauss-Poisson filters • Filtering with image data • Robustness • Stochastic control For more info please see http://randomsets.ee.unimelb.edu.au See also: http://www.ee.unimelb.edu.au/staff/bv/publications.html Thank You!
Some References Books • D. Daley and D. Vere-Jones, An Introduction to the Theory of Point Processes, Springer-Verlag, 1988. • D. Stoyan, D. Kendall, J. Mecke, Stochastic Geometry and its Applications, John Wiley & Sons, 1995 • I. Goodman, R. Mahler, and H. Nguyen, Mathematics of Data Fusion. Kluwer Academic Publishers, 1997. • R. Mahler, Statistical Multisource-Multitarget Information Fusion, ArtechHouse, 2007. • M. Mallick, V. Krisnamurthy, B.-N. Vo (eds), Advanced Topics and Applications in Integrated Tracking, Classification, and Sensor Management, IEEE-Wiley (under review) Papers • R. Mahler, “Multi-target Bayes filtering via first-order multi-target moments,” IEEE Trans. AES, vol. 39, no. 4, pp. 1152–1178, 2003. • B.-N. Vo, S. Singh, and A. Doucet, “Sequential Monte Carlo methods for multi-target filtering with random finite sets,” IEEE Trans. AES, vol. 41, no. 4, pp. 1224–1245, 2005. • B.-N. Vo, and W. K. Ma, “The Gaussian mixture PHD filter,” IEEE Trans. Signal Processing, IEEE Trans. Signal Processing, Vol. 54, No. 11, pp. 4091-4104, 2006. • R. Mahler, “PHD filter of higher order in target number,” IEEE Trans. Aerospace & Electronic Systems, vol. 43, no. 4, pp. 1523–1543, 2007 • B. T. Vo, B.-N. Vo, and A. Cantoni, "Analytic implementations of the Cardinalized Probability Hypothesis Density Filter," IEEE Trans. Signal Processing, Vol. 55, No. 7, Part 2, pp. 3553-3567, 2007. • B.-T. Vo, B.-N Vo, and A. Cantoni, "The Cardinality Balanced Multi-target Multi-Bernoulli filter and its implementations," IEEE Trans. Signal Processing, vol. 57, no. 2, pp. 409–423, 2009. • J. Mullane, B.-N. Vo, M. Adams and S. Wijesoma, "A Random Set Formulation for Bayesian SLAM," International Conference on Intelligent Robots and Systems, Nice, France, 2008.
Collaborators (in no particular order) Mahler R., Lockheed Martin Singh S., Cambridge Doucet A., U. British Columbia Ma W.K., Chinese U. Hong Kong Panta K., BAE Systems Baddeley A., U. Western Australia Clark D., Herriot-Watt U. Vo B.T., U. Western Australia Cantoni A., U. Western Australia Pasha A., U. New South Wales Tuan H.D., U. New South Wales Zuyev S., U. Strathclyde Mullane J., Nanyang Technological U. Adams M., Nanyang Technological U. Wijesoma S., Nanyang Technological U. Schumacher D., U. Bern Ristic B., DSTO Australia Guern J., Lockheed Martin Pham T., INRIA Suter D. U. Adelaide