Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Set/Point Process in Multi-Target Tracking

Similar presentations


Presentation on theme: "Random Set/Point Process in Multi-Target Tracking"— Presentation transcript:

1 Random Set/Point Process in Multi-Target Tracking
Ba-Ngu Vo EEE Department University of Melbourne Australia Collaborators (in no particular order): Mahler R., Singh. S., Doucet A., Ma. W.K., Panta K., Clark D., Vo B.T., Cantoni A., Pasha A., Tuan H.D., Baddeley A., Zuyev S., Schumacher D. SAMSI, RTP, NC, USA, 8 September 2008

2 Outline The Bayes (single-target) filter Multi-target tracking
System representation Random finite set & Bayesian Multi-target filtering Tractable multi-target filters Probability Hypothesis Density (PHD) filter Cardinalized PHD filter Multi-Bernoulli filter Conclusions

3 The Bayes (single-target) Filter
observation space zk zk-1 state space target motion xk xk-1 state-vector I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. System Model fk|k-1(xk| xk-1) gk(zk| xk) Markov Transition Density Measurement Likelihood Objective pk(xk | z1:k) posterior (filtering) pdf of the state measurement history (z1,…, zk)

4 The Bayes (single-target) Filter
observation space zk zk-1 state space target motion xk xk-1 state-vector I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking.  pk-1(xk-1| z1:k-1) dxk-1 fk|k-1(xk| xk-1) K pk|k-1(xk| z1:k-1) gk(zk| xk) Bayes filter pk-1(xk-1 |z1:k-1) prediction pk|k-1(xk| z1:k-1) data-update  pk(xk| z1:k) 

5 The Bayes (single-target) Filter
observation space zk zk-1 gk(zk| xk) state space target motion xk fk|k-1(xk| xk-1) xk-1 state-vector I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. pk-1(. |z1:k-1) pk|k-1(. | z1:k-1) pk(. | z1:k) prediction data-update  Bayes filter  N(.;mk-1, Pk-1) N(.;mk|k-1, Pk|k-1) N(.;(mk, Pk ) Kalman filter i=1 N {wk|k-1, xk|k-1} (i) {wk, xk }  {wk-1, xk-1} Particle filter

6 Multi-target tracking

7 observation produced by targets
Multi-target tracking observation space observation produced by targets state space target motion I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Xk Xk-1 5 targets 3 targets Objective: Jointly estimate the number and states of targets Challenges: Random number of targets and measurements Detection uncertainty, clutter, association uncertainty

8 System Representation
How can we mathematically represent the multi-target state? Usual practice: stack individual states into a large vector! Problem: True Multi-target state Estimated Multi-target state 2 targets 2 targets Estimate is correct but estimation error ??? Remedy: use

9 System Representation
True Multi-target state Estimated Multi-target State 1 target 2 targets True Multi-target state Estimated Multi-target State no target 2 targets What are the estimation errors?

10 System Representation
Error between estimate and true state (miss-distance) fundamental in estimation/filtering & control well-understood for single target: Euclidean distance, MSE, etc in the multi-target case: depends on state representation For multi-target state: vector representation doesn’t admit multi-target miss-distance finite set representation admits multi-target miss-distance: distance between 2 finite sets In fact the “distance” is a distance for sets not vectors

11 observation produced by targets
System Representation observation space observation produced by targets state space target motion I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Xk Xk-1 5 targets 3 targets Number of measurements and their values are (random) variables Ordering of measurements not relevant! Multi-target measurement is represented by a finite set

12 RFS & Bayesian Multi-target Filtering
Reconceptualize as a generalized single-target problem [Mahler 94] observations observed set Z  targets target set X X I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Bayesian: Model state & observation as Random Finite Sets [Mahler 94]  prediction pk-1(Xk-1|Z1:k-1) pk|k-1(Xk|Z1:k-1) data-update pk(Xk|Z1:k)  Need suitable notions of density & integration

13 RFS & Bayesian Multi-target Filtering
state space E S random finite set or random point pattern I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. S state space E NS(S) = |S  S| point process or random counting measure

14 RFS & Bayesian Multi-target Filtering
F(E) E Collection of finite subsets of E State space S S T T Probability distribution of S PS (T ) = P(S ÎT ) , T Í F(E) Belief “distribution” of S bS (T ) = P(S Í T ) , T Í E Since Individual target probability distribution models are defined on subsets of Rn Not tractable to derive multi-target probability distribution on the abstract Borel subsets of the finite subsets of E However, belief distribution on the closed subsets of E can be derived Mahler’s approach is offers a more tractable modeling alternative- Aim: for practicing engineers to write down the belief distribution using the motion models of individual targets, take set derivative to get the multi-target transition density, write down belief distribution using the sensor models, take set derivative to get the multi-target likelihood. Choquet (1968) Point Process Theory ( ’s) Mahler’s Finite Set Statistics (1994) Probability density of S pS : F(E) ® [0,¥) PS (T ) = òT pS (X)m(dX) Belief “density” of S fS : F(E) ® [0,¥) bS (T ) = òT fS (X)dX Vo et. al. (2005) Conventional integral Set integral

15 Multi-object transition
Multi-target Motion Model Xk = Sk|k-1(Xk-1)ÈBk|k-1(Xk-1)ÈGk x x’ X’ death creation spawn motion fk|k-1(Xk|Xk-1 ) Multi-object transition density Evolution of each element x of a given multi-object state Xk-1

16 Multi-object likelihood
Multi-target Observation Model Zk = Qk(Xk) ÈKk(Xk) likelihood x z gk(Zk|Xk) misdetection Multi-object likelihood x clutter state space observation space Observation process for each element x of a given multi-object state Xk

17 Multi-target Bayes Filter
 pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) data-update pk(Xk|Z1:k)  Since Individual target probability distribution models are defined on subsets of Rn Not tractable to derive multi-target probability distribution on the abstract Borel subsets of the finite subsets of E However, belief distribution on the closed subsets of E can be derived Mahler’s approach is offers a more tractable modeling alternative- Aim: for practicing engineers to write down the belief distribution using the motion models of individual targets, take set derivative to get the multi-target transition density, write down belief distribution using the sensor models, take set derivative to get the multi-target likelihood. Computationally intractable in general No closed form solution Particle or SMC implementation [Vo, Singh & Doucet 03, 05, Sidenbladh 03, Vihola 05, Ma et al. 06] Restricted to a very small number of targets

18 Particle Multi-target Bayes Filter
Algorithm for i =1:N, % Initialise => Sample: Compute: end; normalise weights; for k =1: kmax , for i =1:N, % Update => Update: resample; MCMC step;

19 The PHD Filter pk-1(Xk-1|Z1:k-1) pk|k-1(Xk|Z1:k-1) pk(Xk|Z1:k)  
prediction pk|k-1(Xk|Z1:k-1) data-update pk(Xk|Z1:k)  Multi-target Bayes filter: very expensive! Since Individual target probability distribution models are defined on subsets of Rn Not tractable to derive multi-target probability distribution on the abstract Borel subsets of the finite subsets of E However, belief distribution on the closed subsets of E can be derived Mahler’s approach is offers a more tractable modeling alternative- Aim: for practicing engineers to write down the belief distribution using the motion models of individual targets, take set derivative to get the multi-target transition density, write down belief distribution using the sensor models, take set derivative to get the multi-target likelihood. state of system: random vector single-object Bayes filter first-moment filter (e.g. a-b-g filter) Single-object state of system: random set multi-object Bayes filter first-moment filter (“PHD” filter) Multi-object

20 The Probability Hypothesis Density
vS PHD (intensity function) of a RFS S vS(x0) = density of expected number of objects at x0 S vS(x)dx = expected number of objects in S = mean of, NS(S), the random counting measure at S x0 state space S

21 Multi-object Bayes filter
The PHD Filter state space vk vk-1 Avoids data association! PHD filter vk-1(xk-1|Z1:k-1) PHD prediction vk|k-1(xk|Z1:k-1) PHD update vk(xk|Z1:k)   pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) update pk(Xk|Z1:k)   Multi-object Bayes filter

22 Markov transition density
PHD Prediction vk|k-1(xk |Z1:k-1) = fk|k-1(xk, xk-1) vk-1(xk-1|Z1:k-1)dxk-1 + gk(xk) predicted intensity Nk|k-1 = vk|k-1 (x|Z1:k-1)dx predicted expected number of objects Markov transition intensity intensity from previous time-step term for spontaneous object births = intensity of Gk fk|k-1(xk, xk-1) = ek|k-1(xk-1) fk|k-1(xk|xk-1) + bk|k-1(xk|xk-1) probability of object survival Markov transition density term for objects spawned by existing objects = intensity of Bk(xk-1) vk|k-1 = Fk|k-1vk-1 (Fk|k-1a)(xk) = fk|k-1(xk, x)a(x)dx + gk(xk)

23 sensor likelihood function
PHD Update [ S + 1 - pD,k(xk)]vk|k-1(xk|Z1:k-1) pD,k(xk)gk(z|xk) vk(xk|Z1:k)  Dk(z) + kk(z) zZk Bayes-updated intensity measurement intensity of false alarms probability of detection predicted intensity (from previous time) Dk(z) = pD,k(x)gk(z|x)vk|k-1(x|Z1:k-1)dx Nk= vk(x|Z1:k)dx sensor likelihood function expected number of objects vk = Ykvk|k-1 (Yka)(x) = zZk <yk,z,a> + kk(z) yk,z(x) + 1 - pD,k(x)]a(x) [ S

24 Particle PHD filter The PHD (or intensity function) vk is not a probability density The PHD propagation equation is not a standard Bayesian recursion Sequential MC implementation of the PHD filter [Vo, Singh & Doucet 03, 05], [Sidenbladh 03], [Mahler & Zajic 03] state space Particle approximation of vk Particle approximation of vk-1 Need to cluster the particles to obtain multi-target estimates

25 Particle PHD filter Algorithm Initialise; for k =1: kmax ,
for i =1: Jk , Sample: ; compute: ; end; for i = Jk +1: Jk +Lk-1 , Sample: ; compute: ; for i =1: Jk +Lk-1 , Update: ; Redistribute total mass among Lk resampled particles; Convergence: [Vo, Singh & Doucet 05], [Clark & Bell 06], [Johansen et. al. 06]

26 Gaussian Mixture PHD filter
Closed-form solution to the PHD recursion exists for linear Gaussian multi-target model Gaussian mixture prior intensity Þ Gaussian mixture posterior intensities at all subsequent times vk-1( . |Z1:k-1) vk(. |Z1:k) vk|k-1(. |Z1:k-1)  PHD filter  {wk-1, mk-1, Pk-1} i=1 Jk-1 (i) {wk|k-1, mk|k-1, Pk|k-1} Jk|k-1 {wk, mk, Pk } Jk Gaussian Mixture (GM) PHD filter [Vo & Ma 05, 06] Extended & Unscented Kalman PHD filter [Vo & Ma 06] Jump Markov PHD filter [Pasha et. al. 06] Track continuity [Clark et. al. 06]

27 Cardinalised PHD Filter
Drawback of PHD filter: High variance of cardinality estimate Relax Poisson assumption: allows arbitrary cardinality distribution Jointly propagate: intensity function & probability generating function of cardinality. pk-1(n|Z1:k-1) pk(n|Z1:k) pk|k-1(n|Z1:k-1)  cardinality prediction update vk-1(xk-1|Z1:k-1) intensity prediction vk|k-1(xk|Z1:k-1) intensity update vk(xk|Z1:k)   CPHD filter [Mahler 06,07] More complex PHD update step (higher computational costs)

28 Particle CPHD filter [Vo 08]
Gaussian Mixture CPHD Filter Closed-form solution to the CPHD recursion exists for linear Gaussian multi-target model Gaussian mixture prior intensity Þ Gaussian mixture posterior intensities at all subsequent times [Vo et. al. 06, 07] Particle-PHD filter can be extended to the CPHD filter cardinality prediction  cardinality update   {pk-1(n)}  {pk|k-1(n)} {pk(n)}  n=0 n=0 n=0 (i) (i) Jk-1 intensity prediction {wk-1, xk-1} {wk|k-1, xk|k-1} (i) (i) Jk|k-1 intensity update (i) (i) Jk  {wk, xk }  i=1 i=1 i=1 Particle CPHD filter [Vo 08]

29 CPHD filter Demonstration
1000 MC trial average GMCPHD filter Well I’m getting a bit bored of equations, so I’d like to show this closed form solution in action. What I’m going to show you is a scenario where we have targets moving in a given region shown by a blue cross. The measurements received by the filter are shown by black crosses and you’ll see that we can’t distinguish which measurement came from where. The filter estimate is shown in red. On the right you’ll see a histogram of the posterior intensity showing how well the closed form Gaussian mixture implementation performs. Here we’ll actually have a total of 10 targets on the scene at any one time. GMPHD filter

30 CPHD filter Demonstration
Comparison with JPDA: linear dynamics, sv = 5, sh = 10, 4 targets, 1000 MC trial average Well I’m getting a bit bored of equations, so I’d like to show this closed form solution in action. What I’m going to show you is a scenario where we have targets moving in a given region shown by a blue cross. The measurements received by the filter are shown by black crosses and you’ll see that we can’t distinguish which measurement came from where. The filter estimate is shown in red. On the right you’ll see a histogram of the posterior intensity showing how well the closed form Gaussian mixture implementation performs. Here we’ll actually have a total of 10 targets on the scene at any one time.

31 CPHD filter Demonstration
Sonar images

32 Multi-object Bayes filter
MeMBer Filter Multi-object Bayes filter pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) update pk(Xk|Z1:k)   (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update (i) (i) Mk  {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )}  i=1 i=1 i=1 (Multi-target Multi-Bernoulli ) MeMBer filter [Mahler 07], biased Cardinality-Balanced MeMBer filter [Vo et. al. 07], unbiased Approximate predicted/posterior RFSs by Multi-Bernoulli RFSs Valid for low clutter rate & high probability of detection

33 Cardinality-Balanced
Cardinality-Balanced MeMBer Filter (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk   i=1 i=1 i=1 (i) Mk-1 {(rP,k|k-1, pP,k|k-1)} È {(rG,k, pG,k)} (i) (i) (i) MG,k i=1 i=1 rk-1á pk-1, pS,kñ (i) term for object births (i) á fk|k-1(·|·), pk-1 pS,kñ Cardinality-Balanced MeMBer filter [Vo et. al. 07] ápk-1, pS,kñ (i)

34 Cardinality-Balanced
Cardinality-Balanced MeMBer Filter (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update  {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk  i=1 i=1 i=1 {(rL,k, pL,k)} È {(rU,k,(z), pU,k(z))} (i) (i) Mk|k-1 i=1 zÎZk 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1(1- ápk|k-1, pD,kñ) 1- ápk|k-1, pD,kñ (i) pk|k-1(1- pD,k) 1- rk|k-1 (i) rk|k-1 pk|k-1 i=1 Mk|k-1 S pD,kgk(z|·) rk|k-1ápk|k-1, pD,kgk(z|·)ñ 1- rk|k-1 (i) i=1 Mk|k-1 S Mk|k-1 rk|k-1(1- rk|k-1) ápk|k-1, pD,kgk(z|·)ñ (i) (i) (i) S (1- rk|k-1ápk|k-1, pD,kñ)2 (i) (i) i=1 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1 ápk|k-1, pD,kgk(z|·)ñ i=1 Mk|k-1 S Cardinality-Balanced MeMBer filter [Vo et. al. 07] k(z) +

35 Cardinality-Balanced MeMBer Filter
Closed-form (Gaussian mixture) solution [Vo et. al. 07], Jk-1 (i,j) (i,j) (i,j) Jk|k-1 {wk-1, mk-1, Pk-1} (i,j) (i,j) (i,j) {wk|k-1, mk|k-1, Pk|k-1} {wk, mk, Pk } (i,j) (i,j) (i,j) Jk j=1 j=1 j=1 (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk   i=1 i=1 i=1 {wk-1, xk-1} (i,j) (i,j) Jk-1 {wk|k-1, xk|k-1 } (i,j) (i,j) Jk|k-1 (i,j) (i,j) Jk {wk, xk } j=1 j=1 j=1 Particle implementation [Vo et. al. 07], More useful than PHD filters in highly non-linear problems

36 Performance comparison
Example: 10 targets max on scene, with births/deaths 4D states: x-y position/velocity, linear Gaussian observations: x-y position, linear Gaussian Dynamics constant velocity model: v = 5ms-2, survival probability: pS,k = 0.99, Observations additive Gaussian noise:  =10m, detection probability: pD,k = 0.98, uniform Poisson clutter: c = 2.5x10-6m-2 / start/end positions

37 Gaussian implementation
1000 MC trial average Cardinality-Balanced Recursion Mahler’s MeMBer Recursion

38 Gaussian implementation
1000 MC trial average CPHD Filter has better performance

39 Particle implementation
1000 MC trial average CB-MeMBer Filter has better performance

40 Concluding Remarks Thank You! Random Finite Set framework
Rigorous formulation of Bayesian multi-target filtering Leads to efficient algorithms Future research directions Track before detect Performance measure for multi-object systems Numerical techniques for estimation of trajectories For more info please see Thank You!

41 References D. Stoyan, D. Kendall, J. Mecke, Stochastic Geometry and its Applications, John Wiley & Sons, 1995 D. Daley and D. Vere-Jones, An Introduction to the Theory of Point Processes, Springer-Verlag, 1988. I. Goodman, R. Mahler, and H. Nguyen, Mathematics of Data Fusion. Kluwer Academic Publishers, 1997. R. Mahler, “An introduction to multisource-multitarget statistics and applications,” Lockheed Martin Technical Monograph, 2000. R. Mahler, “Multi-target Bayes filtering via first-order multi-target moments,” IEEE Trans. AES, vol. 39, no. 4, pp. 1152–1178, 2003. B. Vo, S. Singh, and A. Doucet, “Sequential Monte Carlo methods for multi-target filtering with random finite sets,” IEEE Trans. AES, vol. 41, no. 4, pp. 1224–1245, 2005,. B. Vo, and W. K. Ma, “The Gaussian mixture PHD filter,” IEEE Trans. Signal Processing, IEEE Trans. Signal Processing, Vol. 54, No. 11, pp , 2006. R. Mahler, “A theory of PHD filter of higher order in target number,” in I. Kadar (ed.), Signal Processing, Sensor Fusion, and Target Recognition XV, SPIE Defense & Security Symposium, Orlando, April 17-22, 2006 B. T. Vo, B. Vo, and A. Cantoni, "Analytic implementations of the Cardinalized Probability Hypothesis Density Filter," IEEE Trans. SP, Vol. 55,  No. 7,  Part 2,  pp , 2007. D. Clark & J. Bell, “Convergence of the Particle-PHD filter,” IEEE Trans. SP, 2006. A. Johansen, S. Singh, A. Doucet, and B. Vo, "Convergence of the SMC implementation of the PHD filter," Methodology and Computing in Applied Probability, 2006. A. Pasha, B. Vo, H. D Tuan and W. K. Ma, "Closed-form solution to the PHD recursion for jump Markov linear models," FUSION, 2006. D. Clark, K. Panta, and B. Vo, "Tracking multiple targets with the GMPHD filter," FUSION, 2006. B. T. Vo, B. Vo, and A. Cantoni, “On Multi-Bernoulli Approximation of the Multi-target Bayes Filter," ICIF, Xi’an, 2007. See also:

42 Representation of Multi-target state
Optimal Subpattern Assignment (OSPA) metric [Schumacher et. al 08] Fill up X with n - m dummy points located at a distance greater than c from any points in Y Calculate pth order Wasserstein distance between resulting sets Efficiently computed using the Hungarian algorithm

43 S S [pS,kwk-1N(x; mS,k|k-1, PS,k|k-1) +
Gaussian Mixture PHD Prediction Gaussian mixture posterior intensity at time k-1: vk-1(x) = wk-1N(x; mk-1, Pk-1) S i=1 Jk-1 (i) Gaussian mixture predicted intensity to time k: vk|k-1(x) = [pS,kwk-1N(x; mS,k|k-1, PS,k|k-1) + S i=1 Jk-1 (i) wk-1wb,kN(x; mb,k|k-1, Pb,k|k-1)] + gk(x) (i,l) l=1 Jb,k (l) Fk|k-1vk-1 mS,k|k-1 = Fk-1mk-1 PS,k|k-1 = Fk-1 Pk-1 Fk-1 + Qk-1 (i) T (i,l) (l) mb,k|k-1 = Fb,k-1mk-1 + db,k-1 Pb,k|k-1 = Fb,k-1 Pk-1 (Fb,k-1 )T + Qb,k-1 (i)

44 S S S S Gaussian Mixture PHD Update
Gaussian mixture predicted intensity to time k: vk|k-1(x) = wk|k-1N(x; mk|k-1, Pk|k-1) S i=1 Jk|k-1 (i) Gaussian mixture updated intensity at time k: vk(x) = i=1 Jk|k-1 (i) N(x; mk|k(z), Pk|k) + (1- pD,k)vk|k-1(x) S S zÎ Zk (j) j=1 S pD,k wk|k-1qk (z) + kk(z) pD,kwk|k-1qk (z) Ykvk|k-1 mk|k(z) = mk|k-1 + Kk (z- Hk mk|k-1 ) (i) Pk|k = (I- Kk Hk )Pk|k-1 (i) qk(z) = N(z; Hkmk|k-1, HkPk|k-1Hk + Rk ) T (i) Kk = Pk|k-1Hk (Hk Pk|k-1Hk + Rk )-1 (i) T

45 Markov transition density
Cardinalised PHD Prediction S j=0 n pk|k-1(n) = p,k(n - j) k|k-1[vk-1,pk-1](j) predicted cardinality probability of n - j spontaneous births probability of j surviving targets S l=j Cjl <pS,k ,vk-1> j <1- pS,k ,vk-1> l-j pk-1 (l) <1,vk-1>l vk|k-1(xk) = pS,k(xk-1) fk|k-1(xk|xk-1) vk-1(xk-1)dxk-1 + gk(xk) predicted intensity probability of survival Markov transition density intensity from previous time-step intensity of spontaneous object births Gk

46 S (P ) Cardinalised PHD Update S esfj(Z) = S pk(n) = +
¡k[vk|k-1, Zk](n)pk|k-1(n) vk(xk) = vk|k-1(xk)Yk, Zk(xk) predicted intensity updated intensity pk(n) = <¡k[vk|k-1, Zk], pk|k-1> updated cardinality distribution predicted cardinality distribution ¡k[v, Z](n) = pK,k(|Z|–j) (|Z|–j)! Pj+u S esfj({<v,yk,z>: zZk}) <1- pD,k ,v >n-(j+u) <1,v >n n j=0 min(|Z|,n) u S (P ) z zS S Í Z,|S|=j esfj(Z) = likelihood function prob. of detection clutter intensity pD,k(xk)gk(z|xk)<1,kk>/kk(z) clutter cardinality distribution S zZk yk,z(xk) + <¡k[vk|k-1, Zk], pk|k-1> <¡k[vk|k-1, Zk\{z}], pk|k-1> 1 (1-pD,k(xk))

47 Multi-object Bayes filter
Mahler’s MeMBer Filter Multi-object Bayes filter pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) update pk(Xk|Z1:k)   (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update (i) (i) Mk  {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )}  i=1 i=1 i=1 (Multi-target Multi-Bernoulli ) MeMBer filter [Mahler 07] Approximate predicted/posterior RFSs by Multi-Bernoulli RFSs Valid for low clutter rate & high probability of detection Biased in Cardinality (except when probability of detection = 1)

48 Cardinality-Balanced
Cardinality-Balanced MeMBer Filter (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk   i=1 i=1 i=1 Cardinality-Balanced MeMBer filter [Vo et. al. 07] (i) (i) Mk|k-1 {(rL,k, pL,k)} È {(rU,k,(z), pU,k(z))} i=1 zÎZk 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1(1- ápk|k-1, pD,kñ) 1- ápk|k-1, pD,kñ (i) pk|k-1(1- pD,k) ávk|k-1, pD,kgk(z|·)ñ vk|k-1 pD,kgk(z|·) ~* (1- rk|k-1ápk|k-1, pD,kñ)2 (i) rk|k-1(1- rk|k-1) pk|k-1 i=1 Mk|k-1 vk|k-1 = S (1) k(z) + ávk|k-1, pD,kgk(z|·)ñ ávk|k-1, pD,kgk(z|·)ñ (1) ~ 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1 pk|k-1 i=1 Mk|k-1 vk|k-1 = ~ S 1- rk|k-1 (i) rk|k-1 pk|k-1 i=1 Mk|k-1 vk|k-1 = ~* S

49 Extensions of the PHD filter
Linear Jump Markov PHD filter [Pasha et. al. 06]

50 Extensions of the PHD filter
Example: 4-D, Linear JM target dynamics with 3 models 4 targets, birth rate= 3x0.05, death prob. = 0.01, clutter rate = 40

51 What is a Random Finite Set (RFS)?
I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Pine saplings in a Finish forest [Kelomaki & Penttinen] Childhood leukaemia & lymphoma in North Humberland [Cuzich & Edwards] The number of points is random, The points have no ordering and are random Loosely, an RFS is a finite set-valued random variable Also known as: (simple finite) point process or random point pattern


Download ppt "Random Set/Point Process in Multi-Target Tracking"

Similar presentations


Ads by Google