Presentation is loading. Please wait.

Presentation is loading. Please wait.

Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen

Similar presentations


Presentation on theme: "Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen"— Presentation transcript:

1 Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen aecp@diku.dk

2 Tracking by image differencing The problem: detecting and tracking moving objects, with no prior knowledge about the objects or the camera, except that the camera is fixed.

3 Image differencing [see movie] Non-zero grey-level differences are all over the place ! Moving targets generate clusters which are not compact There is no well-defined threshold to discriminate between targets and background

4 Usual strategy A sequence of algorithms is applied to the difference image: Thresholding Morphological operators Connected-component analysis Feature detection matching from frame to frame or Kalman filtering

5 An alternative: cluster tracking Find clusters of pixels significantly different from the reference image Optimize cluster parameters by the EM clustering algorithm Remove, merge or split clusters on the basis of cluster parameters None of the operations on the previous slide (except - possibly - Kalman filtering)

6 Generative model The camera is looking at an unknown number of moving objects, each of which generates a cluster of pixels in the image Moving objects generate clusters with bivariate Gaussian distributions in image coordinates There is no correlation between colors of the moving objects and colors of the background, so grey-levels differences in the moving clusters have a uniform distribution

7 Background model For consistency, the background is also considered as a generator of pixels The pixels originating from the background have uniform distribution over the image Grey-level differences in the background are likely (but not certain) to be close to zero: a double-exponential distribution gives a good fit

8 Generative model (re-phrased) The image is generated by a density mixture, in which each pixel is a data point in 3D (2 image coordinates plus grey level) The background is a cluster with uniform distribution in image coordinates and Laplacian (2-sided exponential) distribution in grey-level differences Moving objects are clusters with bivariate Gaussian distributions in image coordinates and uniform distribution in grey-level differences

9 A 2-D example

10 Background adaptation The background image, subtracted from the current image, is not constant If it were known which pixels belong to the background, it would be possible to update their grey levels by low-pass filtering Since we only have probabilities of pixels belonging to the background, we use these probabilities

11 Background adaptation Each reference pixel value is a low-pass version of the corresponding pixel in the image sequence The pixel is updated in proportion to the probability that it is a background pixel, i.e. the time constant of adaptation is inversely proportional to the probability that the pixel originates from the background.

12 Cluster merging Two clusters are merged if the expected decrease of the log-likelihood after merging, is less than a threshold. For computational convenience, the decrease of log- likelihood is assumed to be equal to the decrease of the EM functional.

13 Cluster merging (continued) The expected change of EM functional after merging depends on the number of pixels in the 2 clusters; the similarity in densities between the 2 clusters; the distance between the 2 clusters; the similarity in shape of the 2 clusters.

14 Example: too far for merging

15 Example: close enough for merging

16 The 2 clusters do not fit into a single ellipse

17 The 2 clusters fit into a single ellipse

18 Cluster removal A moving cluster is removed if the expected decrease of EM functional after merging it into the background, is less than a threshold. The expected change of EM functional depends on the parameters of the moving cluster: the number of pixels originating from the cluster; the density of pixels in the Gaussian ellipse; the average grey-level difference.

19 Cluster generation If there are lots of pixels close together, which are unlikely to be generated from any of the clusters (including the background cluster), a new cluster is generated

20 Cluster splitting A cluster is split if the distribution of pixels belonging to the cluster is significantly different from the expected distribution, in any of its sections

21 Iterating over an image sequence For each frame: All clusters are initialized with the same parameters as in the previous frame Next, the image is inspected for generation of new clusters Clusters are tested for splitting The EM clustering algorithm is applied to update the cluster parameters Clusters are tested for merging or removal Finally, the background is updated

22 Cluster parameters For the target clusters: center and covariance of the Gaussian ellipsoid For the background cluster: background image and exponential parameter of the distribution of grey-level differences In addition, number of clusters and numbers of pixels for each cluster

23 Parameters of the method Thresholds for the generation, splitting, removal/merging of clusters Initial estimate of the covariance of a new cluster Time constant of local background adaptation Convergence criterion for EM

24 PETS 2000 test sequence

25 PETS 2001 test sequence 3:1

26 Conclusions The method is limited by occlusion Merging can be reduced by learning the grey-level distribution for each target cluster, but not much (PETS 2001) Deciding which pixel belongs to which cluster is never necessary


Download ppt "Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen"

Similar presentations


Ads by Google