Kernel Similarity Modeling of Texture Pattern Flow for Motion Detection in Complex Background 2011 IEEE transection on CSVT Baochang Zhang, Yongsheng Gao, Sanqiang Zhao, Bineng Zhong
Outline TPF Operator Kernel Similarity Modeling Experiment Result Conclusion
TPF Operator-Spatial
TPF Operator-Temporal The temporal derivative is defined as A pixel value lying within 2.5 standard deviations of a distribution is defined as a match matc h
TPF Operator By integrating both spatial and temporal information, the TPF is defined as TPF reveals the relationship between derivative directions in both spatial and temporal domains
Flowchart for one pixel
Integral Histogram
Integral Histogram of TPF Using a neighborhood region provides certain robustness against noise When the local region is too large, the more details will be lost
Building Background Model Use GMM to model the background If a match has been found for the pixel, update mean and variance of the matched Gaussian distribution If none of the K Gaussian distributions match the current pixel value, the least probable distribution is replaced with a new distribution whose mean is the current pixel value
Kernel Similarity Measurement We use k to represent the result of kernel similarity With the information of kernel similarity, we can get an adaptive threshold to classify the input pixel
Update the Background Model If the pixel is labeled as background, the background model histogram with the highest similarity value will be updated with the new data
Experiment Results
Experiment 1
Experiment 2 Wallflower video (a)GMM (b)CMU (c)LBP (d)TPF (e)KSM-TPF
Experiment 2 GMM CMU LBP TPF KSM-TPF
Conclusion KSM-TPF is much more robust to significant background variations However, it is less computationally efficient than the GMM method or LBP method