Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields.

Similar presentations


Presentation on theme: "1 Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields."— Presentation transcript:

1 1 Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields with Mean-field Inference http://cms.brookes.ac.uk/research/visiongroup/

2 Labelling Problem 2 StereoObject detection Assign a label to each image pixel Object segmentation

3 Problem Formulation Find a labelling that maximizes the conditional probability or minimizes the energy function 3

4 Problem Formulation 4 Grid CRF construction Inference Grid CRF leads to over smoothing around boundaries

5 Problem Formulation 5 Grid CRF leads to over smoothing around boundaries Dense CRF is able to recover fine boundaries Grid CRF construction Dense CRF construction Inference

6 Inference in Dense CRF 6 Very high time complexity alpha-expansion takes almost 1200 secs/per image with neighbourhood size of 15 on PascalVOC segmentation dataset graph-cuts based methods not feasible

7 Inference in Dense CRF 7 Filter-based mean-field inference method takes 0.2 secs * * Krahenbuhl et al. Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials, NIPS 11 Efficient inference under two assumptions Mean-field approximation to CRF Pairwise weights take Gaussian weights

8 Efficient inference in dense CRF 8 Intractable inference with distribution P Approximate distribution from tractable family Mean-fields methods (Jordan et.al., 1999)

9 Naïve mean field 9 Assume all variables are independent

10 Efficient inference in dense CRF 10 Assume Gaussian pairwise weight Mixture of Gaussian kernels Bilateral Spatial

11 Marginal update 11 Marginal update involve expectation of cost over distribution Q given that x_i takes label l Expensive message passing step is solved using highly efficient permutohedral lattice based filtering approach Maximum posterior marginal (MPM) with approximate distribution:

12 Q distribution 12 Iteration 0 Q distribution for different classes across different iterations on CamVID dataset 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1

13 Q distribution 13 Q distribution for different classes across different iterations on CamVID dataset 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1 Iteration 1

14 Q distribution 14 Q distribution for different classes across different iterations on CamVID dataset 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1 Iteration 2

15 Q distribution 15 Q distribution for different classes across different iterations on CamVID dataset 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1 Iteration 10

16 Q distribution 16 Iter 0 Iter 1 Iter 2 Iter 10 Q distribution for different classes across different iterations on CamVID dataset

17 17 Sensitive to initialisation Restrictive Gaussian pairwise weights Two issues associated with the method

18 Our Contributions 18 Sensitive to initialisation Propose SIFT-flow based initialisation method Restrictive Gaussian pairwise weights Expectation maximisation (EM) based strategy to learn more general Gaussian mixture model Resolve two issues associated with the method

19 Sensitivity to initialisation 19 Experiment on PascalVOC-10 segmentation dataset Good initialisation can lead to better solution Propose a SIFT-flow based better initialisation method Mean-fieldAlpha-expansion Unary potential28.52 %27.88% Ground truth label41 %27.88% Observe an improvement of almost 13% in I/U score on initialising the mean-field inference with the ground truth labelling

20 SIFT-flow based correspondence 20 Given a test image, we first retrieve a set of nearest neighbours from training set using GIST features Test image Nearest neighbours retrieved from training set

21 SIFT-flow based correspondence 21 K-nearest neighbours warped to the test image 23.31 13.3114.31 18.38 22 30.8727.2 Test image Warped nearest neighbours and corresponding flows

22 SIFT-flow based correspondence 22 Pick the best nearest neighbour based on the flow value Test image 13.31 Nearest neighbour Warped image Flow:

23 Label transfer Warp the ground truth according to correspondence Transfer labels from top 1 using flow 23 Ground truth of the best nearest neighbour Flow Warped ground truth according to flow Ground truth of test image

24 SIFT-flow based initialisation 24 Rescore the unary potential Test imageGround truth After rescoring Without rescoring Qualitative improvement in accuracy after using rescored unary potential s rescores the unary potential of a variable based on the label observed after the label transfer stage set through cross-validation

25 SIFT-flow based initialisation 25 Initialise mean-field solution Test imageGround truth With initialisation Without initialisation Qualitative improvement in accuracy after initialisation of mean-field

26 Gaussian pairwise weights 26 Plotted the distribution of class-class ( ) interaction by selecting pair of random points (i-j) Experiment on PascalVOC-10 segmentation dataset Aeroplane-AeroplaneCar-Person Horse-Person

27 Gaussian pairwise weights 27 Such complex structure of data can not be captured by zero mean Gaussian Experiment on PascalVOC-10 segmentation dataset distributed horizontally not centred around zero mean distributed vertically Propose an EM-based learning strategy to incorporate more general class of Gaussian mixture model

28 Our model 28 Our energy function takes following form: We use separate weights for label pairs but Gaussian components are shared We follow piecewise learning strategy to learn parameters of our energy function

29 Learning mixture model 29 Learn the parameters similar to this model* * Krahenbuhl et al. Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials, NIPS 11

30 Learning mixture model 30 * Krahenbuhl et al. Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials, NIPS 11 Learn the parameters of the Gaussian mixture Learn the parameters similar to this model* mean, standard deviation mixing coefficients

31 Learning mixture model 31 * Krahenbuhl et al. Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials, NIPS 11 Lambda is set through cross validation Learn the parameters of the Gaussian mixture Learn the parameters similar to this model* mean, standard deviation mixing coefficients

32 Our model 32 We follow a generative training model Maximise joint likelihood of pair of labels and features: : latent variable: cluster assignment We follow expectation maximization (EM) based method to maximize the likelihood function

33 Learning mixture model 33 Aeroplane-Aeroplane Car-Person Horse-Person Our model is able to capture the true distribution of class-class interaction

34 Inference with mixture model 34 Involves evaluating M extra Gaussian terms: Perform blurring on mean-shifted points Increases time complexity

35 Experiments on Camvid 35 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1 Iteration 0 Confidence of building pixels increases with initialisation Ground truth Without initialisationWith initialisation Q distribution for building classes on CamVID dataset

36 Experiments on Camvid 36 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1 Confidence of building pixels increases with initialisation Ground truth Without initialisationWith initialisation Iteration 1 Q distribution for building classes on CamVID dataset

37 Experiments on Camvid 37 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1 Confidence of building pixels increases with initialisation Ground truth Without initialisationWith initialisation Iteration 2 Q distribution for building classes on CamVID dataset

38 Experiments on Camvid 38 0 0.1 0.2 0.30.4 0.5 0.60.70.80.9 1 Ground truth Q distribution for building classes on CamVID dataset Iteration 10 Without initialisationWith initialisation Confidence of building pixels increases with initialisation

39 Experiments on Camvid 39 Ground truth Without Initialisation With Initialisation Image 2 Building is properly recovered with our initialisation strategy

40 Experiments on Camvid 40 Quantitative results on Camvid dataset AlgorithmTime(s)Overall(%-corr)Av. RecallAv. U/I Alpha-exp0.9678.8458.6443.89 APST(U+P+H)1.685.1860.0650.62 denseCRF0.279.9659.2945.18 Ours (U+P+I)0.3585.3159.7550.56 Our model with unary and pairwise terms achieve better accuracy than other complex models Generally achieve very high efficiency compared to other methods

41 Experiments on Camvid 41 Qualitative results on Camvid dataset Image Alpha-expansion Ours Able to recover building and tree properly Ground truth

42 Experiments on PascalVOC-10 42 Qualitative results of SIFT-flow method Image Warped nearest ground truth image Output without SIFT-flow Output with SIFT-flow Able to recover missing body parts Ground truth

43 Experiments on PascalVOC-10 43 Quantitative results PascalVOC-10 segmentation dataset AlgorithmTime(s)Overall(%-corr)Av. RecallAv. U/I Alpha-exp3.079.5236.0827.88 AHCRF+Cooc3681.4338.0130.9 Dense CRF0.6771.6334.5328.4 Ours1(U+P+GM)26.780.2336.4128.73 Ours2 (U+P+I)0.9079.6541.8430.95 Ours3 (U+P+I+GM)26.778.9644.0531.48 Our model with unary and pairwise terms achieves better accuracy than other complex models Generally achieves very high efficiency compared to other methods

44 Experiments on PascalVOC-10 44 Qualitative results on PascalVOC-10 segmentation dataset Image alpha-expansion Dense CRF Ours Able to recover missing object and body parts Ground truth

45 Conclusion 45 Filter-based mean-field inference promises high efficiency and accuracy Proposed methods to robustify basic mean-field method SIFT-flow based method for better initialisation EM based algorithm for learning general Gaussian mixture model More complex higher order models can be incorporated into pairwise model

46 46 Thank you


Download ppt "1 Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields."

Similar presentations


Ads by Google