# Mean-Field Theory and Its Applications In Computer Vision5 1.

## Presentation on theme: "Mean-Field Theory and Its Applications In Computer Vision5 1."— Presentation transcript:

Mean-Field Theory and Its Applications In Computer Vision5 1

Global Co-occurrence Terms 2 Encourages global consistency and co- occurrence of objects Without cooc With co- occurrence

Global Co-occurrence Terms 3 Defined on subset of labels Associates a cost with each possible subset

Properties of cost function 4 Non-decreasing 0.2 3.0 5.0

Properties of cost function 5 We represent our cost as second order cost function defined on binary vector:

Complexity 6 Complexity: O(NL 2 ) Two relaxed (approximation) of this form Complexity: O(NL+L 2 )

Our model Represent 2 nd order cost by binary latent variables Unary cost per latent variable 7 label level variable node (0/1)

Our model Represent 2 nd order cost by binary latent variables Pairwise cost between latent variable 8

Global Co-occurrence Cost Two approximation to include into fully connected CRF 9

Global Co-occurrence Terms First model 10

Global Co-occurrence Terms Model 11

Global Co-occurrence Terms Constraints (lets take one set of connections) 12 If latent variable is on, atleast one of image variable take that label If latent variable is off, no image variable take that label

Global Co-occurrence Terms Pay a cost K for violating first constraint 13

Global Co-occurrence Terms Pay a cost K for violating second constrait 14

Global Co-occurrence Terms Cost for first model: 15

Global Co-occurrence Terms Second model Each latent node is connected to the variable node 16

Global Co-occurrence Terms Constraints (lets take one set of connections) 17 If latent variable is on, atleast one of image variable take that label If latent variable is off, no image variable take that label

Global Co-occurrence Terms Pay a cost K for violating the constraint 18

Global Co-occurrence Terms Cost for second model: 19

Global Co-occurrence Terms Expectation evaluation for variable Yl Case 1: Y_l takes label 0 20

Global Co-occurrence Terms Expectation evaluation for variable Yl Case 1: Y_l takes label 0 21

Global Co-occurrence Terms Expectation evaluation for variable Yl Case 1: Y_l takes label 0 22

Global Co-occurrence Terms Expectation evaluation for variable Yl Case 1: Y_l takes label 1 23

Global Co-occurrence Terms Expectation evaluation for variable Yl Case 1: Y_l takes label 1 24

Global Co-occurrence Terms Expectation evaluation for variable Yl 25

Global Co-occurrence Terms Latent variable updates: 26

Global Co-occurrence Terms Latent variable updates: 27

Global Co-occurrence Terms Pay a cost K if variable takes a label l and corresponding latent variable takes label 0 28

Complexity Expectation updates for latent variable Y_l 29

Complexity Expectation updates for latent variable Y_l 30 Overall complexity: Does not increase original complexity:

PascalVOC-10 dataset 31 Qualitative analysis: observe an improvement over other comparative methods

PascalVOC-10 dataset 32 AlgorithmTime (s)OverallAv. RecallAv. I/U AHCRF+Cooc3681.4338.0130.09 Dense CRF0.6771.4334.5328.40 Dense + Potts4.3579.8740.7130.18 Dense + Potts + Cooc 4.480.4443.0832.35 Observe an improvement of almost 2.3% improvement Almost 8-9 times faster than alpha-expansion based method

Mean-field Vs. Graph-cuts 33 Measure I/U score on PascalVOC-10 segmentation Increase standard deviation for mean-field Increase window size for graph-cuts method Both achieve almost similar accuracy

Window sizes 34 AlgorithmModelTime (s)Av. I/U Alpha-exp (n=10)Pairwise326.1728.59 Mean-fieldpairwise0.6728.64 Alpha-exp (n=3)Pairwise + Potts56.829.6 Mean-fieldPairwise + Potts4.3530.11 Alpha-exp (n=1)Pairwise + Potts + Cooc 103.9430.45 Mean-fieldPairwise + Potts + Cooc 4.432.17 Comparison on matched energy Impact of adding more complex costs and increasing window size

PascalVOC-10 dataset 35 AlgorithmbkgplaneCyclebirdBoat AHCRF+ Cooc 82.543.24.917.427.1 Dense + Potts + Cooc 82.944.615.818.926.3 AlgorithmbottleBuscarcatChair AHCRF+ Cooc 31.349.451.029.37.1 Dense + Potts + Cooc 31.748.955.233.37.9 Per class Quantitative results

PascalVOC-10 dataset 36 AlgorithmCowDtbdoghorseMbike AHCRF+ Cooc 26.78.317.024.027.1 Dense + Potts + Cooc 27.016.116.823.443.8 Algorith m psonPlantsheepsofatrainTVAv AHCRF+ Cooc 41.921.825.216.443.843.430.9 Dense + Potts + Cooc 38.421.130.915.544.036.832.35 Per class Quantitative results

Mean-field Vs. Graph-cuts 37 Measure I/U score on PascalVOC-10 segmentation Increase standard deviation for mean-field Increase window size for graph-cuts method Time complexity very high, making infeasible to work with large neighbourhood system