Download presentation

Presentation is loading. Please wait.

1
A Graphical Model For Simultaneous Partitioning And Labeling Philip Cowans & Martin Szummer AISTATS, Jan 2005 Cambridge

2
Motivation – Interpreting Ink Hand-drawn diagram Machine interpretation

3
Graph Construction Vertices are grouped into parts. Each part is assigned a label G Edges, E Vertices, V

4
Labeled Partitions We assume: We assume: Parts are contiguous. Parts are contiguous. The graph is triangulated. The graph is triangulated. We’re interested in probability distributions over labeled partitions conditioned on observed data. We’re interested in probability distributions over labeled partitions conditioned on observed data.

5
Conditional Random Fields CRFs (Lafferty et. al.) provide joint labeling of graph vertices. CRFs (Lafferty et. al.) provide joint labeling of graph vertices. Idea: define parts to be contiguous regions with same label. Idea: define parts to be contiguous regions with same label. But… But… Large number of labels needed. Large number of labels needed. Symmetry problems / bias. Symmetry problems / bias. +2+1 +2 +3

6
A Better Approach… Extend the CRF framework to work directly with labeled partitions. Extend the CRF framework to work directly with labeled partitions. Complexity is improved – don’t need to deal with so many labels. Complexity is improved – don’t need to deal with so many labels. No symmetry problem – we’re working directly with the representation in which the problem is posed. No symmetry problem – we’re working directly with the representation in which the problem is posed.

7
Consistency Let G and H µ V. Let G and H µ V. Y (G) and Y (H) are consistent if and only if: Y (G) and Y (H) are consistent if and only if: For any vertex in G Å H, Y (G) and Y (H) agree on its label. For any vertex in G Å H, Y (G) and Y (H) agree on its label. For any pair of vertices in G Å H, Y (G) and Y (H) agree on their part membership. For any pair of vertices in G Å H, Y (G) and Y (H) agree on their part membership. Denoted Y (G) v Y (H). Denoted Y (G) v Y (H).

8
Projection Projection maps labeled partitions onto smaller subgraphs. Projection maps labeled partitions onto smaller subgraphs. If G µ V then, the projection of Y onto G is the unique labeled partition of G which is ‘consistent’ with Y. If G µ V then, the projection of Y onto G is the unique labeled partition of G which is ‘consistent’ with Y.

9
Notation Y Labeled partition of G Y (A) Labeled partition of the induced subgraph of A µ V YAYAYAYA Projection of Y onto A µ V YiYiYiYi Projection of Y onto vertex i. Y ij Projection of Y onto vertices i and j.

10
Potentials

11
The Model - Unary: - Pairwise:

12
The Model

13
Training Train by finding MAP weights on example data with Gaussian prior (BFGS). Train by finding MAP weights on example data with Gaussian prior (BFGS). We require the value and gradient of the log posterior: We require the value and gradient of the log posterior: Normalization Marginalization

14
Prediction New data is processed by finding the most probable labeled partition. New data is processed by finding the most probable labeled partition. This is the same as normalization with the summation replaced by a maximization. This is the same as normalization with the summation replaced by a maximization.

15
Inference These operations require summation or maximization over all possible labeled partitions. These operations require summation or maximization over all possible labeled partitions. The number of terms grows super- exponentially with the size of G. The number of terms grows super- exponentially with the size of G. Efficient computation possible using message passing as distribution factors. Efficient computation possible using message passing as distribution factors. Proof based on Shenoy & Shafer (1990). Proof based on Shenoy & Shafer (1990).

16
Factorization A distribution factors if it can be written as a product of potentials for cliques on the graph: A distribution factors if it can be written as a product of potentials for cliques on the graph: This is the case for the (un-normalized) model. This is the case for the (un-normalized) model. This allows efficient computation using message passing. This allows efficient computation using message passing.

17
Message Passing 4 8 7 9 1 23 5 6

18
1,2,3,4 2,3,4,5 2,91,7,8 4,5,6 ‘Upstream’ Message summarizes contribution from ‘upstream’ to the sum for a given configuration of the separator. Junction tree constructed from cliques on original graph.

19
Message Passing PartitionLabelsValue (2)(3)(4)+,+,-0.012 (2)(3)(4)+,-,-0.043 (2,3)(4)+,+0.134 (2,3,4)-0.235 ……… 1,2,3,4 2,3,4,5 2,91,7,8 4,5,6PartitionLabelsValue(2)+0.43 (2)-0.72 PartitionLabelsValue(1)+0.23 (1)-0.57 x22

20
Message Update Rule Update messages (for summation) according to Update messages (for summation) according to Marginals found using Marginals found using Z can be found explicitly Z can be found explicitly

21
Complexity Clique Size 2345 CRF162164096 1.0 £ 10 5 Labeled Partitions 62294454

22
Experimental Results We tested the algorithm on hand drawn ink collected using a Tablet PC. We tested the algorithm on hand drawn ink collected using a Tablet PC. The task is to partition the ink fragments into perceptual objects, and label them as containers or connectors. The task is to partition the ink fragments into perceptual objects, and label them as containers or connectors. Training data set was 40 diagrams, from 17 subjects with a total of 2157 fragments. Training data set was 40 diagrams, from 17 subjects with a total of 2157 fragments. 3 random splits (20 training and 20 test examples). 3 random splits (20 training and 20 test examples).

23
Example 1

25
Example 2

27
Example 3

29
Labeling Results Model Labeling Error Grouping Error Independent Labeling 8.5%- Joint Labeling 4.5%- Labeled Partitions 2.6%8.5% Labelling error: fraction of fragments labeled incorrectly. Labelling error: fraction of fragments labeled incorrectly. Grouping error: fraction of edges locally incorrect. Grouping error: fraction of edges locally incorrect.

30
Conclusions We have presented a conditional model definied over labeled partitions of an undirected graph. We have presented a conditional model definied over labeled partitions of an undirected graph. Efficient exact inference is possible in our model using message passing. Efficient exact inference is possible in our model using message passing. Labeling and grouping simultaneously can improve labeling performance. Labeling and grouping simultaneously can improve labeling performance. Our model performs well when applied to the task of parsing hand-drawn ink diagrams. Our model performs well when applied to the task of parsing hand-drawn ink diagrams.

31
Acknowledgements Thanks to: Thomas Minka, Yuan Qi and Michel Gagnet for useful discussion and providing software. Hannah Pepper for collecting our ink database.

Similar presentations

© 2020 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google