Download presentation

Presentation is loading. Please wait.

Published byRolf Hampton Modified over 3 years ago

1
An Information-Theoretic Framework for Flow Visualization Lijie Xu, Teng-Yok Lee, & Han-Wei Shen The Ohio State University

2
2 Introduction: Visualization vs Communication Data Visualization Algorithm OutputInput Communication Channel Visualization The effectiveness of a visualization algorithm depends on the amount of information that can be transmitted

3
3 Introduction: Information-theoretic Visualization Framework Can more information be shown? Visualization Algorithm Yes Visualization Data Visualization should be information-dependent e.g. for flow visualization, the flow near the salient flow features shouldn’t be missed Information Information in Visualization No Information in Data Stop

4
4 Outline 1.Realization of this visualization framework for static flow a.Measurement of information in data (vector field) b.Measurement for information in visualization (streamlines) c.Information comparison between data & visualization 2.An information-aware streamline placement algorithm Place streamlines to highlight salient flow features Automatically stop when no more information can be shown

5
5 Review: Information and Entropy Information theory: quantitatively measures the amount of information contained in a data source Data Source Random variable X x1x1 x2x2 x3x3 x4x4 p(x i ): Probability X = x i p(xi)p(xi) x1x1 x2x2 x3x3 x4x4 Less information Only one possible outcome p(xi)p(xi) x1x1 x2x2 x3x3 x4x4 More information More equally possible outcomes Maximal EntropyMinimal Entropy Entropy of X H(X) = -∑p(x i ) log 2 p(x i )

6
6 Information in Vector Fields Concept Treat the vector field as a data source that generates vector orientation as outcome The more diverse the vector orientations, the more information is contained in the vector field Measurement Estimate the distribution of the vector orientation Compute the entropy of this distribution as the measurement Vector field Polar Histogram

7
7 Information in Vector Fields (contd’)

8
8 Entropy Field and Seeding Measure the entropy around each point’s neighborhood Vector Field Entropy field: higher value means more information in the corresponding region Entropy-based seeding: Places streamlines on the region with high entropy

9
9 Information in Streamlines Given information: estimate the vector along the streamlines by their trajectory Reconstructed information: synthesize the vectors of the empty regions What kind of information is represented by the streamlines? A synthesized vector field

10
10 The Information Comparison between Data/Visualization Conditional entropy H(X|Y): The information in X not represented by Y An effective visualization should represent most information in the data, i.e. H(X|Y) should be small Vector Field X H(X)H(X)H(Y)H(Y) Streamlines Y H(X|Y)H(X|Y)

11
11 H(X)H(X) Conditional Entropy and Joint Entropy H(X|Y)H(X|Y) H(Y) H(X, Y) H(Y)H(Y) – = Vector field from the streamlines H(Y): entropy of the distribution of only vectors Y on all points Input vector field H(X, Y): entropy of the joint distribution of both vectors X and Y on all points H(X, Y): Joint Entropy of both X and Y H(Y): Entropy of Y streamlines

12
12 Conditional Entropy Field and Seeding Measure the under-represented information in each region Streamlines Conditional entropy field Conditional-entropy-based seeding: Place more seeds on regions with higher under-represented information

13
13 Removal of Redundant Streamlines Remove the streamlines that cannot reduce the conditional entropy Computing conditional entropy per streamline is slow Approximation Discard a streamline if all its points are too close to existing streamlines Use a smaller distance threshold for regions with higher entropy After the removal of redundant streamlines Initially placed seeds

14
14 Information-theoretic Streamline Placement Algorithm Can more information be shown? Visualization Algorithm Yes (Streamlines)(Vector Field) Information Information in Visualization No Information in Data Stop (Entropy Field) Iteration 1 Entropy-based Seeding Iteration > 1 Conditional- entropy-based Seeding Can H(X|Y) be further reduced? (Synthesized Vector Y)(Original Vector X) DataVisualization

15
15 Result: 2D Vector Fields 1 st iteration: Entropy- based seeding 2 nd iteration: Cond.- entropy-based seeding When conditional entropy converges Conditional entropy

16
16 Comparison with Distance-based Methods Evenly-space streamline placement (Jobard & Lefer, 1997) 1 Farthest-point streamline placement (Mebarki et al., 2005) 2 Information-theoretic streamline placement 1. B. Jobard and W. Lefer. Creating evenly-spaced streamlines of arbitrary density. In Eurographics Workshop on Visualization in Scientific Computing, 1997. 2. A. Mebarki, et al. Farthest point seeding for efficient placement of streamlines. In IEEE Vis ’05, 2005. Emphasizes regions with more information

17
17 Comparison with Distance-based Methods (contd’) The first 54 streamlines placed by different methods. Distance-based methods need more streamlines to capture the salient flow features Entropy-based streamline placement Evenly-space streamline placement (Jobard & Lefer, 1997) 1 Farthest-point streamline placement (Mebarki et al., 2005) 2

18
18 Result: 3D Vector Fields Entropy field 1 st iteration: Entropy- based seeding 2 nd iteration: Cond.- entropy-based seeding Conditional entropy

19
19 Entropy-based Occlusion-reduction Reduce the occlusion to the salient features in 3D flow field Use a transfer function to map the entropy of each streamline fragment to its opacity

20
20 Conclusion Contributions An information-theoretic visualization framework Information measurement for flow field visualization An information-aware streamline placement algorithm Future work Time-varying flow fields Other types of visualization Isosurface Direct volume rendering

21
21 Acknowledgement The authors would like to thank Torsten Möller, Carrie Stein, and the anonymous reviewers for their comments This work was supported in part by NSF ITR Grant ACI-0325934 NSF RI Grant CNS-0403342 NSF Career Award CCF-0346883 DOE SciDAC grant DE-FC02-06ER25779 Questions?

Similar presentations

OK

Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Summarized by J.W. Ha Biointelligence Laboratory, Seoul National University.

Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Summarized by J.W. Ha Biointelligence Laboratory, Seoul National University.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on introduction to c programming Ppt on aircraft landing gear systems Ppt on centring ig Download ppt on chromosomes Business intelligence microsoft ppt online Ppt on automobile related topics about computer Ppt on business etiquettes training programs Ppt on cross-sectional study limitations Marketing mix ppt on sony Ppt on duty roster