Comments We consider in this topic a large class of related problems that deal with proximity of points in the plane. We will: 1.Define some proximity.

Slides:



Advertisements
Similar presentations
Problem solving with graph search
Advertisements

CSE 4101/5101 Prof. Andy Mirzaian. References: Lecture Note 8 [LN8]LN8 [CLRS] chapter 33 Lecture Note 8 [LN8]LN8 [CLRS] chapter 33 Applications:  Proximity.
Lower Bound for Sparse Euclidean Spanners Presented by- Deepak Kumar Gupta(Y6154), Nandan Kumar Dubey(Y6279), Vishal Agrawal(Y6541)
Orthogonal Drawing Kees Visser. Overview  Introduction  Orthogonal representation  Flow network  Bend optimal drawing.
Weighted graphs Example Consider the following graph, where nodes represent cities, and edges show if there is a direct flight between each pair of cities.
Advanced Topics in Algorithms and Data Structures Lecture 7.2, page 1 Merging two upper hulls Suppose, UH ( S 2 ) has s points given in an array according.
Lecture 3: Parallel Algorithm Design
 Distance Problems: › Post Office Problem › Nearest Neighbors and Closest Pair › Largest Empty and Smallest Enclosing Circle  Sub graphs of Delaunay.
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
Chapter 15 Graph Theory © 2008 Pearson Addison-Wesley. All rights reserved.
1 Chapter 15.3 Hamilton Paths and Hamilton Circuits Objectives 1.Understand the definitions of Hamilton paths & Hamilton circuits. 2.Find the number of.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Point-and-Line Problems. Introduction Sometimes we can find an exisiting algorithm that fits our problem, however, it is more likely that we will have.
Closest Pair Given a set S = {p1, p2,..., pn} of n points in the plane find the two points of S whose distance is the smallest. Images in this presentation.
The Divide-and-Conquer Strategy
The Divide-and-Conquer Strategy
Advanced Topics in Algorithms and Data Structures Lecture 7.1, page 1 An overview of lecture 7 An optimal parallel algorithm for the 2D convex hull problem,
Convex Hull Problem Presented By Erion Lin. Outline Convex Hull Problem Voronoi Diagram Fermat Point.
Steps in DP: Step 1 Think what decision is the “last piece in the puzzle” –Where to place the outermost parentheses in a matrix chain multiplication (A.
C++ Programming: Program Design Including Data Structures, Third Edition Chapter 21: Graphs.
3. Delaunay triangulation
Nearest Neighbor. Predicting Bankruptcy Nearest Neighbor Remember all your data When someone asks a question –Find the nearest old data point –Return.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
CSE 421 Algorithms Richard Anderson Dijkstra’s algorithm.
Closest Pair of Points Computational Geometry, WS 2006/07 Lecture 9, Part II Prof. Dr. Thomas Ottmann Khaireel A. Mohamed Algorithmen & Datenstrukturen,
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Lecture 6 Divide and Conquer for Nearest Neighbor Problem Shang-Hua Teng.
The Complexity of Algorithms and the Lower Bounds of Problems
Introduction Outline The Problem Domain Network Design Spanning Trees Steiner Trees Triangulation Technique Spanners Spanners Application Simple Greedy.
Delaunay Triangulations Presented by Glenn Eguchi Computational Geometry October 11, 2001.
CSE53111 Computational Geometry TOPICS q Preliminaries q Point in a Polygon q Polygon Construction q Convex Hulls Further Reading.
Data Structures and Algorithms Graphs Minimum Spanning Tree PLSD210.
Lecture 2 Geometric Algorithms. A B C D E F G H I J K L M N O P Sedgewick Sample Points.
Algorithms: Design and Analysis Summer School 2013 at VIASM: Random Structures and Algorithms Lecture 3: Greedy algorithms Phan Th ị Hà D ươ ng 1.
UNC Chapel Hill M. C. Lin Point Location Reading: Chapter 6 of the Textbook Driving Applications –Knowing Where You Are in GIS Related Applications –Triangulation.
1 CSC 421: Algorithm Design & Analysis Spring 2013 Complexity & Computability  lower bounds on problems brute force, decision trees, adversary arguments,
7 Graph 7.1 Even and Odd Degrees.
5 -1 Chapter 5 The Divide-and-Conquer Strategy A simple example finding the maximum of a set S of n numbers.
© 2010 Pearson Prentice Hall. All rights reserved. 1 §15.3, Hamilton Paths and Circuits.
The Lower Bounds of Problems
1 Closest Pair of Points (from “Algorithm Design” by J.Kleinberg and E.Tardos) Closest pair. Given n points in the plane, find a pair with smallest Euclidean.
Discrete Structures Lecture 12: Trees Ji Yanyan United International College Thanks to Professor Michael Hvidsten.
On Graphs Supporting Greedy Forwarding for Directional Wireless Networks W. Si, B. Scholz, G. Mao, R. Boreli, et al. University of Western Sydney National.
1 The Theory of NP-Completeness 2 Cook ’ s Theorem (1971) Prof. Cook Toronto U. Receiving Turing Award (1982) Discussing difficult problems: worst case.
Chapter 8 Maximum Flows: Additional Topics All-Pairs Minimum Value Cut Problem  Given an undirected network G, find minimum value cut for all.
LIMITATIONS OF ALGORITHM POWER
© The McGraw-Hill Companies, Inc., Chapter 12 On-Line Algorithms.
Lecture 14 Lower Bounds Decision tree model Linear-time reduction.
Coverage Problems in Wireless Ad-hoc Sensor Networks Seapahn Meguerdichian 1 Farinaz Koushanfar 2 Miodrag Potkonjak 1 Mani Srivastava 2 University of California,
11 -1 Chapter 12 On-Line Algorithms On-Line Algorithms On-line algorithms are used to solve on-line problems. The disk scheduling problem The requests.
Lecture. Today Problem set 9 out (due next Thursday) Topics: –Complexity Theory –Optimization versus Decision Problems –P and NP –Efficient Verification.
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
1. For minimum vertex cover problem in the following graph give
Introduction to NP Instructor: Neelima Gupta 1.
Honors Track: Competitive Programming & Problem Solving Seminar Topics Kevin Verbeek.
1 GRAPHS – Definitions A graph G = (V, E) consists of –a set of vertices, V, and –a set of edges, E, where each edge is a pair (v,w) s.t. v,w  V Vertices.
Dr Nazir A. Zafar Advanced Algorithms Analysis and Design Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar.
CS4234 Optimiz(s)ation Algorithms
Autumn 2016 Lecture 11 Minimum Spanning Trees (Part II)
Divide and Conquer / Closest Pair of Points Yin Tat Lee
Autumn 2015 Lecture 11 Minimum Spanning Trees (Part II)
Autumn 2015 Lecture 10 Minimum Spanning Trees
Algorithm design (computational geometry)
Hamilton Paths and Hamilton Circuits
Chapter 1 Introduction.
Lecture 2 Geometric Algorithms.
Complexity Theory: Foundations
Presentation transcript:

Comments We consider in this topic a large class of related problems that deal with proximity of points in the plane. We will: 1.Define some proximity problems and see how they are related 2.Study a classic algorithm for one of the problems 3.Introduce triangulations 4.Examine a data structure that seems to represent nearly everything we might like to know about proximity in a set of points in the plane. Overview of the topic SubtopicSource Collection of proximity problemsP5.1 - P5.3 Closest pair, divide and conquer algorithmP5.4 TriangulationsP6.2 Problem definitions“ Greedy algorithm“ Constrained triangulations“ Triangulating polygonsO1 Voronoi diagramsP5.5 Definition“ Properties“ Construction“ Delaunay triangulationsO5.3 Proximity problems and Voronoi diagramsP5.6 Proximity Introduction

Closest pair CLOSEST PAIR INSTANCE: Set S = {p 1, p 2,..., p N } of N points in the plane. QUESTION: Determine the two points of S whose mutual distance is smallest. Distance is defined as the usual Euclidean distance: distance(p i,p j ) = sqrt((x i - x j ) 2 + (y i - y j ) 2 ). This problem is described as “one of the fundamental questions of computational geometry” in Preparata. Brute force A brute force solution is to compute the distance for every pair of points, saving the smallest; this requires O(dN 2 ) time. The factor d is the number of dimensions, i.e., the number of coordinates involved in each distance computation. For d = 1, we can do better than O(N 2 ), as follows: 1.Sort the N points (which are simply numbers). O(N log N) 2.Scan the sorted sequence (x 1, x 2, …, x N ) computing x i+1 - x i for i = 1, 2, …, N-1. Save the smallest difference. This is optimal for d = 1. For d = 2, can we do better than O(2N 2 )  O(N 2 )? Proximity Proximity problems

All nearest neighbors ALL NEAREST NEIGHBORS INSTANCE: Set S = {p 1, p 2,..., p N } of N points in the plane. QUESTION: Determine the “nearest neighbor” (point of minimum distance) for each point in S. Proximity Proximity problems

Nearest neighbor relation, 1 “Nearest neighbor” is a relation on a set S as follows: point b is a nearest neighbor of point a, denoted a  b, if distance(a,b) =min distance(a,c) c  S - a (a, b, c  S). The “nearest neighbor” relation is not symmetric, i.e., a  b does not imply b  a (though it could be true). Preparata, p. 186 says: “Note also that a point is not the nearest neighbor of a unique point (i.e., “  ” is not a function).” Perhaps slightly clearer: “Note that a point is not necessarily the nearest neighbor of a unique point, i.e., “  ” is not one-to-one, nor is it onto, as a point may have more than one nearest neighbor.” Proximity Proximity problems

Nearest neighbor relation, 2 Footnote 2 on Preparata, p. 186: “Although a point can be the nearest neighbor of every other point, a point can have at most six nearest neighbors in two dimensions…” I think that is backwards, and should be: “Although a point can have every other point as a nearest neighbor, a point can be the nearest neighbor of at most six other points in two dimensions…” Proximity Proximity problems cPoint c has every other point as its nearest neighbor. At most six points can have point c as their nearest neighbor.

Euclidean minimum spanning tree EUCLIDEAN MINIMUM SPANNING TREE INSTANCE: Set S = {p 1, p 2,..., p N } of N points in the plane. QUESTION: Construct a tree of minimum total length whose vertices are the points in S. A solution to this problem will be the N-1 pairs of points in S that comprise the edges of the tree. The (more general) Minimum Spanning Tree (MST) problem is usually formulated as a problem in graph theory: Given a graph G with N nodes and E weighted edges, find the subtree of G that includes every vertex with minimum total edge weight. In the Euclidean Minimum Spanning Tree (EMST) problem, the equivalent graph formulation has graph G complete (i.e., every pair of vertices is joined by an edge), with the edges weights just the distance between the vertices. Any algorithm that attacks EMST as a graph problem must necessarily take O(N 2 ) time, because a MST on a graph must contain a shortest edge, and to find the shortest edge of the graph G, using a graph approach, requires examining N 2 edges.) We seek a geometric algorithm for EMST that requires < O(N 2 ) time. Proximity Proximity problems

Triangulation TRIANGULATION INSTANCE: Set S = {p 1, p 2,..., p N } of N points in the plane. QUESTION: Join the points in S with nonintersecting straight line segments so that every region internal to the convex hull of S is a triangle. Proximity Proximity problems A triangulation for a set S is not necessarily unique. As a planar graph, a triangulation on N vertices has  3N - 6 edges.

Single-shot vs. search The previous problems (CLOSEST PAIR, ALL NEAREST NEIGHBORS, EUCLIDEAN MINIMUM SPANNING TREE, and TRIANGULATION) have been single shot. We now define two search-type proximity problems. Because these are search problems, repetitive mode is assumed, and thus preprocessing is allowed. Nearest neighbor search NEAREST NEIGHBOR SEARCH INSTANCE: Set S = {p 1, p 2,..., p N } of N points in the plane. QUESTION: Given a query point q, which point p  S is a nearest neighbor of q? Proximity Proximity problems p q

k nearest neighbors k-NEAREST NEIGHBORS INSTANCE: Set S = {p 1, p 2,..., p N } of N points in the plane. QUESTION: Given a query point q, determine the k points of S nearest to q. This problem is equivalent to the previous one for k = 1. The figure gives the solution for k = 3. Proximity Proximity problems q

Element uniqueness Preparata defines a computational prototype as an archetypal problem, one which can act as a fundamental representative for a class of problems. For example, SATISFIABILITY for NP-complete problems, or SORTING from many problems in computational geometry. Another such problem is ELEMENT UNIQUENESS. ELEMENT UNIQUENESS INSTANCE: Set S = {x 1, x 2,..., x N } of N real numbers. QUESTION: Are any two numbers x i, x j in S equal? It is shown in the text (Preparata, p. 192) using the algebraic decision tree model that a lower bound on time for ELEMENT UNIQUENESS is in  (N log N). Three problems (SORTING, ELEMENT UNIQUENESS, and EXTREME POINTS (Preparata, p. 99)) all have lower bounds on time in  (N log N). However, they are not easily transformable (“reducible”) to each other. Preparata asks: do they have a common “ancestor” problem that can be transformed into all of them? SORTING ?ELEMENT UNIQUENESS EXTREME POINTS Proximity Proximity problems O(N)O(N) O(N)O(N) O(N)O(N)

Lower bounds The proximity problems we have defined can be transformed into each other as follows: ELEMENTCLOSESTALL NEAREST UNIQUENESSPAIRNEIGHBORS  (N log N) SORTINGEUCLIDEAN MINIMUM  (N log N) SPANNING TREE TRIANGULATION BINARYNEAREST NEIGHBORSEARCH  (log N) k-NEAREST NEIGHBORS Here the arrow A B O(  (N)) means “transformable in time O(  (N)) in a way that proves the lower bound of B”. Proximity Proximity problems O(N)O(N) O(N)O(N) O(N)O(N) O(N)O(N) O(N)O(N) O(1)

Search problems BINARY SEARCH  O(1) NEAREST NEIGHBOR SEARCH BINARY SEARCH  O(1) k-NEAREST NEIGHBORS Preparata defines BINARY SEARCH in a slightly unusual way, apparently to simplify the lower bounds proof. BINARY SEARCH INSTANCE: Set S = {x 1, x 2,..., x N } of N real numbers and query real number q. Assume that for 1  i, j  N, i < j  x i < x j (preprocessing). QUESTION (Usual): Find x i such that x i  q < x i+1. QUESTION (Preparata): Find x i closest to q. Proximity Proximity problems x i-1 xixi x i+1 Preparata Usual

Search problems, 1 BINARY SEARCH has lower bound in  (log N). Transform BINARY SEARCH to NEAREST NEIGHBOR SEARCH as follows: 1.Transform instance of BINARY SEARCH: S = {x 1, x 2,..., x N } and q to an instance of NEAREST NEIGHBOR SEARCH: S = {(x 1,0), (x 2,0),..., (x N,0)} and q = (q,0). O(N) time. 2.Solve NEAREST NEIGHBOR SEARCH for S and q; let (x i,0) be the solution. 3.Transform solution point (x i,0) to real number x i, which is the solution to BINARY SEARCH. O(1) time.  NEAREST NEIGHBOR has lower bound in  (log N). Or does it? Proximity Proximity problems

Search problems, 2 But, there seems to be a problem with this proof, as given in Preparata, p. 193: The O(N) transformation dominates the  (log N) lower bound, voiding the result. We can get around that by considering the 1-dimensional version of NEAREST NEIGHBOR SEARCH, which has an instance identical to the instance of BINARY SEARCH. This resolution still has two difficulties: 1.It assumes that 2-dimensional NEAREST NEIGHBOR SEARCH has the same lower bound as the 1-dimensional version (this is probably easily proven). 2.However, the argument tantamount to a tautology, as 1-dimensional NEAREST NEIGHBOR SEARCH is the same problem as Preparata’s BINARY SEARCH. Can the proof be modified to start from the usual form of the BINARY SEARCH problem? We get a lower bound of  (log N) for k-NEAREST NEIGHBORS directly by letting k = 1, in which case the problem is the same as NEAREST NEIGHBOR SEARCH, and thus gets a lower bound in  (log N) by the same transformation. Proximity Proximity problems

Closest pair ELEMENT UNIQUENESS has lower bound in  (N log N). Transform ELEMENT UNIQUENESS to CLOSEST PAIR as follows: 1.Transform instance of ELEMENT UNIQUENESS: S = {x 1, x 2,..., x N } to an instance of CLOSEST PAIR: S = {(x 1,0), (x 2,0),..., (x N,0)}. O(N) time. 2.Solve CLOSEST PAIR for S; let (x i,0) and (x j,0) be the solution (the two closest points). 3.Transform this into a solution to ELEMENT UNIQUENESS: If x i = x j, return TRUE, else return FALSE. O(1) time.  CLOSEST PAIR has lower bound in  (N log N). Proximity Proximity problems

All nearest neighbors CLOSEST PAIR has lower bound in  (N log N). Transform CLOSEST PAIR to ALL NEAREST NEIGHBORS as follows: 1.An instance of CLOSEST PAIR: S = {p 1, p 2,..., p N } is an instance of ALL NEAREST NEIGHBORS: S = {p 1, p 2,..., p N }. O(0) time. 2.Solve ALL NEAREST NEIGHBORS for S; let A = {(p 1,q 1 ), (p 2,q 2 ), …, (p N,q N )} be the solution (q i  S, a nearest neighbor for each point in S). 3.Transform this into a solution to CLOSEST PAIR: For each pair (p j,q i )  A, compute distance(p j,q i ). Save the smallest. O(N) time.  ALL NEAREST NEIGHBORS has lower bound in  (N log N). Proximity Proximity problems

Euclidean minimum spanning tree SORTING has lower bound in  (N log N). Transform SORTING to EUCLIDEAN MINIMUM SPANNING TREE (EMST) as follows: 1.An instance of SORTING: S = {x 1, x 2,..., x N } to an instance of EMST: S = {(x 1,0), (x 2,0),..., (x N,0)}. O(N) time. 2.Solve EMST for S. A set of points along the x axis has a unique EMST, where there is an edge ((x i,0),(x j,0)) iff x i and x j are consecutive in sorted order. Let T = {(x i1,x j1 ), (x i2,x j2 ), …, (x iN,x jN )} be the solution (the edges of the EMST, in no particular order). 3.Transform this into a solution to SORTING: Initialize an array A of N entries: A.first = TRUE, A.next = 0. For each edge (x i1,x j1 ) in T: A.next[i1] = j1, A.first[j1] = FALSE. Scan A to find A.first = TRUE. From there, follow the A.next indices to read off the sorted order. O(N) time.  EMST has lower bound in  (N log N). Step 3 is not given in Preparata, it is simply described as “a simple exercise”. Proximity Proximity problems

Triangulation SORTING has lower bound in  (N log N). Transform SORTING to TRIANGULATION as follows: 1.An instance of SORTING: S = {x 1, x 2,..., x N } to an instance of TRIANGULATION: S = {(x 1,0), (x 2,0),..., (x N,0)}  {(0,-1)}. O(N) time. 2.Solve TRIANGULATION for S. Set of points S has a unique triangulation, shown in the figure. Let T = {(x i1,x j1 ), (x i2,x j2 ), …, (x iN,x jN )} be the solution (the edges of the triangulation, in no particular order). 3.Transform this into a solution to SORTING in a manner similar to the procedure for EMST, ignoring edges that include point (0,-1). O(N) time.  TRIANGULATION has lower bound in  (N log N). Proximity Proximity problems