CSE 421 Algorithms Richard Anderson Lecture 4. What does it mean for an algorithm to be efficient?

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Graph Algorithms
Advertisements

Algorithms (and Datastructures) Lecture 3 MAS 714 part 2 Hartmut Klauck.
Coloring Warm-Up. A graph is 2-colorable iff it has no odd length cycles 1: If G has an odd-length cycle then G is not 2- colorable Proof: Let v 0, …,
Lecture 7 March 1, 11 discuss HW 2, Problem 3
Bipartite Matching, Extremal Problems, Matrix Tree Theorem.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
CSE 830: Design and Theory of Algorithms
1 Bipartite Matching Lecture 3: Jan Bipartite Matching A graph is bipartite if its vertex set can be partitioned into two subsets A and B so that.
Connected Components, Directed Graphs, Topological Sort Lecture 25 COMP171 Fall 2006.
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Connected Components, Directed graphs, Topological sort COMP171 Fall 2005.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 11 Instructor: Paul Beame.
Tirgul 7 Review of graphs Graph algorithms: – BFS (next tirgul) – DFS – Properties of DFS – Topological sort.
CSE 421 Algorithms Richard Anderson Lecture 3. Classroom Presenter Project Understand how to use Pen Computing to support classroom instruction Writing.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
GRAPH Learning Outcomes Students should be able to:
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 4.
Analysis of Algorithms
Introduction to Graphs. Introduction Graphs are a generalization of trees –Nodes or verticies –Edges or arcs Two kinds of graphs –Directed –Undirected.
CSE 326: Data Structures NP Completeness Ben Lerner Summer 2007.
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
Lecture 3 Analysis of Algorithms, Part II. Plan for today Finish Big Oh, more motivation and examples, do some limit calculations. Little Oh, Theta notation.
Graph Colouring L09: Oct 10. This Lecture Graph coloring is another important problem in graph theory. It also has many applications, including the famous.
Lecture 11 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 2.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 5.
Exponential time algorithms Algorithms and networks.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
CSE 421 Algorithms Richard Anderson Autumn 2015 Lecture 5.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
Proof of correctness of Dijkstra’s algorithm: Basically, we need to prove two claims. (1)Let S be the set of vertices for which the shortest path from.
CSC317 1 At the same time: Breadth-first search tree: If node v is discovered after u then edge uv is added to the tree. We say that u is a predecessor.
Leda Demos By: Kelley Louie Credits: definitions from Algorithms Lectures and Discrete Mathematics with Algorithms by Albertson and Hutchinson graphics.
Trees.
Introduction to Algorithms
Graphs Representation, BFS, DFS
Introduction to Algorithms
Introduction to Algorithms
Computability and Complexity
GRAPHS Lecture 16 CS2110 Fall 2017.
Lecture 12 Graph Algorithms
CSE 421: Introduction to Algorithms
CS120 Graphs.
CSE 421: Introduction to Algorithms
Lecture 10 Algorithm Analysis
Richard Anderson Lecture 4
Elementary graph algorithms Chapter 22
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Lectures on Graph Algorithms: searching, testing and sorting
Programming and Data Structure
Richard Anderson Lecture 3
Chapter 2.
CSE 421: Introduction to Algorithms
Richard Anderson Autumn 2016 Lecture 5
Richard Anderson Autumn 2016 Lecture 7
Richard Anderson Winter 2009 Lecture 6
CSE 417: Algorithms and Computational Complexity
Richard Anderson Winter 2019 Lecture 4
GRAPHS Lecture 17 CS2110 Spring 2018.
Elementary graph algorithms Chapter 22
Richard Anderson Winter 2019 Lecture 6
Richard Anderson Lecture 5 Graph Theory
Estimating Algorithm Performance
Richard Anderson Winter 2019 Lecture 5
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Richard Anderson Autumn 2015 Lecture 4
Richard Anderson Autumn 2015 Lecture 6
Presentation transcript:

CSE 421 Algorithms Richard Anderson Lecture 4

What does it mean for an algorithm to be efficient?

Definitions of efficiency Fast in practice Qualitatively better worst case performance than a brute force algorithm

Polynomial time efficiency An algorithm is efficient if it has a polynomial run time Run time as a function of problem size –Run time: count number of instructions executed on an underlying model of computation –T(n): maximum run time for all problems of size at most n

Polynomial Time Algorithms with polynomial run time have the property that increasing the problem size by a constant factor increases the run time by at most a constant factor (depending on the algorithm)

Why Polynomial Time? Generally, polynomial time seems to capture the algorithms which are efficient in practice The class of polynomial time algorithms has many good, mathematical properties

Polynomial vs. Exponential Complexity Suppose you have an algorithm which takes n! steps on a problem of size n If the algorithm takes one second for a problem of size 10, estimate the run time for the following problems sizes: : 1 second 12: 2 minutes 14: 6 hours 16: 2 months 18: 50 years 20: 20K years

Ignoring constant factors Express run time as O(f(n)) Emphasize algorithms with slower growth rates Fundamental idea in the study of algorithms Basis of Tarjan/Hopcroft Turing Award

Why ignore constant factors? Constant factors are arbitrary –Depend on the implementation –Depend on the details of the model Determining the constant factors is tedious and provides little insight

Why emphasize growth rates? The algorithm with the lower growth rate will be faster for all but a finite number of cases Performance is most important for larger problem size As memory prices continue to fall, bigger problem sizes become feasible Improving growth rate often requires new techniques

Formalizing growth rates T(n) is O(f(n)) [T : Z +  R + ] –If n is sufficiently large, T(n) is bounded by a constant multiple of f(n) –Exist c, n 0, such that for n > n 0, T(n) < c f(n) T(n) is O(f(n)) will be written as: T(n) = O(f(n)) –Be careful with this notation

Prove 3n 2 + 5n + 20 is O(n 2 ) Choose c = 6, n 0 = 5 T(n) is O(f(n)) if there exist c, n 0, such that for n > n 0, T(n) < c f(n) Let c = Let n 0 =

Order the following functions in increasing order by their growth rate a)n log 4 n b)2n n c)2 n/100 d)1000n + log 8 n e)n 100 f)3 n g)1000 log 10 n h)n 1/2

Lower bounds T(n) is  (f(n)) –T(n) is at least a constant multiple of f(n) –There exists an n 0, and  > 0 such that T(n) >  f(n) for all n > n 0 Warning: definitions of  vary T(n) is  (f(n)) if T(n) is O(f(n)) and T(n) is  (f(n))

Useful Theorems If lim (f(n) / g(n)) = c for c > 0 then f(n) =  (g(n)) If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)) If f(n) is O(h(n)) and g(n) is O(h(n)) then f(n) + g(n) is O(h(n))

Ordering growth rates For b > 1 and x > 0 –log b n is O(n x ) For r > 1 and d > 0 –n d is O(r n )

Formalizing growth rates T(n) is O(f(n)) [T : Z +  R + ] –If n is sufficiently large, T(n) is bounded by a constant multiple of f(n) –Exist c, n 0, such that for n > n 0, T(n) < c f(n) T(n) is O(f(n)) will be written as: T(n) = O(f(n)) –Be careful with this notation

Graph Theory G = (V, E) –V – vertices –E – edges Undirected graphs –Edges sets of two vertices {u, v} Directed graphs –Edges ordered pairs (u, v) Many other flavors –Edge / vertices weights –Parallel edges –Self loops

Definitions Path: v 1, v 2, …, v k, with (v i, v i+1 ) in E –Simple Path –Cycle –Simple Cycle Distance Connectivity –Undirected –Directed (strong connectivity) Trees –Rooted –Unrooted

Graph search Find a path from s to t S = {s} While there exists (u, v) in E with u in S and v not in S Pred[v] = u Add v to S if (v = t) then path found

Breadth first search Explore vertices in layers –s in layer 1 –Neighbors of s in layer 2 –Neighbors of layer 2 in layer 3... s

Key observation All edges go between vertices on the same layer or adjacent layers

Bipartite A graph V is bipartite if V can be partitioned into V 1, V 2 such that all edges go between V 1 and V 2 A graph is bipartite if it can be two colored Two color this graph

Testing Bipartiteness If a graph contains an odd cycle, it is not bipartite

Algorithm Run BFS Color odd layers red, even layers blue If no edges between the same layer, the graph is bipartite If edge between two vertices of the same layer, then there is an odd cycle, and the graph is not bipartite

Bipartite A graph is bipartite if its vertices can be partitioned into two sets V 1 and V 2 such that all edges go between V 1 and V 2 A graph is bipartite if it can be two colored

Theorem: A graph is bipartite if and only if it has no odd cycles

Lemma 1 If a graph contains an odd cycle, it is not bipartite

Lemma 2 If a BFS tree has an intra-level edge, then the graph has an odd length cycle Intra-level edge: both end points are in the same level

Lemma 3 If a graph has no odd length cycles, then it is bipartite

Connected Components Undirected Graphs

Computing Connected Components in O(n+m) time A search algorithm from a vertex v can find all vertices in v’s component While there is an unvisited vertex v, search from v to find a new component

Directed Graphs A Strongly Connected Component is a subset of the vertices with paths between every pair of vertices.

Identify the Strongly Connected Components

Strongly connected components can be found in O(n+m) time But it’s tricky! Simpler problem: given a vertex v, compute the vertices in v’s scc in O(n+m) time

Topological Sort Given a set of tasks with precedence constraints, find a linear order of the tasks

Find a topological order for the following graph E F D A C B K J G H I L

If a graph has a cycle, there is no topological sort Consider the first vertex on the cycle in the topological sort It must have an incoming edge B A D E F C

Lemma: If a graph is acyclic, it has a vertex with in degree 0 Proof: –Pick a vertex v 1, if it has in-degree 0 then done –If not, let (v 2, v 1 ) be an edge, if v 2 has in- degree 0 then done –If not, let (v 3, v 2 ) be an edge... –If this process continues for more than n steps, we have a repeated vertex, so we have a cycle

Topological Sort Algorithm While there exists a vertex v with in-degree 0 Output vertex v Delete the vertex v and all out going edges E F D A C B K J G H I L

Details for O(n+m) implementation Maintain a list of vertices of in-degree 0 Each vertex keeps track of its in-degree Update in-degrees and list when edges are removed m edge removals at O(1) cost each