CS 46101 Section 600 CS 56101 Section 002 Dr. Angela Guercio Spring 2010.

Slides:



Advertisements
Similar presentations
한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님
Advertisements

한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님
Divide-and-Conquer CIS 606 Spring 2010.
Comp 122, Spring 2004 Order Statistics. order - 2 Lin / Devi Comp 122 Order Statistic i th order statistic: i th smallest element of a set of n elements.
Ack: Several slides from Prof. Jim Anderson’s COMP 202 notes.
Recurrences : 1 Chapter 3. Growth of function Chapter 4. Recurrences.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Divide and Conquer (Merge Sort)
Master Theorem Chen Dan Dong Feb. 19, 2013
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
A simple example finding the maximum of a set S of n numbers.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
5/15/ Algorithms1 Algorithms – Ch4 - Divide & Conquer Recurrences: as we saw in another lecture, the Divide and Conquer approach leads to Recurrence.
CS4413 Divide-and-Conquer
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
Spring 2015 Lecture 5: QuickSort & Selection
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (3) Recurrence Relation 11/11 ~ 11/14/2008 Yang Song.
CSC 2300 Data Structures & Algorithms January 26, 2007 Chapter 2. Algorithm Analysis.
4.Recurrences Hsu, Lih-Hsing. Computer Theory Lab. Chapter 4P.2 Recurrences -- Substitution method Recursion-tree method Master method.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
1 Divide and Conquer Binary Search Mergesort Recurrence Relations CSE Lecture 4 – Algorithms II.
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
ECOE 456/556: Algorithms and Computational Complexity Lecture 1 Serdar Taşıran.
1 Designing algorithms There are many ways to design an algorithm. Insertion sort uses an incremental approach: having sorted the sub-array A[1…j - 1],
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
4.Divide-and-Conquer Hsu, Lih-Hsing. Computer Theory Lab. Chapter 4P.2 Instruction Divide Conquer Combine.
1 Chapter 4 Divide-and-Conquer. 2 About this lecture Recall the divide-and-conquer paradigm, which we used for merge sort: – Divide the problem into a.
Chapter 4. Recurrences. Outline Offers three methods for solving recurrences, that is for obtaining asymptotic bounds on the solution In the substitution.
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 2: Getting Started.
1Computer Sciences Department. Objectives Recurrences.  Substitution Method,  Recursion-tree method,  Master method.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 4: Recurrences.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
CSC317 1 Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
CS 3343: Analysis of Algorithms
Chapter 4: Divide and Conquer
CS 3343: Analysis of Algorithms
Divide-and-Conquer 7 2  9 4   2   4   7
Ch 4: Recurrences Ming-Te Chi
Divide and Conquer (Merge Sort)
Trevor Brown CS 341: Algorithms Trevor Brown
Ack: Several slides from Prof. Jim Anderson’s COMP 202 notes.
Introduction To Algorithms
The Selection Problem.
Divide-and-Conquer 7 2  9 4   2   4   7
Quicksort Quick sort Correctness of partition - loop invariant
Presentation transcript:

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010

Use a recurrence to characterize the running time of a divide-and-conquer algorithm. Solving the recurrence gives us the asymptotic running time. A recurrence is a function is defined in terms of one or more base cases, and ! itself, with smaller arguments.

Examples

Many technical issues: Floors and ceilings - ARE OMITTED Boundary conditions - ARE OMITTED Exact vs. asymptotic functions In algorithm analysis, we usually express both the recurrence and its solution using asymptotic notation. Example: T(n) = 2T(n/2) + Θ(n) with solution T(n) = Θ(n lg n).

The boundary conditions are usually expressed as T(n) = Θ(1) for sufficiently small n. When we desire an exact, rather than an asymptotic, solution, we need to deal with boundary conditions. In practice, we just use asymptotic most of the time, and we ignore boundary conditions.

Input: An array A[1..n] of numbers. Output: Indices i and j such that A[i..j] has the greatest sum of any nonempty, contiguous subarray of A, along with the sum of the values in A[i..j]. Scenario You have the prices that a stock traded at over a period of n consecutive days. When should you have bought the stock? When should you have sold the stock? Even though its in retrospect, you can yell at your stockbroker for not recommending these buy and sell dates.

To convert to a maximum-subarray problem, let A[i] = (price after day i ) - (price after day (i -1)). Then the nonempty, contiguous subarray with the greatest sum brackets the days that you should have held the stock. If the maximum subarray is A[i..j], then should have bought just before day i (i.e., just after day i - 1) and sold just after day j. Why do we need to find the maximum subarray? Why not just buy low, sell high?

Lowest price might occur after the highest price. But wouldnt the optimal strategy involve buying at the lowest price or selling at the highest price? Not necessarily: Maximum profit is $3 per share, from buying after day 2 and selling after day 3. Yet lowest price occurs after day 4 and highest occurs after day 1.

Can solve by brute force: check all Θ(n 2 ) subarrays. Can organize the computation so that each subarray A[i..j] takes O(1) time, given that youve computed A[i..j – 1], so that the brute-force solution takes Θ(n 2 ) time. Solving by divide-and-conquer Use divide-and-conquer to solve in O(n lg n) time. Subproblem: Find a maximum subarray of A[low..high]. In original call, low = 1, high = n.

Divide the subarray into two subarrays of as equal size as possible. Find the midpoint mid of the subarrays, and consider the subarrays A[low..mid] and A[mid + 1..high]. Conquer by finding maximum subarrays of A[low..mid] and A[mid + 1..high]. Combine by finding a maximum subarray that crosses the midpoint, and using the best solution out of the three (the subarray crossing the midpoint and the two solutions found in the conquer step). This strategy works because any subarray must either lie entirely on one side of the midpoint or cross the midpoint.

Finding the maximum subarray that crosses the midpoint Not a smaller instance of the original problem: has the added restriction that the subarray must cross the midpoint. Again, could use brute force. If size of A[low..mid] is n, would have n/2 choices for left endpoint and n/2 choices for right endpoint, so would have Θ(n 2 ) combinations altogether. Can solve in linear time.

Any subarray crossing the midpoint A[mid] is made of two subarrays A[i..mid] and A[mid j], where low i mid and mid < j high. Find maximum subarrays of the form A[i..mid] and A[mid j] and then combine them. Procedure to take array A and indices low, mid, high and return a tuple giving indices of maximum subarray that crosses the midpoint, along with the sum in this maximum subarray:

Divide by computing mid. Conquer by the two recursive calls to FIND- MAXIMUM-SUBARRAY. Combine by calling FIND-MAX-CROSSING- SUBARRAY and then determining which of the three results gives the maximum sum. Base case is when the subarray has only 1 element.

Simplifying assumption: Original problem size is a power of 2, so that all subproblem sizes are integer. Let T(n) denote the running time of FIND- MAXIMUM-SUBARRAY on a subarray of n elements. Base case: Occurs when high equals low, so that n = 1. The procedure just returns T(n) = Θ(1).

Recursive case: Occurs when n > 1. Dividing takes Θ(1) time. Conquering solves two subproblems, each on a subarray of n / 2 elements. Takes T(n/2) time for each subproblem 2T(n/2) time for conquering. Combining consists of calling FIND-MAX- CROSSING-SUBARRAY, which takes Θ(n) time, and a constant number of constant-time tests Θ(n) + Θ(1) time for combining.

Guess the solution. Use induction to find the constants and show that the solution works. Example

Guess: T(n) = n lg n + n. Induction: Basis: n =1 n lg n + n = 1 = T(n) Inductive step: Inductive hypothesis is that T(k) = k lg k + k for all k < n. Well use this inductive hypothesis for T(n / 2).

Generally, we use asymptotic notation: We would write T(n) = 2T(n / 2) + Θ(n). We assume T(n) = O(1) for sufficiently small n. We express the solution by asymptotic notation: T(n) = Θ(n lg n). We dont worry about boundary cases, nor do we show base cases in the substitution proof. T(n) is always constant for any constant n. Since we are ultimately interested in an asymptotic solution to a recurrence, it will always be possible to choose base cases that work. When we want an asymptotic solution to a recurrence, we dont worry about the base cases in our proofs. ! When we want an exact solution, then we have to deal with base cases.

For the substitution method: Name the constant in the additive term. Show the upper (O) and lower (Ω) bounds separately. Might need to use different constants for each. Example T(n) = 2 T(n / 2) + Θ(n). If we want to show an upper bound of T(n) = 2 T(n / 2) + Θ(n), we write T(n) 2 T(n / 2) + cn for some positive constant c.

Upper bound: Guess: T(n) d n lg n for some positive constant d. We are given c in the recurrence, and we get to choose d as any positive constant. Its OK for d to depend on c.

Lower bound: Write T(n) 2 T(n / 2) + cn for some positive constant c. Guess: T(n) d n lg n for some positive constant d. Therefore, T(n) = Θ(n lg n)

More Divide and Conquer Strassens algorithm for matrix multiplication Reading: Read Chapter 4