CS 206 Introduction to Computer Science II 01 / 30 / 2009 Instructor: Michael Eckmann.

Slides:



Advertisements
Similar presentations
Algorithm Analysis Input size Time I1 T1 I2 T2 …
Advertisements

CS 206 Introduction to Computer Science II 02 / 27 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 03 / 02 / 2009 Instructor: Michael Eckmann.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CS 206 Introduction to Computer Science II 09 / 10 / 2008 Instructor: Michael Eckmann.
Introduction to Analysis of Algorithms
CS 206 Introduction to Computer Science II 09 / 12 / 2008 Instructor: Michael Eckmann.
CS107 Introduction to Computer Science
CS 206 Introduction to Computer Science II 11 / 12 / 2008 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 10 / 13 / 2008 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 09 / 05 / 2008 Instructor: Michael Eckmann.
CS 106 Introduction to Computer Science I 03 / 28 / 2008 Instructor: Michael Eckmann.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
CS 106 Introduction to Computer Science I 03 / 03 / 2008 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 11 / 12 / 2008 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 01 / 28 / 2009 Instructor: Michael Eckmann.
CS2336: Computer Science II
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CS 206 Introduction to Computer Science II 04 / 06 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 02 / 25 / 2009 Instructor: Michael Eckmann.
CS 106 Introduction to Computer Science I 10 / 16 / 2006 Instructor: Michael Eckmann.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
Algorithm Analysis Dr. Bernard Chen Ph.D. University of Central Arkansas.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
COMPSCI 102 Introduction to Discrete Mathematics.
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Week 2 CS 361: Advanced Data Structures and Algorithms
Discrete Mathematics Algorithms. Introduction  An algorithm is a finite set of instructions with the following characteristics:  Precision: steps are.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Algorithm Analysis. Algorithm An algorithm is a clearly specified set of instructions which, when followed, solves a problem. recipes directions for putting.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CS 3343: Analysis of Algorithms
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Analysis of Algorithms
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Algorithm Analysis PS5 due 11:59pm Wednesday, April 18 Final Project Phase 2 (Program Outline) due 1:30pm Tuesday, April 24 Wellesley College CS230.
Arrays Tonga Institute of Higher Education. Introduction An array is a data structure Definitions  Cell/Element – A box in which you can enter a piece.
ALG0183 Algorithms & Data Structures Lecture 6 The maximum contiguous subsequence sum problem. 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
MS 101: Algorithms Instructor Neelima Gupta
Chapter 10 Algorithm Analysis.  Introduction  Generalizing Running Time  Doing a Timing Analysis  Big-Oh Notation  Analyzing Some Simple Programs.
Algorithm Analysis Data Structures and Algorithms (60-254)
Geoff Holmes and Bernhard Pfahringer COMP206-08S General Programming 2.
3.3 Complexity of Algorithms
Algorithm Analysis Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2008.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Asymptotic Analysis (based on slides used at UMBC)
Copyright © 2014 Curt Hill Growth of Functions Analysis of Algorithms and its Notation.
Algorithm Analysis Chapter 5. Algorithm An algorithm is a clearly specified set of instructions which, when followed, solves a problem. –recipes –directions.
CS 206 Introduction to Computer Science II 09 / 18 / 2009 Instructor: Michael Eckmann.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
CS 206 Introduction to Computer Science II 02 / 02 / 2009 Instructor: Michael Eckmann.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CS 206 Introduction to Computer Science II 10 / 10 / 2008 Instructor: Michael Eckmann.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Introduction to Analysis of Algorithms
Introduction to Algorithms Analysis
CS 201 Fundamental Structures of Computer Science
Introduction to Data Structures
Introduction to Discrete Mathematics
slides created by Ethan Apter
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Presentation transcript:

CS 206 Introduction to Computer Science II 01 / 30 / 2009 Instructor: Michael Eckmann

Michael Eckmann - Skidmore College - CS Spring 2009 Today’s Topics Questions/comments Program 1 will be due by 11:59pm. on Tuesday February 3rd. start algorithm analysis

Michael Eckmann - Skidmore College - CS Spring 2009 HW Read handout on Algorithm Analysis and Chapter 5.

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis An algorithm is a specific set of instructions for solving a problem. The amount of time an algorithm takes to finish is often proportional to the amount of input –sorting 1 million items vs. sorting 10 items –searching a list of 2 billion items vs. searching a list of 3 items Problem vs. algorithm --- a problem is not the same as an algorithm. Example: Sorting is a problem. An algorithm is a specific recipe for solving a problem. –bubbleSort, insertionSort, etc. are different algorithms for sorting. So, when we're analyzing the running time --- we're analyzing the running time of an algorithm, not a problem.

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis Algorithms are often analyzed for –the amount of time they take to run and/or –the amount of space used while running

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis Some common functions (in increasing order) used in analysis are –constant functions (e.g. f(n) = 10 )‏ –logarithmic functions (e.g. f(n) = log(20n) )‏ –log squared (e.g. f(n) = log 2 (7n) )‏ –linear functions (e.g. f(n) = 3n – 9 )‏ –N log N (e.g. f(n) = 2n log n )‏ –quadratic functions (e.g. f(n) = 5n 2 + 3n )‏ –cubic functions (e.g. f(n) = 3n n 2 + (4/7)n )‏ –exponential functions (e.g. f(n) = 5 n )‏ –factorial functions (e.g. f(n) = n! )‏

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis The dominant term is what gives a function it's name among –cubic, quadratic, logarithmic, etc. It's more complex than this, but the dominant term can generally be picked out like: –if you determine a function for the running time of an algorithm to be say f(n) = log 2 n + 4n 3 it's dominant term is 4n 3 so ignoring constant multiplier, we have n 3 –we say that f(n) is O (n 3 ) (pronounced big-Oh en cubed)‏ An example of when it's a bit harder to determine –f(n) = 3n log(n!) + (n 2 + 3)log n is O(n 2 logn)‏

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis Graphs of functions to get a more intuitive feel for the growth of functions, take a look at the handout with graphs.

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis Let's look at the tables with examples of actual times for certain running times given large inputs shows that the time complexity of an algorithm is much more important than processor speed (for large enough inputs) even though processor speeds are getting faster exponentially

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis Growth rates of functions are different than being able to say one function is less than another –e.g. x is greater than x 3 for many initial values but as x increases above some value, x 3 will always be bigger The constant being multiplied by the dominant term is generally ignored (except for small amounts of input)‏ Big O notation ignores the constant multipliers of the dominant term and we say a phrase like: –linear search is big Oh en –when we mean that the linear search algorithm's time complexity grows linearly (based on n, the number of items in the search space).

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis When examining an algorithm, we usually count how many times a certain operation (or group of operations) is performed. See handout for reasonable choices of what operations we would count in different problems. This will lead us to determining the time complexity of the algorithm. We can consider best-case, worst-case and average-case scenarios.

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis Let's consider 1 problem and 3 ways to solve it (using 3 different algorithms) and we'll analyze the running times of each. The Maximum contiguous subsequence problem: –Given an integer sequence A 1, A 2,..., A N, find (and identify the sequence corresponding to) the maximum value of  j k=i A k. The maximum contiguous subsequence sum is zero if all are negative. Therefore, an empty sequence may be maximum. Example input: { -2, 11, -4, 13, -5, 2 } the answer is 20 and the sequence is { 11, -4, 13 } Another: { 1, -3, 4, -2, -1, 6 } the answer is 7, the sequence is { 4, -2, -1, 6 }

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis The simplest is an exhaustive search (brute force algorithm.)‏ –that is, simply consider every possible subsequence and compute its sum, keep doing this and save the greatest –so, we set the maxSum = 0 (b/c it is at least this big) and we start at the first element and consider every subsequence that begins with the first element and sum them up... if any has a sum larger than maxSum, save this... –then start at second element and do the same... and so on until start at last element Advantages to this: not complex, easy to understand Disadvantages to this: slow Let's examine the algorithm. –decide what is a good thing to count –count that operation (in terms of the input size)‏

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis Let's consider this inefficient maximum continuous subsequence algorithm from page 171 in our text.

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis The exhaustive search has many unnecessary computations. Notice that  j k=i A k = A j +  j-1 k=i A k That is, if we know the sum of the first j-1 elements, then the sum of the first j elements is found just by adding in the jth element. Knowing that, the problem can be solved more efficiently by the algorithm that we are about to analyze. –We won't need to keep adding up a sequence from scratch

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis The second algorithm that we'll analyze uses the improvement just mentioned and the running time improves (goes down.)‏

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis A further improvement can come if we realize that if a subsequence has a negative sum, it will not be the first part of the maximum subsequence. –Why? Also, all contiguous subsequences bordering a maximum contiguous subsequence must have negative or 0 sums. –why?

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis A further improvement can come if we realize that if a subsequence has a negative sum, it will not be the first part of the maximum subsequence. –Why? –A negative value will only bring the total down (see Theorem 5.2 in textbook)‏ Also, all contiguous subsequences bordering a maximum contiguous subsequence must have negative or 0 sums. –why? –If they were >0, they would be attached to the maximum sequence (thereby giving it a larger sum). Those two aren't too hard to see, but by themselves won't allow us to reduce the running time to below quadratic. Let's go a bit further.

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis There are some observations in our textbook which are then proved on page 175 in Theorem 5.3 of the textbook. This will allow us to process the sequence by just going through the sequence once from start to finish (and hence get linear time performance.)‏ Specifically, while computing the sum of a subsequence, if at any time the sum becomes negative, we start considering sequences only starting at the next element. This is less intuitive than the other observations we made, but in my opinion not worth taking class time to cover in detail, so I leave it to you, if you wish to go over the observations in our textbook. But let's look at the linear algorithm and see if it is obvious that it is linear.

Michael Eckmann - Skidmore College - CS Spring 2009 Algorithm Analysis What's the point of that exercise: –1) get a feel for how to count how much work is being done in an algorithm –2) it is sometimes possible to get the running time of an algorithm down by exploiting facts about the problem. –3) it is good to think about such things in a course that mainly deals with data structures. Any guesses as to why I say this? –4) it is sometimes difficult to exploit some things about the problem to make a more efficient algorithm