Presentation is loading. Please wait.

Presentation is loading. Please wait.

Matters of Time & Space Today: Time & Space Complexity Wednesday

Similar presentations


Presentation on theme: "Matters of Time & Space Today: Time & Space Complexity Wednesday"— Presentation transcript:

1 Matters of Time & Space Today: Time & Space Complexity Wednesday
Robot Competitions (Slight (and hopefully last)) Modification to Schedule (Minor) Additions & Clarification of USAR Project Rescue Team Order The goal of the Smart Dust Project is to build a self-contained, millimeter-scale sensing and communication platform for a massively distributed sensor network.  This device will be around the size of a grain of sand and will contain sensors, computational ability, bi-directional wireless communications, and a power supply. 

2 Computational Resources
Time How much time does it take for the program to run on a particular problem? Space How much memory space does it take for the program to run on a particular problem? Processors are getting faster and memory is getting cheaper so why worry? Limited resources in embedded computing like cars, cell phones, sensor networks, and robots Real-time constraints

3 Example: Robotmote Robomote is a macro mote designed to act as part of a mobile sensor network made up of hundreds or thousands of identical robots that would monitor the environment.

4 How do you compare programs?
Benchmarking A standard program or set of programs which can be run on different computers to give an inaccurate measure of their performance. "In the computer industry, there are three kinds of lies: lies, damn lies, and benchmarks." Why?

5 Algorithm Analysis Compare running times and space requirements independent of programming languages, compiler, hardware, processor speed, … Algorithm Analysis Measure the efficiency of an algorithm or program as the size of the input becomes large Provides a gross comparison of algorithms

6 Time & Space Complexity
Time Complexity The way in which the number of steps required by a algorithm varies with the size of the problem it is solving. Space Complexity The way in which the amount of storage space required by an algorithm varies with the size of the problem it is solving. What is meant by “steps” and “size”?

7 Steps Basic operation An algorithm step or program code that has the property that its time to complete does not depend on the particular values of its operands totalError =totalError + currentError; for (i = 0; i < roomLength; i++) for (j = 0; j < roomWidth; j++) { map(room[i,j]; }

8 Size Number of inputs processed n
int sumIntArray(int arr[ ], int sizeOfarr) { int i, total; for (i=0; i < sizeOfarr; i++) total = total + arr[i]; return total; }

9 Running Time Let us say that c is the amount of time it takes to access and add one integer Then we can say that sumIntArray has a “running time” of T(n) = cn where n = sizeOfarr int sumIntArray(int arr[ ], int sizeOfarr) { int i, total; for (i=0; i < sizeOfarr; i++) total = total + arr[i]; return total; }

10 Running Time What about summing a 2D array?
Then we can say that sumIntArray has a “running time” of T(n) = cn2 where n = sizeOfarr and both dimensions are equal for (i=0; i < sizeOfarr; i++) for (j=0; j < sizeOfarr; j++) total = total + arr[i][j];

11 Growth Rate Linear growth rate Quadratic growth rate
Exponential growth rate

12 Best, Average, Worst Best Case Analysis
The least amount of running time possible The most optimistic case Rarely of interest Average (or typical or expected) Case Analysis What the running time will be on the average Requires understanding of how the universe of input data is distributed Worst Case The most amount of running time possible The most pessimistic case Provides a clear basis for comparison Can be very important for real-time applications

13 Limits & Bounds Asymptotic algorithm analysis Upper Bounds
Interested in the resource requirements as the input size “gets big” or reaches a limit Ignore constants Upper Bounds The highest growth rate that an algorithm can have. Not the same as worst case, but the upper bound for the growth rate expressed as an equation: “this algorithm has an upper bound to its growth rate of n2 in the worst case”

14 Big-O: “order of” g(n) r(n) ng nr
Function T(n) is said to be O(f (n)) if there are positive constants c and n0 such that T(n)  c f (n) for any n  n0 (i.e., T(n) is ultimately bounded above by c f (n)). Example: n3 + 3n2 + 6n + 5 is O(n3). (Use c = 15 and n0 = 1.) Example: n2 + n logn is O(n2). (Use c = 2 and n0 = 1.) g(n) r(n) ng nr r(n) is O(g(n)) since (1)g(n) exceeds r(n) for all n-values past ng g(n) is O(r(n)) since (3)r(n) exceeds g(n) for all n-values past nr Thanks to Dr. White for the slide

15 Big-O Represents An Upper Bound
If T(n) is O(f(n)), then f(n) is basically a cap on how bad T(n) will behave when n gets big. g(n) r(n) y(n) v(n) b(n) p(n) YES! YES! YES! Is g(n) O(r(n))? Is v(n) O(y(n))? Is b(n) O(p(n))? YES! YES! NO! Is r(n) O(g(n))? Is y(n) O(v(n))? Is p(n) O(b(n))? Thanks to Dr. White for the slide

16 Computational Model For Algorithm Analysis
To formally analyze the performance of algorithms, we will use a computational model with a couple of simplifying assumptions: Each simple instruction (assignment, comparison, addition, multiplication, memory access, etc.) is assumed to execute in a single time unit. Memory is assumed to be limitless, so there is always room to store whatever data is needed. The size of the input, n, will normally be used as our main variable, and we’ll primarily be interested in “worst case” scenarios. Thanks to Dr. White for the slide

17 General Rules For Running Time Calculation
Rule One: Loops The running time of a loop is at most the running time of the statements inside the loop, multiplied by the number of iterations. Example: for (i = 0; i < n; i++) // n iterations A[i] = (1-t)*X[i] + t*Y[i]; // 12 time units // per iteration (Retrieving X[i] requires one addition and one memory access, as does retrieving Y[i]; the calculation involves a subtraction, two multiplications, and an addition; assigning A[i] the resulting value requires one addition and one memory access; and each loop iteration requires a comparison and either an assignment or an increment. This totals twelve primitive operations.) Thus, the total running time is 12n time units, i.e., this part of the program is O(n). Thanks to Dr. White for the slide

18 Total running time: ((10+2)n+2)n = 12n2+2n time units, which is O(n2).
Rule Two: Nested Loops The running time of a nested loop is at most the running time of the statements inside the innermost loop, multiplied by the product of the number of iterations of all of the loops. Example: for (i = 0; i < n; i++) // n iterations. 2 ops each for (j = 0; j < n; j++) // n iterations, 2 ops each C[i,j] = j*A[i] + i*B[j]; // 10 time units/iteration (2 for retrieving A[i], 2 for retrieving B[j], 3 for the RHS arithmetic, 3 for assigning C[i,j].) Total running time: ((10+2)n+2)n = 12n2+2n time units, which is O(n2). Thanks to Dr. White for the slide

19 Total running time: 12n2+24n time units, i.e., this code is O(n2).
Rule Three: Consecutive Statements The running time of a sequence of statements is merely the sum of the running times of the individual statements. Example: for (i = 0; i < n; i++) { // 22n time units A[i] = (1-t)*X[i] + t*Y[i]; // for this B[i] = (1-s)*X[i] + s*Y[i]; // entire loop } for (i = 0; i < n; i++) // (12n+2)n time for (j = 0; j < n; j++) // units for this C[i,j] = j*A[i] + i*B[j]; // nested loop Total running time: 12n2+24n time units, i.e., this code is O(n2). Thanks to Dr. White for the slide

20 Rule Four: Conditional Statements
The running time of an if-else statement is at most the running time of the conditional test, added to the maximum of the running times of the if and else blocks of statements. Example: if (amt > cost + tax) //2 time units { count = 0; //1 time unit while ((count<n) && (amt>cost+tax)) //4 TUs per iter { //At most n iter amt -= (cost + tax); //3 time units count++; //2 time units } cout << “CAPACITY:” << count; //2 time units else cout << “INSUFFICIENT FUNDS”; //1 time unit Total running time: 2 + max(1 + ( )n + 2, 1) = 9n + 5 time units, i.e., this code is O(n). Thanks to Dr. White for the slide

21 Analysis of Breadth First Search
create root node; put root node in list; while (solution not found) or (list is not empty) do take first node off of list; if node = solution set solution found to true; return node; else for each possible action generate child node put child node on the end of list return null

22 Analysis of Breadth First Search
Assume the branching factor is 2 Branching factor: number of children per parent Number of nodes expanded: ….+2n O(2n) What would be the Big-O of a breadth first search with any number of children? O(bn) What is the space complexity of breadth first search? If the solution path is needed: O(bn)

23 Analysis of Depth First Search
create root node; put root node in list; while (solution not found) or (list is not empty do) take first node off of list; if node = solution set solution found to true; return node; else for each possible action generate child node put child node on the start of list return null

24 Analysis of Depth First Search

25 Analysis of Depth First Search
Assume the branching factor is 2 O(2n) What would be the Big-O of a depth first search with any number of children? O(bn) What is the space complexity of depth first search? If the solution path is needed: O(bn)

26 Algorithm Analysis Questions
What is the Big-O of wave front path planning? What is the Big-O of thresholding an image to find a specific color blob?


Download ppt "Matters of Time & Space Today: Time & Space Complexity Wednesday"

Similar presentations


Ads by Google