CSE 373: Data Structures and Algorithms

Slides:



Advertisements
Similar presentations
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Advertisements

Analysis of Algorithms
Lecture: Algorithmic complexity
CSE 373 Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III.
HST 952 Computing for Biomedical Scientists Lecture 10.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Introduction to Analysis of Algorithms
1 TCSS 342, Winter 2005 Lecture Notes Course Overview, Review of Math Concepts, Algorithm Analysis and Big-Oh Notation Weiss book, Chapter 5, pp
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CSE 421 Algorithms Richard Anderson Lecture 3. Classroom Presenter Project Understand how to use Pen Computing to support classroom instruction Writing.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Algorithm Analysis (Big O)
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
TCSS 342 Lecture Notes Course Overview, Review of Math Concepts,
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
CS 3343: Analysis of Algorithms
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Lauren Milne Summer 2015.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Computer Science and Software Engineering University of Wisconsin - Platteville 8. Comparison of Algorithms Yan Shi CS/SE 2630 Lecture Notes Part of this.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Nicki Dell Spring 2014.
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
CE 221 Data Structures and Algorithms Chapter 2: Algorithm Analysis - I Text: Read Weiss, §2.1 – Izmir University of Economics.
Algorithm Analysis Part of slides are borrowed from UST.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
1 CSE 373 Algorithm Analysis and Runtime Complexity slides created by Marty Stepp ©
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Linda Shapiro Winter 2015.
Algorithm Analysis 1.
Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Catie Baker Spring 2015.
TCSS 342, Winter 2006 Lecture Notes
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
Searching, Sorting, and Asymptotic Complexity
CSE 373 Data Structures and Algorithms
CSE 373 Optional Section Led by Yuanwei, Luyi Apr
8. Comparison of Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
CSE 373: Data Structures and Algorithms
Estimating Algorithm Performance
Analysis of Algorithms
Presentation transcript:

CSE 373: Data Structures and Algorithms Lecture 3: Math Review/Asymptotic Analysis

Announcements Programming Project #1 Getting Help General Questions  Message board Feel free to answer/respond yourselves Please no code/specifics – general ideas only Specific/Implementation Questions  Office Hours (on course website) or email cse373-staff AT cs DOT washington DOT edu (read by myself and the three TAs) No turnin yet Using sox Want to add CSE 373? See me after class.

Motivation So much data!! Human genome: 3.2 * 109 base pairs If there are 6.8 * 109 on the planet, how many base pairs of human DNA? Earth surface area: 1.49 * 108 km2 How many photos if taking a photo of each m2? For every day of the year (3.65 * 102)? But aren't computers getting faster and faster?

Why algorithm analysis? As problem sizes get bigger, analysis is becoming more important. The difference between good and bad algorithms is getting bigger. Being able to analyze algorithms will help us identify good ones without having to program them and test them first.

Measuring Performance: Empirical Approach Implement it, run it, time it (averaging trials) Pros? Cons?

Measuring Performance: Empirical Approach Implement it, run it, time it (averaging trials) Pros? Find out how the system effects performance Stress testing – how does it perform in dynamic environment No math! Cons? Need to implement code Can be hard to estimate performance When comparing two algorithms, all other factors need to be held constant (e.g., same computer, OS, processor, load)

Measuring Performance: Analytical Approach Use a simple model for basic operation costs Computational Model has all the basic operations: +, -, *, / , =, comparisons fixed sized integers (e.g., 32-bit) infinite memory all basic operations take exactly one time unit (one CPU instruction) to execute analysis= determining run-time efficiency. model = estimate, not meant to represent everything in real-world

Measuring Performance: Analytical Approach Analyze steps of algorithm, estimating amount of work each step takes Pros? Independent of system-specific configuration Good for estimating Don't need to implement code Cons? Won't give you info exact runtimes optimizations made by the architecture (i.e. cache) Only gives useful information for large problem sizes In real life, not all operations take exactly the same time and have memory limitations

Analyzing Performance General “rules” to help measure how long it takes to do things: Basic operations Consecutive statements Conditionals Loops Function calls Recursive functions Constant time Sum of timesx Test, plus larger branch cost Sum of iterations Cost of function body Solve recurrence relation… Okay, now that we have some algorithms to serve as examples, let’s analyze them! Here’s some hints before we begin … 9

Efficiency examples statement1; statement2; statement3; ? for (int i = 1; i <= N; i++) { statement4; } statement5; statement6; statement7; ? ? ?

Efficiency examples statement1; statement2; statement3; 3 for (int i = 1; i <= N; i++) { statement4; } statement5; statement6; statement7; 3 N 3N

Efficiency examples 2 for (int i = 1; i <= N; i++) { ? for (int i = 1; i <= N; i++) { for (int j = 1; j <= N; j++) { statement1; } statement2; statement3; statement4; statement5; ? ?

Efficiency examples 2 N2 + 4N for (int i = 1; i <= N; i++) { for (int j = 1; j <= N; j++) { statement1; } statement2; statement3; statement4; statement5; How many statements will execute if N = 10? If N = 1000? N2 4N

Relative rates of growth most algorithms' runtime can be expressed as a function of the input size N rate of growth: measure of how quickly the graph of a function rises goal: distinguish between fast- and slow-growing functions we only care about very large input sizes (for small sizes, most any algorithm is fast enough) this helps us discover which algorithms will run more quickly or slowly, for large input sizes most of the time interested in worst case performance; sometimes look at best or average performance Motivation: we usually care only about algorithm performance when there are large number of inputs. We usually don’t care about small changes in run-time performance. (inaccuracy of estimates make small changes less relevant). Consider algorithms with slow growth rate better than those with fast growth rates.

Growth rate example Consider these graphs of functions. Perhaps each one represents an algorithm: n3 + 2n2 100n2 + 1000 Which grows faster?

Growth rate example How about now?

Big-Oh notation Defn: T(N) = O(f(N)) if there exist positive constants c , n0 such that: T(N)  c · f(N) for all N  n0 idea: We are concerned with how the function grows when N is large. We are not picky about constant factors: coarse distinctions among functions Lingo: "T(N) grows no faster than f(N)." Important (not in Lewis & Chase book). Write on board. (for next slide).

Big-Oh example problems n = O(2n) ? 2n = O(n) ? n = O(n2) ? n2 = O(n) ? n = O(1) ? 100 = O(n) ? 214n + 34 = O(2n2 + 8n) ?

Preferred big-Oh usage pick tightest bound. If f(N) = 5N, then: f(N) = O(N5) f(N) = O(N3) f(N) = O(N log N) f(N) = O(N)  preferred ignore constant factors and low order terms T(N) = O(N), not T(N) = O(5N) T(N) = O(N3), not T(N) = O(N3 + N2 + N log N) Wrong: f(N)  O(g(N)) Wrong: f(N)  O(g(N))

Show f(n) = O(n) Claim: n2 + 100n = O(n2) Proof: Must find c, n0 such that for all n > n0, n2 + 100n <= cn2

Efficiency examples 3 ? ? ? sum = 0; for (int i = 1; i <= N * N; i++) { for (int j = 1; j <= N * N * N; j++) { sum++; } ? ? ?

Efficiency examples 3 So what is the Big-Oh? N5 + 1 N2 N3 sum = 0; for (int i = 1; i <= N * N; i++) { for (int j = 1; j <= N * N * N; j++) { sum++; } N5 + 1 N2 N3 So what is the Big-Oh?

Math background: Exponents XY , or "X to the Yth power"; X multiplied by itself Y times Some useful identities XA XB = XA+B XA / XB = XA-B (XA)B = XAB XN+XN = 2XN 2N+2N = 2N+1 exponents grow really fast: use doubling salary example, $1000 initial, yearly $1000 increase, vs $1 initial, doubles every year. Which is better after 20 years? logs grow really slow. logs are inverses of exponents.

Efficiency examples 4 sum = 0; for (int i = 1; i <= N; i += c) { sum++; } ? ?

Efficiency examples 4 What is the Big-Oh? sum = 0; for (int i = 1; i <= N; i += c) { sum++; } N/c + 1 N/c What is the Big-Oh? Intuition: Adding to the loop counter means that the loop runtime grows linearly when compared to its maximum value n.

Efficiency examples 5 sum = 0; for (int i = 1; i <= N; i *= c) { sum++; } ? ? Intuition: Multiplying the loop counter means that the maximum value n must grow exponentially to linearly increase the loop runtime

Efficiency examples 5 What is the Big-Oh? sum = 0; for (int i = 1; i <= N; i *= c) { sum++; } logc N + 1 logc N What is the Big-Oh?

Math background: Logarithms definition: XA = B if and only if logX B = A intuition: logX B means: "the power X must be raised to, to get B" In this course, a logarithm with no base implies base 2. log B means log2 B Examples log2 16 = 4 (because 24 = 16) log10 1000 = 3 (because 103 = 1000)

Logarithm identities Identities for logs with addition, multiplication, powers: log (AB) = log A + log B log (A/B) = log A – log B log (AB) = B log A Identity for converting bases of a logarithm: example: log432 = (log2 32) / (log2 4) = 5 / 2 Proof: Let X = log A, Y = log B, and Z = log AB. Then 2X = A, 2Y = B, and 2Z = AB. So, 2X 2Y = AB = 2Z. Therefore, X + Y = Z.

Logarithm problem solving When presented with an expression of the form: logaX = Y and trying to solve for X, raise both sides to the a power. X = aY logaX = logbY and trying to solve for X, find a common base between the logarithms using the identity on the last slide. logaX = logaY / logab

Logarithm practice problems Determine the value of x in the following equation. log7 x + log713 = 3 log8 4 - log8 x = log8 5 + log16 6