Faculty Name: Ruhi Fatima Topics Covered Theta **Notation** Oh **Notation** Omega **Notation** Standard **Notations** and Common function **Asymptotic** **notations** **Asymptotic** **notation** are primarily used to describe the running times **of** **algorithms** The Running time **of** **Algorithm** is defined as : the time needed by an **algorithm** in order to deliver its output when presented with legal input. In **Asymptotic** **notation**, **algorithm** is treated as a function. Let us consider **asymptotic** **notations** that are well suited to characterizing running/

**of** an **Algorithm** – Fundamentals **of** **Algorithmic** Problem Solving – Important Problem Types – Fundamentals **of** the Analysis **of** **Algorithm** Efficiency – Analysis Framework – **Asymptotic** **Notations** and its properties – Mathematical analysis for Recursive and Non- recursive **algorithms**. 3/15/2016DAA - Unit - I Presentation Slides8 What is an **algorithm**? An **algorithm** is a list **of** steps (sequence **of**/ AN **ALGORITHM** (Contd…) Step 7 : Controls: It has three types (i) : Sequential Logic : It is executed by **means** **of** numbered /

1974. Father **of** the analysis **of** **algorithms** Popularizing the **asymptotic** **notation** 2009 Fall SemesterData Structures and **Algorithms** (I)41 Part 1 Using **asymptotic** **notation** in sentences 2009 Fall SemesterData Structures and **Algorithms** (I)42 Examples 2009 Fall SemesterData Structures and **Algorithms** (I)43 More examples 2009 Fall SemesterData Structures and **Algorithms** (I)44 **Meaning** 2009 Fall SemesterData Structures and **Algorithms** (I)45 **Meaning** 2009 Fall SemesterData Structures and **Algorithms** (I)46 **Meaning** 2009 Fall/

**Asymptotic** **Notations** Relation between O, Ω and Θ – **Notations** Θ(g(n)) = O(g(n)) ∩ Ω(g(n)) 58 Review **of** **Algorithm** Analysis Fundamentals **of** **Algorithms** Important Problems: Sorting Searching String processing Graph problems Analysis Framework: Input size Running Time Order **of** Growth M. G. Abbas Malik - FCIT, UoJ 59 Review **of** **Algorithm** Analysis **Asymptotic** **Notation** Big O **notation** (big O) Big Ω **notation** (big omega) Big θ **notation**/ example, we construct a B-tree **of** order 5. This **means** that (other than the root node/

+ 5x 2 – 9 equals the function O (x 3 )” Which actually **means** “3x 3 +5x 2 –9 is dominated by x 3 ” Read as: “3x 3 +5x 2 –9 is big-Oh **of** x 3 ” L838 Intuitive Notion **of** Big-O **Asymptotic** **notation** captures behavior **of** functions for large values **of** x. EG: Dominant term **of** 3x 3 +5x 2 –9 is x 3. As x becomes larger/

are familiar with: –programming and basic data structures: arrays, lists, stacks, queues –**asymptotic** **notation** “big-oh” –sets, graphs –proofs, especially induction proofs –exposure to NP-completeness April 1, 2014CS38 Lecture 16 Motivation/Overview **Algorithms** Systems and Software Design and Implementation Computability and Complexity Theory Motivation/Overview at the heart **of** programs lie **algorithms** in this course **algorithms** **means**: –abstracting problems from across application domains –worst case analysis/

stuff … **Algorithms** Problems Course Objectives Administrative stuff … Analysis **of** **Algorithms** **Algorithm** Analysis Overview RAM model **of** computation Concept **of** input size Measuring complexity Best-case, average-case, worst-case **Asymptotic** analysis **Asymptotic** **notation** The RAM Model RAM model represents a “generic” implementation **of** the **algorithm** Each “simple/5 sec 910-5 sec 0.0001 sec 0.0002 sec 0.0003 sec 0.0004 sec Example Problems 1. What does it **mean** if: f(n) O(g(n)) and g(n) O(f(n)) ? 2. Is 2n+1 = O(/

are dominated by the effects **of** the input size itself. **Asymptotic** **Notation** **Asymptotic** **Notation** The **notation** we use to describe the **asymptotic** running time **of** an **algorithm** are defined in terms **of** functions whose domains are the set **of** natural numbers O-**notation** For a given function, we denote by the set **of** functions We use O-**notation** to give an **asymptotic** upper bound **of** a function, to within a constant factor. **means** that there existes some/

which **means** that the maximum is the one that counts). IF / ELSE: For the fragment if ( Condition ) S1; else S2; the running time is never more than the running time **of** the test plus the larger **of** the running time **of** S1 and S2. 14/ 26 §2 **Asymptotic** **Notation** / 1 = O( N log N ) Also true for N 2 k The program can be found on p.21. 18/ 26 §3 Compare the **Algorithms** **Algorithm** 4 On-line **Algorithm** int MaxSubsequenceSum( const int A[ ], int N ) { int ThisSum, MaxSum, j; /* 1*/ ThisSum = MaxSum = 0; /* 2*/ for ( j/

**Asymptotic** Analysis, Briefly Silicon Downs and the SD Cheat Sheet **Asymptotic** Analysis, Proofs and Programs Examples and Exercises 16 A Task to Solve and Analyze Find a student’s name in a class given her student ID 17 Analysis **of** **Algorithms** Analysis **of** an **algorithm**/’s also a notion **of** **asymptotic** “dominance”, which **means** one function as a fraction **of** another (**asymptotically** dominant) function goes to/ 5: Simplify T(n) and convert to order **notation**. (Also, which order **notation**: O, , ?) 58 Analyzing Code // /

expression containing a variable approaches a limit, usually infinity Sep-15 CSC201 Analysis and Design **of** **Algorithms** 17-Sep-1516 **Asymptotic** Performance **asymptotic** performance In mathematics, computer science, and related fields, big-O **notation** (along with the closely related big-Omega **notation**, big- Theta **notation**, and little o **notation**) describes the limiting behavior **of** a function when the argument tends towards a particular value or infinity, usually in terms/

the comparison among **algorithms** 8 Outline Motivation Analysis **of** **algorithms** Types **of** **asymptotic** **notations** Big-Oh: **Asymptotic** Upper Bound Big-Omega: **Asymptotic** Lower Bound Big-Theta: **Asymptotic** Tight Bound Examples Practice questions 9 Types **of** **asymptotic** **notations** Three major types **of** **asymptotic** **notations** Big-Oh: **Asymptotic** Upper Bound Big-Omega: **Asymptotic** Lower Bound Big-Theta: **Asymptotic** Tight Bound Measure the growth rate A faster growth rate does not **mean** the **algorithm** always performs/

, most **of** **algorithms** will do When the input size is very large things change **Asymptotic** Performance In this course, we care most about **asymptotic** performance How does the **algorithm** behave as the problem size gets very large? Running time Memory/storage requirements Bandwidth/power requirements/logic gates/etc. **Asymptotic** **Notation** By now you should have an intuitive feel for **asymptotic** (big-O) **notation**: What does O(n) running time **mean**/

**Asymptotic** Analysis, Briefly Silicon Downs and the SD Cheat Sheet **Asymptotic** Analysis, Proofs and Programs Examples and Exercises 15 A Task to Solve and Analyze Find a student’s name in a class given her student ID 16 Analysis **of** **Algorithms** Analysis **of** an **algorithm**/’s also a notion **of** **asymptotic** “dominance”, which **means** one function as a fraction **of** another (**asymptotically** dominant) function goes to/ 5: Simplify T(n) and convert to order **notation**. (Also, which order **notation**: O, , ?) 50 Analyzing Code // /

**Asymptotic** Analysis, Briefly Silicon Downs and the SD Cheat Sheet **Asymptotic** Analysis, Proofs and Programs Examples and Exercises 16 A Task to Solve and Analyze Find a student’s name in a class given her student ID 17 Analysis **of** **Algorithms** Analysis **of** an **algorithm**/’s also a notion **of** **asymptotic** “dominance”, which **means** one function as a fraction **of** another (**asymptotically** dominant) function goes to/ 5: Simplify T(n) and convert to order **notation**. (Also, which order **notation**: O, , ?) 58 Analyzing Code // /

Program verification Computability Classification **of** **Algorithms** by methods (techniques) by characteristics by running environments (architectures) Classified by methods (techniques) Divide and Conquer Dynamic Programming Greedy Network Flow Linear/Integer Programming Backtracking Branch and Bound Classified by characteristics Heuristic Approximation Randomized (Probabilistic) On-Line Genetic Classified by running environments Sequential Parallel Distributed Systolic **Asymptotic** **Notations** Suppose f and g/

: how the running time **of** an **algorithm** increases with the size **of** the input in the limit. **Asymptotically** more efficient **algorithms** are best for all but small inputs September 17, 2001 **Asymptotic** **Notation** The “big-Oh” O-**Notation** **asymptotic** upper bound f(n) /**of** Big-Omega September 17, 2001 **Asymptotic** **Notation** (6) Analogy with real numbers f(n) = O(g(n)) f g f(n) = (g(n)) f g f(n) = (g(n)) f g f(n) = o(g(n)) f g f(n) = (g(n)) f g Abuse **of** **notation**: f(n) = O(g(n)) actually **means**/

Analysis **of** Algorithms1 O-**notation** (upper bound) **Asymptotic** running times **of** **algorithms** are usually defined by functions whose domain are N={0, 1, 2, …} (natural numbers) Formal Definition **of** O-**notation** f(n) = O(g(n)) if positive constants c, n 0 such that 0 ≤ f(n) ≤ / n ≥ n 0 } 2n 2 = O(n 3 ) **means** that 2n 2 O(n 3 ) Analysis **of** Algorithms3 O-**notation** (upper bound) O-**notation** is an upper-bound **notation** It makes no sense to say “running time **of** an **algorithm** is at least O(n 2 )”. let running time be T(n)/

should be analyzed May be various ways to design a particular **algorithm** – Certain **algorithms** take very little computer time to execute – Others take a considerable amount **of** time 15C++ Programming: Program Design Including Data Structures, Sixth Edition **Asymptotic** **Notation**: Big-O **Notation** (cont’d.) 16C++ Programming: Program Design Including Data Structures, Sixth Edition **Asymptotic** **Notation**: Big-O **Notation** (cont’d.) 17C++ Programming: Program Design Including Data Structures, Sixth Edition/

: how the running time **of** an **algorithm** increases with the size **of** the input in the limit. **Asymptotically** more efficient **algorithms** are best for all but small inputs September 9, 200215 **Asymptotic** **Notation** The “big-Oh” O-**Notation** **asymptotic** upper bound f(n) /**of** Big-Omega September 9, 200220 **Asymptotic** **Notation** (6) Analogy with real numbers f(n) = O(g(n)) f g f(n) = (g(n)) f g f(n) = (g(n)) f g f(n) = o(g(n)) f g f(n) = (g(n)) f g Abuse **of** **notation**: f(n) = O(g(n)) actually **means**/

Lecture 2COSC3101A5 Review: **Asymptotic** **Notations**(1) 5/11/2004 Lecture 2COSC3101A6 Review: **Asymptotic** **Notations**(2) if and only if 5/11/2004 Lecture 2COSC3101A7 Review: **Asymptotic** **Notations**(3) A way to describe behavior **of** functions in the limit –How we indicate running times **of** **algorithms** –Describe the running time **of** an **algorithm** as n grows to O **notation**: **asymptotic** “less than”: f(n) “≤” g(n) **notation**: **asymptotic** “greater than”: f(n) “≥” g(n) **notation**: **asymptotic** “equality”: f/

**of** an **algorithm**. **Asymptotic** **Notation** Method A is 10n² - 5 milliseconds to process n elements. Method B is 100n + 200 milliseconds. **Asymptotic** **Notation** The differences for small values **of** n are relatively insignificant. –What really concerns us is the **asymptotic** behavior **of** the running-time functions: What happens as n becomes very large? **Asymptotic** **Notation**/ Worst-case analysis –Contains() –This **means** assuming that target is not in the ArraList, giving a running time **of** Θ(n). Average-case analysis –Requires /

all swap operations will be executed For both inputs the solution requires time **Asymptotic** **Notation** Considering two **algorithms**, A and B, and the running time for each **algorithm** for a problem **of** size n is T A (n) and T B (n) respectively /**Asymptotic** tight Bound - cg(n) f(n) dg(n) Example **Asymptotic** **Notation** When we use the term f = O(n) we **mean** that the function f O(n) When we write we **mean** that the aside from the function the sum includes an additional function from O(n) which we have no interest **of**/

**Asymptotic** Analysis, Briefly Silicon Downs and the SD Cheat Sheet **Asymptotic** Analysis, Proofs and Programs Examples and Exercises 16 A Task to Solve and Analyze Find a student’s name in a class given her student ID 17 Analysis **of** **Algorithms** Analysis **of** an **algorithm**/faster **means** smaller, not larger! a.Left b.Right c.Tied d.It depends e.I am opposed to **algorithm** /i return -1 Step 5: Simplify T(n) and convert to order **notation**. (Also, which order **notation**: O, o, , , ?) 43 Analyzing Code // Linear search /

**of** Functions Exact running time **of** an **algorithm** is usually hard to compute, and it’s unnecessary. For large enough inputs, the lower-order terms **of** an exact running time are dominated by high-order terms. f(n) = n^2 + 5n + 234 n^2 >> 5n + 234, when n is large enough **Asymptotic** **Notation**/f(n) = O(1) **Asymptotic** **Notation**: Big Oh (O) Example 5[loose bounds] f(n) = 3n+3 For n >= 10, 3n+3 <= 3n^2. Therefore, f(n) = O(n^2). Usually, we **mean** tight upper bound when using big oh **notation**. Example 6[Incorrect bounds] 3n/

**of** **Algorithms** Lecture 2: **Asymptotic** **Notations** 1/6/20162 Outline Review **of** last lecture Order **of** growth **Asymptotic** **notations** –Big O, big Ω, Θ 1/6/20163 How to express **algorithms**? Nature language (e.g. English) Pseudocode Real programming languages Increasing precision Ease **of** expression Describe the ideas **of** an **algorithm** in nature language. Use pseudocode to clarify sufficiently tricky details **of** the **algorithm**/Abuse **of** **notation** (for convenience): f(n) = Θ(g(n)) actually **means** f(n) Θ(g(n)) Θ(1) **means** /

Software University http://softuni.bg Data Structures, **Algorithms** and Complexity Analyzing **Algorithm** Complexity. **Asymptotic** **Notation** Table **of** Contents 1.Data Structures Linear Structures, Trees, Hash Tables, Others 2.**Algorithms** Sorting and Searching, Combinatorics, Dynamic Programming, Graphs, Others 3.Complexity **of** **Algorithms** Time and Space Complexity **Mean**, Average and Worst Case **Asymptotic** **Notation** O(g) 2 Data Structures Overview 4 Examples **of** data structures: Person structure (first name/

how the running time **of** an **algorithm** increases with the size **of** the input in the limit. **Asymptotically** more efficient **algorithms** are best for all but small inputs September 18, 200316 **Asymptotic** **Notation** The “big-Oh” O-**Notation** **asymptotic** upper bound f(n) /**of** Big-Omega September 18, 200321 **Asymptotic** **Notation** (6) Analogy with real numbers f(n) = O(g(n)) f g f(n) = (g(n)) f g f(n) = (g(n)) f g f(n) = o(g(n)) f g f(n) = (g(n)) f g Abuse **of** **notation**: f(n) = O(g(n)) actually **means**/

(c, C, x 0 > 0) ( x x 0 ) 0 cg ( x ) f ( x ) Cg ( x ) **means** that f **asymptotically** equals g **Algorithms** and Data Structures I42Analysis **of** **algorithms** **Algorithms** and Data Structures I43Analysis **of** **algorithms** f ( x ) g(x)g(x) Cg ( x ) x0Cx0C x0cx0c cg ( x ) =x 0 What does the **asymptotic** **notation** show us? We have seen: T ( n ) = θ ( n ) for the procedure Minimum ( A ) where n = A/

at list[min]; I nterchange list[i] and list[min]; } Sort = Find the smallest integer + Interchange it with list[i]. **Algorithm** in pseudo-code 2/15 §1 What to Analyze Machine & compiler-dependent run times. Time & space complexities : machine & /which **means** that the maximum is the one that counts). IF / ELSE: For the fragment if ( Condition ) S1; else S2; the running time is never more than the running time **of** the test plus the larger **of** the running time **of** S1 and S2. 14/15 §2 **Asymptotic** **Notation** /

why /why not? n2 W (n) why /why not? Theta **Asymptotic** tight bound on the growth rate **of** an **algorithm** Insertion sort is (n2) in the worst and average cases This **means** that in the worst case and average cases insertion sort performs cn2 operations Binary/ faster than polylogarithmic functions More properties The following slides show Two examples **of** pairs **of** functions that are not comparable in terms **of** **asymptotic** **notation** How the **asymptotic** **notation** can be used in equations That Theta, Big Oh, and Omega /

(2001-2002 SemA) City Univ **of** HK / Dept **of** CS / Helena Wong 2. Analysis **of** **Algorithms** - 1 http://www.cs.cityu.edu.hk/~helena Analysis **of** **Algorithms** CS3381 Des & Anal **of** Alg (2001-2002 SemA) City Univ **of** HK / Dept **of** CS / Helena Wong 2. Analysis **of** **Algorithms** - 2 http://www.cs.cityu.edu.hk/~helena Analysis **of** **Algorithms** Coming up **Asymptotic** performance, Insertion Sort A formal introduction to **asymptotic** **notation** (Chap 2.1-2.2/

Say “ 3n 5 is O(n) ” instead **of** “ 3n 5 is O(3n) ” PSU CS 311 – **Algorithm** Design and Analysis Dr. Mohamed Tounsi 25 **Asymptotic** **Algorithm** Analysis n The **asymptotic** analysis **of** an **algorithm** determines the running time in big-Oh **notation** n To perform the **asymptotic** analysis l We find the worst-case number **of** primitive operations executed as a function **of** the input size l We express this function/

Analysis **of** **Algorithms** Lecture 2 **Algorithm** InputOutput Analysis **of** Algorithms2 Outline Running time Pseudo-code Counting primitive operations **Asymptotic** **notation** **Asymptotic** analysis Analysis **of** Algorithms3 How good is Insertion-Sort? How can you answer such questions? What is “goodness”? 1. Measure 2. Count 3. Estimate Analysis **of** Algorithms4 How can we quantify it? 1. Correctness 2. Minimum use **of** “time” + “space” Analysis **of** Algorithms5 1) Measure it – do an experiment! Write a/

Kinds **of** analyses Worst-case: (usually) T(n) = maximum time **of** **algorithm** on any input **of** size n. Average-case: (sometimes) T(n) = expected time **of** **algorithm** over all inputs **of** size n. Need assumption **of** statistical distribution **of** inputs. Best-case: (bogus) Cheat with a slow **algorithm** that /6n = O(n) ， 6n = O(n 2 ) Computational time O(n 2 ) **means** the time in the worst case is O(n 2 ) Ω-**notation**: f(n) = Ω(g(n)) ， g(n) is an **asymptotically** lower bound for f(n) 。 Ω(g(n)) = {f(n)| there are positive constants /

Analysis **of** **Algorithms** **Algorithm** InputOutput Analysis **of** Algorithms2 Outline Running time Pseudo-code Counting primitive operations **Asymptotic** **notation** **Asymptotic** analysis Analysis **of** Algorithms3 How good is Insertion-Sort? How can you answer such questions? What is “goodness”? 1. Measure 2. Count 3. Estimate Analysis **of** Algorithms4 How can we quantify it? 1. Correctness 2. Minimum use **of** “time” + “space” Analysis **of** Algorithms5 1) Measure it Write a program implementing the **algorithm** Run the/

0 = 1 5n 2 is (n 2 ) **Asymptotic** Analysis: Review What does it **mean** to say that an **algorithm** has runtime O(n log n)? n: Problem size Big-O: upper bound over all inputs **of** size n “Ignore constant factor” (why?) “as n grows large” O: like <= for functions (**asymptotically** speaking) : like >= : like = **Asymptotic** **notation**: examples **Asymptotic** runtime, in terms **of** O, , ? Suppose the runtime for a function/

**of** this class either **notation** is acceptable 9 -**notation** Intuitively: Set **of** all functions whose rate **of** growth is the same as or higher than that **of** g/ n/b or n/b . T(n) can be bounded **asymptotically** in three cases: 1.If f(n) = O(n log b a–/**algorithm** transforms an input data structure with a small (constant) amount **of** extra storage space –In the context **of** sorting, this **means** that the input array is overwritten by the output as the **algorithm** executes instead **of** introducing a new array –Advantages **of**/

should be executable within with in definite period **of** time on target machine Output: It must produce desired number **Algorithm** Design Develop the **algorithm** Refine the **Algorithm** Usages **of** Control statements( Sequence, Selection, Iteration) Analysis **of** **Algorithm** Time & Space complexity **Asymptotic** **Notations** Big-Oh(O), Omega(Ω) Theta() small-oh(o) How to develop the **Algorithm** Understand the problem Identify the output **of** problem Identify the inputs required by the problem and/

o Theoretical analysis Pseudo-code RAM: Random Access Machine 7 important functions **Asymptotic** **notations**: O(), (), () **Asymptotic** running time analysis **of** **algorithms** Last Update: Aug 21, 2014 EECS2011: Analysis **of** **Algorithms** 32 Last Update: Aug 21, 2014 EECS2011: Analysis **of** **Algorithms** 33 Part 2: Correctness Last Update: Aug 21, 2014EECS2011: Analysis **of** **Algorithms** 34 Outline Iterative **Algorithms**: Assertions and Proofs **of** Correctness Binary Search: A Case Study Last Update: Aug 21, 2014EECS2011: Analysis/

“ 3n 5 is O(n) ” instead **of** “ 3n 5 is O(3n) ” Analysis **of** Algorithms41 **Asymptotic** **Algorithm** Analysis The **asymptotic** analysis **of** an **algorithm** determines the running time in big-Oh **notation** To perform the **asymptotic** analysis We find the worst-case number **of** primitive operations executed as a function **of** the input size We express this function with big-Oh **notation** Example: We determine that **algorithm** arrayMax executes at most 7n 1/

1 TRUE Analyzing Algorithms21 Example 3 Show that Let c = 2 and n 0 = 5 Analyzing Algorithms22 Looking at **Algorithms** **Asymptotic** **notation** gives us a language to talk about the run time **of** **algorithms**. Not for just one case, but how an **algorithm** performs as the size **of** the input, n, grows. Tools: Series sums Recurrence relations Analyzing Algorithms23 Running Time Examples (1) Example 1: a = b/

CSCI 240 Analysis **of** **Algorithms** Dale Roberts Characteristics **of** **Algorithms** **Algorithms** are precise. Each step has a clearly defined **meaning**; “Deterministic” **Algorithms** are effective. The task is always done as required; “Correct” **Algorithms** have a finite number **of** steps; **Algorithms** must terminate. /( O ) **Notation** Dale Roberts 3 major **notations** O(g(n)), Big-Oh **of** g **of** n, the **Asymptotic** Upper Bound; (g(n)), Theta **of** g **of** n, the **Asymptotic** Tight Bound; and (g(n)), Omega **of** g **of** n, the **Asymptotic** Lower Bound./

2 +c 2 n+c 3 could be a correct step count for the program. Because **of** the inexactness **of** what a step count stands for, the exact step count is not very useful for comparison **of** **algorithms**. **Asymptotic** efficiency **Asymptotic** efficiency **means** study **of** **algorithms** efficiency for large inputs. To compare two **algorithms** with running times f(n) and g(n), we need a rough measure that characterizes how/

**Algorithms** Lecture #05 Uzair Ishtiaq **Asymptotic** **Notation** **Asymptotic** **Notation** - Example **Asymptotic** **Notation** **Asymptotic** **Notation** Example **Notation** The theta **notation** bounds a functions from above and below, so it defines exact **asymptotic** behavior. A simple way to get Theta **notation** **of** an expression is to drop low order terms and ignore leading constants. For example, consider the following expression. 3n 3 + 6n 2 + 6000 = (n 3 ) **Notation** For a given function g(n), we denote (g(n/

http://www.cse.unsw.edu.au/~cs9024 2 Outline Big-oh **notation** Big-theta **notation** Big-omega **notation** **asymptotic** **algorithm** analysis 3 Analysis **of** **Algorithms** **Algorithm** Input Output An **algorithm** is a step-by-step procedure for solving a problem in a finite amount **of** time. 4 Running Time Most **algorithms** transform input objects into output objects. The running time **of** an **algorithm** typically grows with the input size. Average case time is/

before termination Order **of** growth –The leading term **of** a formula –Expresses the behavior **of** a function toward infinity CS 477/677 - Lecture 23 **Asymptotic** **Notations** A way to describe behavior **of** functions in the limit –How we indicate running times **of** **algorithms** –Describe the running time **of** an **algorithm** as n grows to O **notation**: **asymptotic** “less than”: f(n) “≤” g(n) **notation**: **asymptotic** “greater than”: f(n) “≥” g(n) **notation**: **asymptotic** “equality”: f(n/

Ω(n) n2 = Ω(n log(n)) 2 n + 1 = Ω (n) Definition **of** "big Theta" To measure the complexity **of** a particular **algorithm**, **means** to find the upper and lower bounds. A new **notation** is used in this case. We say that T(n) = Θ(g(n)) if/ best-case: ? 1 Total worst-case complexity: ? Total best-case complexity: ? 1 Importance **of** **Asymptotic** Analysis—Worst- & Average-Case **Asymptotic** analysis tells us whether a technique/**algorithm** will be practical in all cases (worst-case analysis) or in the average-case (av.-case /

n time” Can say Counting Sort runs in... – “time linear in n and k ”, or – O(n+k) time... “oh **of** n plus k time” 9 this k **means** “max range **of** items” **Asymptotic** **Notation** (“Big-O”) Claim: if f(n) 2 O(g(n)) and g(n) 2 O(h(n)) then f(n) 2 O(h(n)) Proof: We know/ and therefore 0 · f(n) · k 1 k 2 h(n) 8 n ¸ max f n 1,n 2 g 10 Big-O **Notation** Used Everywhere 11 DOCUMENTATION COMP. SCI. COURSES **ALGORITHMS** RESEARCH More Examples **of** Big-O Membership 12 50 n+50 6n 2 +50 n 2 +10n+50 n 3 ¡ 10n 2lgn+50 n+lgn+50 2nlgn+n+lgn/

**of** Inputs **of** Different Functions 23 **Asymptotic** Analysis: Big-oh (O()) Definition: For T(n) a non-negatively valued function, T(n) is in the set O(f(n)) if there exist two positive constants c and n 0 such that T(n) cf(n) for all n > n 0. Usage: The **algorithm** is in O(n 2 ) in [best, average, worst] case. **Meaning**/ c 2 n is in (n 2 ). 28 **Asymptotic** Analysis: Big Theta ( ()) When O() and () meet, we indicate this by using () (big-Theta) **notation**. Definition: An **algorithm** is said to be (h(n)) if it is/

Ads by Google