DATA STRUCTURES Introduction: Basic Concepts and Notations

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Chapter 1 – Basic Concepts
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
Complexity Analysis (Part I)
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Algorithm analysis and design Introduction to Algorithms week1
Program Performance & Asymptotic Notations CSE, POSTECH.
CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a lgorithms.
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
Lecture 2 Computational Complexity
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 1704 Introduction to Data Structures and Software Engineering.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 Introduction to Data Structures. 2 Course Name : Data Structure (CSI 221) Course Teacher : Md. Zakir Hossain Lecturer, Dept. of Computer Science Stamford.
 DATA STRUCTURE DATA STRUCTURE  DATA STRUCTURE OPERATIONS DATA STRUCTURE OPERATIONS  BIG-O NOTATION BIG-O NOTATION  TYPES OF DATA STRUCTURE TYPES.
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Data Structure Introduction.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Algorithm Analysis 1.
Introduction to Data Structures
Chapter 2 Algorithm Analysis
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
Introduction to Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
Data Structures and Algorithms
Analysis of Algorithms
Growth of functions CSC317.
Complexity Analysis.
CS 3343: Analysis of Algorithms
Computation.
Algorithm Analysis (not included in any exams!)
O-notation (upper bound)
Algorithm design and Analysis
Asymptotic Notations Algorithms Lecture 9.
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
Introduction to Algorithms Analysis
Algorithm Efficiency Chapter 10.
Fundamentals of Algorithms MCS - 2 Lecture # 9
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
Analysis of Algorithms
CS200: Algorithms Analysis
Asst. Dr.Surasak Mungsing
Introduction to Data Structures
Revision of C++.
O-notation (upper bound)
8. Comparison of Algorithms
Math/CSE 1019N: Discrete Mathematics for Computer Science Winter 2007
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Discrete Mathematics 7th edition, 2009
Introduction To Algorithms
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Advanced Analysis of Algorithms
Analysis of Algorithms
Presentation transcript:

DATA STRUCTURES Introduction: Basic Concepts and Notations Complexity analysis: time space and trade off Algorithmic notations, Big O notation Introduction to omega, theta and little o notation

Basic Concepts and Notations Algorithm: Outline, the essence of a computational procedure, step-by-step instructions Program: an implementation of an algorithm in some programming language Data Structure: Organization of data needed to solve the problem

Algorithmic Problem Specification of input Specification of output as a function of input ? Infinite number of input instances satisfying the specification. For example: A sorted, non-decreasing sequence of natural numbers of non-zero, finite length: 1, 20, 908, 909, 100000, 1000000000. 3.

Algorithmic Solution Algorithm describes actions on the input instance Specification of input Specification of output as a function of input Algorithm Algorithm describes actions on the input instance Infinitely many correct algorithms for the same algorithmic problem

What is a Good Algorithm? Efficient: Running time Space used Efficiency as a function of input size: The number of bits in an input number Number of data elements(numbers, points)

Complexity Of Algorithms Time and space are two measures for efficiency of an algorithm. Time is measure by counting number of key operation. Space is measure by counting maximum of memory needed by the algorithm.

Time Complexity Worst-case Average-case Best-case An upper bound on the running time for any input of given size Average-case Assume all inputs of a given size are equally likely Best-case The lower bound on the running time

Time Complexity – Example Sequential search in a list of size n Worst-case: n comparisons Best-case: 1 comparison Average-case: n/2 comparisons

Example 1 //linear for(int i = 0; i < n; i++) { cout << i << endl; }

Ans: O(n)

Example 2 //quadratic for(int i = 0; i < n; i++) { for(int j = 0; j < n; j++){ //do swap stuff, constant time }

Ans O(n^2)

Example 3 for(int i = 0; i < 2*n; i++) { cout << i << endl; }

At first you might say that the upper bound is O(2n); however, we drop constants so it becomes O(n)

Example 4 //linear for(int i = 0; i < n; i++) { cout << i << endl; }   //quadratic for(int j = 0; j < n; j++){ //do constant time stuff

Ans : In this case we add each loop's Big O, in this case n+n^2 Ans : In this case we add each loop's Big O, in this case n+n^2. O(n^2+n) is not an acceptable answer since we must drop the lowest term. The upper bound is O(n^2). Why? Because it has the largest growth rate

Example 5 for(int i = 0; i < n; i++) { for(int j = 0; j < 2; j++){ //do stuff }

Ans: Outer loop is 'n', inner loop is 2, this we have 2n, dropped constant gives up O(n)

Example 6 for(int i = 1; i < n; i *= 2) { cout << i << endl; }

There are n iterations, however, instead of simply incrementing, 'i' is increased by 2*itself each run. Thus the loop is log(n).

Example 7 for(int i = 0; i < n; i++) { //linear for(int j = 1; j < n; j *= 2){ // log (n) //do constant time stuff }

Ans: n*log(n)

Question for(int i= 1; i<=n; i++) { for(int j=1; j<i;j++) { // }

Time Complexity (..Not limited to key opertation) Ex: Average(a,N)- 1- sum =0 2- repeat step 3 for I = 0 to N-1 By 1 3- sum = sum + a[i] [End of step 2 loop] 4- avg = sum/N 5- Write avg 6- Return

Equivalent c++ function: void Average (int a[], int n) { int sum = 0; for(int i=0; i<n; i++) sum = sum +a[i]; float avg = sum/n; cout<< avg; return }

Asymptotic notations Algorithm complexity is rough estimation of the number of steps performed by given computation depending on the size of the input data Measured through asymptotic notation O(g) where g is a function of the input data size Examples: Linear complexity O(n) – all elements are processed once (or constant number of times) Quadratic complexity O(n2) – each of the elements is processed n times

0  f(n)  c g(n) for all nn0. O-notation Asymptotic upper bound f(n) = O(g(n)) read as “f(n) is big oh of g(n)”) means that f(n) is asymptotically smaller than or equal to g(n). Therefore, in an asymptotic sense g(n) is an upper bound for f(n). f(n) = O(g(n)) if there are positive constant c and n0 such that 0  f(n)  c g(n) for all nn0. g(n) is said to be an upper bound of f(n).

Example The running time is O(n2) means there is a function f(n) that is O(n2) such that for any value of n, no matter what particular input of size n is chosen, the running time of that input is bounded from above by the value f(n). 3 * n2 + n/2 + 12 ∈ O(n2) 4*n*log2(3*n+1) + 2*n-1 ∈ O(n * log n)

f(n)  c g(n)  0 for all nn0. Ω notation Asymptotic lower bound f(n) = Ω(g(n)) (read as “f(n) is omega of g(n)”) means that f(n) is asymptotically bigger than or equal to g(n). Therefore, in an asymptotic sense, g(n) is a lower bound for f(n). f(n) = Ω(g(n)) if there are positive constant c and n0 such that f(n)  c g(n)  0 for all nn0. g(n) is said to be an lower bound of f(n).

The definition of asymptotically smaller implies the following order: 1 < logn < n < nlogn < n^2 < n^3 < 2^n <n!

c1 g(n)  f(n)  c2 g(n) for all nn0. Θ notation g(n) is an asymptotically tight bound of f(n) f(n) = Θ(g(n)) (read as “f(n) is theta of g(n)”) Means that f(n) is asymptotically equal to g(n) f(n)= Θ(g(n)) if there exist positive constant n0 , c1 and c2 such that c1 g(n)  f(n)  c2 g(n) for all nn0. g(n) is said to be an tight bound of f(n).

0  f(n) < c g(n) for all nn0. little o notation The little oh notation describes a strict upper bound(upper bound that is not asymptotically tight)on the asymptotic growth rate of the function f. f(n) is little oh of g(n)iff f(n) is asymptotically smaller than g(n). f(n) = o(g(n)) if for any positive constant c>0, there exist a constant n0>0 such that 0  f(n) < c g(n) for all nn0. Equivalently, f(n)=o(g(n)) (read as “f of n is little oh of g of n”) iff f(n)=O(g(n)) and f(n) not equal to Ω(g(n))

f(n) > c g(n)  0 for all nn0. ω-notation The little oh notation describes a strict lower bound(lower bound that is not asymptotically tight)on the asymptotic growth rate of the function f. f(n) is little omega of g(n)iff f(n) is asymptotically greater than g(n). f(n) = ω(g(n)) if for any positive constant c>0, there exist a constant n0>0 such that f(n) > c g(n)  0 for all nn0. Equivalently, f(n)= ω(g(n)) (read as “f of n is little omega of g of n”) iff f(n)= Ω(g(n)) and f(n) not equal to O(g(n))

Notes on o-notation O-notation may or may not be asymptotically tight for upper bound. 2n2 = O(n2) is tight, but 2n = O(n2) is not tight. o-notition is used to denote an upper bound that is not tight. 2n = o(n2), but 2n2  o(n2). Difference: for some positive constant c in O-notation, but all positive constants c in o-notation. In o-notation, f(n) becomes insignificant relative to g(n) as n approaches infinitely. Caution: Beware of vary large constant factor. An algorithm running time 1,000,000 n is still O(n) but might be less efficient than one running in time 2n^2 is O(n^2).

Intuition for Asymptotic Notation Big-Oh f(n) is O(g(n)) if f(n) is less than or equal to g(n) Big-Omega f(n) is (g(n)) if f(n) is greater than or equal to g(n) Big-Theta f(n) is (g(n)) if f(n) is equal to g(n)

Relations Between Q, O, W Comp 122

time space tradeoff The best algorithm used to solved a problem is the one that requires less time to complete and takes less space in memory. But in practice it is not possible to achieve both of these objective. Thus we have to sacrifice one at the cost of the other. A time space tradeoff is a situation where the memory use can be reduced at the cost of slower program execution (and, conversely, the computation time can be reduced at the cost of increased memory use).

Classification of Data Structures Primitive Data Structures Non-Primitive Data Structures Integer Real Character Boolean Linear Data Structures Non-Linear Data Structures 목차 자바의 생성배경 자바를 사용하는 이유 과거, 현재, 미래의 자바 자바의 발전과정 버전별 JDK에 대한 설명 자바와 C++의 차이점 자바의 성능 자바 관련 산업(?)의 경향 Array Stack Queue Linked List Tree Graph

Data Structure Operations Data Structures are processed by using certain operations. Traversing: Accessing each record exactly once so that certain items in the record may be processed. Searching: Finding the location of the record with a given key value, or finding the location of all the records that satisfy one or more conditions. Inserting: Adding a new record to the structure. Deleting: Removing a record from the structure. 목차 자바의 생성배경 자바를 사용하는 이유 과거, 현재, 미래의 자바 자바의 발전과정 버전별 JDK에 대한 설명 자바와 C++의 차이점 자바의 성능 자바 관련 산업(?)의 경향

Special Data Structure-Operations Sorting: Arranging the records in some logical order (Alphabetical or numerical order). Merging: Combining the records in two different sorted files into a single sorted file. 목차 자바의 생성배경 자바를 사용하는 이유 과거, 현재, 미래의 자바 자바의 발전과정 버전별 JDK에 대한 설명 자바와 C++의 차이점 자바의 성능 자바 관련 산업(?)의 경향

Thank You