Presentation on theme: "Designed and Presented by Dr. Ayman Elshenawy Elsefy Dept. of Systems & Computer Eng.. Al-Azhar University"— Presentation transcript:
Designed and Presented by Dr. Ayman Elshenawy Elsefy Dept. of Systems & Computer Eng.. Al-Azhar University Email : email@example.com@yaho Lectures on Data Structure
Students using this book should have knowledge of either an object-oriented or procedural programming language. Knowledge of basic features, including primitive data types. Operators. control structures. functions (methods). input and output is assumed. Pre-requisites
Algorithms and Data Structure a data structure is a systematic way of organizing and accessing data. design of "good" data structures and algorithms. Algorithm is a step-by -step procedure for performing some task in a finite amount of time. We must analyze data structures and algorithms in order to classify them as “good”.
Algorithmic Performance There are two aspects of algorithmic performance : Time Instructions take time. How fast does the algorithm perform? What affects its runtime? Space Data structures take space What kind of data structures can be used? How does choice of data structure affect the runtime? We will focus on time: – How to estimate the time required for an algorithm – How to reduce the time required 5
Relative costs of algorithms and DS Performing Analysis of the algorithm when the input sets are very large (asymptotic analysis). 1, 10, 100, 1000, 1 million of items. If an algorithm take 5 secs to perform 1000 items, what time it will be talk if the number of items are 1,000,000? 5 secs or 5 years. You must know before your customer knows.
Rate of Growth how an algorithm’s complexity changes as the input size grows? Big-O notation uses a capital O (“order”) and a formula that expresses the complexity of the algorithm. The formula may have a variable, n, which represents the size of the input.
Common order functions Constant O(1) whose complexity is constant regardless of how large the input size is. The 1 does not mean that there is only one operation or that the operation takes a small amount of time. It might take 1 microsecond or it might take 1 hour. The point is that the size of the input does not influence the time the operation takes.
Common order functions Linear O(n) whose complexity grows linearly with the size of the input. If an input size of 1 takes 5 ms, an input with one thousand items will take 5 sec. Ex. a looping mechanism that accesses each member.
Common order functions Logarithmic O(log n) whose complexity is logarithmic to its size. Many Divide and Conquer algorithms. The binary search tree Contains method implements an O(log n) algorithm.
Algorithms and Data Structure Common order functions Linearithmic O(n log n) A linearithmic algorithm, or log linear, is an algorithm that has a complexity of O(n log n). Some divide and conquer algorithms fall into this bucket. merge sort and quick sort.
Algorithms and Data Structure Common order functions Quadratic O(n 2 ) Whose complexity is quadratic to its size Do not scale well as the input size grows. Ex. an array with 1000 integers 1,000,000 operations to complete. An input with 1,000,000 items would take one trillion (1,000,000,000,000) operations. If each operation takes 1 ms to complete, an O(n 2 ), an input of 1,000,000 items will take nearly 32 years to complete. Making that algorithm 100 times faster would still take 84 days. ( Bubble sort is an example)
Common order functions Quadratic O(n 2 ) ( Bubble sort is an example) – Nested Loops
What we are measuring The amount of time the operation takes to complete (operational complexity, Time). The amount of resources (memory) an algorithm uses (resource complexity, Space). An algorithm that runs ten times faster but uses ten times as much memory might be perfectly acceptable in a server environment with vast amounts of available memory, but may not be appropriate in an embedded environment where available memory is severely limited.
Common operations Comparison operations (greater than, less than, equal to). Assignments and data swapping. Memory allocations. Searching Comparison (Read only operation) no assignment is done. Sorting comparison, assignments, allocations. What we are measuring
Algorithms and Data Structure Comparing Growth Rates
Tools of measuring Experimental Studies Executing it on various test inputs and recording the actual time spent in each execution. System.currentTimeMillis() Perform several experiments on many different test inputs of various sizes. Plotting the performance of each run, input size, n, versus the running time, t. Statistical analysis that seeks to fit the best function of the input size to the experimental data.
Tools of measuring Experimental Studies Limitations Experiments can be done only on a limited set of test inputs; Difficult to comparing two algorithms on tow different environments. We have to fully implement and execute an algorithm in order to study its running time experimentally.
Tools of measuring develop a general way of analyzing the running times of algorithms. Takes into account all possible inputs Evaluate the relative efficiency of any two algorithms on different environments. Can be performed by studying a high-level description of the algorithm without actually implementing it or running experiments on it.
Counting Primitive operations Primitive operation is a low-level instruction with constant time. Counting the primitive can be used as measure for algorithm performance.
Counting Primitive operations we can perform an analysis directly on the high- level pseudo-code instead. We define a set of primitive operations such as the following: Assigning a value to a variable. Calling a method. Performing an arithmetic operation (for example, adding two numbers) Comparing two numbers Indexing into an array Following an object reference Returning from a method.