Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithms - some definitions

Similar presentations


Presentation on theme: "Algorithms - some definitions"— Presentation transcript:

1 Algorithms - some definitions
Algorithms are simply a list of steps required to solve some particular problem Or An algorithm is a sequence of unambiguous instructions for solving a computational problem. They are designed as a sequence of processes carried out by computer programs.

2 Examples of Algorithms
Examples include Sorting Determining if a student qualifies for financial aid Determining the steps to set up a dating service How to search through a maze

3 Algorithms In some cases we have only one algorithm for a problem e.g
- find the largest number in an (unsorted) list of numbers Or the problem is so easy to solve - none is needed E.g. printing out the guests at a party

4 Algorithms Some other problems have many known algorithms
We obviously want to choose the "best" algorithm And some problmes have no known feasible algorithm

5 What is the "best" Algorithm?
Traditionally we focused on two questions 1. How fast does it run? Early days, measured by timing the implementation of the algorithm - (usually using sorting algorithms) 2. How much memory does it require?

6 Analysis of Algorithms
BUT - Programs depend on the operating systems, the machine, and compiler/interpreter used, etc. Different operating systems run at different speeds. Some compilers can arrange the code so it will run faster. So we need to find a way to measure the efficiency of the algorithm THAT IS INDEPENDENT OF THE MACHINE it is run on

7 Analysis of algorithms
So, Analysis of algorithms: compares algorithms and not programs It is based on the premise that : 1. the longer the algorithm takes the longer its implementation will run. 2. Sorting 1 million items ought to take longer than sorting 1000

8 Algorithms Measuring Performance
But if we compare algorithms (not yet implemented) how can we express their performance? How can we "measure" the performance of an algorithm?

9 Analysis of Algorithms
We want an expression to work on any computer This is only possible by stating the efficiency in terms of some critical operations These operations depend on the problem

10 Measuring Performance of Algorithms
We could for instance say that in sorting algorithms - it is the number of times two elements are compared What is/ are the critical operation(s) in your checkers program?

11 Analysis of Algorithm In general we do analysis of algorithms using the RAM model (Random Access Machine) Instructions are executed one after the other There is no concurrency( multiple operations running at the same time) Each algorithm operation is measured one at a time e.g getting a random number for placing the checkers.

12 Measuring the efficiency of Alrorithm
Basic operations take the same time (constant time) i.e. We normally say that each line (step) in the algorithm takes time 1 (one)

13 Analysis of Algorithms
But , if each line of code takes a constant time then the whole algorithm (any algorithm) must take a constant time. Wrong! Although some algorithms may take constant time, e.g. printing the numbers 1 to

14 Varying number of steps in an algorithm
the majority of algorithms varies its number of steps -- based on the size of the problem we're trying to solve. Let’s look at an example

15 Number of steps So we must also consider the number of steps
Most algorithms vary their number of steps. The code below has three lines but those three lines will execute N times - if n = 100, they will execute 100 times for i = 1 to N { a = a + 2 i = i + 1 } for i = 1 to N { a = a + 2 i = i *2; } So we must also consider the number of steps (lines of code) it will take to process the number of items(N).

16 Analysis of Algorithms
Therefore the efficiency of an algorithm is always stated as a function of the problem’s size We generally use the variable n to represent the problem size “N “ is equal to number of lines of code we are processing

17 N = Number of lines to be processed
We could find out that IMPLEMENTING on a Windows machine. QuickSort takes when we calculate the number of lines: 0.6n n seconds to run Plug a value for n(lines of code ) and you have how long it takes N represent the size of the problem, e.g the number of times a loop will execute

18 Asymptotic Analysis Another term - Asymptotic analysis of an algorithm describes : the relative efficiency of an algorithm as n gets very large. Take for instance the value: n = 20,000,000 ( twenty millions) Then n^2 takes longer than just the value just n .

19 Comparison of Algorithms
If you're writing small programs, algorithm analysis is not important When you're dealing with small input size, most algorithms will do When the input size is very large, things change

20 An simple comparison Let's assume that you have 3 algorithms to sort a list f(n) = n log2n // f(n) runs in nlogn time g(n) = n2 // g(n) runs in n2 time h(n) = n3 // g(n) runs in n3 time Let's also assume that each step takes 1 microsecond (10-6) hours hours

21 Higher order Term When we compute the efficiency of the algorithms, we will do it in terms of common functions: polynomials(n^2) , -- ( a nested loop) logarithms( lg2 of N), exponentials(2 ^N) and products of these functions Analyzing the table given earlier, we can see that we are interested in the terms with the higher order

22 If we have a function f(n) that we are measuring and we say that :
f(n) = n3 + n2, for the case when the value of n is equal to 100,000 the running time of the algorithm is 31.7 years for (n3 ) hours for (n2 ) if the program is to run for 31.7 years (n3 ) ! a couple of hours (n2 ) does not make much difference

23 Higher order Term e.g. We discard the lower order term and
In the case above we say that for the function f(n) f(n) is equal to BigO(n3) meaning that f(n) is of the order n3. This is called big-Omega notation. We discard the lower order term and We choose the highest order term

24 Higher order term - Big Omega
BigO also disregards any constant multiplying the term of highest order and any term of smaller order f(n) = 10,000,000n3 is O(n3) The large constant - 10,000,000 - is not important compared to n3 for large values of n

25 Example of Big O(1) – A constant
// only tests the first element in the array of characters - usually no loops involved boolean IsFirstElementNull (String[] strings) // an array of chars { if(strings[0] == null) /// Note this method has no loops return true; return false; } // A loop example that is a Big O(1) is a loop runs 50 times every time - a constant number of times for(int i= 1; i < 50; i++) // will never execute more than 50 times System.out.println(“ “ + i );

26 Common Functions- Exponential
Exponential: Are those that use time kn where k is a constant , e.g. 2^N, 3^N. Algorithms that grow at this rate are suitable only for small problems. O(2^N) denotes an algorithm whose growth will double with each additional element in the input data set e.g. N!

27 Common Functions- Exponential
Recursive computation of Fibonacci numbers is also good example of O(2N) algorithm : public int fib(int n) {     if (n <= 1) return n;     else return fib(n - 2) + fib(n - 1); } Each new number is the sum of adding the two numbers before it (n-2) (n-1) – values increase exponentially

28 Exponential problems Much of the work on developing algorithms today is focused on these problems - There is a large variation in the size of various exponential functions n versus 2n But for large n the functions become huge

29 Comparison of Algorithms
Big-O Notation : A notation that expresses computing time (complexity) as the term in a function that increases most rapidly relative to the size of a problem If : f(N) = N N2 + 10N + 50 then f(N) is 0(N4). The largest exponent is most important N represents the size of the problem.

30 Worst Case and Best Case
Inputs vary in the way they are organized and this can influence the number of critical operations performed Suppose that we are searching for an element in an ordered list If the target key is the first in the list our function takes constant time a BigO(1) e.g InsertLast() in ArrayList If the target key is not in the list our function takes O(n), where n is the size of the list because we have to search the whole list to determine it is not there.

31 Worst Case and Best Case
The examples above are referred to as best case analysis and worst case analysis. Which is the really relevant case? Worst case is more important because it gives us a bound on how long the function might have to run Your boss will want to know the WORST CASE!

32 Average Case In some situations,
neither the best nor the worst case analysis express well the performance of an algorithm Average case analysis can be used if necessary Still average case is tricky because It may be cumbersome to do an average analysis of non-trivial algorithms In most cases the "order" of the average analysis is the same as the worst

33 Comparison of Rates of Growth of algorithms
N log2N N log2N N² N³ ^N

34 Comparison of Linear and Binary Searches

35 Big-O Comparison of Array Operations
Operation Unsorted List Sorted List O(LgN) O(LgN)

36 Review Questions: What problems arise when we "measure" the performance of an algorithm? What problems arise if we time it. 2. What is a “critical operation”? 3. How then do we measure the efficiency of an algorithm 1. The efficiency of an algorithm is stated as a function of the problem size. We generally use the variable N to represent the problem size 2. We must also consider the number of steps it will take to process the number of items(N). 4.What is big-O notation? What are Common Functions: Give an example Constant (1): Logarithmic(lg2 of N): Linear Time( n): Poly-logarithmic (n log n): Polynomial(n^2): Exponential (2^n) 5.What is Worst Case Analysis?

37 What is Big O - Evaluation
Because big O notation discards constants on the running time, And ignores efficiency for low input sizes, it does not always reveal the fastest algorithm in practice or for practically-sized data , But the approach is still very effective for comparing the scalability of various algorithms as input sizes become large.

38 Nano Computers – Quantum version
The quantum nanocomputers hold each bit of data as a quantum state of the computer They use molecule-sized transistors and logic gates. By means of quantum mechanics, waves store the state of each nanoscale component.

39 Quantum Computers With the correct setup,
constructive interference of the waves would emphasize the wave patterns that held the right answer, while destructive interference would prevent any wrong answers.

40 Recent Advances Field-Effect Transistor (FET) are built from a single molecule. "FETs are the powerhouse of modern electronics,“

41 Quantum Computing USC’s new quantum computing center houses the first commercial, operational quantum computer from D-WAVE, a Canadian firm D-Wave’s revolutionary quantum computer was recently purchased by Lockheed Martin.  USC and Lockheed Martin have recently completed successfully a testing of various algorithms

42 The D-Wave quantum computer
The D-Wave quantum computer has 128 quantum bits (called “qubits”), which can encode the two digits of one and zero at the same time – as opposed to traditional bits, which can encode distinctly either a one or a zero. 

43 superposition - double processing
This property, called “superposition,” along with the ability of quantum states to "tunnel" through energy barriers, allows the present D-WAVE device to perform optimization calculations much faster - exponentially faster than traditional computers.

44 Absolute zero storage necessary
The facility keeps the D-Wave hardware at near absolute zero temperatures and contains powerful shielding to block out electromagnetic interference. “It's one of the coldest and most magnetically shielded places on earth,” Absolute zero is the temperature at which entropy stops, eliminating thermal energy. It is defined as 0° Kelvin, or ° Celsius.

45 Entropy Entropy isthe measure of the level of disorder in a closed but changing system. Higher the entropy, higher the disorder and lower the availability of the system's energy to do useful work. Entropy stops in systems when temperatures are at zero

46 Quantum Computers Recently, LockHeed team ran programs whose algorithm performances ranged from n^2 to exponential problems. All completed execution virtually at the same time. In the near future, you will not have to study Big O. The speed with which the processing is done with quantum computing will make it irrelevant


Download ppt "Algorithms - some definitions"

Similar presentations


Ads by Google