# Logarithms and Exponents

## Presentation on theme: "Logarithms and Exponents"— Presentation transcript:

Logarithms and Exponents
Comp3050 Logarithms and Exponents

A quick revision We know that log b a = c if a = bc
We also know that if b = 2, we may omit writing the base of the algorithm i.e. log 1024 = 10

Some rules for the algorithms
If a, b and c are positive real numbers then: Logb ac = logb a + logb c Logb a/c = logb a – logb c Logb ac = c logb a blogca = alogcb Logb a = logc a/logc b Log 2 = 1

Exponent rules We also know that (ba)c = bac Babc = b a+c Ba/bc = ba-c
Based on these rules: 2log n = nlog2 =n

More examples 4n = (22)n = 22n (exponent rule 1) Log 2n = n (rule 3)
What will be: log4n

More examples Algorithm A uses 10nlogn operations. B uses n2 operations? Do you think that Algo A will always be better than Algo B? What value the situations may change (i.e. what will be the value for n0) What if Algo B is nn

Explanation We observe that Till n = 10, ALGO A is more
10nlogn n2 1 2 6.0206 4 3 9 16 5 25 6 36 7 49 8 64 81 10 100 11 121 12 144 13 169 14 196 15 225 256 17 289 We observe that Till n = 10, ALGO A is more After n=10 Algo B is more So it is not always that nlogn is less then n2

Three types of bounds There are three bounds:
Big O, big  (Omega) and big  (theta) We saw two function 10nlogn and n2 We also saw that for different n’s they differ So if we plot them:

Graphs plotted

Big O We may also call Big O as WORST CASE

For this example: F(x) = 10nlogn The Asymptotic O is O(n) G(x) = n2

Big  We may call Big  as BEST CASE

Making big Omega F(x) = 10nlogn G(x) = n2 So the function is
for c=10 and n0 = 10 As now nlogn >=cnlogn. When n>n0 (10)

Big 

Analysis of Algorithms
Intuition for Asymptotic Notation Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n) big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n) big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n) little-oh f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n) little-omega f(n) is (g(n)) if is asymptotically strictly greater than g(n) Analysis of Algorithms

General rule Small o is called strict upper bound
F(n) = 12n2 + 6n will have o(n3) And small omega will be (n) -> strict lower bound

Question Bill has an algorithm, find2D, to find an element x in an n * n array A. The algorithm find2D iterates over the rows of A, and calls the algorithm arrayFind, on each row, until x is found or it has searched all rows of A. What is the worst case running time of find2D in terms of n? What is the worst case running time of find2D in terms of N, where N is the total size of A? Would it be correct to say that find2D is a linear-time algorithm? Why or why not?

solution

More example There are two prefixAverage algo’s
PrefixAlgo is creating array B as each element of Array B must be the average of the elements of Array A till ith location of array B

Algorithm 1

Algorithm 2

Result The running time of algorithm prefixAvarages1 is O(n) for first and second term For third term onwards it is O(n2) So the running time would be O(n2)

Result The running time of algorithm prefixAvarages2 is O(1) for first and second term For third term onwards it is O(n) So the running time would be O(n)

Example Calculate Algo time Loop1 Loop2 Loop3 Loop4 Loop5 – O(n4)

Example

Solution

Similar presentations