Week 12 - Friday.  What did we talk about last time?  Finished hunters and prey  Class variables  Constants  Class constants  Started Big Oh notation.

Slides:



Advertisements
Similar presentations
12-Apr-15 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Advertisements

Announcements You survived midterm 2! No Class / No Office hours Friday.
Introduction to Computer Science Theory
Copyright © 2014, 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Starting Out with C++ Early Objects Eighth Edition by Tony Gaddis,
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Week 12 - Wednesday.  What did we talk about last time?  Hunters and prey  Class variables  Big Oh notation.
 The amount of time it takes a computer to solve a particular problem depends on:  The hardware capabilities of the computer  The efficiency of the.
Chapter 9: Searching, Sorting, and Algorithm Analysis
CS4HS at Marquette University. a) Find the sum of 4 and 7 b) Sort a list of words in alphabetical order c) State the most beautiful phrase in the English.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Week 12: Running Time and Performance 1.  Most of the problems you have written in this class run in a few seconds or less  Some kinds of programs can.
CS 206 Introduction to Computer Science II 09 / 10 / 2008 Instructor: Michael Eckmann.
1 CS 177 Week 12 Recitation Slides Running Time and Performance.
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Quicksort.
CS 206 Introduction to Computer Science II 09 / 05 / 2008 Instructor: Michael Eckmann.
Complexity (Running Time)
1 Algorithms and Analysis CS 2308 Foundations of CS II.
1 Section 2.3 Complexity of Algorithms. 2 Computational Complexity Measure of algorithm efficiency in terms of: –Time: how long it takes computer to solve.
Search Lesson CS1313 Spring Search Lesson Outline 1.Searching Lesson Outline 2.How to Find a Value in an Array? 3.Linear Search 4.Linear Search.
Abstract Data Types (ADTs) Data Structures The Java Collections API
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Big Oh Algorithm Analysis with
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
CS 2430 Day 28. Announcements We will have class in ULR 111 on Monday Exam 2 next Friday (sample exam will be distributed next week)
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Chapter 19 Searching, Sorting and Big O
Week 2 - Monday.  What did we talk about last time?  Exceptions  Threads  OOP  Interfaces.
CS 1704 Introduction to Data Structures and Software Engineering.
Week 5 - Monday.  What did we talk about last time?  Linked list implementations  Stacks  Queues.
Today  Table/List operations  Parallel Arrays  Efficiency and Big ‘O’  Searching.
Chapter 12 Recursion, Complexity, and Searching and Sorting
IT253: Computer Organization Lecture 3: Memory and Bit Operations Tonga Institute of Higher Education.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Arrays Tonga Institute of Higher Education. Introduction An array is a data structure Definitions  Cell/Element – A box in which you can enter a piece.
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation.
Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy Walters, and Godfrey Muganda Modified for use by MSU Dept. of Computer Science.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
1 CS 177 Week 12 Recitation Slides Running Time and Performance.
JETT 2005 Session 5: Algorithms, Efficiency, Hashing and Hashtables.
3.3 Complexity of Algorithms
RUNNING TIME 10.4 – 10.5 (P. 551 – 555). RUNNING TIME analysis of algorithms involves analyzing their effectiveness analysis of algorithms involves analyzing.
Week 10 - Friday.  What did we talk about last time?  Graph representations  Adjacency matrix  Adjacency lists  Depth first search.
Algorithm Analysis (Big O)
Searching and Sorting Searching: Sequential, Binary Sorting: Selection, Insertion, Shell.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Week 15 – Wednesday.  What did we talk about last time?  Review up to Exam 1.
Week 12 - Wednesday.  What did we talk about last time?  Hunters and prey.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Searching CSE 103 Lecture 20 Wednesday, October 16, 2002 prepared by Doug Hogan.
Search Algorithms Written by J.J. Shepherd. Sequential Search Examines each element one at a time until the item searched for is found or not found Simplest.
Copyright © 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy.
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Week 9 - Monday CS 113.
Week 13: Searching and Sorting
COMP 53 – Week Seven Big O Sorting.
Week 2 - Friday CS221.
Building Java Programs
Algorithm design and Analysis
Chapter 8 Search and Sort
Measuring “Work” Linear and Binary Search
Analysis of Algorithms
Analysis of Algorithms
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Analysis of Algorithms
Presentation transcript:

Week 12 - Friday

 What did we talk about last time?  Finished hunters and prey  Class variables  Constants  Class constants  Started Big Oh notation

 Here is a table of several different complexity measures, in ascending order, with their functions evaluated at n = 100 DescriptionBig Ohf(100) ConstantO(1)1 LogarithmicO(log n)6.64 LinearO(n)O(n)100 LinearithmicO(n log n) QuadraticO(n2)O(n2)10000 CubicO(n3)O(n3) ExponentialO(2 n )1.27 x FactorialO(n!)9.33 x

 Computers get faster, but not in unlimited ways  If computers get 10 times faster, here is how much a problem from each class could grow and still be solvable DescriptionBig OhIncrease in Size ConstantO(1)Unlimited LogarithmicO(log n)1000 LinearO(n)O(n)10 LinearithmicO(n log n)10 QuadraticO(n2)O(n2)3-4 CubicO(n3)O(n3)2-3 ExponentialO(2 n )Hardly changes FactorialO(n!)Hardly changes

 There is nothing better than constant time  Logarithmic time means that the problem can become much larger and only take a little longer  Linear time means that time grows with the problem  Linearithmic time is just a little worse than linear  Quadratic time means that expanding the problem size significantly could make it impractical  Cubic time is about the reasonable maximum if we expect the problem to grow  Exponential and factorial time mean that we cannot solve anything but the most trivial problem instances

 Memory usage can be a problem  If you run out of memory, your program can crash  Memory usage can have serious performance consequences too

 Remember, there are multiple levels of memory on a computer  Each next level is on the order of 500 times larger and 500 times slower Cache Actually on the CPU Fast and expensive RAM Primary memory for a desktop computer Pretty fast and relatively expensive Hard Drive Secondary memory for a desktop computer Slow and cheap 1000X SizeSpeed

 If you can do a lot of number crunching without leaving cache, that will be very fast  If you have to fetch data from RAM, that will slow things down  If you have to read and write data to the hard drive (unavoidable with large pieces of data like digital movies), you will slow things down a lot

 Memory can be easier to estimate than running time  Depending on your input, you will allocate a certain number of objects, arrays, and primitive data types  It is possible to count the storage for each item allocated  Remember that a reference to an object or an array costs an additional 4 bytes on top of the size of the object

 Here are the sizes of various types in Java  Note that the N refers to the number of elements in the array or String TypeBytes boolean 1 char 2 int 4 double 8 TypeBytes boolean[] 16 + N char[] N int[] N double[] N TypeBytes reference4 String N object8 + size of members array of objects16 + (4 + size of members)N

 Lets say that I give you a list of numbers, and I ask you, “Is 37 on this list?”  As a human, you have no problem answering this question, as long as the list is reasonably short  What if the list is an array, and I want you to write a Java program to find some number?

 Easy!  We just look through every element in the array until we find it or run out  If we find it, we return the index, otherwise we return -1 public static int find( int[] array, int number ) { for( int i = 0; i < array.length; i++ ) if( array[i] == number ) return i; return -1; } public static int find( int[] array, int number ) { for( int i = 0; i < array.length; i++ ) if( array[i] == number ) return i; return -1; }

 Unfortunately for you, we know about Big Oh notation  Now we have some way to measure how long this algorithm takes  How long, if n is the length of the array?  O(n) time because we have to look through every element in the array, in the worst case

 Is there any way to go smaller than O(n)?  What complexity classes even exist that are smaller than O(n)?  O(1)  O(log n)  Well, on average, we only need to check half the numbers, that’s ½ n which is still O(n)  Darn…

 We can do better with more information  For example, if the list is sorted, then we can use that information somehow  How?  We can play a High-Low game

 Repeatedly divide the search space in half  We’re looking for 37, let’s say Check the middle (Too high) Check the middle (Too low) Check the middle (Too low) Check the middle (Found it!) 37

 How long can it take?  What if you never find what you’re looking for?  Well, then, you’ve narrowed it down to a single spot in the array that doesn’t have what you want  And what’s the maximum amount of time that could have taken?

 We can apply this idea to a guessing game  First we tell the computer that we are going to pick a number between 1 and n  We pick, and it tries to narrow down the number  It should only take log n tries  Remember log 2 (1,000,000) is only about 20

 This is a classic interview question asked by Microsoft, Amazon, and similar companies  Imagine that you have 9 red balls  One of them is just slightly heavier than the others, but so slightly that you can’t feel it  You have a very accurate two pan balance you can use to compare balls  Find the heaviest ball in the smallest number of weighings

 It’s got to be 8 or fewer  We could easily test one ball against every other ball  There must be some cleverer way to divide them up  Something that is related somehow to binary search

 We can divide the balls in half each time  If those all balance, it must be the one we left out to begin with

 How?  They key is that you can actually cut the number of balls into three parts each time  We weigh 3 against 3, if they balance, then we know the 3 left out have the heavy ball  When it’s down to 3, weigh 1 against 1, again knowing that it’s the one left out that’s heavy if they balance

 The cool thing is…  Yes, this is “cool” in the CS sense, not in the real sense  Anyway, the cool thing is that we are trisecting the search space each time  This means that it takes log 3 n weighings to find the heaviest ball  We could do 27 balls in 3 weighings, 81 balls in 4 weighings, etc.

 Sorting

 Finish Project 4  Due tonight before midnight!