Chapter VI What should I know about the sizes and speeds of computers?

Slides:



Advertisements
Similar presentations
Chapter 20 Computational complexity. This chapter discusses n Algorithmic efficiency n A commonly used measure: computational complexity n The effects.
Advertisements

Turing Machines January 2003 Part 2:. 2 TM Recap We have seen how an abstract TM can be built to implement any computable algorithm TM has components:
Announcements You survived midterm 2! No Class / No Office hours Friday.
Types of Algorithms.
1 Chapter 1 Why Parallel Computing? An Introduction to Parallel Programming Peter Pacheco.
HST 952 Computing for Biomedical Scientists Lecture 10.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Fundamentals of Python: From First Programs Through Data Structures
TU/e Processor Design 5Z032 1 Processor Design 5Z032 The role of Performance Henk Corporaal Eindhoven University of Technology 2009.
 The amount of time it takes a computer to solve a particular problem depends on:  The hardware capabilities of the computer  The efficiency of the.
Introduction to Cryptography and Security Mechanisms: Unit 5 Theoretical v Practical Security Dr Keith Martin McCrea
CS 206 Introduction to Computer Science II 09 / 10 / 2008 Instructor: Michael Eckmann.
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
Other time considerations Source: Simon Garrett Modifications by Evan Korth.
CS 206 Introduction to Computer Science II 09 / 05 / 2008 Instructor: Michael Eckmann.
Chapter 4 Assessing and Understanding Performance
CS 206 Introduction to Computer Science II 01 / 28 / 2009 Instructor: Michael Eckmann.
Analysis of Algorithm.
1 Section 2.3 Complexity of Algorithms. 2 Computational Complexity Measure of algorithm efficiency in terms of: –Time: how long it takes computer to solve.
1 Chapter 4. 2 Measure, Report, and Summarize Make intelligent choices See through the marketing hype Key to understanding underlying organizational motivation.
Data Structures Introduction Phil Tayco Slide version 1.0 Jan 26, 2015.
Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI CSCI.
1 Complexity Lecture Ref. Handout p
1 Growth of Functions CS 202 Epp, section ??? Aaron Bloomfield.
Difficult Problems. Polynomial-time algorithms A polynomial-time algorithm is an algorithm whose running time is O(f(n)), where f(n) is a polynomial A.
1 Ethics of Computing MONT 113G, Spring 2012 Session 13 Limits of Computer Science.
Lecture 1: Performance EEN 312: Processors: Hardware, Software, and Interfacing Department of Electrical and Computer Engineering Spring 2013, Dr. Rozier.
计算机科学概述 Introduction to Computer Science 陆嘉恒 中国人民大学 信息学院
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Arrays Tonga Institute of Higher Education. Introduction An array is a data structure Definitions  Cell/Element – A box in which you can enter a piece.
Unsolvability and Infeasibility. Computability (Solvable) A problem is computable if it is possible to write a computer program to solve it. Can all problems.
1 CS/EE 362 Hardware Fundamentals Lecture 9 (Chapter 2: Hennessy and Patterson) Winter Quarter 1998 Chris Myers.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Lawrence Snyder University of Washington, Seattle © Lawrence Snyder 2004 What can a computer be commanded to do?
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Cliff Shaffer Computer Science Computational Complexity.
Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy Walters, and Godfrey Muganda Modified for use by MSU Dept. of Computer Science.
Georgia Institute of Technology Speed part 6 Barb Ericson Georgia Institute of Technology May 2006.
Algorithms and data structures: basic definitions An algorithm is a precise set of instructions for solving a particular task. A data structure is any.
Review 1 Arrays & Strings Array Array Elements Accessing array elements Declaring an array Initializing an array Two-dimensional Array Array of Structure.
3.3 Complexity of Algorithms
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Beauty and Joy of Computing Limits of Computing Ivona Bezáková CS10: UC Berkeley, April 14, 2014 (Slides inspired by Dan Garcia’s slides.)
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
SNU OOPSLA Lab. 1 Great Ideas of CS with Java Part 1 WWW & Computer programming in the language Java Ch 1: The World Wide Web Ch 2: Watch out: Here comes.
CS 206 Introduction to Computer Science II 09 / 18 / 2009 Instructor: Michael Eckmann.
CMSC 100 Efficiency of Algorithms Guest Lecturers: Clay Alberty and Kellie LaFlamme Professor Marie desJardins Tuesday, October 2, 2012 Some material adapted.
Chapter 7 What Can Computers Do For Me?. How important is the material in this chapter to understanding how a computer works? 4.
Complexity © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Searching Topics Sequential Search Binary Search.
Copyright © 2014 Curt Hill Algorithm Analysis How Do We Determine the Complexity of Algorithms.
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
Section 1: Problem solving AQA Computing A2 © Nelson Thornes 2009 Examples of problems with different time complexities Section 1.2: Comparing algorithms.
Introduction to Computers - Hardware
Teach A level Computing: Algorithms and Data Structures
CS 2210 Discrete Structures Algorithms and Complexity
Chapter 12: Analysis of Algorithms
Analysis of Algorithms
Pruning 24-Feb-19.
Data Structures Introduction
What is Computer Science About? Part 2: Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Announcements Jack group member by Thursday!! Quiz-8? HW10
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
CS 2210 Discrete Structures Algorithms and Complexity
Presentation transcript:

Chapter VI What should I know about the sizes and speeds of computers?

How important is the material in this chapter to understanding how a computer works Maybe 5 on a scale of 1-10

How Important is this material to understanding how to use a computer? On a scale of 1 – 10, maybe 4

True/False Quiz The size of Computer Chips has doubled about every 18 months since 1965 The size of disk memory has grown at a rate of about 50% every year or so since The capacities of CPUs within a computer have a long record of doubling about every 18 months. All of these statements are TRUE

True/False Quiz With the capabilities of computers increasing as indicated by the previous questions, and prices continuing to decrease, it should soon be true that computers can solve almost any reasonable problem. This statement is FALSE. Examining this seeming contradiction is the subject of this chapter

How Fast Does Technology Really Evolve? Moore’s Law: A new generation of processors is developed about every three years. Typically the number of transistors in such a processor quadruples. Moore’s Law is an observation made by Gordon Moore, co-founder of Intel in 1965, but it has held true for the past 40 years.

Paul’s fable Discuss

How long does it take to search through a list of items to find a particular one? It depends on how fast the computer is. It depends on how many items are on the list. It depends on what method is used to search the list.

Searching (continued) Let’s ignore the speed of the computer because if you get a computer that’s twice as fast as your old computer then it should be able to solve the problem twice as fast. Clearly the more items that are on the list, the longer it will take to search it. But … how much longer?

Searching (continued) One way to find an item in a list is to start at the beginning and look through the list until you find what you’re looking for OR you run out of items to look through and you conclude that what you wanted is not there. This method is called a linear search. If you double the number of items to search then you double the time (on average) that it takes to search.

Searching (continued) But if the list of items is in order, you can improve the search using a method called the binary search. Look at the middle of the list. If the item is after this item, then you can throw away the first half and only search the second half. If the item is before this item, then you can throw away the last half.

Searching (Continued) If the item is the one you’re looking for, then you have found it. BUT, to summarize, each time you look at an item, you throw away half the list, so doubling the size of the list to be searched only adds one more item. This type of search is called logarithmic.

How Does Paul’s Fable Relate To Computers? An algorithm is a finite set of unambiguous steps that solves a problem. Some algorithms are very complex. Both of the searching algorithms were simple, one of complexity n (linear) and one of complexity log n (logarithmic)

Traveling salesman problem A salesman has to call on some clients in different cities and would like to arrange the order that he calls on his clients in order to optimize his travel (for example, he might want to make the mileage as small as possible or the air fare as cheap as possible or the time as small as possible).

Traveling Salesman (cont) One algorithm that can be used to solve the problem is exhaustive search. That is list all the possible solutions and then compute which one is best. If there are 3 clients then there are only 6 possible choices: HABCH, HACBH, HBACH, HBCAH, HCABH, HCBAH, where H represents home and A, B, and C represent each of the clients.

Traveling Salesman (cont) For 3 clients, the number is 3! (3*2*1) For 5 clients, the number is 5! (5*4*3*2*1) For 10 clients, the number is 10! So one way to solve the problem is to get the computer to list all the possibilities and compute the distance for each possibility and then pick the smallest one

Traveling Salesman (cont) 3! = 6 5!= ! = 3,628,800 15! = 1,307,674,368,000 20! = 2,432,902,008,176,640,000

Traveling Salesman (cont) If the computer could compute one billion routes per second (which is faster thanany current computer by a substantial amount) then – 3 clients = much less than 1 second – 5 clients = much less than 1 second – 10 clients = much less than 1 second – 15 clients = 21 minutes 48 seconds – 20 clients = approximately 77 years

Traveling Salesman (cont) The exhaustive search method for solving this problem doesn’t scale. It’s easy for a small amount of data but intractable for larger numbers. The complexity of this algorithm is n! (n factorial)

Chess Playing Computers typically play chess in the following way: – They use an opening book (a list of moves that have been used successfully in the past). These books typically have thousands of games listed. – When the opponent makes a move that is not in the book then the computer analyzes the game, basically evaluating every possible response to the opponent’s move. It then analyzes every response to it’s move that the opponent might make, and analyzes every response to that move that it may make. – When a player moves a piece that is referred to as a ply

Chess Playing (continued) So if a chess playing program is set to evaluate to a depth of 4 plies then it will evaluate – it’s move, – the opponent’s response, – it’s response to the response, – the opponents response to the response to the opponents move to its move. Then which ever move is seen as best is the move the machine will take.

Chess Playing (continued) At each moment there are about 20 or so moves that could come into consideration. Thus for a depth of 4 plies the number of possible move sequences is This is an example of exponential growth in an algorithm.

When Do Solutions Scale Up? Problems scale up when their level of complexity is polynomially related (such as linear (n), quadratic (n 2 ) or higher power. Problems don’t scale up when their complexity is exponential (2 n ), factorial (n!) or “worse”

Class P The set of all algorithms whose complexity is a polynomial Neither the Traveling Salesman problem nor the Chess playing problem are in this class.

Class NP Suppose we tried to solve the problem using several computers simultaneously (and having them communicate with each other to report the progress of their work). If an algorithm could be solved in polynomial time under this consideration, it is said to be in class NP. The Traveling Salesman Problem is in this class.

What Are Some Difficulties Logical Complexity Causes in Computers? Single user computers will not need as much security since only one person will be using them. If the user will only be performing one task at a time (single-tasking) the O/S can be much simpler since scheduling will not be a part of the problem.

When Does the Cost of More Outweigh the Benefits? More features in software Higher speed in hardware Higher graphics capabilities Greater Accuracy

More Features in Software Extra features add complexity to the software. – Consider the case of a Web address, Sometimes fixing errors may lead to other errors in the software.

Higher Speed in Hardware Speed must be co-ordinated – a faster processor may not help unless other components are also faster. Often speeding up the machine does not speed up the work being done, for example Web browsing or word processing.

Higher Graphics Capabilities Higher Graphics capabilities in general increases the quality of images BUT It also increase the space needed to store the images, and the time needed to move the images, and the time needed to manipulate the images.

Greater Accuracy Increasing accuracy necessitates storing numbers in more complex forms This requires, in general, more space, and therefore more time for the computations. Some computations need high accuracy, others do not. Some computations require numbers that are measured and they may be limited in precision.

Summary Computers are increasing in speed. Due to the laws of physics there is a limit to how much more the increase can be. Some solutions to problems are so complex that increase in computer speeds will not be enough to realistically solve these problems. Increasing the capabilities of computers, in general requires them to become more complex and thus more prone to errors.

Terminology Class P Class NP Combinatorial Explosion Computational Complexity Exhaustive Listing Exhaustive Search Linear Search Logical Complexity Moore’s Law Ply