Algorithms + L. Grewe.

Slides:



Advertisements
Similar presentations
Algorithm Design Techniques
Advertisements

Algorithm Design Methods (I) Fall 2003 CSE, POSTECH.
Algorithm Design Methods Spring 2007 CSE, POSTECH.
IEOR 4004 Final Review part II.
Artificial Intelligence Presentation
Pure, Mixed-Integer, Zero-One Models
Types of Algorithms.
Dynamic Programming.
Branch & Bound Algorithms
Introduction to Algorithms
15-May-15 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
Computer science is a field of study that deals with solving a variety of problems by using computers. To solve a given problem by using computers, you.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 1 (Part 3) Tuesday, 9/4/01 Greedy Algorithms.
Chapter 10: Algorithm Design Techniques
1 Branch and Bound Searching Strategies 2 Branch-and-bound strategy 2 mechanisms: A mechanism to generate branches A mechanism to generate a bound so.
Backtracking Reading Material: Chapter 13, Sections 1, 2, 4, and 5.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 3) Tuesday, 1/29/02 Design Patterns for Optimization.
Recursion Chapter 7. Chapter 7: Recursion2 Chapter Objectives To understand how to think recursively To learn how to trace a recursive method To learn.
Backtracking.
The Fundamentals: Algorithms, the Integers & Matrices.
Design & Analysis of Algorithms Introduction. Introduction Algorithms are the ideas behind computer programs. An algorithm is the thing which stays the.
Instructor: Dr. Sahar Shabanah Fall Lectures ST, 9:30 pm-11:00 pm Text book: M. T. Goodrich and R. Tamassia, “Data Structures and Algorithms in.
Stochastic Algorithms Some of the fastest known algorithms for certain tasks rely on chance Stochastic/Randomized Algorithms Two common variations – Monte.
Recursion Chapter 7. Chapter Objectives  To understand how to think recursively  To learn how to trace a recursive method  To learn how to write recursive.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Fundamentals of Algorithms MCS - 2 Lecture # 7
Lecture 2: General Problem-Solving Methods. Greedy Method Divide-and-Conquer Backtracking Dynamic Programming Graph Traversal Linear Programming Reduction.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Design & Analysis of Algorithms Lecture 1 Introduction.
INTRODUCTION. What is an algorithm? What is a Problem?
Data Structures & Algorithms Recursion and Trees Richard Newman.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
Applications of Dynamic Programming and Heuristics to the Traveling Salesman Problem ERIC SALMON & JOSEPH SEWELL.
CSE373: Data Structures & Algorithms Lecture 22: The P vs. NP question, NP-Completeness Lauren Milne Summer 2015.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
1 Inventory Control with Time-Varying Demand. 2  Week 1Introduction to Production Planning and Inventory Control  Week 2Inventory Control – Deterministic.
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CES 592 Theory of Software Systems B. Ravikumar (Ravi) Office: 124 Darwin Hall.
Greedy Algorithms Analysis of Algorithms.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
Dynamic Programming. What is Dynamic Programming  A method for solving complex problems by breaking them down into simpler sub problems. It is applicable.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Advanced Algorithms Analysis and Design
Data Structures Lab Algorithm Animation.
Algorithm Design Methods
Major Design Strategies
Topic:- ALGORITHM Incharge Faculty – Lokesh Sir.
Lecture 22 Complexity and Reductions
Analysis and design of algorithm
Types of Algorithms.
Data Structures & Algorithms
Objective of This Course
Types of Algorithms.
Topic 14 Algorithm Families.
Exam 2 LZW not on syllabus. 73% / 75%.
Lecture 3: Environs and Algorithms
Algorithm Design Methods
Backtracking and Branch-and-Bound
Types of Algorithms.
Algorithm Design Methods
Major Design Strategies
Topic 14 Algorithm Families.
Major Design Strategies
Algorithm Design Methods
Complexity Theory: Foundations
Presentation transcript:

Algorithms + L. Grewe

Algorithms and Programs Algorithm: a method or a process followed to solve a problem. A recipe. An algorithm takes the input to a problem (function) and transforms it to the output. A mapping of input to output. A problem can have many algorithms.

Algorithm Properties An algorithm should (ideally) possess the following properties: It must be correct. It must be composed of a series of concrete steps. There can be no ambiguity as to which step will be performed next. It must be composed of a finite number of steps. It must terminate. A computer program is an instance, or concrete representation, for an algorithm in some programming language. “Correct” means computes the proper function. “Concrete steps” are executable by the machine in question. We frequently interchange use of “algorithm” and “program” though they are actually different concepts.

Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of computer program design are two (sometimes conflicting) goals. To design an algorithm that is easy to understand, code, debug. To design an algorithm that makes efficient use of the computer’s resources.

Algorithms Any problem can have a large number of algorithms that could be used to solve it. ……but there are some algorithms that are effecitve in solving many problems.

Common Algorithms / Algorithm Methods Greedy Algorithms Divide and Conquer Dynamic Programming Backtracking Branch and Bound

Other (common) Algorithms / Algorithm Methods Linear Programming Integer Programming Neural Networks Genetic Algorithms Simulated Annealing Typically these are covered in application specific courses that use them (e.g. Artificial Intelligence)

Algorithms specific to a data structure Also algorithms can be specifically designed for common operations on a particular data structure. Example - Graph Algorithms Graph matching Find shortest path(s)

Defining the Problem Before even trying to design or reuse an existing algorithm, you must define your problem. ……many problems can be defined as an optimization problem.

Optimization Problem Problem = Function to X + Constraints This is where you describe the problem as a formula/function. Example, “find the shortest path” can be stated as Sum (distances) is minimum. Here the X is to “minimize”. The Function is the “Sum(distances)”. Sometimes these are called “COST functions” Constraints In our shortest path problem this might be to never visit the same node in the path twice.

One Solution to any Problem…..The Brute Force Algorithm Try every solution! Exponential Time, because exponentially many This is WHY we discuss algorithms!!!!

Optimization Problem Elements Instances: The possible inputs to the problem. Solutions for Instance: Each instance has an exponentially large set of solutions. Cost of Solution: Each solution has an easy to compute cost or value. Specification <preCond>: The input is one instance. <postCond>: An valid solution with optimal cost. (minimum or maximum)

Greedy Algorithms Class of algorithms that solve Optimization Problems in a “greedy way” Some greedy algorithms will generate an optimal solution, others only a “good (enough)” solution. Greedy Method = at each point in the algorithm a decision is make that is best at that point. Decisions made are not changed at a later point. Greedy Criterion = criterion used to make a greedy decision at each point.

Divide and Conquer Problem = Set of Several Independent (smaller) sub-problems. Divide problem into several independent sub-problems and solve each sub-problem. Combine solutions to derive final solution. Can work well on parallel computers. Many times the sub-problems are the same problem and need to only develop one algorithm to solve them and then the algorithm to combine the results.

Divide and Conquer Example: Starting Thoughts: You are given a bag with 16 coins and told one is counterfeit and lighter than the others. Problem = determine if bag contains the counterfeit coin. You have a machine that compares the weight of two sets of coins and tells you which is lighter or if they are the same. Starting Thoughts: You could start with comparing coin 1 and 2. If one is lighter you are done. You can then compare coin 3 and 4 and so on. If you have N coins this can take N/2 times. The Divide and Conquer Way: If you have N coins divide into two N/2 groups. Weigh them. If one is lighter then we are done and the bag does contain a counterfeit coin. If they are the same, there is no counterfeit coin. This takes ONLY 1 operation. What happens if you want to find the counterfeit coin?

Dynamic Programming dynamic programming is a method of solving complex problems by breaking them down into simpler steps. Bottom-up dynamic programming simply means storing the results of certain calculations, which are then re-used later because the same calculation is a sub-problem in a larger calculation. Top-down dynamic programming involves formulating a complex calculation as a recursive series of simpler calculations.

More on Dynamic Programming Some programming languages can automatically memorize the result of a function call with a particular set of arguments Some languages make it possible portably (e.g. Scheme, Common Lisp or Perl), some need special extensions (e.g. C++, see [2]). Some languages have automatic memoization built in. In any case, this is only possible for a referentially transparent function.

Backtracking Backtracking is a general algorithm for finding all (or some) solutions to some computational problem, that incrementally builds candidates to the solutions, and abandons each partial candidate c ("backtracks") as soon as it determines that c cannot possibly be completed to a valid solution problems which admit the concept of a "partial candidate solution“ and a relatively quick test of whether it can possibly be completed to a valid solution.

Branch and Bound Divides a problem to be solved into a number of subproblems, similar to the strategy backtracking. Systematic enumeration of all candidate solutions, where large subsets of fruitless candidates are discarded en masse, by using upper and lower estimated bounds of the quantity being optimized Efficiency of the method depends strongly on the node-splitting procedure and on the upper and lower bound estimators. All other things being equal, it is best to choose a splitting method that provides non-overlapping subsets.