Algorithms  Al-Khwarizmi, arab mathematician, 8 th century  Wrote a book: al-kitab… from which the word Algebra comes  Oldest algorithm: Euclidian algorithm.

Slides:



Advertisements
Similar presentations
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Advertisements

S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
CS333 Algorithms
Complexity Theory CSE 331 Section 2 James Daly. Reminders Project 4 is out Due Friday Dynamic programming project Homework 6 is out Due next week (on.
© 2006 Pearson Addison-Wesley. All rights reserved14 A-1 Chapter 14 Graphs.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
1 -1 Chapter 1 Introduction Why Do We Need to Study Algorithms? To learn strategies to design efficient algorithms. To understand the difficulty.
DAST, Spring © L. Joskowicz 1 Data Structures – LECTURE 1 Introduction Motivation: algorithms and abstract data types Easy problems, hard problems.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Review Lecture Tuesday, 12/10/02.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
1 -1 Chapter 1 Introduction Why to study algorithms? Sorting problem: To sort a set of elements into increasing or decreasing order. 11, 7, 14,
Chapter 11: Limitations of Algorithmic Power
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
The Complexity of Algorithms and the Lower Bounds of Problems
DAST, Spring © L. Joskowicz 1 Data Structures – LECTURE 1 Introduction Motivation: algorithms and abstract data types Easy problems, hard problems.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Teaching Teaching Discrete Mathematics and Algorithms & Data Structures Online G.MirkowskaPJIIT.
Chapter 11 Limitations of Algorithm Power. Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples:
CSCE350 Algorithms and Data Structure
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 11 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Computational Complexity Polynomial time O(n k ) input size n, k constant Tractable problems solvable in polynomial time(Opposite Intractable) Ex: sorting,
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
CSE 326: Data Structures NP Completeness Ben Lerner Summer 2007.
Approximation Algorithms
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 28, 2014.
CSE332: Data Abstractions Lecture 24.5: Interlude on Intractability Dan Grossman Spring 2012.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 21, 2014.
INTRODUCTION. What is an algorithm? What is a Problem?
Data Structures & Algorithms Recursion and Trees Richard Newman.
Cliff Shaffer Computer Science Computational Complexity.
Lecture 3 Analysis of Algorithms, Part II. Plan for today Finish Big Oh, more motivation and examples, do some limit calculations. Little Oh, Theta notation.
Data Structures & Algorithms Graphs
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
CSE373: Data Structures & Algorithms Lecture 22: The P vs. NP question, NP-Completeness Lauren Milne Summer 2015.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness and course wrap up.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
Design and Analysis of Algorithms (09 Credits / 5 hours per week) Sixth Semester: Computer Science & Engineering M.B.Chandak
CSE 340: Review (at last!) Measuring The Complexity Complexity is a function of the size of the input O() Ω() Θ() Complexity Analysis “same order” Order.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 30, 2014.
CES 592 Theory of Software Systems B. Ravikumar (Ravi) Office: 124 Darwin Hall.
© The McGraw-Hill Companies, Inc., Chapter 1 Introduction.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Review Lecture Tuesday, 12/11/01.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 30, 2014.
Lecture 6 Sorting II Divide-and-Conquer Algorithms.
Design and Analysis of Algorithms
Design and Analysis of Algorithms (09 Credits / 5 hours per week)
Design and Analysis of Computer Algorithm (CS575-01)
Design and Analysis of Algorithms (07 Credits / 4 hours per week)
The Complexity of Algorithms and the Lower Bounds of Problems
Analysis and design of algorithm
Data Structures & Algorithms
Computation Basics & NP-Completeness
CLASSES P AND NP.
Chapter 11 Limitations of Algorithm Power
Chapter 1 Introduction.
Major Design Strategies
INTRODUCTION TO ALOGORITHM DESIGN STRATEGIES
Department of Computer Science & Engineering
Design and Analysis of Algorithms (04 Credits / 4 hours per week)
Major Design Strategies
Presentation transcript:

Algorithms  Al-Khwarizmi, arab mathematician, 8 th century  Wrote a book: al-kitab… from which the word Algebra comes  Oldest algorithm: Euclidian algorithm for finding the largest common divisor of two numbers

Ten Big Ideas in Algorithms 1.Orders of Magnitude and Big Oh notation 2.Complexity: Polynomial versus Exponential 3.Iteration versus Recursion 4.Recurrence Relations 5.Divide-and-Conquer 6.Graph Traversal: Depth-First, Breadth-First 7.Greedy Algorithm 8.Dynamic Programming 9.Lower Bounds 10.NP-completeness

1. Orders of Magnitude How to compare algorithm A with algorithm B in an implementation-independent manner? Asymptotic analysis: compare them when n goes to infinity (becomes very large) [An aside: massive data sets; the Internet and search engines; Bioinformatics] a(n)=running time of alg A, b(n)=of alg B A is better than B if lim a(n)/b(n) = 0 (b “grows faster” than b) Examples in Mathematica

Big Oh Notation Too many functions, not all components relevant Big Oh helps keep the big picture. Example: a(n) = 3 n and b(n) = 5 n n grow roughly at the same rate lim a(n) / b(n) = constant a(n) = O(n 2 ), b(n) = O(n 2 )

Complexity: Polynomial versus Exponential Polynomial: order growth of O(n), O(n2), O(n3), etc. is tractable. Exponential: order growth of O(2n), O(nn) or larger is intractable and leads to impractical algorithms

3. Iteration versus Recursion Find Min of n elements: iterate over the array, at each step do a comparison. Time = n Sort n elements by Selection Sort: iterate over array, at each iteration Find Min, put in right position Time = n + (n-1) + (n-2) + … Find closed form for summation: Time = n(n+1)/2= O(n 2 )

Recursion Binary Search: split the list in two, search in the half where it should be. Merge Sort: divide the array in two equal parts; sort each part recursively; merge them. Quick Sort: use first element to partition array in two, first being smaller than second; sort each part recursively.

4. Recurrence Relations Time for Merge Sort: F(n) = 2 F(n/2) + n Solve recurrence relation: F(n) = n log n

5. Divide and Conquer Paradigm for algorithm design: split in two or more parts, solve recursively, combine solutions Binary Search, Merge Sort and Quick Sort illustrate it. Leads to improved algorithms for many problems: oMedian finding oMatrix Multiplication oComputational Geometry algorithms (convex hulls, etc.) Leads to divide and conquer recurrence relations.

6.Graph Traversal: Depth-First and Breadth-First Techniques for designing very efficient (linear time) algorithms Apply to a variety of graph problems: Connectivity Finding cycles in graphs Strongly connected components in directed graphs Biconnected components in graphs Planarity testing

7. Greedy Algorithm Proceed iteratively. At each step, choose an element which maximizes some “profit” or minimizes some “cost”. Minimum spanning tree in a graph (min cost connecting network): at each step, pick up the least expensive edge that does not create a cycle in the graph. Shortest paths in a graph from given vertex a: at each step, choose a new vertex which is closest to an already constructed set of vertices.

8. Dynamic Programming A scheme for combining several results for smaller values to obtain the current result. Often leads to polynomial time algorithms All pairs shortest paths: compute the shortest path between all pairs of nodes in a graph.

9. Lower Bounds When do you stop improving an algorithm? How to prove that what you found is the best possible algorithm? Lower bounds: techniques for proving an algorithm is the best possible. Work under certain assumptions about the model of computation Classical bounds: finding the minimum, sorting. Information Theoretical Lower Bound Adversary arguments

10.NP-Completeness Despite all efforts, hundreds of very practical problems have no efficient (polynomial time) known algorithm (doesn’t mean that one may not be found) Nobody has been able to prove that no polynomial time algorithm exists for any of these problems. But one can show that if a polynomial time algorithm exists for one of these problems, then it exists for all of them. A problem is NP-complete if it is as hard (NP- hard) as any of the problems in this class, and if a solution to the problem can be verified in polynomial time (it is in NP)

Some NP-complete Problems Traveling Salesman Problem Hamiltonian Cycle in a graph Maximum Clique in a graph Satisfiability for boolean formulas and circuits The game of Push-Push

Overview of the course We cover the 10 big ideas Lots of examples of algorithms Start with some algebra and calculus Need data structures and recursion Use Leda software for demos (all algorithms are implemented and visualized) Abstract, high level class Need to be comfortable with abstract thinking A little bit of programming – mostly theory