Engineering Principles in Software Engineering Here are some key ideas you will learn about... 1. divide-and-conquer 2. recursion 3. greedy algorithms,

Slides:



Advertisements
Similar presentations
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Advertisements

Types of Algorithms.
Algorithms + L. Grewe.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
I Advanced Algorithms Analysis. What is Algorithm?  A computer algorithm is a detailed step-by-step method for solving a problem by using a computer.
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Genome-scale disk-based suffix tree indexing Benjarath Phoophakdee Mohammed J. Zaki Compiled by: Amit Mahajan Chaitra Venus.
1 CSC 421: Algorithm Design & Analysis Spring 2013 See online syllabus: (also on BlueLine2) Course.
Sorting Algorithms. Motivation Example: Phone Book Searching Example: Phone Book Searching If the phone book was in random order, we would probably never.
CMPS1371 Introduction to Computing for Engineers SORTING.
Chapter 3 The Greedy Method 3.
CS333 Algorithms
CS 104 Introduction to Computer Science and Graphics Problems
Data Structures Introduction. What is data? (Latin) Plural of datum = something given.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Instructor: Paul Beame TA: Gidon Shavit.
Programming Fundamentals (750113) Ch1. Problem Solving
Analysis of Algorithms
1 ES 314 Advanced Programming Lec 2 Sept 3 Goals: Complete the discussion of problem Review of C++ Object-oriented design Arrays and pointers.
1 CSE 326: Data Structures: Sorting Lecture 17: Wednesday, Feb 19, 2003.
Programming With Java ICS201 University Of Hail1 Chapter 12 UML and Patterns.
Bold Stroke January 13, 2003 Advanced Algorithms CS 539/441 OR In Search Of Efficient General Solutions Joe Hoffert
Stochastic Algorithms Some of the fastest known algorithms for certain tasks rely on chance Stochastic/Randomized Algorithms Two common variations – Monte.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Fundamentals of Algorithms MCS - 2 Lecture # 7
Algorithms  Al-Khwarizmi, arab mathematician, 8 th century  Wrote a book: al-kitab… from which the word Algebra comes  Oldest algorithm: Euclidian algorithm.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Télécom 2A – Algo Complexity (1) Time Complexity and the divide and conquer strategy Or : how to measure algorithm run-time And : design efficient algorithms.
1 CSC 222: Computer Programming II Spring 2004 See online syllabus at: Course goals:
1 CS 350 Data Structures Chaminade University of Honolulu.
Introduction CS 3358 Data Structures. What is Computer Science? Computer Science is the study of algorithms, including their  Formal and mathematical.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Simple Iterative Sorting Sorting as a means to study data structures and algorithms Historical notes Swapping records Swapping pointers to records Description,
1 CPSC 320: Intermediate Algorithm Design and Analysis July 28, 2014.
Engineering Principles in Software Engineering five important concepts in CS that you will learn that can enhance software development.
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
1 Chapter 6 Dynamic Programming. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, optimizing some local criterion. Divide-and-conquer.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Data Structures Using C++ 2E
UNIT 5.  The related activities of sorting, searching and merging are central to many computer applications.  Sorting and merging provide us with a.
Engineering Principles in Software Engineering Five important concepts in CS that you will learn that can enhance software development.
Data Structure Introduction Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2010.
Design and Analysis of Algorithms (09 Credits / 5 hours per week) Sixth Semester: Computer Science & Engineering M.B.Chandak
Computer Science Background for Biologists CSC 487/687 Computing for Bioinformatics Fall 2005.
Complexity Analysis. 2 Complexity The complexity of an algorithm quantifies the resources needed as a function of the amount of input data size. The resource.
Sorting. Sorting Sorting is important! Things that would be much more difficult without sorting: –finding a telephone number –looking up a word in the.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
Onlinedeeneislam.blogspot.com1 Design and Analysis of Algorithms Slide # 1 Download From
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
FURQAN MAJEED ALGORITHMS. A computer algorithm is a detailed step-by-step method for solving a problem by using a computer. An algorithm is a sequence.
CSC 421: Algorithm Design & Analysis
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Advanced Algorithms Analysis and Design
Introduction to Algorithms
Cache Memory Presentation I
13 Text Processing Hongfei Yan June 1, 2016.
Algorithms Furqan Majeed.
CSC 421: Algorithm Design & Analysis
Engineering Principles in Software Engineering
Design and Analysis of Computer Algorithm (CS575-01)
Divide and Conquer / Closest Pair of Points Yin Tat Lee
Objective of This Course
Programming Fundamentals (750113) Ch1. Problem Solving
Introduction to Algorithms
What is Computer Science About? Part 2: Algorithms
INTRODUCTION TO ALOGORITHM DESIGN STRATEGIES
CS203 Lecture 15.
Department of Computer Science & Engineering
Time Complexity and the divide and conquer strategy
Presentation transcript:

Engineering Principles in Software Engineering Here are some key ideas you will learn about divide-and-conquer 2. recursion 3. greedy algorithms, tradeoffs 4. caching, dynamic programming 5. abstraction and reuse

1. Divide-and-conquer a major strategy for solving problems: break them into smaller sub-problems example: mergesort –given a list of numbers in random order –split the list into 2 halves –sort each half independently –merge the two sub-lists (interleave) | | ← here is a list to sort divide into 2 sublists sort each separately merge them back together

2. Recursion a form of divide-and-conquer write functions that call themselves example: factorial n! = 1x2x3...n def fact(n): if n>1: return n*fact(n-1) return 1 # base case: fact(1)=1 example: mergesort –when you divide list into 2 halves, how do you sort each half – by calling mergesort, of course! call trace: fact(3) => fact(2) => fact(1) 1 <= 2*1=2<= 3*2=6 <=

3. Greedy algorithms most implementations involve making tradeoffs –we know NP-complete problems are hard and probably cannot be solved in polynomial time –use a heuristic/shortcut – might get a pretty good solution (but not optimal) in faster time greedy methods do not guarantee an optimal solution –however, in many cases, a near-optimal solution can be good enough –it is important to know when a heuristic will NOT produce an optimal solution, and to know how sub- optimal it is (i.e. an “error bound”)

Examples of greedy algorithms –navigation, packet routing, shortest path in graph, robot motion planning choose the “closest” neighbor in the direction of the destination –document comparison (e.g. diff) start by aligning the longest matching substrings –knapsack packing choose item with highest value/weight ratio first –scheduling schedule the longest job first, or the one with most constraints..out to be more efficient to find the length of the longest subsequence. Then in the case where the increase the efficiency using the length of the longest subsequence. But if the first characters differ..

4. Caching One way to improve the efficiency of many programs is to use caching – saving intermediate results in memory that will get used multiple times –why calculate the same thing multiple times? –might require designing a special data structure (e.g. a hash table) to store/retrieve these efficiently –amortization: the cost of calculating something gets divided over all the times it is used

def calcProb(A,n): s = 0 for i in range(1,n): s += log(A[i]) return s def calcProb2(A,n): s = 0 C = [-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1] for i in range(1,n): if C[A[i]]==-1: C[A[i] = log(A[i]) s += C[A[i]] return s > calcProb([3,6,3,5,2,6,3,3,2,5]) calls log(.) 10 times calls log(.) 4 times alog(a)indexcache

Caching also applies to hardware design –memory hierarchy –for constants or global variables that get used frequently, put them in a register or L1 cache –analog to a “staging area” –variables used infrequently can stay in RAM –very large datasets can be swapped out to disk typical sizeR/W access time CPU registers101 cycles L1 cache (on chip)1kb-1Mb10 cycles main memory (RAM) 10Gb100 cycles disk drives100Gb-10Tb10,000 cycles kb cache

An important example of caching is Dynamic Programming –Suppose our goal is to compute the min. travel distance between A and E –build-up a table of smaller results for a subgraph ABCD A0 B180 C20220 D AC D B AC D BE

extend table for larger results –add row/column for E –E connects to the network at B and C –compute dist of X  E based on X  B and X  C ABCDE A0 B180 C20220 D E AC D BE d(D,E)=min[d(D,B)+19,d(D,C)+26] =min(47+19,25+26) =min(66,51)=51 d(A,E)=min[d(A,B)+19,d(A,C)+26] =min(18+19,20+26) =min(37,47)=37

5. Abstraction and Reuse Abstraction is the key to becoming a good programmer –don’t reinvent the wheel –more importantly, reuse things that have been debugged This is the basis of Object-Oriented Programming Many large software projects are built by plugging components together –write a small amount of code the makes things work together, like a making a browser out of: a) an HTML parser, b) a display engine, c) network URL query/retrieval functions, and d) plug-ins

Kinds of Abstraction –making a function out of things you do repeatedly parameterizing it so it can be applied to a wider range of inputs –object-oriented classes encapsulation – define internal representation of data interface – define methods, services good design – make the external operations independent of the internal representation (helps decouple code) example: a Complex number is a thing that can be added/subtracted, multiplied (by other Complex or a scalar), conjugated, viewed as a+bi or re i  –templates in C++: if you can sort a list of integers, why not generalize it to sort lists with any data type that can be pairwise-compared (total order)? –API design – Application-Programmer Interface a coherent, complete, logical system of functions and data formats example: OCR (optical character recognition) you don’t want to have to implement feature-based character recognition that is font- and scale-independent yourself (probably) define input: scanned TIFF images? output: ASCII strings? interface: String* OCRscan(TiffImage* input_image) are you going to indicate coordinates where word was found on the page? is the user able to load different character sets (alphabets)?