DROPPING LOWEST GRADES MATC Math Club January 26, 2006 Jonathan Kane, Professor University of Wisconsin-Whitewater.

Slides:



Advertisements
Similar presentations
Unit-iv.
Advertisements

Dynamic Programming 25-Mar-17.
Algorithm Design Methods Spring 2007 CSE, POSTECH.
MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Chapter 5 Fundamental Algorithm Design Techniques.
Greedy Algorithms Clayton Andrews 2/26/08. What is an algorithm? “An algorithm is any well-defined computational procedure that takes some value, or set.
Greed is good. (Some of the time)
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
15-May-15 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Lecture 3: Greedy Method Greedy Matching Coin Changing Minimum Spanning Tree Fractional Knapsack Dijkstra's Single-Source Shortest-Path.
Greedy vs Dynamic Programming Approach
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
WARNING…WARNING… WARNING…WARNING THIS POWER POINT PRESENTATION MAY IMPROVE STUDENT ACHIEVEMENT VIEW AT YOUR OWN RISK!!!! Mean, Mode, Median, Range Mr.
0-12 Mean, Median, Mode, Range and Quartiles Objective: Calculate the measures of central tendency of a set of data.
Scott Perryman Jordan Williams.  NP-completeness is a class of unsolved decision problems in Computer Science.  A decision problem is a YES or NO answer.
1 Approximation Through Scaling Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij.
Numerical Methods Applications of Loops: The power of MATLAB Mathematics + Coding 1.
Algorithm Paradigms High Level Approach To solving a Class of Problems.
Greedy algorithms David Kauchak cs161 Summer 2009.
5-1-1 CSC401 – Analysis of Algorithms Chapter 5--1 The Greedy Method Objectives Introduce the Brute Force method and the Greedy Method Compare the solutions.
1 Programming for Engineers in Python Autumn Lecture 12: Dynamic Programming.
Dropping Lowest Grades What score(s) should be dropped to maximize a students grade? Dustin M. Weege Concordia College 2008 Secondary Mathematics Education.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Hunter Gear Optimization Team Alt + F4 Michael Barbour; Joshua Law son; Michael Lee.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Algorithm Design Methods 황승원 Fall 2011 CSE, POSTECH.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Interval Scheduling and Fractional Knapsack These slides are based on the Lecture Notes by David Mount for the course CMSC 451 at the.
Dynamic Programming … Continued 0-1 Knapsack Problem.
Greedy algorithms David Kauchak cs302 Spring 2012.
2/19/ ITCS 6114 Dynamic programming 0-1 Knapsack problem.
Math for Liberal Studies.  The knapsack problem is a variation of the bin packing problems we have been discussing  This time, there is just one bin.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
Ch3 /Lecture #4 Brute Force and Exhaustive Search 1.
Exhaustive search Exhaustive search is simply a brute- force approach to combinatorial problems. It suggests generating each and every element of the problem.
Introduction to Algorithms: Brute-Force Algorithms.
Genetic Algorithm (Knapsack Problem)
CHAPTER 12 Statistics.
Dynamic Programming 26-Apr-18.
while Repetition Structure
Lecture on Design and Analysis of Computer Algorithm
Algorithm Design Methods
CHAPTER 12 Statistics.
Robbing a House with Greedy Algorithms
Design and Analysis of Algorithm
Final Grade Averages Weighted Averages.
Greedy Method     Greedy Matching     Coin Changing     Minimum Spanning Tree     Fractional Knapsack     Dijkstra's Single-Source Shortest-Path.
Prepared by Chen & Po-Chuan 2016/03/29
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Approximate Algorithms
Brute Force Approaches
Exam 2 LZW not on syllabus. 73% / 75%.
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Dynamic Programming 23-Feb-19.
Primal-Dual Algorithm
Algorithm Design Methods
Write the number that is half way between these 2 numbers.
CHAPTER 12 Statistics.
Dynamic Programming.
CSC 413/513- Intro to Algorithms
Algorithm Design Methods
14.2 Measures of Central Tendency
Algorithm Design Methods
Presentation transcript:

DROPPING LOWEST GRADES MATC Math Club January 26, 2006 Jonathan Kane, Professor University of Wisconsin-Whitewater

What is dropping lowest grades? Obtaining the largest raw score Obtaining the largest mean score

DROP TWO GRADES: Largest raw score Get the largest raw score by dropping Quiz 1 & 4 to give a raw score of = 36. NOTE: this drops the grade with the largest percentage. TABLE 1: Alan’s Quiz scores Quiz12345 Score Possible Percentage

K = {1, 2, 3, 4, … k} For j  K, quiz j has integer maximum possible score. A student earns an integer score. To drop r scores, find of size k – r so that the ratio is maximized.

DROP ONE GRADE DROP #1:(20 + 1) / ( ) = 17.5% DROP #2:(80 + 1) / ( ) = 67.5% DROP #3:( ) / ( ) = 50% The best grade to drop is quiz 2. NOTE: This retains quiz 3 with the lowest percentage and lowest raw score. TABLE 2: Beth’s Quiz Scores Quiz123 Score80201 Possible Percentage80205

BEST ONE GRADE TO DROP: Quiz 4 ( ) / ( ) = 63.4% BEST TWO GRADES TO DROP: Quizzes 2 and 3 ( ) / ( ) = 74.6% TABLE 3: Carl’s Quiz Scores Quiz1234 Score Possible Percentage

TABLE 4: Dale’s Quiz Scores Quiz Score20+c21-b 1 22-b 2 23-b 3 24-b 4 25-b 5 Possible Percetage Quiz Score26-b 6 27-b 7 28-b 8 29-b 9 30-b 10 Possible Percetage

Average of the grades quiz 0 and the quizzes in

BETH’S QUIZ GRADES DROP FIVE GRADES Each b j = 1 If c = 4, it is best to drop quizzes 1,2,3,4,5. If c = 6, it is best to drop quizzes 6,7,8,9,10. NOTE: A very small change in the problem completely changes the outcome.

BETH’S QUIZ GRADES C = 11 and each b j = 2 DROP FOUR GRADES It is best to drop quizzes 1,2,3,4. DROP FIVE GRADES It is best to drop quizzes 6,7,8,9,10. NOTE: The best five grades to drop does not contain any of the best four grades to drop.

COMMON APPROACHES TO FINDING ALGORITHMS Brute Force Greedy Algorithm Dynamic Programming

FINDING THE BEST r OF k GRADES TO DROP BY BRUTE FORCE Consider all the subsets of K. To drop 10 out of 100 grades, you would need to consider 17,310,309,456,440 subsets. At 1 Million subsets per second, it would take about 200 days to consider them all.

GREEDY ALGORITHM Take what looks like the best first step. Then take what looks like the best next step. Continue until a solution is found. Maybe it will be the optimal solution.

GREEDY ALGORITHM SHORTEST PATH

DYNAMIC PROGRAMMING Find the best solution for a small problem. Increase the size of the problem, and find the best solution for the new size. Continue increasing the size of the problem in increments until the final size is reached.

DYNAMIC PROGRAMMING KNAPSACK PROBLEM Your knapsack can hold a weight of 50. You can take as many of each item as you can fit. You want to maximize the sum of the benefits.

ITEM TYPEWEIGHTBENEFIT

BEST SOLUTION IF YOU CAN CARRY ONLY A WEIGHT OF 10 1 #1 for a benefit of 5. BEST SOLUTION IF YOU CAN CARRY ONLY A WEIGHT OF 20 2 #1 for a benefit of #2 for a benefit of 12. BEST SOLUTION IF YOU CAN CARRY ONLY A WEIGHT OF 30 1 #1 and best weight 20 for a benefit of #3 for a benefit of 20.

BEST SOLUTION IF YOU CAN CARRY ONLY A WEIGHT OF 40 1 #1 and best weight 30 for a benefit of #2 and best weight 20 for a benefit of #4 for a benefit of 24. BEST SOLUTION IF YOU CAN CARRY ONLY A WEIGHT OF 50 1 #5 for a benefit of #4 and best weight 10 for a benefit of #3 and best weight 20 for a benefit of 32. Best solution: 1 item #2 and 1 item #3.

Greedy algorithms and dynamic programming algorithms will not work for the dropping lowest grades problem because the best solution for n – 1 grades is unrelated to the best solution for n grades.

THE OPTIMAL DROP FUNCTION Want to maximize Rewrite as Define for each j, Want to find of size k – r and q so where q is as large as possible.

Define the Optimal Drop Function F(q) is easy and efficient to calculate. For a given q, just calculate the k f j (q) values and throw away the r smallest.

f j (q) for dropping 2 of Carl’s 4 quiz grades.

FIND ALL INTERSECTOINS There are k lines. Find the intersections of lines. Then calculate F(q) for each q at each intersection point.

BISECTION METHOD Start with x 1 and x 2 where the function F crosses the axis between x 1 and x 2. Find x 3 half way in between. Determine on which side of x 3 the function F crosses the axis. Continue until you are satisfied.

NEWTON’S METHOD?? Start with any q 1. Find F(q 1 ). Follow the line segment to q 2 on the axis. Repeat until q best is found.