Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Mathematical Programming

Similar presentations

Presentation on theme: "Introduction to Mathematical Programming"— Presentation transcript:

1 Introduction to Mathematical Programming
Dr. Saeed Shiry Amirkabir University of Technology Computer Engineering & Information Technology Department

2 Introduction Mathematical Programming considers the problem of allocating limited resources among competing activities. These resources could be people, capital, equipment, and competing activities might be products or services, investments, marketing media, or transportation routes. The objective of mathematical programming is to select the best or optimal solution from the set of solutions that satisfy all of the restriction on resources, called feasible solutions.

3 Application Examples An Operation Manger whishes to determine the most profitable mix of products or services that meets restrictions on labor, material and equipment while meeting forecasted demands. A call center manager needs to decide how many technicians must be scheduled during each shift so that the forecasted call volume during the day can be met. A marketing manager must decide how to allocate the advertising budget to different media depending on cost, effectiveness and mix constraints. A transportation manager wishes to determine the shortest routes for its delivery vehicle while serving all of its customers.

4 Components of a mathematical program
Major Components of a MP: Decision Variables, Those factors that are controlled by decision maker. Example: How many products, No of people, Amount of money allocated Objective function A performance measure such as Maximizing profit, Minimizing cost, Minimizing delivery time Constraints Restrictions that limit availability and manner that resources can be used to achieve the objective. Example: Limitation on labor, Time available to process a procedure.

5 Major Classes of Mathematical Programming
Linear Programming (LP) Makes 4 assumptions: Linearity, Divisibility, Certainty, and non negativity. Integer Programming (IP) Assumes that the decision parameters must take on integer values. Non linear programming (NLP) Assumes that the relationship in the objective function and/or constraint may be nonlinear

6 Modeling Process Requires following activities: Formulation, Solution,
Defining the decision variables, objective function and constraints Solution, Requires determining the optimal values of the decision variables and the objective function. For example by computer software Interpretation To interpret the results

7 Linear programming LP makes 4 assumptions: Linearity, Divisibility, Certainty, and non negativity. Linearity implies that the objective function and all of the constraints are linear relationship. Proportionality and additively are consequences of the linear assumption. Divisibility means that the optimal values of decision variables may be fractional depending upon the application. For example professional. Certainty requires that the parameters of LP model are known or can be accurately estimates. Non negativity simply means that all decision variables must take positive or non zero values.

8 Linear Programming Example
Assume that a firm produces alphas and omegas using labor, machine time, and finishing time. Profit for each alpha is 2.1USD and for each omega is 3.5USD. Each alpha requires 10 labor hour and 2 hours of machine and 3 hours of finishing Each Omega requires 14 labor hour and 2 hours of machine and no finishing. They have 70 hours of labor, 70 hours of machine and 12 hours of finishing time each day. Determine how many alphas and omegas they should produce to maximize the daily profit?

9 LP formulation Model: MAX= 2.1 x1+ 3.5X2 10X1+14X2 <=70

10 Graphical Solution Graph the constraints and identify the feasible region Determine the coordinates of the corner points of the feasible region Compute the objective function values for each corner point and determine the optimal solution.

11 Graph the constraints and identify the feasible region
The LP graph is restricted to the first quadrant of a standard two dimensional plot ( all variables are positive) Because of less than relation, the graph for each variable is a region. However we first draw the line and then determine the region

12 Graph of feasible Region
Model: MAX= 2.1 x1+ 3.5X2 10X1+14X2 <=70 2X1+20X2 <= 70 3X1<=12 X1 X2 Optimal Solution (X1,X2) Obj. Func (0,0) 0 (0,3.5) (4,0) 8.4 (4.2,14) (2.44,3.26) 16.52

13 Determine the coordinates of the corner points
The fundamental theorem of linear programming is that an optimal solution always lies at a corner point of the feasible region. Corner points are where 2 or more lines intersect. There are 5 corner point in this problem.

14 Calculate the objective function
The last step is to plug the values of the coordinates of each corner point into the objective function and compute the total profit for each point. MAX= 2.1 x1+ 3.5X2 (X1,X2) Obj. Func (0,0) 0 (0,3.5) (4,0) 8.4 (4.2,14) (2.44,3.26) 16.52 Optimal Solution

15 Iso-Profit Approach Another approach is to graph Iso-profit lines.
Set the objective function equal to an arbitrary value of profit . Example 2.1 x1+ 3.5X2 =14 To Maximize the profit move the line upward the region. When it gets tangent to the feasible line, the optimal solution is reached. X1 X2 Optimal Solution

16 Simplex Method Concept
Many Computer software packages use the simplex method for solving linear programming. This method systematically searches the corner points until the values of the objective function cannot be improved. Example: In this example, it moves first in the X2 axis ( because it has the highest profit) until it reaches corner (0,3.5). For the next corner point increase we have to in X1 which leads to increase in profit. Thus algorithm chooses to move to (2.44,3.26) For next corner point again we have to increase X1 which will cause a loss in profit. Thus algorithm stops. Thus this algorithm search only 3 points instead of 5 points. For larger problems with so many variables and constraints this will lead to saving in search space. X2 X1

17 Integer Programming IP is a variation of LP where one or more of the decision variables take on integer values. Example: Restrict X1 and X2 in previous example to be integer values. LP relaxation: The LP solution without integer restriction is called Relaxation. For a maximization problem the optimal objective function value of LP relaxation is upper bound of IP problem. X2 Optimal Solution LB=14.7 UB UB=16.52 X1

18 Integer Programming The LP relaxation solution for this IP problem is: X1=2.44 and X2=3.26 with an objective function value of which is a upper bond. A simple solution: When all constraint are “less than or equal to”, we can find the lower bond by simply taking the LP relaxation solution and rounding down to their nearest solution: X1=2 and X2=3 which will give an objective function value of 14.7 Then we can search the gap between lower and upper bond by Iso profit lines which passes through them. In this case we will find X1=4 and X2=2 with an objective function value of This is called optimal IP solution.

19 Non linear Programming
A non linear programming is a mathematical program in which at least one of constraints or the objective function is a non linear expression. Linear approximation might be a solution, however we do not live in a linear world: Doubling dose of a medicine dose not double its effectiveness. Assigning three times employees to do a job does not make it three times faster. The Portfolio problem is an important Non linear problem.

20 Non linear Objective function
Consider a NLP problem that with a nonlinear objective function and linear constraints. In this case we are not guaranteed that optimal solution lie at the corner points of the feasible region. In NLP problem, optimal solution can occur at any point. Thus it should search all possible points.

21 Local Optimal Solution
A local optimal solution is the best solution only with respect to feasible solutions close to that point. A global optimal solution is the best solution to the complete problem. Global Optima Local Optima

22 Example Model: Max= 100X2-341X1+128X1^2-13X1^3 5X1+3>=30 X2<=8.5 -X1+X2>=3 We can Graph the feasible region and use Iso Profit approach to identify the local optimal solutions. X2 A point of Tangency Global Optimum Profit=722 Profit=636 A Local Optimum Profit=400 X1

23 Gradient Search Many of the computational method for solving NLP problems make use the vector of partial derivative of the objective function known as gradient. This method moves in the direction of the steepest ascent. The search is continues until the peak is reached : No more improvement in the objective function can be made. This method has the potential to be trapped in local optima and there is no guarantee to it will find the global optima.

24 Home Work 5 A) Consider the following Integer programming problem:
Maximize: 5X1+8X2 Such that: 6X1+5X2<=30 9X1+4X2<=36 X1+2X2<=10 Provide a graph of the model and the optimal solution.

25 HomeWork 5 B) Summarize one of the following Papers:
Real-Coded Genetic Algorithm for Solving Generalized Polynomial Programming Problems From: Journal of Advanced Computational Intelligence and Intelligent Informatics Vol.11 No.4, 2007 Solving Integer Programming Problems Using Genetic Algorithms By ai-depot | May 25, 2003

Download ppt "Introduction to Mathematical Programming"

Similar presentations

Ads by Google