Warehouse Lending Optimization Paul Parker (2016).

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Advertisements

Algorithm Design Techniques
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Exact and heuristics algorithms
Tetris – Genetic Algorithm Presented by, Jeethan & Jun.
Optimization methods Review
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Genetic Algorithms. Some Examples of Biologically Inspired AI Neural networks Evolutionary computation (e.g., genetic algorithms) Immune-system-inspired.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
Iterative Improvement Algorithms
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Evolutionary algorithms
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
Genetic Algorithm.
Efficient Model Selection for Support Vector Machines
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Genetic Algorithms Michael J. Watts
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
Genetic Algorithms Genetic algorithms imitate a natural optimization process: natural selection in evolution. Developed by John Holland at the University.
Optimization with Genetic Algorithms Walter Reade October 31, 2002 TAG Meeting.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
Optimal Placement of Wind Turbines Using Genetic Algorithms
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Exact and heuristics algorithms
FORS 8450 Advanced Forest Planning Lecture 5 Relatively Straightforward Stochastic Approach.
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
ECE 103 Engineering Programming Chapter 52 Generic Algorithm Herbert G. Mayer, PSU CS Status 6/4/2014 Initial content copied verbatim from ECE 103 material.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Parallel Genetic Algorithms By Larry Hale and Trevor McCasland.
Optimization Problems
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
FACTS Placement Optimization For Multi-Line Contignecies Josh Wilkerson November 30, 2005.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm(GA)
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Using GA’s to Solve Problems
Genetic Algorithms.
Goal We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models. The hybrid.
Digital Optimization Martynas Vaidelys.
Bin Packing Optimization
Chapter 4 Beyond Classical Search
Meta-Heuristic Algorithms 16B1NCI637
GENETIC ALGORITHMS & MACHINE LEARNING
Population Based Metaheuristics
Presentation transcript:

Warehouse Lending Optimization Paul Parker (2016)

Problem Background Warehouse Operations Process: As many as 50,000 loans come into the pipeline at once, each with different characteristics such as credit rating, loan to value (LTV), zip code, type of structure, etc. Warehouse Operations Process: As many as 50,000 loans come into the pipeline at once, each with different characteristics such as credit rating, loan to value (LTV), zip code, type of structure, etc. Lender has 11 credit lines for warehousing, each line has different Loan level (eligibility)and Pool level (sublimit) rules Lender has 11 credit lines for warehousing, each line has different Loan level (eligibility)and Pool level (sublimit) rules

Problem Background (cont’d) Loan allocation is typically done by hand using Excel spreadsheets Loan allocation is typically done by hand using Excel spreadsheets Any allocation must meet eligibility (loan level) and will incur costs and penalties if sublimit (pool level) requirements are not met Any allocation must meet eligibility (loan level) and will incur costs and penalties if sublimit (pool level) requirements are not met Optimization of cost not typically done Optimization of cost not typically done

Sample Loan Attributes

Sample Lender Constraints

Problem Background (cont’d) Problem: create an algorithm to automate the allocation of N loans to n liquidity sources under the constraint that the Cost function C(f) is minimized and both eligiblity and sublimit requirements are met Problem: create an algorithm to automate the allocation of N loans to n liquidity sources under the constraint that the Cost function C(f) is minimized and both eligiblity and sublimit requirements are met This is a combinatorial optimization problem, it is also what is called an NP-complete problem, in which the computation time to find an exact solution increases with N as t(N) = exp(const * N) This is a combinatorial optimization problem, it is also what is called an NP-complete problem, in which the computation time to find an exact solution increases with N as t(N) = exp(const * N)

Solution Options There are many approaches that can be used to solve the problem, which can be grouped into the following classes: There are many approaches that can be used to solve the problem, which can be grouped into the following classes: Grid search (exhaustive search) Grid search (exhaustive search) Random or Monte Carlo type search Random or Monte Carlo type search Gradient (calculus based) methods Gradient (calculus based) methods Global Optimization methods Global Optimization methods

Grid search A Grid Search, or exhaustive search, finds an optimal solution by brute force, attempting every possible solution and suggesting the one that gives the best results A Grid Search, or exhaustive search, finds an optimal solution by brute force, attempting every possible solution and suggesting the one that gives the best results Impractical for all but the case where the number of loans is very small Impractical for all but the case where the number of loans is very small

Combinations of solutions for 10 Warehouse Lines (grid search) Number of Loans Possible Solutions Computation Time seconds 20167, minutes 36600,000,000 7 days E days E years

Random search A random search (also called a monte carlo search) samples the solution space stochastically A random search (also called a monte carlo search) samples the solution space stochastically Unintelligent - information about the search results is not used to refine the search Unintelligent - information about the search results is not used to refine the search Because there is such a large and complex solution space in this problem it is unlikely that a good solution can be found randomly Because there is such a large and complex solution space in this problem it is unlikely that a good solution can be found randomly

Gradient methods Methods such as conjugate gradient and steepest descent Methods such as conjugate gradient and steepest descent Rely on the derivative of a function to find an optimal solution Rely on the derivative of a function to find an optimal solution Very efficient on well behaved functions Very efficient on well behaved functions Poor performance on complex and /or multimodal functions Poor performance on complex and /or multimodal functions

Multimodal function in 3 dimensions

Gradient methods (cont’d) Gradient methods are analogous to putting a ball somewhere on the previous surface and hoping it will roll down into the lowest point Gradient methods are analogous to putting a ball somewhere on the previous surface and hoping it will roll down into the lowest point Naturally, the ball will go to a minimum, but not necessarily the global minimum Naturally, the ball will go to a minimum, but not necessarily the global minimum Gradient methods have no reliable way to “bounce out” of a local minimum Gradient methods have no reliable way to “bounce out” of a local minimum

Global optimization methods Occupy a middle ground between traditional gradient based methods and global enumerative/random schemes Occupy a middle ground between traditional gradient based methods and global enumerative/random schemes Many popular methods Many popular methods Genetic Algorithms Genetic Algorithms Simulated Annealing Simulated Annealing Neural Networks Neural Networks Simulated ant and bee colonies Simulated ant and bee colonies These methods mimic nature to solve complex nonlinear problems These methods mimic nature to solve complex nonlinear problems

Genetic algorithms Mimic natural selection by randomly producing a population of solutions, testing each solution, and combining traits for best performing ones Mimic natural selection by randomly producing a population of solutions, testing each solution, and combining traits for best performing ones They also introduce a certain amount of randomness into the search by performing occasional mutations on the solutions They also introduce a certain amount of randomness into the search by performing occasional mutations on the solutions

Steps for Implementing the Genetic Algorithm 1. Coding the problem and incorporating constraints 2. Generating an initial (parent) population of random solutions 3. Evaluating the fitness of each solution 4. Subjecting the solutions to a selection process to find pairs of high fitness 5. Applying crossover to the pairs to create a new (child) population 6. Applying mutation operators to the new population 7. Overwriting the old population with the new one

Genetic Algorithm

Incorporating Constraints Warehouse lenders impose constraints on eligibility both at the loan level and the pool level Warehouse lenders impose constraints on eligibility both at the loan level and the pool level Potential problem for global cost optimization: wasting time computing the cost of solutions which violate loan level constraints (invalid solutions) Potential problem for global cost optimization: wasting time computing the cost of solutions which violate loan level constraints (invalid solutions) One way to deal with this is to penalize invalid solutions in the objective function One way to deal with this is to penalize invalid solutions in the objective function A far better way is to minimize the global cost by searching ONLY the sub space of valid solutions A far better way is to minimize the global cost by searching ONLY the sub space of valid solutions

Incorporating Constraints The initial (parent) population of potential solutions can be coded an list of loans, with each loan assigned randomly to an ELIGIBLE lender. To do this, we must compute a list of the eligible lenders for each loan and draw randomly from this list. This insures that time is not wasted evaluating invalid solutions The initial (parent) population of potential solutions can be coded an list of loans, with each loan assigned randomly to an ELIGIBLE lender. To do this, we must compute a list of the eligible lenders for each loan and draw randomly from this list. This insures that time is not wasted evaluating invalid solutions A reasonably large (> 100) initial population of solutions must be generated A reasonably large (> 100) initial population of solutions must be generated To generate a new (child) solutions, we combine two solutions by crossing them over at a random point on the loan list To generate a new (child) solutions, we combine two solutions by crossing them over at a random point on the loan list

Generating an Initial (Parent) Population

Evaluating the Fitness of Each Solution For the warehouse operations process this is essentially what is done by hand. Loan parameters are compared to loan constraints in an Excel spreadsheet to see if they meet the qualifications for the bin For the warehouse operations process this is essentially what is done by hand. Loan parameters are compared to loan constraints in an Excel spreadsheet to see if they meet the qualifications for the bin For this problem, the fitness function is the same as the cost function, with “fitter” solutions giving a smaller fitness value For this problem, the fitness function is the same as the cost function, with “fitter” solutions giving a smaller fitness value Rapid calculation of this is essential, as it will be done hundreds of thousands of times with each run Rapid calculation of this is essential, as it will be done hundreds of thousands of times with each run

Evaluating the Fitness of Each Solution Factory Pattern for creation of Loan Attributes

Applying Crossover to the Pairs to Create a New (Child) Population Here we are assuming that the best performing solutions have bits and pieces of the optimal solution, so to get the optimal solution we must mix them up by crossing the solutions over Here we are assuming that the best performing solutions have bits and pieces of the optimal solution, so to get the optimal solution we must mix them up by crossing the solutions over Here we must make sure that the offspring are valid solutions (in terms of eligibility requirements). This can be accomplished easily because the solutions are coded as strings of valid lines, and the action of crossing over does not disrupt the validity of each solution Here we must make sure that the offspring are valid solutions (in terms of eligibility requirements). This can be accomplished easily because the solutions are coded as strings of valid lines, and the action of crossing over does not disrupt the validity of each solution

Applying Crossover to the Pairs to Create a New (Child) Population

Applying mutation operators to the new population This keeps diversity in the gene pool and guarantees that no “traits” are ever completely lost due to crossover This keeps diversity in the gene pool and guarantees that no “traits” are ever completely lost due to crossover Solutions are mutated by the occasional random change of position for a loan (a different eligible line is selected) Solutions are mutated by the occasional random change of position for a loan (a different eligible line is selected)

Overwriting the old generation In this step the old (parent) array is overwritten with the new (child) array and a generation is completed In this step the old (parent) array is overwritten with the new (child) array and a generation is completed The new population can be checked to see if the best model from the previous generation has been reproduced and if not, it is copied into a random slot. This ensures that there is no regression in the solution quality The new population can be checked to see if the best model from the previous generation has been reproduced and if not, it is copied into a random slot. This ensures that there is no regression in the solution quality

Results vs. Monte Carlo