Axially Variable Strength Control Rods for The Power Maneuvering of PWRs KIM, UNG-SOO Dept. of Nuclear and Quantum Engineering.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Pattern Recognition and Machine Learning
Experimental Design, Response Surface Analysis, and Optimization
Optimization methods Review
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
11.1 Introduction to Response Surface Methodology
Design and Analysis of Experiments
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
14-1 Introduction An experiment is a test or series of tests. The design of an experiment plays a major role in the eventual solution of the problem.
Stochastic Trust Region Gradient- Free Method (STRONG) -A Response-Surface-Based Algorithm in Stochastic Optimization via Simulation Kuo-Hao Chang Advisor:
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Industrial Applications of Response Surface Methodolgy John Borkowski Montana State University Pattaya Conference on Statistics Pattaya, Thailand.
A Comparative Study Of Deterministic And Stochastic Optimization Methods For Integrated Design Of Processes Mario Francisco a, Silvana Revollar b, Pastora.
Lecture 17 Today: Start Chapter 9 Next day: More of Chapter 9.
Planning operation start times for the manufacture of capital products with uncertain processing times and resource constraints D.P. Song, Dr. C.Hicks.
Informed Search Chapter 4 Adapted from materials by Tim Finin, Marie desJardins, and Charles R. Dyer CS 63.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
10.1 Chapter 10 Optimization Designs Optimization Designs CS RO R Focus: A Few Continuous Factors Output: Best Settings Reference: Box, Hunter &
1 14 Design of Experiments with Several Factors 14-1 Introduction 14-2 Factorial Experiments 14-3 Two-Factor Factorial Experiments Statistical analysis.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
MultiSimplex and experimental design as chemometric tools to optimize a SPE-HPLC-UV method for the determination of eprosartan in human plasma samples.
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Chapter 4: Response Surface Methodology
Industrial Applications of Experimental Design John Borkowski Montana State University University of Economics and Finance HCMC, Vietnam.
Statistical Design of Experiments
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Chapter 11Design & Analysis of Experiments 8E 2012 Montgomery 1.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
DOX 6E Montgomery1 Design of Engineering Experiments Part 8 – Overview of Response Surface Methods Text reference, Chapter 11, Sections 11-1 through 11-4.
Experimental Algorithmics Reading Group, UBC, CS Presented paper: Fine-tuning of Algorithms Using Fractional Experimental Designs and Local Search by Belarmino.
The American University in Cairo Interdisciplinary Engineering Program ENGR 592: Probability & Statistics 2 k Factorial & Central Composite Designs Presented.
CHAPTER 17 O PTIMAL D ESIGN FOR E XPERIMENTAL I NPUTS Organization of chapter in ISSO –Background Motivation Finite sample and asymptotic (continuous)
Local Search and Optimization Presented by Collin Kanaley.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Optimization Problems
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Introduction to Statistical Quality Control, 4th Edition Chapter 13 Process Optimization with Designed Experiments.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
January KIM, UNG-SOO Dept. of Nuclear Engineering
Optimization Problems
Chapter 7. Classification and Prediction
Heuristic Optimization Methods
Digital Optimization Martynas Vaidelys.
KIM, UNG-SOO Dept. of Nuclear and Quantum Engineering
School of Computer Science & Engineering
CJT 765: Structural Equation Modeling
Traffic Simulator Calibration
Local Search Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
Optimization Problems
5.2.3 Optimization, Search and
More on Search: A* and Optimization
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Design Of Experiment Eng. Ibrahim Kuhail.
ENM 310 Design of Experiments and Regression Analysis Chapter 3
Local Search Algorithms
Modeling and Analysis Tutorial
14 Design of Experiments with Several Factors CHAPTER OUTLINE
Dr. Arslan Ornek MATHEMATICAL MODELS
Stochastic Methods.
Optimization under Uncertainty
Presentation transcript:

Axially Variable Strength Control Rods for The Power Maneuvering of PWRs KIM, UNG-SOO Dept. of Nuclear and Quantum Engineering

2 Table of Contents  Introduction  Simulation Optimization  Optimization of AVSCRs  Further Study

3 Introduction  Optimization for the worth shape of the AVSCRs  Minimizing AO variation and power deviation during power maneuvering  Simulation optimization  Using response surface methodology  Objective function for optimization of the AVSCRs  Relationship between AO variation(or power deviation) and worth shape  Analytic objective function does not exist.  Response for input can be only be evaluated by computer simulation.

4 Simulation Optimization  What is simulation optimization?  The process of finding the best input variable value from among all possibilities without explicitly evaluating each possibility  Optimization problem where the objective function and some constraints are responses that can only be evaluated by computer simulation  An analytic expression of the objective function or the constraints do not exist.  This eliminates the possibility of differentiation or exact calculation of local gradients.  Precise evaluation of the objective function is computationally very costly. This makes the efficiency of the optimization algorithms more crucial.

5 Simulation Optimization  Advantages in using simulation in optimization  Complexity of the system being modeled does not significantly affect the performance of the optimization process.  For stochastic systems, the variance of the response is controllable by various output analysis technique.  A general simulation model  n-input variables and m-output variables  Simulation optimization entails finding optimal settings of the input variables which optimize the output variables

6 Simulation Optimization  A simulation optimization model  The output of a simulation model is used by an optimization strategy to provide feedback on progress of the search for the optimal solution.

7 Simulation Optimization  Simulation optimization method  The six major categories  Gradient based search method  Stochastic optimization  Response surface methodology  Heuristic methods  A-teams  Statistical methods

8 Simulation Optimization  Gradient based search methods  Estimate the response function gradient to assess the shape of the objective function  Finite differences  To estimate the gradient at a specific value of x, at least n+1 configuration of the simulation model must be run.  The crudest method of estimating gradient  Likelihood ratios  Estimate the gradient of the expected value of an output variable with respect to an input variable.  Suitable for transient and a regenerative simulation optimization problems

9 Simulation Optimization  Perturbation analysis  If an input variable is perturbed by an infinitesimal amount, the sensitivity of the output variable to the parameter can be estimated by tracing its pattern of propagation.  All partial gradients of an objective function are estimated from a single simulation run.  Frequency domain method  Selected input parameters are oscillated sinusoidally at different frequencies during one long simulation run.  If the output variable is sensitive to an input parameter, the sinusoidal oscillation of that parameter should induce corresponding oscillations in the response.

10 Simulation Optimization  Stochastic optimization  The problem of finding a local optimum for an objective function whose values are not known analytically but can be estimated or measured  Mimics the gradient search method in a rigorous statistical manner that takes into account the stochastic nature of the system model  Response surface methodology  A procedure for fitting a series of regression models to the output variable of a simulation model and optimizing the resulting regression function  Starts with a first order regression function and the steepest ascent/descent method.  Higher degree regression functions are employed after reaching the vicinity.

11 Simulation Optimization  Heuristic methods  Genetic algorithms  Search strategy that employs random choice to guide a highly exploitative, striking a balance between exploration of the feasible domain and exploitation of “good” solutions  Selection, reproduction, crossover, and mutation  Evolutionary strategies  Algorithms that imitate the principles of natural evolution  Two membered ES   Multi-membered ES   Extended multi-membered ES 

12 Simulation Optimization  Simulated annealing  A stochastic search method analogous to the physical annealing process where an alloy is cooled gradually so that a minimal energy state is achieved.  Tabu search  Optimizing an objective function with a special feature designed to avoid being trapped in local minima  A fixed-length of the Tabu moves that are not allowed at the present iteration is maintained.  Simplex search  Starts with points in a simplex consisting of p+1 vertices in the feasible region.  Proceeds by continuously dropping the worst point in the simplex and adding a new point determined through the centroid of the remaining vertices.

13 Simulation Optimization  A-teams  A process that involves combining various problem solving strategies so that they can interact synergistically  Inherently suitable for multi-criteria simulation optimization problems  Statistical methods  Important sampling method  To achieve significant speed ups in simulations involving rare event  Simulating system under a different probability measure so as to increase the probability of typical sample paths involving the rare event of interest  Ranking and selection  Have the ability to treat the optimization problem as a multi-criteria decision problem.  Multiple comparisons with the best  The problem to select the best of a finite number of system designs.

14 Optimization of AVSCRs  Optimization of AVSCRs through simulation optimization  RSM as an optimization strategy  Automation of the sequential process of RSM  Response surface methodology  A procedure for fitting a series of regression models to the output variable of a simulation model and optimizing the resulting regression function  Sequential nature of RSM  Starts with a first order regression function and the steepest ascent/descent method.  Higher degree regression functions are employed after reaching the vicinity.

15 Optimization of AVSCRs  First stage

16 Optimization of AVSCRs  Approximate the response surface function locally by first-order model  Usually a fractional two-level factorial design of resolution-III is used.  The number of design points is small compared to other types of two-level factorial design

17 Optimization of AVSCRs  Test the first-order model for adequacy  Lack of fit test with the analysis of variance(ANOVA) table  Perform a line search in the steepest descent direction  A line search is performed from a center point of current region of interest to find a point of improved response.  The steepest descent direction is given by  To end this type of search, a stopping rule is applied.  The most straightforward rule ends the line search when an observed value of the simulation response function is higher than the preceding observation.

18 Optimization of AVSCRs  Solve the inadequacy of the first-order model  Approximating the simulation response function in the region of interest by a second-order polynomial (usually)  To reduce the size of the region of interest by decreasing the step size (alternative)  To increase the simulation size used to evaluate a design point or to increase the number of replicated observation done in the design points (alternative)

19 Optimization of AVSCRs  Second stage

20 Optimization of AVSCRs  Approximate the objective function in the current region of interest by a second-order model  The central composite design(CCD) is most popular.  It can be transformed orthogonal by choosing a specific number of replicated observation in the center point.  Solve the inadequacy of the second-order model  To reduce the size of the region of interest  To increase the simulation size used in evaluating a design point  In RSM it is not customary to fit a higher than second-order polynomial.

21 Optimization of AVSCRs  Perform canonical analysis  To determine the location and the nature of the stationary point of the second-order model  If the stationary point is a minimum, the stationary point is accepted as the center of the next region of interest.  Otherwise the stationary point is rejected.  Perform ridge analysis  A search for a new stationary point on a given radius R such that the second-order model has a minimum at this stationary point  Accept the stationary point  Dependent on the result of the canonical analysis  A minimum is found : a new second-order approximation  A maximum or saddle point : a new first-order approximation

22 Optimization of AVSCRs  Results of first stage  Size of interest region=1.0,

23 Further Study  Obtain optimum worth shape of axially variable strength control rods  Enhancement of the operation strategy for the AVSCRs  to extract optimal performance of the AVSCRs to be developed  applying T_avg signal