Presentation is loading. Please wait.

Presentation is loading. Please wait.

STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP

Similar presentations


Presentation on theme: "STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP"— Presentation transcript:

1 STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP
INCLUDING UNCERTAINTY MODELS FOR MORE EFFICIENT SURROGATE BASED DESIGN OPTIMIZATION The EGO algorithm Raphael T. Haftka Felipe A. C. Viana

2 BACKGROUND: SURROGATE MODELING
Surrogates replace expensive simulations in design optimization. Kriging (KRG) Polynomial response surface (PRS) Support vector regression Radial basis neural networks Example: is an estimate of Forrester AIJ and Keane AJ, Recent advances in surrogate-based optimization, Progress in Aerospace Sciences, Vol. 45, No. 1-3, pp , 2009. 2 2

3 BACKGROUND: UNCERTAINTY
Some surrogates also provide an uncertainty estimate: standard error, s(x). Example: kriging and polynomial response surface. These are used in EGO 3 3

4 Efficient Global Optimization (EGO)
Outline Efficient Global Optimization (EGO) Importing uncertainty models in order to use EGO with surrogates without uncertainty models Simple multiple-surrogate EGO to benefit from parallel computations

5 KRIGING FIT AND PRESENT BEST SOLUTION
First we sample the function. And we fit a kriging model. We note the present best sample (PBS) Jones DR, Schonlau M, Welch W, Efficient global optimization of expensive black-box functions, Journal of Global Optimization, 13(4), pp , 1998. 5 5

6 THE EXPECTED IMPROVEMENT QUESTION
Then we ask: Of all the points where we will improve, where are we most likely to improve significantly upon the present best sample? 6 6

7 WHAT IS EXPECTED IMPROVEMENT?
Consider the point x=0.8, and the random variable Y, which is the possible values of the function there. Its mean is the kriging prediction, which is near zero. 7 7

8 EXPLORATION AND EXPLOITATION
EGO maximizes E[I(x)] to find the next point to be sampled. The expected improvement balances exploration and exploitation because it can be high either because of high uncertainty or low surrogate prediction. When can we say that the next point is “exploration?” 8 8

9 THE BASIC EGO WORKS WITH KRIGING
Considering the root mean square error, : (a) Kriging Why not run EGO with the most accurate surrogate? (b) Support vector regression 9 9

10 BUT SVR DOES NOT HAVE UNCERTAINTY
Say we have two surrogates: Kriging (KRG): with uncertainty Support vector regression (SVR): NO uncertainty But we want SVR for EGO. Viana FAC and Haftka RT, Importing Uncertainty Estimates from One Surrogate to Another, in: 50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, Palm Springs, USA, May 4 - 7, AIAA 10 10

11 IMPORTATION AT A GLANCE
First, we generate the standard error of kriging. 11 11

12 IMPORTATION AT A GLANCE
Then, we combine the prediction from support vector regression and the standard error from kriging. 12 12

13 BENEFITS OF IMPORTATION OF UNCERTAINTY
Run EGO with non-kriging models!!! Hartman3 function (initially fitted with 20 points): After 20 iterations (i.e., total of 40 points), improvement (I) over initial best sample: 13

14 TWO OTHER DESIGNS OF EXPERIMENTS
FIRST: SECOND: 14 14

15 SUMMARY OF THE HARTMAN3 EXERCISE
Box plot of the difference between improvement offered by different surrogates (out of 100 DOEs) In 34 DOEs (out of 100) KRG outperforms RBNN (in those cases, the difference between the improvements has mean of only 0.8%). 15 15

16 EGO WITH MULTIPLE SURROGATES
Traditional EGO uses kriging to generate one point at a time. We use multiple surrogates to get multiple points. 16

17 POTENTIAL OF EGO WITH MULTIPLE SURROGATES
Hartman3 function (100 DOEs with 20 points) Overall, surrogates are comparable in performance. 17

18 POTENTIAL OF EGO WITH MULTIPLE SURROGATES
“krg” runs EGO for 20 iterations adding one point at a time. “krg-svr” and “krg-rbnn” run 10 iterations adding two points. Multiple surrogates offer good results in half of the time!!! 18


Download ppt "STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP"

Similar presentations


Ads by Google