Download presentation

Presentation is loading. Please wait.

Published byJustin Buchanan Modified over 2 years ago

1
1 -Classification: Internal Uncertainty in petroleum reservoirs

2
2 -Classification: Internal Finding the reservoir I

3
3 -Classification: Internal Finding the reservoir II The underground is packed with density gradients: Interpreting this is a far cry from hard science. Top and base of reservoir (I think …).

4
4 -Classification: Internal Geological properties Exploration well – try to infer properties on km scale from point measurement.

5
5 -Classification: Internal Porosity and permeability High porosityLow porosity High permeabilityLow permeability

6
6 -Classification: Internal OK – what is inside this reservoir Internal barriers? Interface depth?

7
7 -Classification: Internal Fluid properties Water-wet reservoirOil-wet reservoir

8
8 -Classification: Internal Uncertain factors –The geometry of the reservoir – including internal compartmentalization. –The spatial distribution of porosity and permeability. –Depth of fluid interfaces. –Fluid and fluid-reservoir properties. –…

9
9 -Classification: Internal What to do with it? 1.Deterministic models: Attempts at modelling and quantifying uncertainty are certainly done, but this is mainly in the form of variable (stocastic) input, not stocastic dynamics. 2.Before production: A range input values is tried out, and the future production is simulated.These simulations are an important basis for investment decisions. 3.After production start: When the field is producing we have measured values of e.g. produced rates of oil, gas and water which can be compared with the simulated predictions a misfit can be evaluated, and the models updated.

10
10 -Classification: Internal History matching (or revisionism) 1.Select a set true observations you want to reproduce in your simulations. 2.Select a (limited) set of parameters to update. 3.Update your parameters as best you can. 4.Simulate your model and compare simulated results with observations. 5.Discrepancy below tolerance? 6.You have an updated model. No Yes

11
11 -Classification: Internal History matching – it is just plain stupid Traditionally History Matching is percieved as an optimization problem – a very problematic approach: –The problem is highly nonlinear, and severely underdetermined. –The observations we are comparing with can be highly uncertain. –The choice of parameterization is somewhat arbitrary – we will optimize in the wrong space anyway.

12
12 -Classification: Internal A probabilistic problem – Bayesian setting. {m} : Model parameters {d} : Observed data Prior Posterior Likelihood

13
13 -Classification: Internal The objective function Guassian likelihood: P(d|m) = exp(-(S(m) – d) T C -1 (S(m) – d)) Result from the simulator Covariance of measurement errors. Evaluation of S(m) requires running the simulator and is very costly. Observed data

14
14 -Classification: Internal How to find the posterior?? EnKF: Data assimilation technique based on resampling of finite ensemble in a Gaussian approximation. Gives good results when the Gaussian approximation applies, and fails spectactularly when it does not apply. BASRA (McMC with proxy functions): Flexible and fully general approach. Guaranteed to converge to the correct posterior, but the convergence rate can be slow.

15
15 -Classification: Internal Kalman filter Kalman filter: Technique for sequential state estimation based on combining measurements and a linear equation of motion. Very simple example: Forecast Updated Measurement Forecast error Measurement error (Co)variance estimate: State estimate:

16
16 -Classification: Internal EnKF When the equation of motion is nonlinear predicting the state covariance becomes difficult. The EnKF approach is to let an ensemble (i.e. sample) evolve with the equation of motion, and use the sample covariance as a plugin estimator for the state covariance. –Gaussian likelihood. –Gaussian prior –A combined parameter and state estimation problem. –The updated state is linear combination of the prior states. Computationally efficient – but limiting

17
17 -Classification: Internal EnKF - linear combination Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Observation Integrate Time EnKF update: A A = A F X

18
18 -Classification: Internal EnKF update: sequential The EnKF method updates the models every time data is available. When new data becomes available we can continue without going back. WOPR TIME Last historical data Future prediction

19
19 -Classification: Internal BASRA Workflow 1.Select a limited ( <~ 50 ) parameters {m} to update, with an accompanying prior. 2.Perturb the parameter set {m} {m} + δ{m} and evaluate a new misfit O({m}). 3.Accept the new state with probability P = min{1,exp(-δO({m})}. 4.When this has converged we have one realization {m} from the posterior which can be used for uncertainty studies; repeat to get an ensemble of realizations. The evaluation of the misfit is prohibitively expensive, and advanced proxy modelling is essential.

20
20 -Classification: Internal BASRA Results Converging the proxies: Marginal posteriors: Posterior ensemble: Prior Posterior

21
21 -Classification: Internal Current trends –Reservoir modelling usually involves a chain of weakly coupled models and applications – strive hard to update parameters early in the chain. –Update of slightly more exotic variables like surface shapes and the direction of channels. –The choice of parameterization is somewhat arbitrary – we will optimize in the wrong space anyway. A more systematic approach to choosing parameterization would be very valuable.

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google