Ensemble Kalman Filtering in Distance-Kernel Space

Size: px
Start display at page:

Download "Ensemble Kalman Filtering in Distance-Kernel Space"

Transcription

1 Ensemble Kalman Filtering in Distance-Kernel Space Kwangwon Park Abstract Current geostatistical simulation methods allow generating a realization that honors available data, such as hard and secondary data under certain geological scenarios. However, it is difficult to simulate large models that honor highly nonlinear response functions. The objective of this study is to generate multiple realizations all of which honor all available data. First, we generate a large ensemble of possible realizations describing the spatial uncertainty for given hard data. Secondly, using multidimensional scaling, we map these models into a low-dimensional space by defining a proper distance between these prior models. Next, kernel Kerhunen-Loeve expansion is applied to the realizations mapped in this metric space to parameterize them into short standard normal random vectors. We apply ensemble Kalman filtering to these parameterizations to update multiple realizations matching the same data. A back-transformation (the pre-image problem) allows the generation of multiple geostatistical models that match all data, hard and non-linear response. The proposed framework has been successfully applied to generate multiple Gaussian models which honor hard data and dynamic response data. Dissimilarity distance which is highly correlated with the difference in dynamic data combined with multidimensional scaling provides a powerful tool for analyzing the path of optimization and the probability density of the inverse problem. Additionally, EnKF is boosted by reducing the ensemble size through kernel k-mean clustering without any significant loss of ensemble statistics. 1 Introduction Conditioning Many geoscientists and petroleum engineers have attempted to build geological models for predicting reservoir performance and building development plans. One of those main issues when modeling a reservoir is conditioning. A model should be consistent with currently all available data: geologic processes, outcrop, log, core, seismic survey, production history, etc. Most of the data are static except for production history and 4-D seismic. 1

2 Presently, geostatistical algorithms are being widely used to generate a model conditional to static data. Both two-point and multiple-point algorithms can handle hard and soft data under certain geological scenario. Any other methods, such as object-based or process-based methods, still have limitations on conditioning to those data. Optimization Conditioning to dynamic data (usually called history matching) using geostatistical simulations, however, is difficult because of the severe nonlinearity of the problem. Hence, we often regard it as an optimization problem which minimizes the difference between the observed data and the calculated output, so called the objective function. Compared with various conventional optimization problems, this problem contains several tough challenges: ill-posedness, sparsity of data, large number of model parameters, timeconsuming forward simulation, and so forth. Various approaches to overcome those limitations have been proposed and are divided into two main categories: gradient-based and stochastic methods. Gradient-based optimization methods usually using sensitivity coefficient are widely used because of their fast convergence. Although the approaches are well-defined, the fluctuating nature of the objective function leads to non-optimal solution, such as a local minimum. In presence of local minima, stochastic optimization techniques, such as simulated annealing (SA) and genetic algorithm, are better suited to reach the optimal solution. In spite of the fact that those methods can find a solution which is conditioned to dynamic data, they may not preserve static data conditioning as imposed by geostatistical algorithms. Probability Perturbation Method (PPM; Caers, 2003) and Gradual Deformation Method (GDM; Hu, 2000) make it possible to update a model stochastically to condition to dynamic data as well as static data and geologic constraints. Both methods update a current model by combining it with a new possible model through a one-parameter optimization. The difference is that PPM deals with the probability and GDM the random numbers which are used in geostatistical simulations (Caers, 2007). While PPM and GDM can provide a model conditional to static and dynamic data with geologic constraints, they need relatively large number of forward simulations, which take usually several hours to days. Moreover, they yield only one inverse solution per optimization, hence a large number of optimizations with different initial realizations are required to obtain multiple conditional realizations. Due to uncertainty and nonuniqueness in the problem, it is necessary to generate multiple conditional realizations. Filtering (EnKF) In order to get multiple models, Ensemble Kalman Filtering (EnKF) has been applied and researched very actively recently. Kalman Filter (KF) is the technique to obtain the spec- 2

3 ification of a linear dynamic system which accomplishes the prediction, separation, or detection of a random signal (Kalman, 1960). Though KF deals with a linear stochastic difference equation, Extended Kalman Filter (EKF) is able to be applied when the relationship between the process and measurements is non-linear (Welch and Bishop, 2004). However, EKF is not applicable to highly nonlinear systems. Evensen (1994) developed a modified Kalman filter, EnKF, for highly nonlinear problems. Due to its high performance and pliant applicability, EnKF has rapidly spread and has been effectively applied to a variety of fields, such as ocean dynamics, meteorology, hydrogeology, and so forth (Houtekamer and Mitchel, 1998; Reichle et al., 2002; Margulis et al., 2002; Evensen, 2003; Evensen, 2004). Nævdal and Vefring (2002) brought EnKF into the history matching problem. In their earlier studies, EnKF was applied to characterize a well-bore reservoir. Since then, EnKF was utilized to identify detailed permeability distribution of an entire reservoir (Nævdal et al., 2003). Gu and Oliver (2004) updated the permeability and porosity of a full 3-Dimensional (3-D) reservoir simultaneously in PUNQ-S3 reservoir, which is a realistic synthetic reservoir to verify the performance of history matching and uncertainty analysis. Gao et al. (2005) compared randomized maximum likelihood method with EnKF. Liu and Oliver (2005) carried out EnKF with the consideration of the geologic facies. Park et al. (2005) demonstrated the superior performance of EnKF in aquifer parameter identification compared to SA and GDM. Park et al. (2006) also verified the applicability of EnKF to a waterflooded reservoir with the methods of regeneration and selective uses of the measurements. As a history matching through EnKF showed good performance and numerous advantages, it has been researched actively in a range of petroleum academies and industries (Zhang et al., 2005; Zafari and Reynolds, 2005; Skjervheim et al., 2005; Lorentzen et al., 2005). The strong points of EnKF are: (1) It makes it possible to update a model efficiently and elaborately, since EnKF needs exactly one forward simulation per ensemble member, as opposed to hundreds to thousands numbers of forward simulations with conventional methods. (2) Since EnKF provides optimal output from noisy input, we can avoid the deviation of solution by measurements errors. (3) It handles any kinds of reservoir properties. (4) It utilizes all available kinds of measurement data simultaneously. (5) It is easily implemented to various forward simulators without additional calculation or modification of the equations. However, the limitations of EnKF are: (1) It does not preserve the geologic information. It deals with Gaussian model only. Therefore, EnKF cannot handle facies models, such as channel bed reservoirs, which are often generated by multiple-point geostatistics (MPS). (2) It often provides physically unreasonable values, such as pressure and saturation inconsistent with the permeability or saturation values greater than 1.0 or smaller that 0.0. (3) It requires the ensemble size large enough to represent the uncertainty. Large size of ensemble needs large number of forward simulations. 3

4 KL expansion Recently, Sarma (2006) developed a novel method to parameterize a geologic realization. In the method, a realization is parameterized by an relatively short Gaussian random vector through the Karhunen-Loeve (KL) expansion of the empirical covariance of an ensemble of geologic realizations. The KL expansion makes it possible to generate new realizations which share the same covariance of the ensemble. In addition, in order to maintain higher-order moments as well as covariance, Sarma introduced a kernel into the parameterization. Therefore the parameterization is done in a very high-dimensional kernel feature space and a realization is obtained by solving a pre-image problem. The pre-image problem consists of yet another optimization problem to convert a realization from a feature space to a realization space. Consequently, a new realization is obtained by a nonlinear combination of an ensemble of realizations. Sarma optimized this relatively short parameters and found a solution which is conditioned to dynamic data by gradient-based optimization algorithms. A kernel can be understood as a similarity measure, because a kernel is a dot product of feature vectors and similar feature vectors provide large dot product (Schölkopf and Smola, 2002). If this similarity from a kernel function can represent the similarity of dynamic data, optimization would become easier and more efficient. However, polynomial kernels which are used in Sarma (2007) are not well correlated with the difference of dynamic data, which will be shown later. Better and more relevant similarity measures are required for reservoir applications. Distance While a kernel is a similarity measure, a distance can be regarded as a dissimilarity measure. As a dissimilarity measure, Suzuki and Caers (2006) introduced a Hausdorff distance. They showed how a (static) distance between any two realizations that correlates with their difference in flow response can be used to search for history matched models by means of efficient search algorithms, such as neighborhood algorithm and tree-search algorithm. This method was successfully applied to structurally complex reservoirs (Suzuki et al., 2006). In order for this method to work the dissimilarity distance between any two realizations should be reasonably correlated with the dynamic data. To do this, Park and Caers (2007) proposed a connectivity distance which correlates well with the difference in dynamic data between two realizations. Scheidt and Caers (2007) showed how a distance which is well correlated with dynamic data can be applied for uncertainty analysis in kernel feature space. Although we only know the distance between two realizations in a distance space, if we use a radial basis function (RBF) kernel of Euclidian distance calculated by multi-dimensional scaling of the distance measure, we can calculate a kernel function. Scheidt(2008) also shows that the RBF kernel based on a similarity distance can be applied to the parameterization based 4

5 on kernel KL expansion. The objective of this research is to overcome the limitations of conventional EnKF through implementing it in distance-based kernel space. Further, this research would show how the EnKF in distance-based kernel space generates multiple geologic realizations all of which honor all available data. In the following chapter, the theoretical methodology is explained. Then simple example is presented and discussed. This report focuses on ensemble Kalman filtering in distance-based kernel space. For details about the other procedures (distance, parameterization, kernel, and pre-image problem), refer to Appendix or the report of Caers (2008). 2 Methodology 2.1 Kalman Filter Prior to explaining EnKF, the Kalman filter is briefly introduced. KF consists of a set of mathematical equations that provides an efficient computational (recursive) means to estimate the state of a process, in a way that minimizes the mean of the squared estimation error (Welch and Bishop, 2004). KF considers the uncertainties in the model and simultaneously provides optimal model estimates from noisy measurements. Each time the measurements are taken during the prediction process, KF carries out the correction process. Through the repetitions of prediction and correction processes, KF helps the state converge to the value near the truth. KF addresses the problem to estimate the state of a linear stochastic difference equation (Equation 1) with a measurement (Equation 2). In this system, we can obtain measurements through the linear combination of the state vector. The measurements have measurement noises. z t = Az t 1 + Bu t 1 + w t 1 (1) d t = H t z t + v t (2) where, z represents the state vector, i.e. the state of the system which cannot be measured directly (for example, gridblock pressure and saturation), u the control input (for example, boundary condition of the system), and w the process noise (for example, estimated error from the forward simulator, which is A in KF). The subscript t represents the time step. z denotes the measurement vector (for instance, bottom hole pressure), and v the measurement noise (for instance, the noise when measuring the bottom hole pressure). Although we measure the same properties several times, measurements are not exactly identical because of noise. H is called the measurement matrix operator (for example, if we measure permeability at some gridblocks, H becomes a matrix that consists of 0 and 1 such that we can obtain the permeability values at the gridblocks from the state vector by d = Hz). 5

6 The estimation error is defined by Equations 3 and 4. e = z ẑ (3) e = z ẑ (4) where, e is the estimation error. The superscript indicates a vector for the a priori state and no superscript denotes the a posteriori state. A priori here is a state before assimilating and a posteriori after assimilating the measurements. The hat denotes an estimated state and no hat denotes the true state. Four error covariances are defined by Equations 5 to 8. Q = E [ww ] (5) R = E [vv ] (6) P = E [ e e ] (7) P = E [ee ] (8) where, Q is the model error covariance, R the measurement noise covariance, P the estimation error covariance. E[ ] represents the expectation operator. In deriving the equations for KF, we begin with the goal of finding an equation that computes an a posteriori state estimate as a linear combination of the a priori estimates and a weighted difference between an actual measurement and the prediction as shown below in Equation 9, the assimilation equation: ẑ t = ẑ ( t + G t dt Hẑ ) t (9) where, G is named Kalman gain, which is determined to minimize the estimation error covariance. When we substitute Equations 4 and 9 into Equation 8, we get Equation 10. P t = E [{ z t ẑ t G ( d t Hẑ )} { t zt ẑ t G ( d t Hẑ )} ] t (10) Expanding Equation 10 and assuming that the measurement errors are independent of the estimation error (Equation 11), we obtain Equation 12. [ ] E v t e t = E [ e ] t v t = 0 (11) P t = (I G t H)P t (I G th) + G t RG t (12) Differentiating Equation 12 and finding the Kalman gain to minimize the estimation error covariance, we finally deduce the Kalman gain equation, Equation 13. Additionally, the a posteriori estimation error covariance is calculated from the a priori estimation error covariance directly (Equation 14) and vice versa (Equation 15). G t = P t H ( HP t H + R ) 1 (13) P t = (I G t H)P t (14) P t = AP t 1 A + Q (15) 6

7 Equations 1 and 15 represent the forecast (predict) step, or time update, and Equations 13 and 14 represent the assimilation (correct) step, or measurement update. After each time and measurement updates, the process is repeated with the previous a posteriori estimate used to project or predict the new a priori estimate. This recursive nature is one of the very appealing features of the KF (Welch and Bishop, 2004). 2.2 Ensemble Kalman Filter The method of EnKF consists of two steps: the forecast step and the assimilation step. In history matching, the forecast step is the reservoir simulation step from current state to the next measurement time. The assimilation step is a correction step to honor the measurements at that time. EnKF is a recursive data process algorithm that updates all the variables simultaneously with repetition of forecast and assimilation steps. A nonlinear difference equation that calculates the state at time step t from that at time step t 1 is represented by Equation 16. z t = f (z t 1, u t 1 ) (16) where, the operator f ( ) means the nonlinear difference equation. In this case, the state vector consists of the parameterized permeability, pressure, and water saturation at each gridblock and dynamic data, such as Equation 17. z = y k y p y Sw d (17) where, y k, y p, and y Sw represent the parameterized permeability, the parameterized pressure, and the parameterized water saturation vectors of each center gridblock in kernel feature space. In its most simple parameterization, each gridblock permeability, pressure, and water saturation are parameters. Equation 18 shows the measurement at time step t which contains measurement noise. The measurement noise is assumed to be white noise. The measurement matrix operator is composed of 0 and 1. The measurement noise is generated by Equation 19. d t = Hz t + v t (18) v t = v t + δ t (19) where, the bar means the mean of measurement noise. δ represents the white noise. The measurement error covariance is calculated by Equation 20. If we assume that the measurement error of a property is independent with the error measuring another 7

8 properties at the same location; then the measurement error covariance is a block diagonal matrix (for example, the porosity and the permeability at the same location). If we assume that the measurement error of any variable from one measurement location is independent with that of the other location, even though the properties are the same, the measurement error covariance is a diagonal matrix. In other words, the measurement error covariance becomes the measurement error variance. R = E [vv ] = v v (20) The aim of the assimilation step is to minimize the estimation error covariance. The estimation error and the estimation error covariance are defined by Equations 21 to 24, respectively. e = x ˆx (21) e = x ˆx (22) P = E [ e e ] = 1 N R N R e j e j (23) j=1 P = E [ee ] = 1 N R N R e j e j (24) j=1 where, N R indicates the size of ensemble. That is to say, the assimilation step is to update an a priori state to an a posteriori state in which the estimation error covariance is to be minimized. In EnKF, the true state is assumed to be the mean of ensemble members. Note that the size of ensemble should be large enough to represent the underlying uncertainty. The state vector that minimizes the estimation error covariance is obtained from Equation 25 to 26. ẑ t = ẑ k + G ( t dt Hẑ ) t (25) G t = P ( t H HP t H + R ) 1 (26) Once we obtain an a priori estimate through forward simulation, we could acquire the a posteriori estimate after some basic matrix calculations. Assimilation is able to be conducted whenever the measurements are available. Summarizing the EnKF processes (Figure 1), first, we generate an initial ensemble based on initial measurements at time t 0. Second, we conduct the time update to acquire next production data at time t 1 through a reservoir simulator: This is the prediction step. When the next measurements become available, we carry out the measurement update by calculating the Kalman gain: This is the assimilation step. From the corrected state, we conduct the prediction step again until the next measurements are obtained at time t 2. Likewise, whenever we get measurements, we execute the correction. The update proceeds with the iterative prediction and correction step. 8

9 3 The proposed workflow Based on the theories that are stated above, the proposed procedure for conditioning ensemble to dynamic data under realistic geologic scenarios is as follows (Figure 2): 1. Generate the initial ensemble (realization space) First we generate an initial ensemble. The initial ensemble should include realizations that are honoring geologic information and are conditioned to all available static data, that is, hard and soft data. To do this, we can choose a proper geostatistical algorithms, such as SGSIM, SISIM, DSSIM, or SNESIM and FILTERSIM if using a training image. Generating the ensemble, we may have to consider the uncertainty in the static data. For example, if our geologic information is uncertain, we can use multiple training images or variogram models. 2. Calculate the dissimilarity distances (distance space to metric space) From the initial ensemble, we calculate the dissimilarity distances and construct a distance matrix. At this step, it is important for the distances to be correlated with the dynamic data that we want to condition. If needed, we can apply multidimensional scaling to lower the dimension and get Euclidian distances, which make it possible to use RBF kernels. The dissimilarity distance employed in this research is explained in the Appendix. 3. Calculate the kernel matrix (to feature space) Based on the Euclidian distances, we calculate the kernel matrix. RBF kernel matrix would be easily calculated but a proper kernel should be chosen cautiously. 4. Parameterize the initial ensemble (to parameterization space) After obtaining the eigenvalues and eigenvectors of the kernel matrix, each realization of the initial ensemble is parameterized to relatively short Gaussian random variables. Actually the parameterization is obtained from the eigenvectors of the kernel matrix directly. (See the Appendix) 5. Ensemble Kalman filtering (in parameterization space) The ensemble Kalman filtering would be done on the parameterization of the initial ensemble. Since the parameterizations are low-dimensional Gaussian random vectors, we may apply various optimization methods, such as gradient-based methods using the sensitivity coefficients, probability perturbation method, gradual deformation method, ensemble Kalman filter, and so on. Since we already have an ensemble, EnKF would be applied effectively and provide multiple realizations which show the same dynamic data response. 9

10 6. Solve the pre-image problems (to realization space) Now, the optimized parameterizations are converted back into the realization space. Using a proper minimization algorithm, such as a fixed-point iteration, we solve the pre-image problems for all the optimized parameterizations. The preimage problem will be discussed in another report (Caers, 2008). 7. Analyze multiple realizations Finally, we obtain multiple realizations which satisfy all available data and geologic scenarios. We can use these multiple realizations in a variety of purposes. Since we generate an initial ensemble reflecting the uncertainty after conditioning to static data acquired so far, these final multiple realizations indicate the uncertainty (a posteriori) after conditioning to static and dynamic data. 4 Example 4.1 Given information Geometry Geology Target reservoir is 310 ft 310 ft 10 ft. The permeability distribution is modeled with an anisotropic spherical variogram (NE50 direction; 200 ft (major direction) and 50 ft (minor) of correlation length). It is log-normally distributed and the mean and standard deviation of logpermeability are 4.0 and 1.0, respectively. Hard data Two wells are available with quarter-five-spot pattern. We are given the permeability values at the two wells as hard data. Injector and producer are located at (45 ft, 45 ft) and (275 ft, 275 ft), respectively. Both the permeability values at the two wells are 150 md. Dynamic data During 3 years, watercut is measured at the producer every two months. The measured watercut value contains Gaussian noise (noise level: 10%). Figure 3 shows the watercut data acquired. 4.2 Ensemble Generation Based on the given geometry, the reservoir is discretized into 961 (31 31) gridblocks (10 ft 10 ft 10 ft). Based on the given geology and hard data, 1,000 realizations are generated 10

11 by SGSIM with conditioning to hard data (permeability values at two locations). Figure 4 shows the log-permeability of 6 out of 1,000 realizations. Figure 5 represents the mean and conditional variance of the log-permeability of 1,000 initial realization (initial ensemble). Figure 6 depicts the simulated watercut curves for 1,000 initial realizations and the measured watercut values. Conditioning only to hard data provides significantly biased realizations in terms of flow response. The measured watercut values are almost p 90 of the initial realizations. Watercut curves for initial realizations would be called a priori flow responses, which are useful when compared with the a posteriori flow responses later. 4.3 Distance calculation and multi-dimensional scaling Dissimilarity distances between all possible combinations of two realizations among 1,000 initial realizations have been calculated. In this example, connectivity distance has been used (See Appendix). Figure 7 represents the scatter plot between dissimilarity distance and the difference in actual dynamic data. High linear correlation means that the distance can represent the difference in dynamic data without conducting any time-consuming forward simulation. One million (1,000 1,000) distance calculations are one to 100 times faster than one forward simulation. Figure 8 displays 1,000 initial realizations in 2D MDS space. The fact that there are two groups (left narrow line and right wide plume) of realizations in MDS space can be easily caught. Figure 9 demonstrates the nature of the two groups of realizations: in the first group (right-side wide plume), the injector is connected to the producer by highpermeability region (hot spot) in the realizations; in the second group (left-side narrow line), the two wells are disconnected in the realizations. Furthermore, the realizations which are located far left region in the left narrow line region are more disconnected that those located near the center (near the intersection point between the two regions). This is true for the right wide plume region. Figure 10 exhibits the difference in dynamic data in 2D MDS space. Since we plot the difference in dynamic data based on one realization (indicated by ), the difference in dynamic data is thought to be the objective function where the reference is the realization indicated by. Figure 10 shows that the objective function in 2D MDS space is smoothly varying and does not have any local minimum, which is a favorable case for fast optimization. 4.4 Ensemble Kalman Filtering EnKF has been implemented with 300 (randomly chosen) out of 1,000 initial realizations. After the eigenvalue decomposition of Gaussian kernel matrix based on the connectivity distance, standard Gaussian parameterizations are assigned to 300 realizations. Detailed parameterization can be found in Appendix and Caers (2008). 11

12 At 260 days, a first correction is executed based on the measurement. Reservoir simulations (prediction) from 0 days to 260 days for 300 initial realizations are done. Figure 11 shows the watercut curves calculated by the reservoir simulations and the measured watercut at 260 days. All the realizations have not yet shown breakthrough, so all the watercut values simulated to 260 days are zero. Also the measured watercut is zero, which means there is no need to correct the state vectors. Figure 12 displays the update at 260 days in 2D MDS space. As we expect, all the realizations do not move at all in 2D MDS map. At 520 days, a second correction is performed based on the measurement. Reservoir simulations (the prediction step) from 260 days to 520 days for 300 initial realizations are done. For the prediction, the reservoir simulations are conducted from the pressure and water saturation updated at 260 days. Figure 13 shows the watercut curves calculated by the reservoir simulations and the measured watercut at 520 days. The watercut values from the simulations vary from 0.0 to 0.2. While the measured watercut is 0.03, which means there are some realizations which show fast breakthrough. It can be shown that the simulation displaying early breakthrough have their injectors connected to the producers by a high permeability zone. Therefore, the realizations that are more connected have to be corrected. Figure 14 displays the update at 520 days in 2D MDS space. Recall that the realizations located in left narrow line region are highly connected ones and the realizations located in right wide plume region are not connected ones. As a result, almost all the connected realizations are corrected to disconnected ones. The realizations in right wide plume region also moves to the reference point (which is indicated by in Figure 10). Dissimilarity distance which is highly correlated with the difference in dynamic data and multi-dimensional scaling provide a powerful tool for analyzing the path of optimization and the quality of EnKF methodology. Figure 15 displays the updates after 520 days. As the update proceeds, all the realizations move toward the reference point and watercut curves starts honoring the measurements. Figure 16 lists the updates of log-permeability, a priori and a posteriori water saturation of one realization. In three updates, the solution converges and the update of permeability is consistent with the water saturation. The pre-image problem (Caers, 2008) on dissimilarity distance-based kernel makes this consistent updates possible. Figure 17 exhibits the final mean and conditional variance after EnKF. While the initial mean (Figure 5) represents the disconnected high-permeability zone from injector to producer, the final mean shows more connected high-permeability zone. Figure 18 shows the reference field, which were assumed to be the true realization. The mean of final realizations looks very similar to the reference one. Figure 19 represents the prediction of watercut of 300 final realizations from 0 days to 1095 days. The mean of 300 watercut curves matches the measurements very well, which can be identified by comparing it with Figure 6. 12

13 5 a priori and a posteriori probability density Figures 20 to 22 depicts the probability density of unconditional realizations, conditional realizations to hard data only and to hard and dynamic data in 2D MDS map. Figure 20 means a priori probability density when we have geologic information only. In this example, the geologic information is spherical variogram and log-normal distribution of permeability. Figure 21 means a posteriori probability density after conditioning to hard data. Compared with the a priori distribution, the portion of the connected realizations (in right wide plume region) has been increased, because the given hard data are high permeability at well locations. Figure 22 represents a posteriori probability density after conditioning to hard and dynamic data, which indicates that EnKF can provide multiple solutions all of which are honoring to the dynamic data and represents the a posteriori probability distribution. Dissimilarity distance and multi-dimensional scaling made it possible to analyze the a priori and a posteriori probability density. It would be impossible unless the dissimilarity distance is highly correlated with the difference in dynamic data and MDS maps them into low-dimensional space. 6 Ensemble size reduction Scheidt (2008) shows an effective selection method using kernel k-mean clustering in distance-kernel space. First, do kernel k-mean clustering of the ensemble members in distance-kernel space. Second, select a realization for each cluster which is nearest to the centroid of that cluster. Then, the selected realizations reproduce the statistics of flow responses of the whole ensemble members. This method is applied to reduce the size of ensemble in EnKF. First, the initial 300 realizations (Figure 8) are clustered into 30 clusters through kernel k-mean clustering (Figure 24). Second, 30 realizations which are nearest to the centroids of the corresponding clusters, respectively (Figure 25). Figure 26 and 27 display watercut curves for the selected 30 realizations and 300 realizations and their p 10, p 50, and p 90 (10- percentile, median, 90-percentile), respectively. Figure 28 compares the p 10, p 50, and p 90 of the selected 30 realizations with those of the whole initial 300 realizations. Only 30 realizations reproduce the statistics of flow responses of 300 realizations. Third, the same procedure of EnKF is applied to the 30 realizations only. In this case, only 30 reservoir simulations are needed. Figure 29 represents the movements of 30 realizations at the update of 520 days. As in the case of 300 realizations, all the 30 realizations moves to the reference point. Figure 30 depicts the flow predictions of the final 30 realizations. All the realizations are conditioned to the measurements within the measurement error level (10%). Figure 31 and 32 display watercut curves for the final 30 realizations and the final

14 realizations and their p 10, p 50, and p 90 (10-percentile, median, 90-percentile), respectively. Figure 33 compares the p 10, p 50, and p 90 of the final 30 realizations with those of the whole final 300 realizations. EnKF with only 30 selected realizations reproduces the statistics of flow responses of final 300 realizations. In summary, the ensemble size can be reduced significantly by kernel k-mean clustering and the reduced ensemble reproduce the similar results (final ensemble statistics or uncertainty) as those of full ensemble case. 7 Conclusion EnKF has been successfully implemented in distance-based kernel space. 1. EnKF can be applied to any type of geologic model including Gaussian in distancekernel space. 2. Standard normal parameterization through kernel Karhunen-Loeve expansion fits well for the state vector of EnKF. 3. The proposed framework makes it possible to update the permeability, pressure, and water saturation consistently. 4. Dissimilarity distance which is highly correlated with the difference in dynamic data combined with multi-dimensional scaling provide a powerful tool for analyzing the path of optimization and the probability density of the inverse problem. 5. EnKF is boosted by reducing the ensemble size through kernel k-mean clustering without any significant loss of ensemble statistics. 14

15 References [1] Caers, J.: Distance-based stochastic modeling: theory and applications, 21st SCRF Annual Meeting Report (2008) [2] Caers, J.: Comparison of the Gradual Deformation with the Probability Perturbation Method for Solving Inverse Problems, Mathematical Geology (2007) 39, 1. [3] Caers, J.: History Matching under a Training Image-based Geological Model Constraints, SPEJ (2003) [4] Datta-Gupta, A. and King, M.J.: A Semianalytic Approach to Tracer Flow Modeling in Heterogeneous Permeable Media, Advances in Water Resources (1995) 18, 9. [5] Deutsch, C.V. and Journel, A.G.: Geostatistical Software Library and User s Guide, Oxford University Press, NY (1998). [6] Evensen, G.: Sampling Strategies and Square Root Analysis Schemes for the EnKF, Ocean Dynamics (2004) 54, 539. [7] Evensen, G.: The Ensemble Kalman Filter: Theoretical Formulation and Practical Implementation, Ocean Dynamics (2003) 53, 343. [8] Evensen, G.: Sequential Data Assimilation with a Nonlinear Quasi-Geostrophic Model Using Monte Carlo Methods to Forecast Error Statistics, J. of Geophysical Research (1994) 99, [9] Gao, G., Zafari, M., and Reynolds, A.C.: Quantifying Uncertainty for the PUNQ-S3 Problem in a Bayesian Setting with RML EnKF, paper SPE93324 presented at the 2005 SPE Reservoir Simulation and Symposium, TX. [10] Gu, Y. and Oliver D.S.: History Matching of the PUNQ-S3 Reservoir Model Using the Ensemble Kalman Filter, paper SPE89942 of the 2004 SPE ATCE, TX. [11] Houtekamer, P.L. and Mitchell, H.L.: Data Assimilation Using an Ensemble Kalman Filter Technique, Monthly Weather Review (1998) 126, 796. [12] Hu, L.Y.: Extended Probability Perturbation Method for Calibrating Stochastic Reservoir Models, Mathematical Geology, (2008) under review. [13] Hu, L.Y., Blanc, G., and Noetinger, B.: Gradual Deformaion and Iterative Calibration of Sequential Stochastic Simulations, Mathematical Geology (2001) 33, 4. [14] Hu, L.Y.: Gradual Deformaion and Iterative Calibration of Gaussian-Related Stochastic Models, Mathematical Geology (2000) 32, 1. 15

16 [15] Hu, L.Y. and Blanc, G.: Constraining a Reservoir Facies Model to Dynamic Data Using a Gradual Deformation Method, proceedings of the 1998 ECMOR, UK. [16] Journel, A.: Combining knowledge from diverse information sources: an alternative to Bayesian analysis, Mathematical Geology (2002) 34, 5. [17] Kalman, R.E.: A New Approach to Linear Filtering and Prediction Problems, J. of Basic Engineering (1960) 82, 35. [18] Lorentzen, R.J., Nævdal, G., Valles, B., Berg, A.M., and Grimstad, A.-A.: Analysis of the Ensemble Kalman Filter for Estimation of Permeability and Porosity in Reservoir Models, paper SPE96375 presented at the 2005 SPE ATCE, TX. [19] Margulis, S.A., Mclaughlin, D., Entekhabi, D., and Dunne, S.: Land Data Assimilation and Estimation of Soil Moisture Using Measurements from the Southern Great Plains 1997 Field Experiment, Water Resources Research (2002) 38, 1. [20] Nævdal, G., Johnsen, L.M., Aanonsen, S.I., Vefring, E.H.: Reservoir Monitoring and Continuous Model Updating Using Ensemble Kalman Filter, paper84372 of the 2003 SPE ATCE, CO. [21] Nævdal, G., and Vefring, E.H.: Near-Well Reservoir Monitoring through Ensemble Kalman Filter, paper SPE75235 presented at the 2002 SPE/DOE Improved Oil Recovery Symposium, OK. [22] Park, K. and Caers, J.: History Matching in Low-Dimensional Connectivity Vector Space, proceedings of EAGE Petroleum Geostatistics 2007 Conference, Cascais, Portugal. [23] Park, K., Choe, J., and Shin, Y.: Real-time Reservoir Characterization Using Ensemble Kalman Filter During Waterflooding, J. of the Korean Society for Geosystem Engineering (2006) 43, 143. [24] Park, K., Choe, J., and Ki, S.: Real-Time Aquifer Characterization Using Ensemble Kalman Filter, proceedings of GIS and Spatial Analysis 2005 Annual Conference of the IAMG, Toronto, Canada. [25] Reichle, R.H., McLanghlin, D.B., and Entekhabi, D.: Hydrologic Data Assimilation with the Ensemble Kalman Filter, Monthly Weather Review (2002) 130, 103. [26] Sarma, P: Efficient Closed-Loop Optimal Control of Petroleum Reservoirs under Uncertainty, Ph.D. Dissertation of Stanford University (2006) [27] Schölkopf, B. and Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, The MIT Press, Cambridge, MA (2002). 16

17 [28] Schaaf, T., Chavent, G., and Mezghani, M.: Refinement Indicators for Optimal Selection of Geostatistical Realizations Using the Gradual Deformation Method, Mathematical Geology (2004) 36, 425. [29] Scheidt, C.: (2008) Quantification of uncertainty on spatial and non-spatial reservoir parameters - comparison between the experimental design and the distance-kernel method, 21st SCRF Annual Meeting Report (2007). [30] Scheidt, C. and Caers, J.: Using Distances and Kernel to parameterize Spatial Uncertainty for Flow Applications, proceedings of EAGE Petroleum Geostatistics 2007 Conference, Cascais, Portugal. [31] Skjervheim, J.-A., Evensen, G., Aanonsen, S.I., Ruud, B.O., and Johansen, T.A.,: Incorporating 4D Seismic Data in Reservoir Simulation Models Using Ensemble Kalman Filter, paper SPE95789 presented at the 2005 SPE ATCE, TX. [32] Suzuki, S. and Caers, J.: History Matching with an Uncertain Geological Scenario, paper SPE presented at the 2006 SPE ATCE, TX. [33] Suzuki, S., Caers, J., and Caumon, G.: History Matching of Structurally Complex Reservoirs Using Discrete Space Optimization Method, proceedings of th GOCAD meeting. [34] Thiele, M.R., Batycky, R.P., and Blunt, M.J.: Simulating Flow in Heterogeneous Media Using Streamtubes and Streamlines, SPERE (1996) 10, 5. [35] Welch, G. and Bishop, G.: An Introduction to the Kalman Filter, Department of Computer Science, University of North Carolina at Chapel Hill, NC (2004) [36] Zafari, M. and Reynolds, A.C.: Assessing the Uncertainty in Reservoir Description and Performance Predictions with the Ensemble Kalman Filter, paper SPE95750 presented at the 2005 SPE ATCE, TX. [37] Zhang, D., Lu, Z., and Chen, Y.: Dynamic Reservoir Data Assimilation with an Efficient, Dimension-Reduced Kalman Filter, paper SPE95277 presented at the 2005 SPE ATCE, TX. [38] Zuur, A.F., Ieno, E.N., and Smith, G.M.: Analysing Ecological Data, Springer, NY (2007) 17

18 A Dissimilarity Distance A distance is a measure of dissimilarity between two realizations. Distance between any two realizations contains more useful information rather than the realizations themselves. What we are interested in about the realizations determines a type of distance. For example, if we are interested in the flow responses of the realizations, a dissimilarity distance should be correlated with the flow response. If we are interested in the geometry of the realizations, a distance should represent the difference in the geometry. Simply, we can evaluate the dissimilarity between realizations x a and x b (discretized into N gb gridblocks) through the Minkowski model (Equation 27). d(x a, x b ) = [ Ngb ] 1/p (x a ) i (x b ) i p i=1 (27) where, (x) i represents i-th element of the vector x. p ( 1) defines the distance space, such as Euclidian space (p = 2) or City-Block (Manhattan) space (p = 1) or Dominance space (p = ). Although the Minkowski model is easy to calculate, it may not be wellcorrelated with dynamic data because the forward simulated dynamic data may change dramatically by perturbing a small portion of the realization. Figure 23 depicts the correlation between Euclidian distance and the dissimilarity between dynamic data (the difference between watercut curves). For this example, 1,000 Gaussian realizations (same as the initial realizations in the example above) were used. It turns out that Euclidian distances is not correlated with the dynamic data. In order to optimize an inverse solution efficiently in the distance space, it is necessary that the dynamic data are spatially correlated in the space. For this, various distances may be utilized. For instance, Suzuki and Caers (2006) proposed the Hausdorff distance which is a measure of dissimilarity of two binary images, Scheidt and Caers (2007) the streamline-assisted solution-variables-based distance, and Park and Caers (2007) the connectivity-based distance, which will be discussed later. Note that we do not have to identify the coordinates of the space and it is sufficient that we know only the distances among the realizations in defining the distance space. Various distances can be used as long as it is well correlated with the dynamic data. The connectivity distance, as an example of dissimilarity distances, will be discussed here. In order for the connectivity vector to have high correlation with the production history, TOF (time of flight, Datta-Gupta and King, 1995) based distance calculation is proposed which shows satisfactory correlation. TOF from an injector to a producer is calculated by streamline simulation (Thiele et al., 1996) in steady-state conditions. Typically, steady-state simulation requires only one hundredth to one thousandth simulation time equivalent to the usual reservoir simulation time. Equation 28 shows the TOF-based injector to producer distance calculation. We can choose a percentile among TOFs of stream- 18

19 lines that arrive at a producer. τ ji k = wp i w I j dζ k v(ζ k ) where, τ ji k represents TOF of k-th streamline from an injector wi j to a producer wp i. ζ k is the coordinate along k-th streamline and v(ζ) is the interstitial velocity along streamline, ζ. Then, analytical water saturation at producer wi P is calculated by N S w (t; wi P) = 1 wp,n sl N wp N sl i,j 1 M 1 ( M t τ ji 1 k where, N wp and N sl represent the number of producers and streamlines, respectively. M means the end-point mobility and t the time. Analytical fractional flow at producer w P i is obtained by f w (t; w P i ) = q w q w + q o = ) (28) (29) k rw ( S w )/µ w k rw ( S w )/µ w + k ro ( S w )/µ o (30) where, q w and q o mean the water and oil flow rates, respectively. k rw and k ro mean the water and oil relatively permeability, respectively. µ w and µ o are the water and oil viscosity, respectively. Finally connectivity distance between realizations m a and m b is calculated by the difference in the fractional flow curves: t1 ( 2 d(m a, m b ) = fw(t) a fw(t)) b dt (31) where, the time t 1 can be set to be the same as the simulation time. 0 B Parameterization Prior to the parameterization of the geological model space, we start from an ensemble of realizations, x j, (j = 1,..., N R, if we generate N R realizations). x could represent a facies porosity or permeability realizations or any combination of those. The initial ensemble can be generated by various geostatistical algorithms honoring the geologic information and conditioning to the static data (hard and soft data). For simplicity, define the matrix for the ensemble X as [X] :,j = x j, (32) where, [X] i,j means (i, j) element of matrix X and (:, j) is j-th column of matrix X. The covariance of the ensembles is calculated numerically by Equation 33. C = 1 N R N R x j x j = 1 XX (33) N j=1 R 19

20 When we perform an eigenvalue decomposition on the covariance (Equation 34), then a new realization can be obtained by Equation 35 (Karhunen-Loeve expansion). This algorithm is analogous to the LUSIM (lower/upper decomposition simulation; Deutsch and Journel, 1998). CV = VL (34) x new = VL 1/2 y new (35) where, V is a matrix each column of which is the eigenvector of the covariance. L is a diagonal matrix each diagonal element of which is the eigenvalue of the covariance. y new is the parameter vector for the realization x new. The parameter y is standard Gaussian random vector and the size is determined by how many eigenvalues are chosen to retain. We do not have to use all the nonzero N R eigenvalues; typically a few large eigenvalues are retained. By Equation 35, we can generate many realizations based on the same covariance. In order to consider higher-order moments or spatial correlation beyond the point-bypoint covariance, the feature expansions of the realizations can be introduced. Let φ be the feature map from realization space R to feature space F (Equations 36 and 37). φ : R F (36) x φ := φ(x) (37) where φ is the feature expansion of realization. With the feature expansions of the ensemble φ(x) (defined by Equation 38), the new feature expansion can be generated in the same manner above (Equation 40). The covariance of the feature expansions (φ(x j )) of the ensemble is calculated by Equation 39. [φ(x)] :,j = φ(x j ) (38) C = 1 N R N R φ(x j )φ(x j ) = 1 φ(x)φ(x) = 1 ΦΦ (39) N j=1 R N R φ(x new ) = VL 1/2 y new (40) However, since the feature expansion is often very high-dimensional and sometimes infinite-dimensional, the eigenvalue decomposition of the covariance matrix is almost impossible. The kernel trick makes it possible to obtain the exactly equivalent solution to the eigenvalue decomposition of the covariance. If we define a kernel function as a dot product of two feature expansions (Equation 41), the kernel function can be evaluated without 20

21 representing the high-dimensional feature expansions explicitly. Then, the kernel matrix (Equation 42) can be calculated efficiently. k(x i, x j ) : = < φ(x i ), φ(x j ) > (41) K : = φ(x) φ(x) = Φ Φ (42) where, [K] i,j is k(x i, x j ) and < > means the dot product. The main idea of the kernel trick is to assume that the new feature expansion is a linear combination of the feature expansions of the ensemble and represent all the elements in the equations as dot products of two feature expansions. Consider the eigenvalue decomposition of the kernel matrix: KE = EΛ (43) where, E is a matrix each column of which is the eigenvector of the kernel matrix. Λ is a diagonal matrix each diagonal element of which is the eigenvalue of the kernel matrix. Then, the eigenvectors and the corresponding eigenvalues of the covariance are calculated directly from the eigenvectors and eigenvalues of the kernel matrix, which takes much less time (Equation 44). L = 1 N R Λ Φ V = EΛ 1/2 (44) For the parameterization of the given ensemble, we have to find an ensemble of parameterization, Y such that Φ = VL 1/2 Y (45) Φ Φ = Φ V }{{} K = and Equation 49 gives the parameterization. }{{} L 1/2 EΛ 1/2 1 NR Λ 1/2 Y (46) 1 NR EΛY (47) (48) Y = N R Λ 1 E K = N R Λ 1 E EΛE = N R E (49) 21

22 C Kernels We can use various types of kernels, but the kernel matrix should be positive definite (Mercer theorem). Some widely used kernels are: Polynomial: k(x, z) = (< x, z > +c) d Gaussian: k(x, z) = exp ) ( x z 2 2σ 2 Sigmoid: k(x, z) = tanh(κ < x, z > +ϑ) Likewise the Gaussian kernel, the kernel that is based on the Euclidian distance is called RBF kernel. Even though we know the Euclidian distance only, the RBF kernel function can be evaluated. Also, although the dissimilarity distance is not a Euclidian distance, we can map the ensemble into the metric space by multidimensional scaling. Once the Euclidian distance in the metric space is well correlated with the dissimilarity distance, we can evaluate the kernel function by replacing the distance to the Euclidian distance in the metric space. D Pre-image problem Once the new feature expansion is acquired, the new realization can be calculated from the new feature expansion (x new = φ 1 (y new )). Since φ 1 cannot often be calculated explicitly, we have to calculate the new model such that x new = arg min φ(x new ) φ(x)b x new { } = arg min φ(x new ) T φ(x new ) 2φ(x new ) T φ(x)b + b T Kb. (50) x new This is another optimization problem which is called the pre-image problem. This optimization problem can be solved by the fixed point iteration method (Schölkopf and Smola, 2002). We find x new such that } xnew {φ(x new ) T φ(x new ) 2φ(x new ) T φ(x)b + b T Kb = 0 (51) by iterations (Equation 52). x new = N R [b] j k (x j, x new )x j j=1 N R [b] j k (x j, x new ) j=1 (52) 22

23 where k means the differential of k. Since we have the kernel functions not the explicit feature expansion, these iterations can be done efficiently. In conclusion, the new realization can be obtained by a nonlinear combination of the ensemble members. Note that the nonlinear weight sum to unity. This pre-image problem will be discussed in detail in Caers (2008). 23

24 Figure 1: Flow diagram of EnKF applied in this study. 24

25 Figure 2: The proposed workflow: Realization space Distance space MDS space Feature space Parameterization space Optimization Feature space Realization space 25

26 Figure 3: Watercut data measured every two months. Only the red circles (noisy data) are available to the algorithm. Figure 4: Log-permeability of 6 out of 1,000 realizations which are generated by SGSIM. All the realizations are conditioned to hard data: 150 md at (45 ft, 45 ft) and (275 ft, 275 ft). 26

27 Figure 5: The mean (left) and conditional variance (right) of log-permeability of 1,000 initial realizations which are generated by SGSIM. It can be verified that all the realizations are conditioned to hard data. In the map of the mean (left), the well locations, or the hard data locations, are easily identified: (45 ft, 45 ft) and (275 ft, 275 ft). Figure 6: Watercut curves simulated with all the 1,000 initial realizations and the measured watercut data. Red circles mean the measured data. Green line means the mean of the watercut curves. Grey lines show 1,000 watercut curves. 27

28 Figure 7: Scatterplot between dissimilarity distance and the difference in dynamic data for 1,000 initial realizations generated by SGSIM. The correlation coefficient is almost 0.94, which means the connectivity distance is highly correlated with the difference in dynamic data. Figure 8: 2D MDS map of 1,000 initial realizations based on their own dissimilarity distances. Each point represents each realization. 28

29 Figure 9: 2D MDS map of 1,000 initial realizations based on their own dissimilarity distances. All the realizations are turned out to be well-sorted based on their connectivity between injector and producer. Figure 10: 2D MDS map of 1,000 initial realizations based on their own dissimilarity distances. Color represents the difference in dynamic data from the realization that is indicated by. Since the connectivity distance is highly correlated with the difference in dynamic data, although the realizations are mapped based on the connectivity distance, the realizations are well sorted with the difference in dynamic data. 29

30 Figure 11: Watercut curves calculated by the reservoir simulations and the measured watercut at 260 days. Figure 12: Update at 260 days in 2D MDS space. s represent the a priori realizations (before correction) and s the a posteriori realizations (after correction). 30

31 Figure 13: Watercut curves calculated by the reservoir simulations and the measured watercut at 520 days. Figure 14: Update at 520 days in 2D MDS space. s represent the a priori realizations (before correction) and s the a posteriori realizations (after correction). Grey lines ( ) show the path of update. 31

32 (a) 580 days (b) 640 days (c) 710 days 32 (d) 770 days

33 (e) 840 days (f) 900 days (g) 970 days 33 (h) 1030 days

34 (i) 1095 days Figure 15: LEFT: Watercut curves calculated by the reservoir simulations and the measured watercut from 580 days to 1,095 days; RIGHT: Update from 580 days to 1,095 days in 2D MDS space. s represent the a priori 34 realizations (before correction) and s the a posteriori realizations (after correction). Grey lines ( ) show the path of update.

35 (a) 260 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (b) 520 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (c) 580 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (d) 640 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (e) 710 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) 35

36 (f) 770 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (g) 840 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (h) 900 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (i) 970 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) (j) 1030 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) 36

37 (k) 1095 days (LEFT: ln k; CENTER: a priori S w ; RIGHT: a posteriori S w ) Figure 16: LEFT: Watercut curves calculated by the reservoir simulations and the measured watercut from 580 days to 1,095 days; RIGHT: Update from 580 days to 1,095 days in 2D MDS space. s represent the a priori realizations (before correction) and s the a posteriori realizations (after correction). Grey lines ( ) show the path of update. Figure 17: The mean (left) and conditional variance (right) of log-permeability of 300 final realizations after EnKF. 37

38 Figure 18: Log permeability of reference realization. Figure 19: Watercut curves predicted by reservoir simulations of 300 final realizations from 0 days to 1095 days. 38

39 Figure 20: Probability density of unconditional realizations in 2D MDS map. (unconditional SGSIM) Figure 21: Probability density of conditional realizations to hard data only in 2D MDS map. (conditional SGSIM) 39

40 Figure 22: Probability density of conditional realizations to hard and dynamic data in 2D MDS map. (EnKF) 40

41 Figure 23: The distance and dissimilarity of dynamic data (watercut). On the y-axis is the difference in watercut between any two realizations. On the x-axis the distance between any two realizations. Figure 24: The initial 300 realizations clustered into 30 clusters through kernel k-mean clustering. 41

Geostatistical History Matching coupled with Adaptive Stochastic Sampling: A zonation-based approach using Direct Sequential Simulation

Geostatistical History Matching coupled with Adaptive Stochastic Sampling: A zonation-based approach using Direct Sequential Simulation Geostatistical History Matching coupled with Adaptive Stochastic Sampling: A zonation-based approach using Direct Sequential Simulation Eduardo Barrela* Instituto Superior Técnico, Av. Rovisco Pais 1,

More information

Modeling uncertainty in metric space. Jef Caers Stanford University Stanford, California, USA

Modeling uncertainty in metric space. Jef Caers Stanford University Stanford, California, USA Modeling uncertainty in metric space Jef Caers Stanford University Stanford, California, USA Contributors Celine Scheidt (Research Associate) Kwangwon Park (PhD student) Motivation Modeling uncertainty

More information

Inverting hydraulic heads in an alluvial aquifer constrained with ERT data through MPS and PPM: a case study

Inverting hydraulic heads in an alluvial aquifer constrained with ERT data through MPS and PPM: a case study Inverting hydraulic heads in an alluvial aquifer constrained with ERT data through MPS and PPM: a case study Hermans T. 1, Scheidt C. 2, Caers J. 2, Nguyen F. 1 1 University of Liege, Applied Geophysics

More information

Ensemble Kalman filter for automatic history matching of geologic facies

Ensemble Kalman filter for automatic history matching of geologic facies Journal of Petroleum Science and Engineering 47 (2005) 147 161 www.elsevier.com/locate/petrol Ensemble Kalman filter for automatic history matching of geologic facies Ning LiuT, Dean S. Oliver School of

More information

Comparing the gradual deformation with the probability perturbation method

Comparing the gradual deformation with the probability perturbation method Comparing the gradual deformation with the probability perturbation method Jef Caers Petroleum Engineering Department Stanford University, California USA 1. Introduction This paper is intended as a comparison

More information

Structural Surface Uncertainty Modeling and Updating Using the Ensemble Kalman Filter

Structural Surface Uncertainty Modeling and Updating Using the Ensemble Kalman Filter Structural Surface Uncertainty Modeling and Updating Using the Ensemble Kalman Filter A. Seiler, Nansen Environmental and Remote Sensing Center and Statoil; S.I. Aanonsen, Center for Integrated Petroleum

More information

A008 THE PROBABILITY PERTURBATION METHOD AN ALTERNATIVE TO A TRADITIONAL BAYESIAN APPROACH FOR SOLVING INVERSE PROBLEMS

A008 THE PROBABILITY PERTURBATION METHOD AN ALTERNATIVE TO A TRADITIONAL BAYESIAN APPROACH FOR SOLVING INVERSE PROBLEMS A008 THE PROAILITY PERTURATION METHOD AN ALTERNATIVE TO A TRADITIONAL AYESIAN APPROAH FOR SOLVING INVERSE PROLEMS Jef AERS Stanford University, Petroleum Engineering, Stanford A 94305-2220 USA Abstract

More information

SPE History Matching Using the Ensemble Kalman Filter on a North Sea Field Case

SPE History Matching Using the Ensemble Kalman Filter on a North Sea Field Case SPE- 1243 History Matching Using the Ensemble Kalman Filter on a North Sea Field Case Vibeke Haugen, SPE, Statoil ASA, Lars-Jørgen Natvik, Statoil ASA, Geir Evensen, Hydro, Aina Berg, IRIS, Kristin Flornes,

More information

Multiple-Point Geostatistics: from Theory to Practice Sebastien Strebelle 1

Multiple-Point Geostatistics: from Theory to Practice Sebastien Strebelle 1 Multiple-Point Geostatistics: from Theory to Practice Sebastien Strebelle 1 Abstract The limitations of variogram-based simulation programs to model complex, yet fairly common, geological elements, e.g.

More information

Characterization of channel oil reservoirs with an aquifer using EnKF, DCT, and PFR

Characterization of channel oil reservoirs with an aquifer using EnKF, DCT, and PFR Original Article Characterization of channel oil reservoirs with an aquifer using EnKF, DCT, and PFR Energy Exploration & Exploitation 2016, Vol. 34(6) 828 843! The Author(s) 2016 Reprints and permissions:

More information

USING GEOSTATISTICS TO DESCRIBE COMPLEX A PRIORI INFORMATION FOR INVERSE PROBLEMS THOMAS M. HANSEN 1,2, KLAUS MOSEGAARD 2 and KNUD S.

USING GEOSTATISTICS TO DESCRIBE COMPLEX A PRIORI INFORMATION FOR INVERSE PROBLEMS THOMAS M. HANSEN 1,2, KLAUS MOSEGAARD 2 and KNUD S. USING GEOSTATISTICS TO DESCRIBE COMPLEX A PRIORI INFORMATION FOR INVERSE PROBLEMS THOMAS M. HANSEN 1,2, KLAUS MOSEGAARD 2 and KNUD S. CORDUA 1 1 Institute of Geography & Geology, University of Copenhagen,

More information

Distance-based stochastic modeling: theory and applications

Distance-based stochastic modeling: theory and applications Distance-based stochastic modeling: theory and applications Jef Caers Stanford University, California, USA Energy Resources Engineering Department Abstract raditional to geostatistics is the quantification

More information

Sequential assimilation of 4D seismic data for reservoir description using the ensemble Kalman filter

Sequential assimilation of 4D seismic data for reservoir description using the ensemble Kalman filter Journal of Petroleum Science and Engineering 53 (2006) 83 99 www.elsevier.com/locate/petrol Sequential assimilation of 4D seismic data for reservoir description using the ensemble Kalman filter Yannong

More information

B008 COMPARISON OF METHODS FOR DOWNSCALING OF COARSE SCALE PERMEABILITY ESTIMATES

B008 COMPARISON OF METHODS FOR DOWNSCALING OF COARSE SCALE PERMEABILITY ESTIMATES 1 B8 COMPARISON OF METHODS FOR DOWNSCALING OF COARSE SCALE PERMEABILITY ESTIMATES Alv-Arne Grimstad 1 and Trond Mannseth 1,2 1 RF-Rogaland Research 2 Now with CIPR - Centre for Integrated Petroleum Research,

More information

Bootstrap confidence intervals for reservoir model selection techniques

Bootstrap confidence intervals for reservoir model selection techniques Bootstrap confidence intervals for reservoir model selection techniques Céline Scheidt and Jef Caers Department of Energy Resources Engineering Stanford University Abstract Stochastic spatial simulation

More information

Analysis of the Pattern Correlation between Time Lapse Seismic Amplitudes and Saturation

Analysis of the Pattern Correlation between Time Lapse Seismic Amplitudes and Saturation Analysis of the Pattern Correlation between Time Lapse Seismic Amplitudes and Saturation Darkhan Kuralkhanov and Tapan Mukerji Department of Energy Resources Engineering Stanford University Abstract The

More information

Gaussian Filtering Strategies for Nonlinear Systems

Gaussian Filtering Strategies for Nonlinear Systems Gaussian Filtering Strategies for Nonlinear Systems Canonical Nonlinear Filtering Problem ~u m+1 = ~ f (~u m )+~ m+1 ~v m+1 = ~g(~u m+1 )+~ o m+1 I ~ f and ~g are nonlinear & deterministic I Noise/Errors

More information

A Stochastic Collocation based. for Data Assimilation

A Stochastic Collocation based. for Data Assimilation A Stochastic Collocation based Kalman Filter (SCKF) for Data Assimilation Lingzao Zeng and Dongxiao Zhang University of Southern California August 11, 2009 Los Angeles Outline Introduction SCKF Algorithm

More information

Reliability of Seismic Data for Hydrocarbon Reservoir Characterization

Reliability of Seismic Data for Hydrocarbon Reservoir Characterization Reliability of Seismic Data for Hydrocarbon Reservoir Characterization Geetartha Dutta (gdutta@stanford.edu) December 10, 2015 Abstract Seismic data helps in better characterization of hydrocarbon reservoirs.

More information

Machine Learning Applied to 3-D Reservoir Simulation

Machine Learning Applied to 3-D Reservoir Simulation Machine Learning Applied to 3-D Reservoir Simulation Marco A. Cardoso 1 Introduction The optimization of subsurface flow processes is important for many applications including oil field operations and

More information

Application of the Ensemble Kalman Filter to History Matching

Application of the Ensemble Kalman Filter to History Matching Application of the Ensemble Kalman Filter to History Matching Presented at Texas A&M, November 16,2010 Outline Philosophy EnKF for Data Assimilation Field History Match Using EnKF with Covariance Localization

More information

Best Practice Reservoir Characterization for the Alberta Oil Sands

Best Practice Reservoir Characterization for the Alberta Oil Sands Best Practice Reservoir Characterization for the Alberta Oil Sands Jason A. McLennan and Clayton V. Deutsch Centre for Computational Geostatistics (CCG) Department of Civil and Environmental Engineering

More information

11280 Electrical Resistivity Tomography Time-lapse Monitoring of Three-dimensional Synthetic Tracer Test Experiments

11280 Electrical Resistivity Tomography Time-lapse Monitoring of Three-dimensional Synthetic Tracer Test Experiments 11280 Electrical Resistivity Tomography Time-lapse Monitoring of Three-dimensional Synthetic Tracer Test Experiments M. Camporese (University of Padova), G. Cassiani* (University of Padova), R. Deiana

More information

Oil Field Production using Machine Learning. CS 229 Project Report

Oil Field Production using Machine Learning. CS 229 Project Report Oil Field Production using Machine Learning CS 229 Project Report Sumeet Trehan, Energy Resources Engineering, Stanford University 1 Introduction Effective management of reservoirs motivates oil and gas

More information

Ensemble Kalman filter assimilation of transient groundwater flow data via stochastic moment equations

Ensemble Kalman filter assimilation of transient groundwater flow data via stochastic moment equations Ensemble Kalman filter assimilation of transient groundwater flow data via stochastic moment equations Alberto Guadagnini (1,), Marco Panzeri (1), Monica Riva (1,), Shlomo P. Neuman () (1) Department of

More information

Optimizing Thresholds in Truncated Pluri-Gaussian Simulation

Optimizing Thresholds in Truncated Pluri-Gaussian Simulation Optimizing Thresholds in Truncated Pluri-Gaussian Simulation Samaneh Sadeghi and Jeff B. Boisvert Truncated pluri-gaussian simulation (TPGS) is an extension of truncated Gaussian simulation. This method

More information

Reservoir Monitoring and Continuous Model Updating Using Ensemble Kalman Filter

Reservoir Monitoring and Continuous Model Updating Using Ensemble Kalman Filter SPE 84372 Reservoir Monitoring and Continuous Model Updating Using Ensemble Kalman Filter Geir Nævdal, RF-Rogaland Research; Liv Merethe Johnsen, SPE, Norsk Hydro; Sigurd Ivar Aanonsen, SPE, Norsk Hydro;

More information

Truncated Conjugate Gradient Method for History Matching in Reservoir Simulation

Truncated Conjugate Gradient Method for History Matching in Reservoir Simulation Trabalho apresentado no CNAC, Gramado - RS, 2016. Proceeding Series of the Brazilian Society of Computational and Applied athematics Truncated Conjugate Gradient ethod for History atching in Reservoir

More information

Entropy of Gaussian Random Functions and Consequences in Geostatistics

Entropy of Gaussian Random Functions and Consequences in Geostatistics Entropy of Gaussian Random Functions and Consequences in Geostatistics Paula Larrondo (larrondo@ualberta.ca) Department of Civil & Environmental Engineering University of Alberta Abstract Sequential Gaussian

More information

EnKF Applications to Thermal Petroleum Reservoir Characterization

EnKF Applications to Thermal Petroleum Reservoir Characterization EnKF Applications to Thermal Petroleum Reservoir Characterization Yevgeniy Zagayevskiy and Clayton V. Deutsch This paper presents a summary of the application of the ensemble Kalman filter (EnKF) to petroleum

More information

B005 A NEW FAST FOURIER TRANSFORM ALGORITHM FOR FLUID FLOW SIMULATION

B005 A NEW FAST FOURIER TRANSFORM ALGORITHM FOR FLUID FLOW SIMULATION 1 B5 A NEW FAST FOURIER TRANSFORM ALGORITHM FOR FLUID FLOW SIMULATION LUDOVIC RICARD, MICAËLE LE RAVALEC-DUPIN, BENOÎT NOETINGER AND YVES GUÉGUEN Institut Français du Pétrole, 1& 4 avenue Bois Préau, 92852

More information

STOCHASTIC AND DETERMINISTIC INVERSION METHODS FOR HISTORY MATCHING OF PRODUCTION AND TIME-LAPSE SEISMIC DATA. A Dissertation SHINGO WATANABE

STOCHASTIC AND DETERMINISTIC INVERSION METHODS FOR HISTORY MATCHING OF PRODUCTION AND TIME-LAPSE SEISMIC DATA. A Dissertation SHINGO WATANABE STOCHASTIC AND DETERMINISTIC INVERSION METHODS FOR HISTORY MATCHING OF PRODUCTION AND TIME-LAPSE SEISMIC DATA A Dissertation by SHINGO WATANABE Submitted to the Office of Graduate and Professional Studies

More information

We LHR3 04 Realistic Uncertainty Quantification in Geostatistical Seismic Reservoir Characterization

We LHR3 04 Realistic Uncertainty Quantification in Geostatistical Seismic Reservoir Characterization We LHR3 04 Realistic Uncertainty Quantification in Geostatistical Seismic Reservoir Characterization A. Moradi Tehrani* (CGG), A. Stallone (Roma Tre University), R. Bornard (CGG) & S. Boudon (CGG) SUMMARY

More information

Direct forecasting without full model inversion Jef Caers

Direct forecasting without full model inversion Jef Caers Direct forecasting without full model inversion Jef Caers Professor of Geological Sciences Stanford University, USA Why moving to Geological Sciences? To attract a greater diversity of students To be able

More information

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal

More information

A NEW APPROACH FOR QUANTIFYING THE IMPACT OF GEOSTATISTICAL UNCERTAINTY ON PRODUCTION FORECASTS: THE JOINT MODELING METHOD

A NEW APPROACH FOR QUANTIFYING THE IMPACT OF GEOSTATISTICAL UNCERTAINTY ON PRODUCTION FORECASTS: THE JOINT MODELING METHOD A NEW APPROACH FOR QUANTIFYING THE IMPACT OF GEOSTATISTICAL UNCERTAINTY ON PRODUCTION FORECASTS: THE JOINT MODELING METHOD IAMG, Cancun, September 6-1, 001 Isabelle Zabalza-Mezghani, IFP Emmanuel Manceau,

More information

A Study of Covariances within Basic and Extended Kalman Filters

A Study of Covariances within Basic and Extended Kalman Filters A Study of Covariances within Basic and Extended Kalman Filters David Wheeler Kyle Ingersoll December 2, 2013 Abstract This paper explores the role of covariance in the context of Kalman filters. The underlying

More information

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background

More information

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS

More information

Reservoir Uncertainty Calculation by Large Scale Modeling

Reservoir Uncertainty Calculation by Large Scale Modeling Reservoir Uncertainty Calculation by Large Scale Modeling Naeem Alshehri and Clayton V. Deutsch It is important to have a good estimate of the amount of oil or gas in a reservoir. The uncertainty in reserve

More information

Advances in Locally Varying Anisotropy With MDS

Advances in Locally Varying Anisotropy With MDS Paper 102, CCG Annual Report 11, 2009 ( 2009) Advances in Locally Varying Anisotropy With MDS J.B. Boisvert and C. V. Deutsch Often, geology displays non-linear features such as veins, channels or folds/faults

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Integration of seismic and fluid-flow data: a two-way road linked by rock physics

Integration of seismic and fluid-flow data: a two-way road linked by rock physics Integration of seismic and fluid-flow data: a two-way road linked by rock physics Abstract Yunyue (Elita) Li, Yi Shen, and Peter K. Kang Geologic model building of the subsurface is a complicated and lengthy

More information

Geostatistics for Seismic Data Integration in Earth Models

Geostatistics for Seismic Data Integration in Earth Models 2003 Distinguished Instructor Short Course Distinguished Instructor Series, No. 6 sponsored by the Society of Exploration Geophysicists European Association of Geoscientists & Engineers SUB Gottingen 7

More information

Statistical Rock Physics

Statistical Rock Physics Statistical - Introduction Book review 3.1-3.3 Min Sun March. 13, 2009 Outline. What is Statistical. Why we need Statistical. How Statistical works Statistical Rock physics Information theory Statistics

More information

23855 Rock Physics Constraints on Seismic Inversion

23855 Rock Physics Constraints on Seismic Inversion 23855 Rock Physics Constraints on Seismic Inversion M. Sams* (Ikon Science Ltd) & D. Saussus (Ikon Science) SUMMARY Seismic data are bandlimited, offset limited and noisy. Consequently interpretation of

More information

University of Alberta

University of Alberta University of Alberta Library Release Form Name of Author: Linan Zhang Title of Thesis: Production Data Integration in Geostatistical Reservoir Modeling Degree: Master of Science Year this Degree Granted:

More information

Ensemble square-root filters

Ensemble square-root filters Ensemble square-root filters MICHAEL K. TIPPETT International Research Institute for climate prediction, Palisades, New Yor JEFFREY L. ANDERSON GFDL, Princeton, New Jersy CRAIG H. BISHOP Naval Research

More information

Assessing the Value of Information from Inverse Modelling for Optimising Long-Term Oil Reservoir Performance

Assessing the Value of Information from Inverse Modelling for Optimising Long-Term Oil Reservoir Performance Assessing the Value of Information from Inverse Modelling for Optimising Long-Term Oil Reservoir Performance Eduardo Barros, TU Delft Paul Van den Hof, TU Eindhoven Jan Dirk Jansen, TU Delft 1 Oil & gas

More information

Determination of Locally Varying Directions through Mass Moment of Inertia Tensor

Determination of Locally Varying Directions through Mass Moment of Inertia Tensor Determination of Locally Varying Directions through Mass Moment of Inertia Tensor R. M. Hassanpour and C.V. Deutsch Centre for Computational Geostatistics Department of Civil and Environmental Engineering

More information

New Fast Kalman filter method

New Fast Kalman filter method New Fast Kalman filter method Hojat Ghorbanidehno, Hee Sun Lee 1. Introduction Data assimilation methods combine dynamical models of a system with typically noisy observations to obtain estimates of the

More information

A Spectral Approach to Linear Bayesian Updating

A Spectral Approach to Linear Bayesian Updating A Spectral Approach to Linear Bayesian Updating Oliver Pajonk 1,2, Bojana V. Rosic 1, Alexander Litvinenko 1, and Hermann G. Matthies 1 1 Institute of Scientific Computing, TU Braunschweig, Germany 2 SPT

More information

A MultiGaussian Approach to Assess Block Grade Uncertainty

A MultiGaussian Approach to Assess Block Grade Uncertainty A MultiGaussian Approach to Assess Block Grade Uncertainty Julián M. Ortiz 1, Oy Leuangthong 2, and Clayton V. Deutsch 2 1 Department of Mining Engineering, University of Chile 2 Department of Civil &

More information

Advanced Reservoir Management Workflow Using an EnKF Based Assisted History Matching Method

Advanced Reservoir Management Workflow Using an EnKF Based Assisted History Matching Method SPE 896 Advanced Reservoir Management Workflow Using an EnKF Based Assisted History Matching Method A. Seiler, G. Evensen, J.-A. Skjervheim, J. Hove, J.G. Vabø, StatoilHydro ASA Copyright 29, Society of

More information

Estimation of Relative Permeability Parameters in Reservoir Engineering Applications

Estimation of Relative Permeability Parameters in Reservoir Engineering Applications Delft University of Technology Faculty of Electrical Engineering, Mathematics and Computer Science Delft Institute of Applied Mathematics Master of Science Thesis Estimation of Relative Permeability Parameters

More information

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Vahid Dehdari and Clayton V. Deutsch Geostatistical modeling involves many variables and many locations.

More information

DATA ASSIMILATION FOR COMPLEX SUBSURFACE FLOW FIELDS

DATA ASSIMILATION FOR COMPLEX SUBSURFACE FLOW FIELDS POLITECNICO DI MILANO Department of Civil and Environmental Engineering Doctoral Programme in Environmental and Infrastructure Engineering XXVI Cycle DATA ASSIMILATION FOR COMPLEX SUBSURFACE FLOW FIELDS

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

A010 MULTISCALE RESERVOIR CHARACTERIZATION USING

A010 MULTISCALE RESERVOIR CHARACTERIZATION USING 1 A010 MULTISCALE RESERVOIR CHARACTERIZATION USING RODUCTION AND TIME LASE SEISMIC DATA Mokhles MEZGHANI, Alexandre FORNEL, Valérie LANGLAIS, Nathalie LUCET IF, 1 & 4 av de Bois réau, 92852 RUEIL-MALMAISON

More information

Reservoir management in the age of big data

Reservoir management in the age of big data Reservoir management in the age of big data Since the 1980s, Norwegian companies have been pioneers in advanced oil and gas technologies, helping the industry boost recoverable volumes in fields worldwide.

More information

Reducing Uncertainty in Modelling Fluvial Reservoirs by using Intelligent Geological Priors

Reducing Uncertainty in Modelling Fluvial Reservoirs by using Intelligent Geological Priors Reducing Uncertainty in Modelling Fluvial Reservoirs by using Intelligent Geological Priors Temístocles Rojas 1, Vasily Demyanov 2, Mike Christie 3 & Dan Arnold 4 Abstract Automatic history matching reservoir

More information

Large Scale Modeling by Bayesian Updating Techniques

Large Scale Modeling by Bayesian Updating Techniques Large Scale Modeling by Bayesian Updating Techniques Weishan Ren Centre for Computational Geostatistics Department of Civil and Environmental Engineering University of Alberta Large scale models are useful

More information

Reservoir connectivity uncertainty from stochastic seismic inversion Rémi Moyen* and Philippe M. Doyen (CGGVeritas)

Reservoir connectivity uncertainty from stochastic seismic inversion Rémi Moyen* and Philippe M. Doyen (CGGVeritas) Rémi Moyen* and Philippe M. Doyen (CGGVeritas) Summary Static reservoir connectivity analysis is sometimes based on 3D facies or geobody models defined by combining well data and inverted seismic impedances.

More information

Smoothers: Types and Benchmarks

Smoothers: Types and Benchmarks Smoothers: Types and Benchmarks Patrick N. Raanes Oxford University, NERSC 8th International EnKF Workshop May 27, 2013 Chris Farmer, Irene Moroz Laurent Bertino NERSC Geir Evensen Abstract Talk builds

More information

NEW DEMANDS FOR APPLICATION OF NUMERICAL SIMULATION TO IMPROVE RESERVOIR STUDIES IN CHINA

NEW DEMANDS FOR APPLICATION OF NUMERICAL SIMULATION TO IMPROVE RESERVOIR STUDIES IN CHINA INTERNATIONAL JOURNAL OF NUMERICAL ANALYSIS AND MODELING Volume 2, Supp, Pages 148 152 c 2005 Institute for Scientific Computing and Information NEW DEMANDS FOR APPLICATION OF NUMERICAL SIMULATION TO IMPROVE

More information

DATA ASSIMILATION FOR FLOOD FORECASTING

DATA ASSIMILATION FOR FLOOD FORECASTING DATA ASSIMILATION FOR FLOOD FORECASTING Arnold Heemin Delft University of Technology 09/16/14 1 Data assimilation is the incorporation of measurement into a numerical model to improve the model results

More information

Parameter Estimation in Reservoir Engineering Models via Data Assimilation Techniques

Parameter Estimation in Reservoir Engineering Models via Data Assimilation Techniques Parameter Estimation in Reservoir Engineering Models via Data Assimilation Techniques Mariya V. Krymskaya TU Delft July 6, 2007 Ensemble Kalman Filter (EnKF) Iterative Ensemble Kalman Filter (IEnKF) State

More information

Ergodicity in data assimilation methods

Ergodicity in data assimilation methods Ergodicity in data assimilation methods David Kelly Andy Majda Xin Tong Courant Institute New York University New York NY www.dtbkelly.com April 15, 2016 ETH Zurich David Kelly (CIMS) Data assimilation

More information

Hydrocarbon Reservoir Parameter Estimation Using Production Data and Time-Lapse Seismic

Hydrocarbon Reservoir Parameter Estimation Using Production Data and Time-Lapse Seismic Hydrocarbon Reservoir Parameter Estimation Using Production Data and Time-Lapse Seismic Hydrocarbon Reservoir Parameter Estimation Using Production Data and Time-Lapse Seismic PROEFSCHRIFT ter verkrijging

More information

A Note on the Particle Filter with Posterior Gaussian Resampling

A Note on the Particle Filter with Posterior Gaussian Resampling Tellus (6), 8A, 46 46 Copyright C Blackwell Munksgaard, 6 Printed in Singapore. All rights reserved TELLUS A Note on the Particle Filter with Posterior Gaussian Resampling By X. XIONG 1,I.M.NAVON 1,2 and

More information

Vasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks

Vasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks C.M. Bishop s PRML: Chapter 5; Neural Networks Introduction The aim is, as before, to find useful decompositions of the target variable; t(x) = y(x, w) + ɛ(x) (3.7) t(x n ) and x n are the observations,

More information

Dynamic Data Driven Simulations in Stochastic Environments

Dynamic Data Driven Simulations in Stochastic Environments Computing 77, 321 333 (26) Digital Object Identifier (DOI) 1.17/s67-6-165-3 Dynamic Data Driven Simulations in Stochastic Environments C. Douglas, Lexington, Y. Efendiev, R. Ewing, College Station, V.

More information

Performance Prediction of a Reservoir Under Gas. Injection, Using Output Error Model

Performance Prediction of a Reservoir Under Gas. Injection, Using Output Error Model Contemporary Engineering Sciences, Vol. 9, 2016, no. 30, 1479-1489 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ces.2016.66116 Performance Prediction of a Reservoir Under Gas Injection, Using

More information

DELFT UNIVERSITY OF TECHNOLOGY

DELFT UNIVERSITY OF TECHNOLOGY DELFT UNIVERSITY OF TECHNOLOGY REPORT 8-6 Comparison of the Ensemble Kalman Filter and a modified Representer Method for sensitivity to prior data J.R. Rommelse, J.D. Jansen, A.W. Heemink, F. Wilschut

More information

Development of Stochastic Artificial Neural Networks for Hydrological Prediction

Development of Stochastic Artificial Neural Networks for Hydrological Prediction Development of Stochastic Artificial Neural Networks for Hydrological Prediction G. B. Kingston, M. F. Lambert and H. R. Maier Centre for Applied Modelling in Water Engineering, School of Civil and Environmental

More information

Building Blocks for Direct Sequential Simulation on Unstructured Grids

Building Blocks for Direct Sequential Simulation on Unstructured Grids Building Blocks for Direct Sequential Simulation on Unstructured Grids Abstract M. J. Pyrcz (mpyrcz@ualberta.ca) and C. V. Deutsch (cdeutsch@ualberta.ca) University of Alberta, Edmonton, Alberta, CANADA

More information

Maximum Likelihood Ensemble Filter Applied to Multisensor Systems

Maximum Likelihood Ensemble Filter Applied to Multisensor Systems Maximum Likelihood Ensemble Filter Applied to Multisensor Systems Arif R. Albayrak a, Milija Zupanski b and Dusanka Zupanski c abc Colorado State University (CIRA), 137 Campus Delivery Fort Collins, CO

More information

Combining geological surface data and geostatistical model for Enhanced Subsurface geological model

Combining geological surface data and geostatistical model for Enhanced Subsurface geological model Combining geological surface data and geostatistical model for Enhanced Subsurface geological model M. Kurniawan Alfadli, Nanda Natasia, Iyan Haryanto Faculty of Geological Engineering Jalan Raya Bandung

More information

The Ensemble Kalman Filter:

The Ensemble Kalman Filter: p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen 23, Ocean Dynamics, Vol 53, No 4 p.2 The Ensemble

More information

Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland

Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland EnviroInfo 2004 (Geneva) Sh@ring EnviroInfo 2004 Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland Mikhail Kanevski 1, Michel Maignan 1

More information

Geostatistical Determination of Production Uncertainty: Application to Firebag Project

Geostatistical Determination of Production Uncertainty: Application to Firebag Project Geostatistical Determination of Production Uncertainty: Application to Firebag Project Abstract C. V. Deutsch, University of Alberta (cdeutsch@civil.ualberta.ca) E. Dembicki and K.C. Yeung, Suncor Energy

More information

arxiv: v1 [physics.ao-ph] 23 Jan 2009

arxiv: v1 [physics.ao-ph] 23 Jan 2009 A Brief Tutorial on the Ensemble Kalman Filter Jan Mandel arxiv:0901.3725v1 [physics.ao-ph] 23 Jan 2009 February 2007, updated January 2009 Abstract The ensemble Kalman filter EnKF) is a recursive filter

More information

Lagrangian data assimilation for point vortex systems

Lagrangian data assimilation for point vortex systems JOT J OURNAL OF TURBULENCE http://jot.iop.org/ Lagrangian data assimilation for point vortex systems Kayo Ide 1, Leonid Kuznetsov 2 and Christopher KRTJones 2 1 Department of Atmospheric Sciences and Institute

More information

Earth models for early exploration stages

Earth models for early exploration stages ANNUAL MEETING MASTER OF PETROLEUM ENGINEERING Earth models for early exploration stages Ângela Pereira PhD student angela.pereira@tecnico.ulisboa.pt 3/May/2016 Instituto Superior Técnico 1 Outline Motivation

More information

Learning Gaussian Process Models from Uncertain Data

Learning Gaussian Process Models from Uncertain Data Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada

More information

Immediate Reward Reinforcement Learning for Projective Kernel Methods

Immediate Reward Reinforcement Learning for Projective Kernel Methods ESANN'27 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 25-27 April 27, d-side publi., ISBN 2-9337-7-2. Immediate Reward Reinforcement Learning for Projective Kernel Methods

More information

Addressing the nonlinear problem of low order clustering in deterministic filters by using mean-preserving non-symmetric solutions of the ETKF

Addressing the nonlinear problem of low order clustering in deterministic filters by using mean-preserving non-symmetric solutions of the ETKF Addressing the nonlinear problem of low order clustering in deterministic filters by using mean-preserving non-symmetric solutions of the ETKF Javier Amezcua, Dr. Kayo Ide, Dr. Eugenia Kalnay 1 Outline

More information

A new Hierarchical Bayes approach to ensemble-variational data assimilation

A new Hierarchical Bayes approach to ensemble-variational data assimilation A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander

More information

Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian Mixture Simulation

Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian Mixture Simulation Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian Mixture Simulation Dario Grana, Tapan Mukerji, Laura Dovera, and Ernesto Della Rossa Abstract We present here a method

More information

Relevance Vector Machines for Earthquake Response Spectra

Relevance Vector Machines for Earthquake Response Spectra 2012 2011 American American Transactions Transactions on on Engineering Engineering & Applied Applied Sciences Sciences. American Transactions on Engineering & Applied Sciences http://tuengr.com/ateas

More information

Data assimilation using Bayesian filters and B- spline geological models

Data assimilation using Bayesian filters and B- spline geological models Journal of Physics: Conference Series Data assimilation using Bayesian filters and B- spline geological models To cite this article: Lian Duan et al 211 J. Phys.: Conf. Ser. 29 124 View the article online

More information

Abstract. 1 Introduction. Cointerpretation of Flow Rate-Pressure-Temperature Data from Permanent Downhole Gauges. Deconvolution. Breakpoint detection

Abstract. 1 Introduction. Cointerpretation of Flow Rate-Pressure-Temperature Data from Permanent Downhole Gauges. Deconvolution. Breakpoint detection Cointerpretation of Flow Rate-Pressure-Temperature Data from Permanent Downhole Gauges CS 229 Course Final Report Chuan Tian chuant@stanford.edu Yue Li yuel@stanford.edu Abstract This report documents

More information

Relative Merits of 4D-Var and Ensemble Kalman Filter

Relative Merits of 4D-Var and Ensemble Kalman Filter Relative Merits of 4D-Var and Ensemble Kalman Filter Andrew Lorenc Met Office, Exeter International summer school on Atmospheric and Oceanic Sciences (ISSAOS) "Atmospheric Data Assimilation". August 29

More information

Asymptotics, streamlines, and reservoir modeling: A pathway to production tomography

Asymptotics, streamlines, and reservoir modeling: A pathway to production tomography Asymptotics, streamlines, and reservoir modeling: A pathway to production tomography D. W. VASCO, University of California Berkeley, U.S. AKHIL DATTA-GUPTA, Texas A&M University, College Station, U.S.

More information

A013 HISTORY MATCHING WITH RESPECT TO RESERVOIR STRUCTURE

A013 HISTORY MATCHING WITH RESPECT TO RESERVOIR STRUCTURE A3 HISTORY MATCHING WITH RESPECT TO RESERVOIR STRUCTURE SIGURD IVAR AANONSEN ; ODDVAR LIA ; AND OLE JAKOB ARNTZEN Centre for Integrated Research, University of Bergen, Allégt. 4, N-7 Bergen, Norway Statoil

More information

Multiple realizations using standard inversion techniques a

Multiple realizations using standard inversion techniques a Multiple realizations using standard inversion techniques a a Published in SEP report, 105, 67-78, (2000) Robert G Clapp 1 INTRODUCTION When solving a missing data problem, geophysicists and geostatisticians

More information

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY

More information

Uncertainty analysis for the integration of seismic and CSEM data Myoung Jae Kwon & Roel Snieder, Center for Wave Phenomena, Colorado School of Mines

Uncertainty analysis for the integration of seismic and CSEM data Myoung Jae Kwon & Roel Snieder, Center for Wave Phenomena, Colorado School of Mines Myoung Jae Kwon & Roel Snieder, Center for Wave Phenomena, Colorado School of Mines Summary Geophysical inverse problems consist of three stages: the forward problem, optimization, and appraisal. We study

More information

Updating of Uncertainty in Fractured Reservoirs driven by Geological Scenarios

Updating of Uncertainty in Fractured Reservoirs driven by Geological Scenarios Stanford Center for Reservoir Forecasting, 26th annual meeting, 8 May 2013 Updating of Uncertainty in Fractured Reservoirs driven by Geological Scenarios Andre Jung, Jef Caers, Stanford University Darryl

More information

Assessing uncertainty on Net-to-gross at the Appraisal Stage: Application to a West Africa Deep-Water Reservoir

Assessing uncertainty on Net-to-gross at the Appraisal Stage: Application to a West Africa Deep-Water Reservoir Assessing uncertainty on Net-to-gross at the Appraisal Stage: Application to a West Africa Deep-Water Reservoir Amisha Maharaja April 25, 2006 Abstract A large data set is available from a deep-water reservoir

More information