首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We present a methodology that allows conditioning the spatial distribution of geological and petrophysical properties of reservoir model realizations on available production data. The approach is fully consistent with modern concepts depicting natural reservoirs as composite media where the distribution of both lithological units (or facies) and associated attributes are modeled as stochastic processes of space. We represent the uncertain spatial distribution of the facies through a Markov mesh (MM) model, which allows describing complex and detailed facies geometries in a rigorous Bayesian framework. The latter is then embedded within a history matching workflow based on an iterative form of the ensemble Kalman filter (EnKF). We test the proposed methodology by way of a synthetic study characterized by the presence of two distinct facies. We analyze the accuracy and computational efficiency of our algorithm and its ability with respect to the standard EnKF to properly estimate model parameters and assess future reservoir production. We show the feasibility of integrating MM in a data assimilation scheme. Our methodology is conducive to a set of updated model realizations characterized by a realistic spatial distribution of facies and their log permeabilities. Model realizations updated through our proposed algorithm correctly capture the production dynamics.  相似文献   

2.
3.
Seismic inverse modeling, which transforms appropriately processed geophysical data into the physical properties of the Earth, is an essential process for reservoir characterization. This paper proposes a work flow based on a Markov chain Monte Carlo method consistent with geology, well-logs, seismic data, and rock-physics information. It uses direct sampling as a multiple-point geostatistical method for generating realizations from the prior distribution, and Metropolis sampling with adaptive spatial resampling to perform an approximate sampling from the posterior distribution, conditioned to the geophysical data. Because it can assess important uncertainties, sampling is a more general approach than just finding the most likely model. However, since rejection sampling requires a large number of evaluations for generating the posterior distribution, it is inefficient and not suitable for reservoir modeling. Metropolis sampling is able to perform an equivalent sampling by forming a Markov chain. The iterative spatial resampling algorithm perturbs realizations of a spatially dependent variable, while preserving its spatial structure by conditioning to subset points. However, in most practical applications, when the subset conditioning points are selected at random, it can get stuck for a very long time in a non-optimal local minimum. In this paper it is demonstrated that adaptive subset sampling improves the efficiency of iterative spatial resampling. Depending on the acceptance/rejection criteria, it is possible to obtain a chain of geostatistical realizations aimed at characterizing the posterior distribution with Metropolis sampling. The validity and applicability of the proposed method are illustrated by results for seismic lithofacies inversion on the Stanford VI synthetic test sets.  相似文献   

4.
In history matching of lithofacies reservoir model, we attempt to find multiple realizations of lithofacies configuration that are conditional to dynamic data and representative of the model uncertainty space. This problem can be formalized in the Bayesian framework. Given a truncated Gaussian model as a prior and the dynamic data with its associated measurement error, we want to sample from the conditional distribution of the facies given the data. A relevant way to generate conditioned realizations is to use Markov chains Monte Carlo (MCMC). However, the dimensions of the model and the computational cost of each iteration are two important pitfalls for the use of MCMC. Furthermore, classical MCMC algorithms mix slowly, that is, they will not explore the whole support of the posterior in the time of the simulation. In this paper, we extend the methodology already described in a previous work to the problem of history matching of a Gaussian-related lithofacies reservoir model. We first show how to drastically reduce the dimension of the problem by using a truncated Karhunen-Loève expansion of the Gaussian random field underlying the lithofacies model. Moreover, we propose an innovative criterion of the choice of the number of components based on the connexity function. Then, we show how we improve the mixing properties of classical single MCMC, without increasing the global computational cost, by the use of parallel interacting Markov chains. Applying the dimension reduction and this innovative sampling method drastically lowers the number of iterations needed to sample efficiently from the posterior. We show the encouraging results obtained when applying the methodology to a synthetic history-matching case.  相似文献   

5.
Uncertainty quantification for subsurface flow problems is typically accomplished through model-based inversion procedures in which multiple posterior (history-matched) geological models are generated and used for flow predictions. These procedures can be demanding computationally, however, and it is not always straightforward to maintain geological realism in the resulting history-matched models. In some applications, it is the flow predictions themselves (and the uncertainty associated with these predictions), rather than the posterior geological models, that are of primary interest. This is the motivation for the data-space inversion (DSI) procedure developed in this paper. In the DSI approach, an ensemble of prior model realizations, honoring prior geostatistical information and hard data at wells, are generated and then (flow) simulated. The resulting production data are assembled into data vectors that represent prior ‘realizations’ in the data space. Pattern-based mapping operations and principal component analysis are applied to transform non-Gaussian data variables into lower-dimensional variables that are closer to multivariate Gaussian. The data-space inversion is posed within a Bayesian framework, and a data-space randomized maximum likelihood method is introduced to sample the conditional distribution of data variables given observed data. Extensive numerical results are presented for two example cases involving oil–water flow in a bimodal channelized system and oil–water–gas flow in a Gaussian permeability system. For both cases, DSI results for uncertainty quantification (e.g., P10, P50, P90 posterior predictions) are compared with those obtained from a strict rejection sampling (RS) procedure. Close agreement between the DSI and RS results is consistently achieved, even when the (synthetic) true data to be matched fall near the edge of the prior distribution. Computational savings using DSI are very substantial in that RS requires \(O(10^5\)\(10^6)\) flow simulations, in contrast to 500 for DSI, for the cases considered.  相似文献   

6.
Bayesian lithology/fluid inversion—comparison of two algorithms   总被引:1,自引:0,他引:1  
Algorithms for inversion of seismic prestack AVO data into lithology-fluid classes in a vertical profile are evaluated. The inversion is defined in a Bayesian setting where the prior model for the lithology-fluid classes is a Markov chain, and the likelihood model relates seismic data and elastic material properties to these classes. The likelihood model is approximated such that the posterior model can be calculated recursively using the extremely efficient forward–backward algorithm. The impact of the approximation in the likelihood model is evaluated empirically by comparing results from the approximate approach with results generated from the exact posterior model. The exact posterior is assessed by sampling using a sophisticated Markov chain Monte Carlo simulation algorithm. The simulation algorithm is iterative, and it requires considerable computer resources. Seven realistic evaluation models are defined, from which synthetic seismic data are generated. Using identical seismic data, the approximate marginal posterior is calculated and the exact marginal posterior is assessed. It is concluded that the approximate likelihood model preserves 50% to 90% of the information content in the exact likelihood model.  相似文献   

7.
Histograms of observations from spatial phenomena are often found to be more heavy-tailed than Gaussian distributions, which makes the Gaussian random field model unsuited. A T-distributed random field model with heavy-tailed marginal probability density functions is defined. The model is a generalization of the familiar Student-T distribution, and it may be given a Bayesian interpretation. The increased variability appears cross-realizations, contrary to in-realizations, since all realizations are Gaussian-like with varying variance between realizations. The T-distributed random field model is analytically tractable and the conditional model is developed, which provides algorithms for conditional simulation and prediction, so-called T-kriging. The model compares favourably with most previously defined random field models. The Gaussian random field model appears as a special, limiting case of the T-distributed random field model. The model is particularly useful whenever multiple, sparsely sampled realizations of the random field are available, and is clearly favourable to the Gaussian model in this case. The properties of the T-distributed random field model is demonstrated on well log observations from the Gullfaks field in the North Sea. The predictions correspond to traditional kriging predictions, while the associated prediction variances are more representative, as they are layer specific and include uncertainty caused by using variance estimates.  相似文献   

8.
The spatial continuity of facies is one of the key factors controlling flow in reservoir models. Traditional pixel-based methods such as truncated Gaussian random fields and indicator simulation are based on only two-point statistics, which is insufficient to capture complex facies structures. Current methods for multi-point statistics either lack a consistent statistical model specification or are too computer intensive to be applicable. We propose a Markov mesh model based on generalized linear models for geological facies modeling. The approach defines a consistent statistical model that is facilitated by efficient estimation of model parameters and generation of realizations. Our presentation includes a formulation of the general framework, model specifications in two and three dimensions, and details on how the parameters can be estimated from a training image. We illustrate the method using multiple training images, including binary and trinary images and simulations in two and three dimensions. We also do a thorough comparison to the snesim approach. We find that the current model formulation is applicable for multiple training images and compares favorably to the snesim approach in our test examples. The method is highly memory efficient.  相似文献   

9.
10.

Minimization of a stochastic cost function is commonly used for approximate sampling in high-dimensional Bayesian inverse problems with Gaussian prior distributions and multimodal posterior distributions. The density of the samples generated by minimization is not the desired target density, unless the observation operator is linear, but the distribution of samples is useful as a proposal density for importance sampling or for Markov chain Monte Carlo methods. In this paper, we focus on applications to sampling from multimodal posterior distributions in high dimensions. We first show that sampling from multimodal distributions is improved by computing all critical points instead of only minimizers of the objective function. For applications to high-dimensional geoscience inverse problems, we demonstrate an efficient approximate weighting that uses a low-rank Gauss-Newton approximation of the determinant of the Jacobian. The method is applied to two toy problems with known posterior distributions and a Darcy flow problem with multiple modes in the posterior.

  相似文献   

11.
Building of models in the Earth Sciences often requires the solution of an inverse problem: some unknown model parameters need to be calibrated with actual measurements. In most cases, the set of measurements cannot completely and uniquely determine the model parameters; hence multiple models can describe the same data set. Bayesian inverse theory provides a framework for solving this problem. Bayesian methods rely on the fact that the conditional probability of the model parameters given the data (the posterior) is proportional to the likelihood of observing the data and a prior belief expressed as a prior distribution of the model parameters. In case the prior distribution is not Gaussian and the relation between data and parameters (forward model) is strongly non-linear, one has to resort to iterative samplers, often Markov chain Monte Carlo methods, for generating samples that fit the data likelihood and reflect the prior model statistics. While theoretically sound, such methods can be slow to converge, and are often impractical when the forward model is CPU demanding. In this paper, we propose a new sampling method that allows to sample from a variety of priors and condition model parameters to a variety of data types. The method does not rely on the traditional Bayesian decomposition of posterior into likelihood and prior, instead it uses so-called pre-posterior distributions, i.e. the probability of the model parameters given some subset of the data. The use of pre-posterior allows to decompose the data into so-called, “easy data” (or linear data) and “difficult data” (or nonlinear data). The method relies on fast non-iterative sequential simulation to generate model realizations. The difficult data is matched by perturbing an initial realization using a perturbation mechanism termed “probability perturbation.” The probability perturbation method moves the initial guess closer to matching the difficult data, while maintaining the prior model statistics and the conditioning to the linear data. Several examples are used to illustrate the properties of this method.  相似文献   

12.
The chemical zoning profile in metamorphic minerals is often used to deduce the pressure–temperature (PT) history of rock. However, it remains difficult to restore detailed paths from zoned minerals because thermobarometric evaluation of metamorphic conditions involves several uncertainties, including measurement errors and geological noise. We propose a new stochastic framework for estimating precise PT paths from a chemical zoning structure using the Markov random field (MRF) model, which is a type of Bayesian stochastic method that is often applied to image analysis. The continuity of pressure and temperature during mineral growth is incorporated by Gaussian Markov chains as prior probabilities in order to apply the MRF model to the PT path inversion. The most probable PT path can be obtained by maximizing the posterior probability of the sequential set of P and T given the observed compositions of zoned minerals. Synthetic PT inversion tests were conducted in order to investigate the effectiveness and validity of the proposed model from zoned Mg–Fe–Ca garnet in the divariant KNCFMASH system. In the present study, the steepest descent method was implemented in order to maximize the posterior probability using the Markov chain Monte Carlo algorithm. The proposed method successfully reproduced the detailed shape of the synthetic PT path by eliminating appropriately the statistical compositional noises without operator’s subjectivity and prior knowledge. It was also used to simultaneously evaluate the uncertainty of pressure, temperature, and mineral compositions for all measurement points. The MRF method may have potential to deal with several geological uncertainties, which cause cumbersome systematic errors, by its Bayesian approach and flexible formalism, so that it comprises potentially powerful tools for various inverse problems in petrology.  相似文献   

13.
Most approaches in statistical spatial prediction assume that the spatial data are realizations of a Gaussian random field. However, this assumption is hard to justify for most applications. When the distribution of data is skewed but otherwise has similar properties to the normal distribution, a closed skew normal distribution can be used for modeling their skewness. Closed skew normal distribution is an extension of the multivariate skew normal distribution and has the advantage of being closed under marginalization and conditioning. In this paper, we generalize Bayesian prediction methods using closed skew normal distributions. A simulation study is performed to check the validity of the model and performance of the Bayesian spatial predictor. Finally, our prediction method is applied to Bayesian spatial prediction on the strain data near Semnan, Iran. The mean-square error of cross-validation is improved by the closed skew Gaussian model on the strain data.  相似文献   

14.
Based on the algorithm for gradual deformation of Gaussian stochastic models, we propose, in this paper, an extension of this method to gradually deforming realizations generated by sequential, not necessarily Gaussian, simulation. As in the Gaussian case, gradual deformation of a sequential simulation preserves spatial variability of the stochastic model and yields in general a regular objective function that can be minimized by an efficient optimization algorithm (e.g., a gradient-based algorithm). Furthermore, we discuss the local gradual deformation and the gradual deformation with respect to the structural parameters (mean, variance, and variogram range, etc.) of realizations generated by sequential simulation. Local gradual deformation may significantly improve calibration speed in the case where observations are scattered in different zones of a field. Gradual deformation with respect to structural parameters is necessary when these parameters cannot be inferred a priori and need to be determined using an inverse procedure. A synthetic example inspired from a real oil field is presented to illustrate different aspects of this approach. Results from this case study demonstrate the efficiency of the gradual deformation approach for constraining facies models generated by sequential indicator simulation. They also show the potential applicability of the proposed approach to complex real cases.  相似文献   

15.
An adequate representation of the detailed spatial variation of subsurface parameters for underground flow and mass transport simulation entails heterogeneous models. Uncertainty characterization generally calls for a Monte Carlo analysis of many equally likely realizations that honor both direct information (e.g., conductivity data) and information about the state of the system (e.g., piezometric head or concentration data). Thus, the problems faced is how to generate multiple realizations conditioned to parameter data, and inverse-conditioned to dependent state data. We propose using Markov chain Monte Carlo approach (MCMC) with block updating and combined with upscaling to achieve this purpose. Our proposal presents an alternative block updating scheme that permits the application of MCMC to inverse stochastic simulation of heterogeneous fields and incorporates upscaling in a multi-grid approach to speed up the generation of the realizations. The main advantage of MCMC, compared to other methods capable of generating inverse-conditioned realizations (such as the self-calibrating or the pilot point methods), is that it does not require the solution of a complex optimization inverse problem, although it requires the solution of the direct problem many times.  相似文献   

16.
In this paper we present an extension of the ensemble Kalman filter (EnKF) specifically designed for multimodal systems. EnKF data assimilation scheme is less accurate when it is used to approximate systems with multimodal distribution such as reservoir facies models. The algorithm is based on the assumption that both prior and posterior distribution can be approximated by Gaussian mixture and it is validated by the introduction of the concept of finite ensemble representation. The effectiveness of the approach is shown with two applications. The first example is based on Lorenz model. In the second example, the proposed methodology combined with a localization technique is used to update a 2D reservoir facies models. Both applications give evidence of an improved performance of the proposed method respect to the EnKF.  相似文献   

17.
18.
Application of EM algorithms for seismic facices classification   总被引:1,自引:0,他引:1  
Identification of the geological facies and their distribution from seismic and other available geological information is important during the early stage of reservoir development (e.g. decision on initial well locations). Traditionally, this is done by manually inspecting the signatures of the seismic attribute maps, which is very time-consuming. This paper proposes an application of the Expectation-Maximization (EM) algorithm to automatically identify geological facies from seismic data. While the properties within a certain geological facies are relatively homogeneous, the properties between geological facies can be rather different. Assuming that noisy seismic data of a geological facies, which reflect rock properties, can be approximated with a Gaussian distribution, the seismic data of a reservoir composed of several geological facies are samples from a Gaussian mixture model. The mean of each Gaussian model represents the average value of the seismic data within each facies while the variance gives the variation of the seismic data within a facies. The proportions in the Gaussian mixture model represent the relative volumes of different facies in the reservoir. In this setting, the facies classification problem becomes a problem of estimating the parameters defining the Gaussian mixture model. The EM algorithm has long been used to estimate Gaussian mixture model parameters. As the standard EM algorithm does not consider spatial relationship among data, it can generate spatially scattered seismic facies which is physically unrealistic. We improve the standard EM algorithm by adding a spatial constraint to enhance spatial continuity of the estimated geological facies. By applying the EM algorithms to acoustic impedance and Poisson’s ratio data for two synthetic examples, we are able to identify the facies distribution.  相似文献   

19.
On Modelling Discrete Geological Structures as Markov Random Fields   总被引:1,自引:0,他引:1  
The purpose of this paper is to extend the locally based prediction methodology of BayMar to a global one by modelling discrete spatial structures as Markov random fields. BayMar uses one-dimensional Markov-properties for estimating spatial correlation and Bayesian updating for locally integrating prior and additional information. The methodology of this paper introduces a new estimator of the field parameters based on the maximum likelihood technique for one-dimensional Markov chains. This makes the estimator straightforward to calculate also when there is a large amount of missing observations, which often is the case in geological applications. We make simulations (both unconditional and conditional on the observed data) and maximum a posteriori predictions (restorations) of the non-observed data using Markov chain Monte Carlo methods, in the restoration case by employing simulated annealing. The described method gives satisfactory predictions, while more work is needed in order to simulate, since it appears to have a tendency to overestimate strong spatial dependence. It provides an important development compared to the BayMar-methodology by facilitating global predictions and improved use of sparse data.  相似文献   

20.
The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our implementation of the MCMC method provides the gold standard against which the aforementioned Gaussian approximations are assessed. We present numerical synthetic experiments where we quantify the capability of each of the ad hoc Gaussian approximation in reproducing the mean and the variance of the posterior distribution (characterized via MCMC) associated to a data assimilation problem. Both single-phase and two-phase (oil–water) reservoir models are considered so that fundamental differences in the resulting forward operators are highlighted. The main objective of our controlled experiments was to exhibit the substantial discrepancies of the approximation properties of standard ad hoc Gaussian approximations. Numerical investigations of the type we present here will lead to the greater understanding of the cost-efficient, but ad hoc, Bayesian techniques used for data assimilation in petroleum reservoirs and hence ultimately to improved techniques with more accurate uncertainty quantification.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号