首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到10条相似文献,搜索用时 130 毫秒
1.
Stochastic spatial simulation allows generation of multiple realizations of spatial variables. Due to the computational time required for evaluating the transfer function, uncertainty quantification of these multiple realizations often requires a selection of a small subset of realization. However, by selecting only a few realizations, one may risk biasing the P10, P50, and P90 estimates as compared to the original multiple realizations. The objective of this study is to develop a methodology to quantify confidence intervals for the estimated P10, P50, and P90 quantiles when only a few models are retained for response evaluation. We use the parametric bootstrap technique, which evaluates the variability of the statistics obtained from uncertainty quantification and constructs confidence intervals. Using this technique, we compare the confidence intervals when using two selection methods: the traditional ranking technique and the distance-based kernel clustering technique (DKM). The DKM has been recently developed and has been shown to be effective in quantifying uncertainty. The methodology is demonstrated using two examples. The first example is a synthetic example, which uses bi-normal variables and serves to demonstrate the technique. The second example is from an oil field in West Africa where the uncertain variable is the cumulative oil production coming from 20 wells. The results show that, for the same number of transfer function evaluations, the DKM method has equal or smaller error and confidence interval compared to ranking.  相似文献   

2.
Seismic inverse modeling, which transforms appropriately processed geophysical data into the physical properties of the Earth, is an essential process for reservoir characterization. This paper proposes a work flow based on a Markov chain Monte Carlo method consistent with geology, well-logs, seismic data, and rock-physics information. It uses direct sampling as a multiple-point geostatistical method for generating realizations from the prior distribution, and Metropolis sampling with adaptive spatial resampling to perform an approximate sampling from the posterior distribution, conditioned to the geophysical data. Because it can assess important uncertainties, sampling is a more general approach than just finding the most likely model. However, since rejection sampling requires a large number of evaluations for generating the posterior distribution, it is inefficient and not suitable for reservoir modeling. Metropolis sampling is able to perform an equivalent sampling by forming a Markov chain. The iterative spatial resampling algorithm perturbs realizations of a spatially dependent variable, while preserving its spatial structure by conditioning to subset points. However, in most practical applications, when the subset conditioning points are selected at random, it can get stuck for a very long time in a non-optimal local minimum. In this paper it is demonstrated that adaptive subset sampling improves the efficiency of iterative spatial resampling. Depending on the acceptance/rejection criteria, it is possible to obtain a chain of geostatistical realizations aimed at characterizing the posterior distribution with Metropolis sampling. The validity and applicability of the proposed method are illustrated by results for seismic lithofacies inversion on the Stanford VI synthetic test sets.  相似文献   

3.
Having a large number of geostatistical simulations of a mineral or petroleum deposit provides a better idea of its upside potential and downside risk; however, large numbers of simulated realizations of a deposit may pose computational difficulties in subsequent decision-making phases. Hence, depending on the specific case, there can be a need to select a representative subset of conditionally simulated deposit realizations. This paper examines and extends an approach developed by the stochastic optimization community based on stochastic mathematical programming with recourse and is discussed here in the context of mineral deposits while it is possibly suitable for other earth science applications. The approach is based on measuring the “distance” between simulations and the introduced distance measure between simulated realizations of a mineral deposit is based on the metal above a given set of cutoff grades while a pre-existing mine design is available. The approach is tested on 100 simulations of the Walker Lake data with promising results.  相似文献   

4.
In earth and environmental sciences applications, uncertainty analysis regarding the outputs of models whose parameters are spatially varying (or spatially distributed) is often performed in a Monte Carlo framework. In this context, alternative realizations of the spatial distribution of model inputs, typically conditioned to reproduce attribute values at locations where measurements are obtained, are generated via geostatistical simulation using simple random (SR) sampling. The environmental model under consideration is then evaluated using each of these realizations as a plausible input, in order to construct a distribution of plausible model outputs for uncertainty analysis purposes. In hydrogeological investigations, for example, conditional simulations of saturated hydraulic conductivity are used as input to physically-based simulators of flow and transport to evaluate the associated uncertainty in the spatial distribution of solute concentration. Realistic uncertainty analysis via SR sampling, however, requires a large number of simulated attribute realizations for the model inputs in order to yield a representative distribution of model outputs; this often hinders the application of uncertainty analysis due to the computational expense of evaluating complex environmental models. Stratified sampling methods, including variants of Latin hypercube sampling, constitute more efficient sampling aternatives, often resulting in a more representative distribution of model outputs (e.g., solute concentration) with fewer model input realizations (e.g., hydraulic conductivity), thus reducing the computational cost of uncertainty analysis. The application of stratified and Latin hypercube sampling in a geostatistical simulation context, however, is not widespread, and, apart from a few exceptions, has been limited to the unconditional simulation case. This paper proposes methodological modifications for adopting existing methods for stratified sampling (including Latin hypercube sampling), employed to date in an unconditional geostatistical simulation context, for the purpose of efficient conditional simulation of Gaussian random fields. The proposed conditional simulation methods are compared to traditional geostatistical simulation, based on SR sampling, in the context of a hydrogeological flow and transport model via a synthetic case study. The results indicate that stratified sampling methods (including Latin hypercube sampling) are more efficient than SR, overall reproducing to a similar extent statistics of the conductivity (and subsequently concentration) fields, yet with smaller sampling variability. These findings suggest that the proposed efficient conditional sampling methods could contribute to the wider application of uncertainty analysis in spatially distributed environmental models using geostatistical simulation.  相似文献   

5.
Soil erosion is one of most widespread process of degradation. The erodibility of a soil is a measure of its susceptibility to erosion and depends on many soil properties. Soil erodibility factor varies greatly over space and is commonly estimated using the revised universal soil loss equation. Neglecting information about estimation uncertainty may lead to improper decision-making. One geostatistical approach to spatial analysis is sequential Gaussian simulation, which draws alternative, equally probable, joint realizations of a regionalised variable. Differences between the realizations provide a measure of spatial uncertainty and allow us to carry out an error analysis. The objective of this paper was to assess the model output error of soil erodibility resulting from the uncertainties in the input attributes (texture and organic matter). The study area covers about 30 km2 (Calabria, southern Italy). Topsoil samples were collected at 175 locations within the study area in 2006 and the main chemical and physical soil properties were determined. As soil textural size fractions are compositional data, the additive-logratio (alr) transformation was used to remove the non-negativity and constant-sum constraints on compositional variables. A Monte Carlo analysis was performed, which consisted of drawing a large number (500) of identically distributed input attributes from the multivariable joint probability distribution function. We incorporated spatial cross-correlation information through joint sequential Gaussian simulation, because model inputs were spatially correlated. The erodibility model was then estimated for each set of the 500 joint realisations of the input variables and the ensemble of the model outputs was used to infer the erodibility probability distribution function. This approach has also allowed for delineating the areas characterised by greater uncertainty and then to suggest efficient supplementary sampling strategies for further improving the precision of K value predictions.  相似文献   

6.
Reservoir characterization needs the integration of various data through history matching, especially dynamic information such as production or 4D seismic data. Although reservoir heterogeneities are commonly generated using geostatistical models, random realizations cannot generally match observed dynamic data. To constrain model realizations to reproduce measured dynamic data, an optimization procedure may be applied in an attempt to minimize an objective function, which quantifies the mismatch between real and simulated data. Such assisted history matching methods require a parameterization of the geostatistical model to allow the updating of an initial model realization. However, there are only a few parameterization methods available to update geostatistical models in a way consistent with the underlying geostatistical properties. This paper presents a local domain parameterization technique that updates geostatistical realizations using assisted history matching. This technique allows us to locally change model realizations through the variation of geometrical domains whose geometry and size can be easily controlled and parameterized. This approach provides a new way to parameterize geostatistical realizations in order to improve history matching efficiency.  相似文献   

7.
The likelihood of Gaussian realizations, as generated by the Cholesky simulation method, is analyzed in terms of Mahalanobis distances and fluctuations in the variogram reproduction. For random sampling, the probability to observe a Gaussian realization vector can be expressed as a function of its Mahalanobis distance, and the maximum likelihood depends only on the vector size. The Mahalanobis distances are themselves distributed as a Chi-square distribution and they can be used to describe the likelihood of Gaussian realizations. Their expected value and variance are only determined by the size of the vector of independent random normal scores used to generate the realizations. When the vector size is small, the distribution of Mahalanobis distances is highly skewed and most realizations are close to the vector mean in agreement with the multi-Gaussian density model. As the vector size increases, the realizations sample a region increasingly far out on the tail of the multi-Gaussian distribution, due to the large increase in the size of the uncertainty space largely compensating for the low probability density. For a large vector size, realizations close to the vector mean are not observed anymore. Instead, Gaussian vectors with Mahalanobis distance in the neighborhood of the expected Mahalanobis distance have the maximum probability to be observed. The distribution of Mahalanobis distances becomes Gaussian shaped and the bulk of realizations appear more equiprobable. However, the ratio of their probabilities indicates that they still remain far from being equiprobable. On the other hand, it is observed that equiprobable realizations still display important fluctuations in their variogram reproduction. The variance level that is expected in the variogram reproduction, as well as the variance of the variogram fluctuations, is dependent on the Mahalanobis distance. Realizations with smaller Mahalanobis distances are, on average, smoother than realizations with larger Mahalanobis distances. Poor ergodic conditions tend to generate higher proportions of flatter variograms relative to the variogram model. Only equiprobable realizations with a Mahalanobis distance equal to the expected Mahalanobis distance have an expected variogram matching the variogram model. For large vector sizes, Cholesky simulated Gaussian vectors cannot be used to explore uncertainty in the neighborhood of the vector mean. Instead uncertainty is explored around the n-dimensional elliptical envelop corresponding to the expected Mahalanobis distance.  相似文献   

8.
In the analysis of petroleum reservoirs, one of the most challenging problems is to use inverse theory in the search for an optimal parameterization of the reservoir. Generally, scientists approach this problem by computing a sensitivity matrix and then perform a singular value decomposition in order to determine the number of degrees of freedom i.e. the number of independent parameters necessary to specify the configuration of the system. Here we propose a complementary approach: it uses the concept of refinement indicators to select those degrees which have the greatest sensitivity to an objective function quantifying the mismatch between measured and simulated data. We apply this approach to the problem of data integration for petrophysical reservoir charaterization where geoscientists are currently working with multimillion cell geological models. Data integration may be performed by gradually deforming (by a linear combination) a set of these multimillion grid geostatistical realizations during the optimization process. The inversion parameters are then reduced to the number of coefficients of this linear combination. However, there is an infinity of geostatistical realizations to choose from which may not be efficient regarding operational constraints. Following our new approach, we are able through a single objective function evaluation to compute refinement indicators that indicate which realizations might improve the iterative geological model in a significant way. This computation is extremely fast as it implies a single gradient computation through the adjoint state approach and dot products. Using only the most sensitive realizations from a given set, we are able to resolve quicker the optimization problem case. We applied this methodology to the integration of interference test data into 3D geostatistical models.  相似文献   

9.
Conditioning realizations of stationary Gaussian random fields to a set of data is traditionally based on simple kriging. In practice, this approach may be demanding as it does not account for the uncertainty in the spatial average of the random field. In this paper, an alternative model is presented, in which the Gaussian field is decomposed into a random mean, constant over space but variable over the realizations, and an independent residual. It is shown that, when the prior variance of the random mean is infinitely large (reflecting prior ignorance on the actual spatial average), the realizations of the Gaussian random field are made conditional by substituting ordinary kriging for simple kriging. The proposed approach can be extended to models with random drifts that are polynomials in the spatial coordinates, by using universal or intrinsic kriging for conditioning the realizations, and also to multivariate situations by using cokriging instead of kriging.  相似文献   

10.
Spatial inverse problems in the Earth Sciences are often ill-posed, requiring the specification of a prior model to constrain the nature of the inverse solutions. Otherwise, inverted model realizations lack geological realism. In spatial modeling, such prior model determines the spatial variability of the inverse solution, for example as constrained by a variogram, a Boolean model, or a training image-based model. In many cases, particularly in subsurface modeling, one lacks the amount of data to fully determine the nature of the spatial variability. For example, many different training images could be proposed for a given study area. Such alternative training images or scenarios relate to the different possible geological concepts each exhibiting a distinctive geological architecture. Many inverse methods rely on priors that represent a single subjectively chosen geological concept (a single variogram within a multi-Gaussian model or a single training image). This paper proposes a novel and practical parameterization of the prior model allowing several discrete choices of geological architectures within the prior. This method does not attempt to parameterize the possibly complex architectures by a set of model parameters. Instead, a large set of prior model realizations is provided in advance, by means of Monte Carlo simulation, where the training image is randomized. The parameterization is achieved by defining a metric space which accommodates this large set of model realizations. This metric space is equipped with a “similarity distance” function or a distance function that measures the similarity of geometry between any two model realizations relevant to the problem at hand. Through examples, inverse solutions can be efficiently found in this metric space using a simple stochastic search method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号