首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
Ensemble methods present a practical framework for parameter estimation, performance prediction, and uncertainty quantification in subsurface flow and transport modeling. In particular, the ensemble Kalman filter (EnKF) has received significant attention for its promising performance in calibrating heterogeneous subsurface flow models. Since an ensemble of model realizations is used to compute the statistical moments needed to perform the EnKF updates, large ensemble sizes are needed to provide accurate updates and uncertainty assessment. However, for realistic problems that involve large-scale models with computationally demanding flow simulation runs, the EnKF implementation is limited to small-sized ensembles. As a result, spurious numerical correlations can develop and lead to inaccurate EnKF updates, which tend to underestimate or even eliminate the ensemble spread. Ad hoc practical remedies, such as localization, local analysis, and covariance inflation schemes, have been developed and applied to reduce the effect of sampling errors due to small ensemble sizes. In this paper, a fast linear approximate forecast method is proposed as an alternative approach to enable the use of large ensemble sizes in operational settings to obtain more improved sample statistics and EnKF updates. The proposed method first clusters a large number of initial geologic model realizations into a small number of groups. A representative member from each group is used to run a full forward flow simulation. The flow predictions for the remaining realizations in each group are approximated by a linearization around the full simulation results of the representative model (centroid) of the respective cluster. The linearization can be performed using either adjoint-based or ensemble-based gradients. Results from several numerical experiments with two-phase and three-phase flow systems in this paper suggest that the proposed method can be applied to improve the EnKF performance in large-scale problems where the number of full simulation is constrained.  相似文献   

2.
Uncertainty quantification for geomechanical and reservoir predictions is in general a computationally intensive problem, especially if a direct Monte Carlo approach with large numbers of full-physics simulations is used. A common solution to this problem, well-known for the fluid flow simulations, is the adoption of surrogate modeling approximating the physical behavior with respect to variations in uncertain parameters. The objective of this work is the quantification of such uncertainty both within geomechanical predictions and fluid-flow predictions using a specific surrogate modeling technique, which is based on a functional approach. The methodology realizes an approximation of full-physics simulated outputs that are varying in time and space when uncertainty parameters are changed, particularly important for the prediction of uncertainty in vertical displacement resulting from geomechanical modeling. The developed methodology has been applied both to a subsidence uncertainty quantification example and to a real reservoir forecast risk assessment. The surrogate quality obtained with these applications confirms that the proposed method makes it possible to perform reliable time–space varying dependent risk assessment with a low computational cost, provided the uncertainty space is low-dimensional.  相似文献   

3.
Distance-based stochastic techniques have recently emerged in the context of ensemble modeling, in particular for history matching, model selection and uncertainty quantification. Starting with an initial ensemble of realizations, a distance between any two models is defined. This distance is defined such that the objective of the study is incorporated into the geological modeling process, thereby potentially enhancing the efficacy of the overall workflow. If the intent is to create new models that are constrained to dynamic data (history matching), the calculation of the distance requires flow simulation for each model in the initial ensemble. This can be very time consuming, especially for high-resolution models. In this paper, we present a multi-resolution framework for ensemble modeling. A distance-based procedure is employed, with emphasis on the rapid construction of multiple models that have improved dynamic data conditioning. Our intent is to construct new high-resolution models constrained to dynamic data, while performing most of the flow simulations only on upscaled models. An error modeling procedure is introduced into the distance calculations to account for potential errors in the upscaling. Based on a few fine-scale flow simulations, the upscaling error is estimated for each model using a clustering technique. We demonstrate the efficiency of the method on two examples, one where the upscaling error is small, and another where the upscaling error is significant. Results show that the error modeling procedure can accurately capture the error in upscaling, and can thus reproduce the fine-scale flow behavior from coarse-scale simulations with sufficient accuracy (in terms of uncertainty predictions). As a consequence, an ensemble of high-resolution models, which are constrained to dynamic data, can be obtained, but with a minimum of flow simulations at the fine scale.  相似文献   

4.
We present a parallel framework for history matching and uncertainty characterization based on the Kalman filter update equation for the application of reservoir simulation. The main advantages of ensemble-based data assimilation methods are that they can handle large-scale numerical models with a high degree of nonlinearity and large amount of data, making them perfectly suited for coupling with a reservoir simulator. However, the sequential implementation is computationally expensive as the methods require relatively high number of reservoir simulation runs. Therefore, the main focus of this work is to develop a parallel data assimilation framework with minimum changes into the reservoir simulator source code. In this framework, multiple concurrent realizations are computed on several partitions of a parallel machine. These realizations are further subdivided among different processors, and communication is performed at data assimilation times. Although this parallel framework is general and can be used for different ensemble techniques, we discuss the methodology and compare results of two algorithms, the ensemble Kalman filter (EnKF) and the ensemble smoother (ES). Computational results show that the absolute runtime is greatly reduced using a parallel implementation versus a serial one. In particular, a parallel efficiency of about 35 % is obtained for the EnKF, and an efficiency of more than 50 % is obtained for the ES.  相似文献   

5.
6.
Ensemble-based methods are becoming popular assisted history matching techniques with a growing number of field applications. These methods use an ensemble of model realizations, typically constructed by means of geostatistics, to represent the prior uncertainty. The performance of the history matching is very dependent on the quality of the initial ensemble. However, there is a significant level of uncertainty in the parameters used to define the geostatistical model. From a Bayesian viewpoint, the uncertainty in the geostatistical modeling can be represented by a hyper-prior in a hierarchical formulation. This paper presents the first steps towards a general parametrization to address the problem of uncertainty in the prior modeling. The proposed parametrization is inspired in Gaussian mixtures, where the uncertainty in the prior mean and prior covariance is accounted by defining weights for combining multiple Gaussian ensembles, which are estimated during the data assimilation. The parametrization was successfully tested in a simple reservoir problem where the orientation of the major anisotropic direction of the permeability field was unknown.  相似文献   

7.
8.
Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site.One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described.The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.  相似文献   

9.
In earth and environmental sciences applications, uncertainty analysis regarding the outputs of models whose parameters are spatially varying (or spatially distributed) is often performed in a Monte Carlo framework. In this context, alternative realizations of the spatial distribution of model inputs, typically conditioned to reproduce attribute values at locations where measurements are obtained, are generated via geostatistical simulation using simple random (SR) sampling. The environmental model under consideration is then evaluated using each of these realizations as a plausible input, in order to construct a distribution of plausible model outputs for uncertainty analysis purposes. In hydrogeological investigations, for example, conditional simulations of saturated hydraulic conductivity are used as input to physically-based simulators of flow and transport to evaluate the associated uncertainty in the spatial distribution of solute concentration. Realistic uncertainty analysis via SR sampling, however, requires a large number of simulated attribute realizations for the model inputs in order to yield a representative distribution of model outputs; this often hinders the application of uncertainty analysis due to the computational expense of evaluating complex environmental models. Stratified sampling methods, including variants of Latin hypercube sampling, constitute more efficient sampling aternatives, often resulting in a more representative distribution of model outputs (e.g., solute concentration) with fewer model input realizations (e.g., hydraulic conductivity), thus reducing the computational cost of uncertainty analysis. The application of stratified and Latin hypercube sampling in a geostatistical simulation context, however, is not widespread, and, apart from a few exceptions, has been limited to the unconditional simulation case. This paper proposes methodological modifications for adopting existing methods for stratified sampling (including Latin hypercube sampling), employed to date in an unconditional geostatistical simulation context, for the purpose of efficient conditional simulation of Gaussian random fields. The proposed conditional simulation methods are compared to traditional geostatistical simulation, based on SR sampling, in the context of a hydrogeological flow and transport model via a synthetic case study. The results indicate that stratified sampling methods (including Latin hypercube sampling) are more efficient than SR, overall reproducing to a similar extent statistics of the conductivity (and subsequently concentration) fields, yet with smaller sampling variability. These findings suggest that the proposed efficient conditional sampling methods could contribute to the wider application of uncertainty analysis in spatially distributed environmental models using geostatistical simulation.  相似文献   

10.
Uncertainty in surfactant–polymer flooding is an important challenge to the wide-scale implementation of this process. Any successful design of this enhanced oil recovery process will necessitate a good understanding of uncertainty. Thus, it is essential to have the ability to quantify this uncertainty in an efficient manner. Monte Carlo simulation is the traditional uncertainty quantification approach that is used for quantifying parametric uncertainty. However, the convergence of Monte Carlo simulation is relatively low, requiring a large number of realizations to converge. This study proposes the use of the probabilistic collocation method in parametric uncertainty quantification for surfactant–polymer flooding using four synthetic reservoir models. Four sources of uncertainty were considered: the chemical flood residual oil saturation, surfactant and polymer adsorption, and the polymer viscosity multiplier. The output parameter approximated is the recovery factor. The output metrics were the input–output model response relationship, the probability density function, and the first two moments. These were compared with the results obtained from Monte Carlo simulation over a large number of realizations. Two methods for solving for the coefficients of the output parameter polynomial chaos expansion are compared: Gaussian quadrature and linear regression. The linear regression approach used two types of sampling: full-tensor product nodes and Chebyshev-derived nodes. In general, the probabilistic collocation method was applied successfully to quantify the uncertainty in the recovery factor. Applying the method using the Gaussian quadrature produced more accurate results compared with using the linear regression with full-tensor product nodes. Applying the method using the linear regression with Chebyshev derived sampling also performed relatively well. Possible enhancements to improve the performance of the probabilistic collocation method were discussed. These enhancements include improved sparse sampling, approximation order-independent sampling, and using arbitrary random input distribution that could be more representative of reality.  相似文献   

11.
In the past years, many applications of history-matching methods in general and ensemble Kalman filter in particular have been proposed, especially in order to estimate fields that provide uncertainty in the stochastic process defined by the dynamical system of hydrocarbon recovery. Such fields can be permeability fields or porosity fields, but can also fields defined by the rock type (facies fields). The estimation of the boundaries of the geologic facies with ensemble Kalman filter (EnKF) was made, in different papers, with the aid of Gaussian random fields, which were truncated using various schemes and introduced in a history-matching process. In this paper, we estimate, in the frame of the EnKF process, the locations of three facies types that occur into a reservoir domain, with the property that each two could have a contact. The geological simulation model is a form of the general truncated plurigaussian method. The difference with other approaches consists in how the truncation scheme is introduced and in the observation operator of the facies types at the well locations. The projection from the continuous space of the Gaussian fields into the discrete space of the facies fields is realized through in an intermediary space (space with probabilities). This space connects the observation operator of the facies types at the well locations with the geological simulation model. We will test the model using a 2D reservoir which is connected with the EnKF method as a data assimilation technique. We will use different geostatistical properties for the Gaussian fields and different levels of the uncertainty introduced in the model parameters and also in the construction of the Gaussian fields.  相似文献   

12.
A hierarchical scale-up framework is formulated to study the scaling characteristics of reservoir attributes and input dispersivities at the transport modeling scale, where heterogeneity distribution exhibits both non-stationarity (trend) and sub-scale variability. The proposed method is flexible to handle heterogeneities occurring at multiple scales, without any explicit assumption regarding the multivariate distribution of the heterogeneity. This paper extends our previous work by incorporating the effects of non-stationarity into the modeling workflow. Rock property at a given location is modeled as a random variable, which is decomposed into the sum of a trend (available on the same resolution of the transport modeling scale) and a residual component (defined at a much smaller scale). First, to scale up the residual component to the transport modeling scale, the corresponding volume variance is computed; by sampling numerous sets of “conditioning data” via bootstrapping and constructing multiple realizations of the residual components at the transport modeling, uncertainty due to this scale-up process is captured. Next, to compute the input dispersivity at the transport modeling scale, a flow-based technique is adopted: multiple geostatistical realizations of the same physical size as the transport modeling scale are generated to describe the spatial heterogeneity below the modeling scale. Each realization is subjected to particle-tracking simulation. Effective longitudinal and transverse dispersivities are estimated by minimizing the difference in effluent history for each realization and that of an equivalent average medium. Probability distributions of effective dispersivities are established by aggregating results from all realizations. The results demonstrate that both large-scale non-stationarity and sub-scale variability are both contributing to anomalous non-Fickian behavior. In comparison with our previous work, which ignored large-scale non-stationarity, the non-Fickian characteristics observed in this study is dramatically more pronounced.  相似文献   

13.
Traditionally within the mining industry, single models for both grade and geology of orebodies are created upon which all mine development decisions are based. These models provide a single interpretation of the extent and continuity of the mineralization envelope based on solids and sections interpreted from relatively widely spaced drilling. The inherent variable behavior of grade and geology cannot be understood from a single estimated resource model. To account for uncertainty in the geology and mineralization envelope, Newmont Mining Corporation uses multiple-point statistics (MPS), an emerging spatial simulation framework, which can be employed to generate multiple, geologically realistic, realizations of data representing attributes of mineral deposits that display complex non-linear features. MPS uses a conceptual model of the geology, termed a training image, to infer these high-order spatial relationships. A detailed application of the MPS algorithm at the structurally controlled Apensu gold deposit, Ghana, demonstrates the practical intricacies of the MPS framework and documents efficiency and effectiveness. Multiple realizations of the Apensu deposit allow for an assessment of the geologic and volumetric uncertainty, which is further combined with grade simulations to generate a more complete picture of the true uncertainty of the deposit.  相似文献   

14.

Conditioning complex subsurface flow models on nonlinear data is complicated by the need to preserve the expected geological connectivity patterns to maintain solution plausibility. Generative adversarial networks (GANs) have recently been proposed as a promising approach for low-dimensional representation of complex high-dimensional images. The method has also been adopted for low-rank parameterization of complex geologic models to facilitate uncertainty quantification workflows. A difficulty in adopting these methods for subsurface flow modeling is the complexity associated with nonlinear flow data conditioning. While conditional GAN (CGAN) can condition simulated images on labels, application to subsurface problems requires efficient conditioning workflows for nonlinear data, which is far more complex. We present two approaches for generating flow-conditioned models with complex spatial patterns using GAN. The first method is through conditional GAN, whereby a production response label is used as an auxiliary input during the training stage of GAN. The production label is derived from clustering of the flow responses of the prior model realizations (i.e., training data). The underlying assumption of this approach is that GAN can learn the association between the spatial features corresponding to the production responses within each cluster. An alternative method is to use a subset of samples from the training data that are within a certain distance from the observed flow responses and use them as training data within GAN to generate new model realizations. In this case, GAN is not required to learn the nonlinear relation between production responses and spatial patterns. Instead, it is tasked to learn the patterns in the selected realizations that provide a close match to the observed data. The conditional low-dimensional parameterization for complex geologic models with diverse spatial features (i.e., when multiple geologic scenarios are plausible) performed by GAN allows for exploring the spatial variability in the conditional realizations, which can be critical for decision-making. We present and discuss the important properties of GAN for data conditioning using several examples with increasing complexity.

  相似文献   

15.
A stochastic channel embedded in a background facies is conditioned to data observed at wells. The background facies is a fixed rectangular box. The model parameters consist of geometric parameters that describe the shape, size, and location of the channel, and permeability and porosity in the channel and nonchannel facies. We extend methodology previously developed to condition a stochastic channel to well-test pressure data, and well observations of the channel thickness and the depth of the top of the channel. The main objective of this work is to characterize the reduction in uncertainty in channel model parameters and predicted reservoir performance that can be achieved by conditioning to well-test pressure data at one or more wells. Multiple conditional realizations of the geometric parameters and rock properties are generated to evaluate the uncertainty in model parameters. The ensemble of predictions of reservoir performance generated from the suite of realizations provides a Monte Carlo estimate of the uncertainty in future performance predictions. In addition, we provide some insight on how prior variances, data measurement errors, and sensitivity coefficients interact to determine the reduction in model parameters obtained by conditioning to pressure data and examine the value of active and observation well data in resolving model parameters.  相似文献   

16.

One main problem in the modeling of mineral deposits is to design a block model that divides the deposit into homogeneous subdomains. The spatial uncertainty in the geological boundaries becomes a critical factor prior to the modeling of the ore properties. For this reason, reducing the uncertainty of geological models leads to an improved mineral resource evaluation. This research work addresses the problem of updating the geological models by using actual online-sensor measurement data. A novel algorithm is provided, which integrates the discrete wavelet transform to the Ensemble Kalman Filter for assimilating online-sensor production data into geological models. The geological realizations in each time step are transformed to frequency coefficients and, after each assimilation step, the updated realizations are back-transformed to the original categorical distribution. Furthermore, a reconciliation process is performed to compare the online-sensor data derived from the production blocks and the updated realizations in each time step. The algorithm is illustrated through an application to the Golgohar iron deposit located in SW of Sirjan, Iran, and proves to reproduce the statistical parameters and connectivity values of the primary geological realizations.

  相似文献   

17.
Development of subsurface energy and environmental resources can be improved by tuning important decision variables such as well locations and operating rates to optimize a desired performance metric. Optimal well locations in a discretized reservoir model are typically identified by solving an integer programming problem while identification of optimal well settings (controls) is formulated as a continuous optimization problem. In general, however, the decision variables in field development optimization can include many design parameters such as the number, type, location, short-term and long-term operational settings (controls), and drilling schedule of the wells. In addition to the large number of decision variables, field optimization problems are further complicated by the existing technical and physical constraints as well as the uncertainty in describing heterogeneous properties of geologic formations. In this paper, we consider simultaneous optimization of well locations and dynamic rate allocations under geologic uncertainty using a variant of the simultaneous perturbation and stochastic approximation (SPSA). In addition, by taking advantage of the robustness of SPSA against errors in calculating the cost function, we develop an efficient field development optimization under geologic uncertainty, where an ensemble of models are used to describe important flow and transport reservoir properties (e.g., permeability and porosity). We use several numerical experiments, including a channel layer of the SPE10 model and the three-dimensional PUNQ-S3 reservoir, to illustrate the performance improvement that can be achieved by solving a combined well placement and control optimization using the SPSA algorithm under known and uncertain reservoir model assumptions.  相似文献   

18.
19.
Two methods for generating representative realizations from Gaussian and lognormal random field models are studied in this paper, with term representative implying realizations efficiently spanning the range of possible attribute values corresponding to the multivariate (log)normal probability distribution. The first method, already established in the geostatistical literature, is multivariate Latin hypercube sampling, a form of stratified random sampling aiming at marginal stratification of simulated values for each variable involved under the constraint of reproducing a known covariance matrix. The second method, scarcely known in the geostatistical literature, is stratified likelihood sampling, in which representative realizations are generated by exploring in a systematic way the structure of the multivariate distribution function itself. The two sampling methods are employed for generating unconditional realizations of saturated hydraulic conductivity in a hydrogeological context via a synthetic case study involving physically-based simulation of flow and transport in a heterogeneous porous medium; their performance is evaluated for different sample sizes (number of realizations) in terms of the reproduction of ensemble statistics of hydraulic conductivity and solute concentration computed from a very large ensemble set generated via simple random sampling. The results show that both Latin hypercube and stratified likelihood sampling are more efficient than simple random sampling, in that overall they can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than the simple random sampling.  相似文献   

20.
Construction of predictive reservoir models invariably involves interpretation and interpolation between limited available data and adoption of imperfect modeling assumptions that introduce significant subjectivity and uncertainty into the modeling process. In particular, uncertainty in the geologic continuity model can significantly degrade the quality of fluid displacement patterns and predictive modeling outcomes. Here, we address a standing challenge in flow model calibration under uncertainty in geologic continuity by developing an adaptive sparse representation formulation for prior model identification (PMI) during model calibration. We develop a flow-data-driven sparsity-promoting inversion to discriminate against distinct prior geologic continuity models (e.g., variograms). Realizations of reservoir properties from each geologic continuity model are used to generate sparse geologic dictionaries that compactly represent models from each respective prior. For inversion initially the same number of elements from each prior dictionary is used to construct a diverse geologic dictionary that reflects a wide range of variability and uncertainty in the prior continuity. The inversion is formulated as a sparse reconstruction problem that inverts the flow data to identify and linearly combine the relevant elements from the large and diverse set of geologic dictionary elements to reconstruct the solution. We develop an adaptive sparse reconstruction algorithm in which, at every iteration, the contribution of each dictionary to the solution is monitored to replace irrelevant (insignificant) elements with more geologically relevant (significant) elements to improve the solution quality. Several numerical examples are used to illustrate the effectiveness of the proposed approach for identification of geologic continuity in practical model calibration problems where the uncertainty in the prior geologic continuity model can lead to biased inversion results and prediction.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号