首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
突发性水污染事件溯源方法   总被引:2,自引:0,他引:2       下载免费PDF全文
为快速准确地求解突发性水污染溯源问题,在微分进化与蒙特卡罗基础上提出了一种新的溯源方法。该方法将溯源问题视为贝叶斯估计问题,推导出污染源强度、位置和排放时刻等未知参数的后验概率密度函数;结合微分进化和蒙特卡罗模拟方法对后验概率分布进行采样,进而估计出这些未知参数,确定污染源项。通过算例与贝叶斯-蒙特卡罗方法进行对比,结果表明:该方法可使迭代次数有效缩减3/4,污染源强度、位置和排放时刻的平均相对误差分别减少1.23%、2.23%和4.15%,均值误差分别降低0.39%、0.83%和1.49%,其稳定性和可靠性明显高于贝叶斯-蒙特卡罗方法,能较好地识别突发性水污染源,为解决突发水污染事件中的追踪溯源难点问题提供了新的思路和方法。  相似文献   

2.
An adequate representation of the detailed spatial variation of subsurface parameters for underground flow and mass transport simulation entails heterogeneous models. Uncertainty characterization generally calls for a Monte Carlo analysis of many equally likely realizations that honor both direct information (e.g., conductivity data) and information about the state of the system (e.g., piezometric head or concentration data). Thus, the problems faced is how to generate multiple realizations conditioned to parameter data, and inverse-conditioned to dependent state data. We propose using Markov chain Monte Carlo approach (MCMC) with block updating and combined with upscaling to achieve this purpose. Our proposal presents an alternative block updating scheme that permits the application of MCMC to inverse stochastic simulation of heterogeneous fields and incorporates upscaling in a multi-grid approach to speed up the generation of the realizations. The main advantage of MCMC, compared to other methods capable of generating inverse-conditioned realizations (such as the self-calibrating or the pilot point methods), is that it does not require the solution of a complex optimization inverse problem, although it requires the solution of the direct problem many times.  相似文献   

3.
Inverse problems are ubiquitous in the Earth Sciences. Many such problems are ill-posed in the sense that multiple solutions can be found that match the data to be inverted. To impose restrictions on these solutions, a prior distribution of the model parameters is required. In a spatial context this prior model can be as simple as a Multi-Gaussian law with prior covariance matrix, or could come in the form of a complex training image describing the prior statistics of the model parameters. In this paper, two methods for generating inverse solutions constrained to such prior model are compared. The gradual deformation method treats the problem of finding inverse solution as an optimization problem. Using a perturbation mechanism, the gradual deformation method searches (optimizes) in the prior model space for those solutions that match the data to be inverted. The perturbation mechanism guarantees that the prior model statistics are honored. However, it is shown with a simple example that this perturbation method does not necessarily draw accurately samples from a given posterior distribution when the inverse problem is framed within a Bayesian context. On the other hand, the probability perturbation method approaches the inverse problem as a data integration problem. This method explicitly deals with the problem of combining prior probabilities with pre-posterior probabilities derived from the data. It is shown that the sampling properties of the probability perturbation method approach the accuracy of well-known Markov chain Monte Carlo samplers such as the rejection sampler. The paper uses simple examples to illustrate the clear differences between these two methods  相似文献   

4.
The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our implementation of the MCMC method provides the gold standard against which the aforementioned Gaussian approximations are assessed. We present numerical synthetic experiments where we quantify the capability of each of the ad hoc Gaussian approximation in reproducing the mean and the variance of the posterior distribution (characterized via MCMC) associated to a data assimilation problem. Both single-phase and two-phase (oil–water) reservoir models are considered so that fundamental differences in the resulting forward operators are highlighted. The main objective of our controlled experiments was to exhibit the substantial discrepancies of the approximation properties of standard ad hoc Gaussian approximations. Numerical investigations of the type we present here will lead to the greater understanding of the cost-efficient, but ad hoc, Bayesian techniques used for data assimilation in petroleum reservoirs and hence ultimately to improved techniques with more accurate uncertainty quantification.  相似文献   

5.
受工程勘察成本及试验场地限制,可获得的试验数据通常有限,基于有限的试验数据难以准确估计岩土参数统计特征和边坡可靠度。贝叶斯方法可以融合有限的场地信息降低对岩土参数不确定性的估计进而提高边坡可靠度水平。但是,目前的贝叶斯更新研究大多假定参数先验概率分布为正态、对数正态和均匀分布,似然函数为多维正态分布,这种做法的合理性有待进一步验证。总结了岩土工程贝叶斯分析常用的参数先验概率分布及似然函数模型,以一个不排水黏土边坡为例,采用自适应贝叶斯更新方法系统探讨了参数先验概率分布和似然函数对空间变异边坡参数后验概率分布推断及可靠度更新的影响。计算结果表明:参数先验概率分布对空间变异边坡参数后验概率分布推断及可靠度更新均有一定的影响,选用对数正态和极值I型分布作为先验概率分布推断的参数后验概率分布离散性较小。选用Beta分布和极值I型分布获得的边坡可靠度计算结果分别偏于保守和危险,选用对数正态分布获得的边坡可靠度计算结果居中。相比之下,似然函数的影响更加显著。与其他类型似然函数相比,由多维联合正态分布构建的似然函数可在降低对岩土参数不确定性估计的同时,获得与场地信息更为吻合的计算结果。另外,构建似然函数时不同位置处测量误差之间的自相关性对边坡后验失效概率也具有一定的影响。  相似文献   

6.
Rock mechanical parameters and their uncertainties are critical to rock stability analysis, engineering design, and safe construction in rock mechanics and engineering. The back analysis is widely adopted in rock engineering to determine the mechanical parameters of the surrounding rock mass, but this does not consider the uncertainty. This problem is addressed here by the proposed approach by developing a system of Bayesian inferences for updating mechanical parameters and their statistical properties using monitored field data, then integrating the monitored data, prior knowledge of geotechnical parameters,and a mechanical model of a rock tunnel using Markov chain Monte Carlo(MCMC) simulation. The proposed approach is illustrated by a circular tunnel with an analytical solution, which was then applied to an experimental tunnel in Goupitan Hydropower Station, China. The mechanical properties and strength parameters of the surrounding rock mass were modeled as random variables. The displacement was predicted with the aid of the parameters updated by Bayesian inferences and agreed closely with monitored displacements. It indicates that Bayesian inferences combined the monitored data into the tunnel model to update its parameters dynamically. Further study indicated that the performance of Bayesian inferences is improved greatly by regularly supplementing field monitoring data. Bayesian inference is a significant and new approach for determining the mechanical parameters of the surrounding rock mass in a tunnel model and contributes to safe construction in rock engineering.  相似文献   

7.
大地电磁法的1D无偏差贝叶斯反演   总被引:2,自引:0,他引:2  
应用贝叶斯理论对一维(1D)大地电磁反演问题进行无偏差不确定度分析。在贝叶斯理论中,测量数据和先验信息包含在后验概率密度函数(PPD)中,它可以解释成模型的单点估计和不确定度等贝叶斯推断,这些信息的获取需要对反演问题进行优化求最优模型和在高维模型空间中对PPD进行采样积分。采样的完全、彻底和效率,对反演结果有着重要的影响。为了使采样更有效、更完全,数值积分采用主分量参数空间的Metropolis Hastings采样,并采用了不同的采样温度。在反演中,同时采用了欠参数化和超参数化方法,数据误差和正则化因子被当成随机变量。反演结果得到各参数的不确定度、参数间的相关关系和不同深度模型的不确定度分布。COPROD1数据的反演结果表明模型空间中存在双峰结构。非地电参数在反演中得到了约束,说明数据本身不仅包含地球物理模型信息(电导率等),还包含了这些非地电参数的信息。  相似文献   

8.
Model calibration and history matching are important techniques to adapt simulation tools to real-world systems. When prediction uncertainty needs to be quantified, one has to use the respective statistical counterparts, e.g., Bayesian updating of model parameters and data assimilation. For complex and large-scale systems, however, even single forward deterministic simulations may require parallel high-performance computing. This often makes accurate brute-force and nonlinear statistical approaches infeasible. We propose an advanced framework for parameter inference or history matching based on the arbitrary polynomial chaos expansion (aPC) and strict Bayesian principles. Our framework consists of two main steps. In step 1, the original model is projected onto a mathematically optimal response surface via the aPC technique. The resulting response surface can be viewed as a reduced (surrogate) model. It captures the model’s dependence on all parameters relevant for history matching at high-order accuracy. Step 2 consists of matching the reduced model from step 1 to observation data via bootstrap filtering. Bootstrap filtering is a fully nonlinear and Bayesian statistical approach to the inverse problem in history matching. It allows to quantify post-calibration parameter and prediction uncertainty and is more accurate than ensemble Kalman filtering or linearized methods. Through this combination, we obtain a statistical method for history matching that is accurate, yet has a computational speed that is more than sufficient to be developed towards real-time application. We motivate and demonstrate our method on the problem of CO2 storage in geological formations, using a low-parametric homogeneous 3D benchmark problem. In a synthetic case study, we update the parameters of a CO2/brine multiphase model on monitored pressure data during CO2 injection.  相似文献   

9.
Building of models in the Earth Sciences often requires the solution of an inverse problem: some unknown model parameters need to be calibrated with actual measurements. In most cases, the set of measurements cannot completely and uniquely determine the model parameters; hence multiple models can describe the same data set. Bayesian inverse theory provides a framework for solving this problem. Bayesian methods rely on the fact that the conditional probability of the model parameters given the data (the posterior) is proportional to the likelihood of observing the data and a prior belief expressed as a prior distribution of the model parameters. In case the prior distribution is not Gaussian and the relation between data and parameters (forward model) is strongly non-linear, one has to resort to iterative samplers, often Markov chain Monte Carlo methods, for generating samples that fit the data likelihood and reflect the prior model statistics. While theoretically sound, such methods can be slow to converge, and are often impractical when the forward model is CPU demanding. In this paper, we propose a new sampling method that allows to sample from a variety of priors and condition model parameters to a variety of data types. The method does not rely on the traditional Bayesian decomposition of posterior into likelihood and prior, instead it uses so-called pre-posterior distributions, i.e. the probability of the model parameters given some subset of the data. The use of pre-posterior allows to decompose the data into so-called, “easy data” (or linear data) and “difficult data” (or nonlinear data). The method relies on fast non-iterative sequential simulation to generate model realizations. The difficult data is matched by perturbing an initial realization using a perturbation mechanism termed “probability perturbation.” The probability perturbation method moves the initial guess closer to matching the difficult data, while maintaining the prior model statistics and the conditioning to the linear data. Several examples are used to illustrate the properties of this method.  相似文献   

10.
This paper shows the application of the Bayesian inference approach in estimating spatial covariance parameters. This methodology is particularly valuable where the number of experimental data is small, as occurs frequently in modeling reservoirs in petroleum engineering or when dealing with hydrodynamic variables in groundwater hydrology. There are two main advantages of Bayesian estimation: firstly that the complete distribution of the parameters is estimated and, from this distribution, it is a straightforward procedure to obtain point estimates, confidence regions, and interval estimates; secondly, all the prior information about the parameters (information available before the data are collected) is included in the inference procedure through their prior distribution. The results obtained from simulation studies are discussed.  相似文献   

11.
DC resistivity inverse problems are ill-posed. For the Vertical Electrical Sounding method the acceptable solutions lie in very narrow elongated-shape regions in the model space. To characterize this ensemble of solutions is a central question. In a Bayesian framework this issue is solved adopting as solution the so-called model posterior probability distribution. However, due to the nonlinearity of the problem, this distribution is not explicitly known, or it is difficult to calculate. Therefore, algorithms that efficiently sample the model space according to it (importance sampling) are very desirable. The main goal of this paper is to numerically explore the performance of binary genetic algorithms as posterior importance sampling strategies. Their behavior will be firstly analyzed using 2D synthetic posterior test functions bearing the relevant properties of the real geo-electrical inverse problem. The conclusions will be again checked through the histogram reconstruction of parameters in a synthetic VES case, and eventually, in a real, higher dimensional, sea-water coastal intrusion problem, by comparing the results with those obtained with a theoretically correct Metropolis-Hasting importance sampler (simulated annealing without cooling). Percentile curves are introduced as an effective tool for risk assessment. We show that binary genetic algorithms perform well under very general assumptions. When the roulette wheel is the selection method used, mutation is over 10%, and the algorithm does not incorporate elitism. The results do not depend on the values of the remaining tuning parameters. Finally, to improve the efficiency of the sampling strategy, we introduce a binary genetic algorithm with oriented search space. This is done with the help of linearization of the forward operator and singular value decomposition around the maximum posterior estimate. It is shown, also, that the logarithmic model parameterization is adequate for this task.  相似文献   

12.
Rock physical parameters such as porosity and water saturation play an important role in the mechanical behavior of hydrocarbon reservoir rocks. A valid and reliable prediction of these parameters from seismic data is essential for reservoir characterization, management, and also geomechanical modeling. In this paper, the application of conventional methods such as Bayesian inversion and computational intelligence methods, namely support vector regression (SVR) optimized by particle swarm optimization (PSO) and adaptive network-based fuzzy inference system-subtractive clustering method (ANFIS-SCM), is demonstrated to predict porosity and water saturation. The prediction abilities offered by Bayesian inversion, SVR-PSO, and ANFIS-SCM were presented using a synthetic dataset and field data available from a gas carbonate reservoir in Iran. In these models, seismic pre-stack data and attributes were utilized as the input parameters, while the porosity and water saturation were the output parameters. Various statistical performance indexes were utilized to compare the performance of those estimation models. The results achieved indicate that the ANFIS-SCM model has strong potential for indirect estimation of porosity and water saturation with high degree of accuracy and robustness from seismic data and attributes in both synthetic and real cases of this study.  相似文献   

13.
Stochastic fractal (fGn and fBm) porosity and permeability fields are conditioned to given variogram, static (or hard), and multiwell pressure data within a Bayesian estimation framework. Because fGn distributions are normal/second-order stationary, it is shown that the Bayesian estimation methods based on the assumption of normal/second-order stationary distributions can be directly used to generate fGn porosity/permeability fields conditional to pressure data. However, because fBm is not second-order stationary, it is shown that such Bayesian estimation methods can be used with implementation of a pseudocovariance approach to generate fBm porosity/permeability fields conditional to multiwell pressure data. In addition, we provide methods to generate unconditional realizations of fBm/fGn fields honoring all variogram parameters. These unconditional realizations can then be conditioned to hard and pressure data observed at wells by using the randomized maximum likelihood method. Synthetic examples generated from one-, two-, and three-dimensional single-phase flow simulators are used to show the applicability of our methodology for generating realizations of fBm/fGn porosity and permeability fields conditioned to well-test pressure data and evaluating the uncertainty in reservoir performance predictions appropriately using these history-matched realizations.  相似文献   

14.

Recently, statistical distributions have been explored to provide estimates of the mineralogical diversity of Earth, and Earth-like planets. In this paper, a Bayesian approach is introduced to estimate Earth’s undiscovered mineralogical diversity. Samples are generated from a posterior distribution of the model parameters using Markov chain Monte Carlo simulations such that estimates and inference are directly obtained. It was previously shown that the mineral species frequency distribution conforms to a generalized inverse Gauss–Poisson (GIGP) large number of rare events model. Even though the model fit was good, the population size estimate obtained by using this model was found to be unreasonably low by mineralogists. In this paper, several zero-truncated, mixed Poisson distributions are fitted and compared, where the Poisson-lognormal distribution is found to provide the best fit. Subsequently, the population size estimates obtained by Bayesian methods are compared to the empirical Bayes estimates. Species accumulation curves are constructed and employed to estimate the population size as a function of sampling size. Finally, the relative abundances, and hence the occurrence probabilities of species in a random sample, are calculated numerically for all mineral species in Earth’s crust using the Poisson-lognormal distribution. These calculations are connected and compared to the calculations obtained in a previous paper using the GIGP model for which mineralogical criteria of an Earth-like planet were given.

  相似文献   

15.
In this paper, a Bayesian approach for updating a semi-empirical model for predicting excavation-induced maximum ground settlement using centrifuge test data is presented. The Bayesian approach involves three steps: (1) prior estimate of the maximum ground settlement and model bias factor, (2) establishment of the likelihood function and posterior distribution of the model bias factor using the settlement measurement in the centrifuge test, and (3) development of posterior distribution of the predicted maximum settlement. This Bayesian approach is demonstrated with a case study of a well-documented braced excavation, and the results show that the accuracy of the maximum settlement prediction can be improved and the model uncertainty can be reduced with Bayesian updating.  相似文献   

16.
In history matching of lithofacies reservoir model, we attempt to find multiple realizations of lithofacies configuration that are conditional to dynamic data and representative of the model uncertainty space. This problem can be formalized in the Bayesian framework. Given a truncated Gaussian model as a prior and the dynamic data with its associated measurement error, we want to sample from the conditional distribution of the facies given the data. A relevant way to generate conditioned realizations is to use Markov chains Monte Carlo (MCMC). However, the dimensions of the model and the computational cost of each iteration are two important pitfalls for the use of MCMC. Furthermore, classical MCMC algorithms mix slowly, that is, they will not explore the whole support of the posterior in the time of the simulation. In this paper, we extend the methodology already described in a previous work to the problem of history matching of a Gaussian-related lithofacies reservoir model. We first show how to drastically reduce the dimension of the problem by using a truncated Karhunen-Loève expansion of the Gaussian random field underlying the lithofacies model. Moreover, we propose an innovative criterion of the choice of the number of components based on the connexity function. Then, we show how we improve the mixing properties of classical single MCMC, without increasing the global computational cost, by the use of parallel interacting Markov chains. Applying the dimension reduction and this innovative sampling method drastically lowers the number of iterations needed to sample efficiently from the posterior. We show the encouraging results obtained when applying the methodology to a synthetic history-matching case.  相似文献   

17.
A Bayesian linear inversion methodology based on Gaussian mixture models and its application to geophysical inverse problems are presented in this paper. The proposed inverse method is based on a Bayesian approach under the assumptions of a Gaussian mixture random field for the prior model and a Gaussian linear likelihood function. The model for the latent discrete variable is defined to be a stationary first-order Markov chain. In this approach, a recursive exact solution to an approximation of the posterior distribution of the inverse problem is proposed. A Markov chain Monte Carlo algorithm can be used to efficiently simulate realizations from the correct posterior model. Two inversion studies based on real well log data are presented, and the main results are the posterior distributions of the reservoir properties of interest, the corresponding predictions and prediction intervals, and a set of conditional realizations. The first application is a seismic inversion study for the prediction of lithological facies, P- and S-impedance, where an improvement of 30% in the root-mean-square error of the predictions compared to the traditional Gaussian inversion is obtained. The second application is a rock physics inversion study for the prediction of lithological facies, porosity, and clay volume, where predictions slightly improve compared to the Gaussian inversion approach.  相似文献   

18.

Minimization of a stochastic cost function is commonly used for approximate sampling in high-dimensional Bayesian inverse problems with Gaussian prior distributions and multimodal posterior distributions. The density of the samples generated by minimization is not the desired target density, unless the observation operator is linear, but the distribution of samples is useful as a proposal density for importance sampling or for Markov chain Monte Carlo methods. In this paper, we focus on applications to sampling from multimodal posterior distributions in high dimensions. We first show that sampling from multimodal distributions is improved by computing all critical points instead of only minimizers of the objective function. For applications to high-dimensional geoscience inverse problems, we demonstrate an efficient approximate weighting that uses a low-rank Gauss-Newton approximation of the determinant of the Jacobian. The method is applied to two toy problems with known posterior distributions and a Darcy flow problem with multiple modes in the posterior.

  相似文献   

19.
地基沉降修正系数的Bayes概率推断   总被引:2,自引:0,他引:2  
通过分析常规方法在沉降修正系数的选取中具有的定值性和随意性,引入建立在过去信息和现在样本信息之上的Bayes理论,结合某客运专线红黏土路基工程,提出用后验分布得到修正系数的取值范围。实例研究表明,用以往经验综合样本信息,估计修正系数的先验概率在某一区间上服从均匀分布。由现场载荷试验实测沉降量与理论计算沉降量分析所得的修正系数,将现场量测的沉降变形信息与先验信息结合起来,利用Bayes统计理论,由小样本试验数据推算得到修正系数的后验概率服从正态分布。对后验分布所得参数进行区间估计,得到该区域红黏土地基沉降修正系数的取值优化区间为 [1.0, 1.7],分析了不同荷载作用条件下沉降修正系数的概率分布模型。  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号