首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
2.
Groundwater model predictions are often uncertain due to inherent uncertainties in model input data. Monitored field data are commonly used to assess the performance of a model and reduce its prediction uncertainty. Given the high cost of data collection, it is imperative to identify the minimum number of required observation wells and to define the optimal locations of sampling points in space and depth. This study proposes a design methodology to optimize the number and location of additional observation wells that will effectively measure multiple hydrogeological parameters at different depths. For this purpose, we incorporated Bayesian model averaging and genetic algorithms into a linear data-worth analysis in order to conduct a three-dimensional location search for new sampling locations. We evaluated the methodology by applying it along a heterogeneous coastal aquifer with limited hydrogeological data that is experiencing salt water intrusion (SWI). The aim of the model was to identify the best locations for sampling head and salinity data, while reducing uncertainty when predicting multiple variables of SWI. The resulting optimal locations for new observation wells varied with the defined design constraints. The optimal design (OD) depended on the ratio of the start-up cost of the monitoring program and the installation cost of the first observation well. The proposed methodology can contribute toward reducing the uncertainties associated with predicting multiple variables in a groundwater system.  相似文献   

3.
Categorical data play an important role in a wide variety of spatial applications, while modeling and predicting this type of statistical variable has proved to be complex in many cases. Among other possible approaches, the Bayesian maximum entropy methodology has been developed and advocated for this goal and has been successfully applied in various spatial prediction problems. This approach aims at building a multivariate probability table from bivariate probability functions used as constraints that need to be fulfilled, in order to compute a posterior conditional distribution that accounts for hard or soft information sources. In this paper, our goal is to generalize further the theoretical results in order to account for a much wider type of information source, such as probability inequalities. We first show how the maximum entropy principle can be implemented efficiently using a linear iterative approximation based on a minimum norm criterion, where the minimum norm solution is obtained at each step from simple matrix operations that converges to the requested maximum entropy solution. Based on this result, we show then how the maximum entropy problem can be related to the more general minimum divergence problem, which might involve equality and inequality constraints and which can be solved based on iterated minimum norm solutions. This allows us to account for a much larger panel of information types, where more qualitative information, such as probability inequalities can be used. When combined with a Bayesian data fusion approach, this approach deals with the case of potentially conflicting information that is available. Although the theoretical results presented in this paper can be applied to any study (spatial or non-spatial) involving categorical data in general, the results are illustrated in a spatial context where the goal is to predict at best the occurrence of cultivated land in Ethiopia based on crowdsourced information. The results emphasize the benefit of the methodology, which integrates conflicting information and provides a spatially exhaustive map of these occurrence classes over the whole country.  相似文献   

4.
Due to the fast pace increasing availability and diversity of information sources in environmental sciences, there is a real need of sound statistical mapping techniques for using them jointly inside a unique theoretical framework. As these information sources may vary both with respect to their nature (continuous vs. categorical or qualitative), their spatial density as well as their intrinsic quality (soft vs. hard data), the design of such techniques is a challenging issue. In this paper, an efficient method for combining spatially non-exhaustive categorical and continuous data in a mapping context is proposed, based on the Bayesian maximum entropy paradigm. This approach relies first on the definition of a mixed random field, that can account for a stochastic link between categorical and continuous random fields through the use of a cross-covariance function. When incorporating general knowledge about the first- and second-order moments of these fields, it is shown that, under mild hypotheses, their joint distribution can be expressed as a mixture of conditional Gaussian prior distributions, with parameters estimation that can be obtained from entropy maximization. A posterior distribution that incorporates the various (soft or hard) continuous and categorical data at hand can then be obtained by a straightforward conditionalization step. The use and potential of the method is illustrated by the way of a simulated case study. A comparison with few common geostatistical methods in some limit cases also emphasizes their similarities and differences, both from the theoretical and practical viewpoints. As expected, adding categorical information may significantly improve the spatial prediction of a continuous variable, making this approach powerful and very promising.  相似文献   

5.
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster–Shafer (D–S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D–S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D–S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D–S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster–Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D–S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D–S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.  相似文献   

6.
We focus on the Bayesian estimation of strongly heterogeneous transmissivity fields conditional on data sampled at a set of locations in an aquifer. Log-transmissivity, Y, is modeled as a stochastic Gaussian process, parameterized through a truncated Karhunen–Loève (KL) expansion. We consider Y fields characterized by a short correlation scale as compared to the size of the observed domain. These systems are associated with a KL decomposition which still requires a high number of parameters, thus hampering the efficiency of the Bayesian estimation of the underlying stochastic field. The distinctive aim of this work is to present an efficient approach for the stochastic inverse modeling of fully saturated groundwater flow in these types of strongly heterogeneous domains. The methodology is grounded on the construction of an optimal sparse KL decomposition which is achieved by retaining only a limited set of modes in the expansion. Mode selection is driven by model selection criteria and is conditional on available data of hydraulic heads and (optionally) Y. Bayesian inversion of the optimal sparse KLE is then inferred using Markov Chain Monte Carlo (MCMC) samplers. As a test bed, we illustrate our approach by way of a suite of computational examples where noisy head and Y values are sampled from a given randomly generated system. Our findings suggest that the proposed methodology yields a globally satisfactory inversion of the stochastic head and Y fields. Comparison of reference values against the corresponding MCMC predictive distributions suggests that observed values are well reproduced in a probabilistic sense. In a few cases, reference values at some unsampled locations (typically far from measurements) are not captured by the posterior probability distributions. In these cases, the quality of the estimation could be improved, e.g., by increasing the number of measurements and/or the threshold for the selection of KL modes.  相似文献   

7.
Groundwater prediction models are subjected to various sources of uncertainty. This study introduces a hierarchical Bayesian model averaging (HBMA) method to segregate and prioritize sources of uncertainty in a hierarchical structure and conduct BMA for concentration prediction. A BMA tree of models is developed to understand the impact of individual sources of uncertainty and uncertainty propagation to model predictions. HBMA evaluates the relative importance of different modeling propositions at each level in the BMA tree of model weights. The HBMA method is applied to chloride concentration prediction for the “1,500‐foot” sand of the Baton Rouge area, Louisiana from 2005 to 2029. The groundwater head data from 1990 to 2004 is used for model calibration. Four sources of uncertainty are considered and resulted in 180 flow and transport models for concentration prediction. The results show that prediction variances of concentration from uncertain model elements are much higher than the prediction variance from uncertain model parameters. The HBMA method is able to quantify the contributions of individual sources of uncertainty to the total uncertainty.  相似文献   

8.
地震岩相识别概率表征方法   总被引:4,自引:3,他引:1       下载免费PDF全文
储层岩相分布信息是油藏表征的重要参数,基于地震资料开展储层岩相识别通常具有较强的不确定性.传统方法仅获取唯一确定的岩相分布信息,无法解析反演结果的不确定性,增加了油藏评价的风险.本文引入基于概率统计的多步骤反演方法开展地震岩相识别,通过在其各个环节建立输入与输出参量的统计关系,然后融合各环节概率统计信息构建地震数据与储层岩相的条件概率关系以反演岩相分布概率信息.与传统方法相比,文中方法通过概率统计关系表征了地震岩相识别各个环节中地球物理响应关系的不确定性,并通过融合各环节概率信息实现了不确定性传递的数值模拟,最终反演的岩相概率信息能够客观准确地反映地震岩相识别结果的不确定性,为油藏评价及储层建模提供了重要参考信息.模型数据和实际资料应用验证了方法的有效性.  相似文献   

9.
The spatial distribution of residual light non-aqueous phase liquid (LNAPL) is an important factor in reactive solute transport modeling studies. There is great uncertainty associated with both the areal limits of LNAPL source zones and smaller scale variability within the areal limits. A statistical approach is proposed to construct a probabilistic model for the spatial distribution of residual NAPL and it is applied to a site characterized by ultra-violet-induced-cone-penetration testing (CPT–UVIF). The uncertainty in areal limits is explicitly addressed by a novel distance function (DF) approach. In modeling the small-scale variability within the areal limits, the CPT–UVIF data are used as primary source of information, while soil texture and distance to water table are treated as secondary data. Two widely used geostatistical techniques are applied for the data integration, namely sequential indicator simulation with locally varying means (SIS–LVM) and Bayesian updating (BU). A close match between the calibrated uncertainty band (UB) and the target probabilities shows the performance of the proposed DF technique in characterization of uncertainty in the areal limits. A cross-validation study also shows that the integration of the secondary data sources substantially improves the prediction of contaminated and uncontaminated locations and that the SIS–LVM algorithm gives a more accurate prediction of residual NAPL contamination. The proposed DF approach is useful in modeling the areal limits of the non-stationary continuous or categorical random variables, and in providing a prior probability map for source zone sizes to be used in Monte Carlo simulations of contaminant transport or Monte Carlo type inverse modeling studies.  相似文献   

10.
Probability theory as logic (or Bayesian probability theory) is a rational inferential methodology that provides a natural and logically consistent framework for source reconstruction. This methodology fully utilizes the information provided by a limited number of noisy concentration data obtained from a network of sensors and combines it in a consistent manner with the available prior knowledge (mathematical representation of relevant physical laws), hence providing a rigorous basis for the assimilation of this data into models of atmospheric dispersion for the purpose of contaminant source reconstruction. This paper addresses the application of this framework to the reconstruction of contaminant source distributions consisting of an unknown number of localized sources, using concentration measurements obtained from a sensor array. To this purpose, Bayesian probability theory is used to formulate the full joint posterior probability density function for the parameters of the unknown source distribution. A simulated annealing algorithm, applied in conjunction with a reversible-jump Markov chain Monte Carlo technique, is used to draw random samples of source distribution models from the posterior probability density function. The methodology is validated against a real (full-scale) atmospheric dispersion experiment involving a multiple point source release.  相似文献   

11.
We develop a stochastic modeling approach based on spatial point processes of log-Gaussian Cox type for a collection of around 5000 landslide events provoked by a precipitation trigger in Sicily, Italy. Through the embedding into a hierarchical Bayesian estimation framework, we can use the integrated nested Laplace approximation methodology to make inference and obtain the posterior estimates of spatially distributed covariate and random effects. Several mapping units are useful to partition a given study area in landslide prediction studies. These units hierarchically subdivide the geographic space from the highest grid-based resolution to the stronger morphodynamic-oriented slope units. Here we integrate both mapping units into a single hierarchical model, by treating the landslide triggering locations as a random point pattern. This approach diverges fundamentally from the unanimously used presence–absence structure for areal units since we focus on modeling the expected landslide count jointly within the two mapping units. Predicting this landslide intensity provides more detailed and complete information as compared to the classically used susceptibility mapping approach based on relative probabilities. To illustrate the model’s versatility, we compute absolute probability maps of landslide occurrences and check their predictive power over space. While the landslide community typically produces spatial predictive models for landslides only in the sense that covariates are spatially distributed, no actual spatial dependence has been explicitly integrated so far. Our novel approach features a spatial latent effect defined at the slope unit level, allowing us to assess the spatial influence that remains unexplained by the covariates in the model. For rainfall-induced landslides in regions where the raingauge network is not sufficient to capture the spatial distribution of the triggering precipitation event, this latent effect provides valuable imaging support on the unobserved rainfall pattern.  相似文献   

12.
In this paper we combine a multiscale data integration technique introduced in [Lee SH, Malallah A, Datta-Gupta A, Hidgon D. Multiscale data integration using Markov Random Fields. SPE Reservoir Evaluat Eng 2002;5(1):68–78] with upscaling techniques for spatial modeling of permeability. The main goal of this paper is to find fine-scale permeability fields based on coarse-scale permeability measurements. The approach introduced in the paper is hierarchical and the conditional information from different length scales is incorporated into the posterior distribution using a Bayesian framework. Because of a complicated structure of the posterior distribution Markov chain Monte Carlo (MCMC) based approaches are used to draw samples of the fine-scale permeability field.  相似文献   

13.
The present paper reviews the conceptual framework and development of the Bayesian Maximum Entropy (BME) approach. BME has been considered as a significant breakthrough and contribution to applied stochastics by introducing an improved, knowledge-based modeling framework for spatial and spatiotemporal information. In this work, one objective is the overview of distinct BME features. By offering a foundation free of restrictive assumptions that limit comparable techniques, an ability to integrate a variety of prior knowledge bases, and rigorous accounting for both exact and uncertain data, the BME approach was coined as introducing modern spatiotemporal geostatistics. A second objective is to illustrate BME applications and adoption within numerous different scientific disciplines. We summarize examples and real-world studies that encompass the perspective of science of the total environment, including atmosphere, lithosphere, hydrosphere, and ecosphere, while also noting applications that extend beyond these fields. The broad-ranging application track suggests BME as an established, valuable tool for predictive spatial and space–time analysis and mapping. This review concludes with the present status of BME, and tentative paths for future methodological research, enhancements, and extensions.  相似文献   

14.
In previous work, we presented a method for estimation and correction of non-linear mathematical model structures, within a Bayesian framework, by merging uncertain knowledge about process physics with uncertain and incomplete observations of dynamical input-state-output behavior. The resulting uncertainty in the model input-state-output mapping is expressed as a weighted combination of an uncertain conceptual model prior and a data-derived probability density function, with weights depending on the conditional data density. Our algorithm is based on the use of iterative data assimilation to update a conceptual model prior using observed system data, and thereby construct a posterior estimate of the model structure (the mathematical form of the equation itself, not just its parameters) that is consistent with both physically based prior knowledge and with the information in the data. An important aspect of the approach is that it facilitates a clear differentiation between the influences of different types of uncertainties (initial condition, input, and mapping structure) on the model prediction. Further, if some prior assumptions regarding the structural (mathematical) forms of the model equations exist, the procedure can help reveal errors in those forms and how they should be corrected. This paper examines the properties of the approach by investigating two case studies in considerable detail. The results show how, and to what degree, the structure of a dynamical hydrological model can be estimated without little or no prior knowledge (or under conditions of incorrect prior information) regarding the functional forms of the storage–streamflow and storage–evapotranspiration relationships. The importance and implications of careful specification of the model prior are illustrated and discussed.  相似文献   

15.
Streamflow forecasting methods are moving towards probabilistic approaches that quantify the uncertainty associated with the various sources of error in the forecasting process. Multi-model averaging methods which try to address modeling deficiencies by considering multiple models are gaining much popularity. We have applied the Bayesian Model Averaging method to an ensemble of twelve snow models that vary in their heat and melt algorithms, parameterization, and/or albedo estimation method. Three of the models use the temperature-based heat and melt routines of the SNOW17 snow accumulation and ablation model. Nine models use heat and melt routines that are based on a simplified energy balance approach, and are varied by using three different albedo estimation schemes. Finally, different parameter sets were identified through automatic calibration with three objective functions. All models use the snow accumulation, liquid water transport, and ground surface heat exchange processes of the SNOW17. The resulting twelve snow models were combined using Bayesian Model Averaging (BMA). The individual models, BMA predictive mean, and BMA predictive variance were evaluated for six SNOTEL sites in the western U.S. The models performed best and the BMA variance was lowest at the colder sites with high winter precipitation and little mid-winter melting. An individual snow model would often outperform the BMA predictive mean. However, observed snow water equivalent (SWE) was captured within the 95% confidence intervals of the BMA variance on average 80% of the time at all sites. Results are promising that consideration of multiple snow structures would provide useful uncertainty information for probabilistic hydrologic prediction.  相似文献   

16.
Well vulnerability: a quantitative approach for source water protection   总被引:9,自引:0,他引:9  
The concept of vulnerability of drinking water sources is reviewed, and a quantitative approach for assessing well vulnerability for complex three-dimensional ground water systems is developed. The approach focuses on the relative expected impact of potential contaminant sources at unknown locations within a well capture zone, providing relative measures of intrinsic well vulnerability, including the expected times of arrival of a contaminant, the dispersion-related reduction in concentration, the time taken to breach a certain quality objective, and the corresponding exposure times. Thus, the result of the analysis includes the usual advective travel time information used in conventional wellhead protection analysis, plus a set of selected quantitative measures expressing the expected impact. The technique is based on adjoint theory and combines forward- and backward-in-time transport modeling using a standard numerical flow and transport code. The methodology is demonstrated using the case study of a complex glacial multiaquifer system in Ontario. The new approach will be useful in helping water managers develop more physically based and quantitative wellhead protection strategies.  相似文献   

17.
We propose a Bayesian fusion approach to integrate multiple geophysical datasets with different coverage and sensitivity. The fusion strategy is based on the capability of various geophysical methods to provide enough resolution to identify either subsurface material parameters or subsurface structure, or both. We focus on electrical resistivity as the target material parameter and electrical resistivity tomography (ERT), electromagnetic induction (EMI), and ground penetrating radar (GPR) as the set of geophysical methods. However, extending the approach to different sets of geophysical parameters and methods is straightforward. Different geophysical datasets are entered into a trans-dimensional Markov chain Monte Carlo (McMC) search-based joint inversion algorithm. The trans-dimensional property of the McMC algorithm allows dynamic parameterisation of the model space, which in turn helps to avoid bias of the post-inversion results towards a particular model. Given that we are attempting to develop an approach that has practical potential, we discretize the subsurface into an array of one-dimensional earth-models. Accordingly, the ERT data that are collected by using two-dimensional acquisition geometry are re-casted to a set of equivalent vertical electric soundings. Different data are inverted either individually or jointly to estimate one-dimensional subsurface models at discrete locations. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical datasets. Information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. A Bayesian maximum entropy approach is used for spatial fusion of spatially dispersed estimated one-dimensional models and mapping of the target parameter. We illustrate the approach with a synthetic dataset and then apply it to a field dataset. We show that the proposed fusion strategy is successful not only in enhancing the subsurface information but also as a survey design tool to identify the appropriate combination of the geophysical tools and show whether application of an individual method for further investigation of a specific site is beneficial.  相似文献   

18.
A multivariate spatial sampling design that uses spatial vine copulas is presented that aims to simultaneously reduce the prediction uncertainty of multiple variables by selecting additional sampling locations based on the multivariate relationship between variables, the spatial configuration of existing locations and the values of the observations at those locations. Novel aspects of the methodology include the development of optimal designs that use spatial vine copulas to estimate prediction uncertainty and, additionally, use transformation methods for dimension reduction to model multivariate spatial dependence. Spatial vine copulas capture non-linear spatial dependence within variables, whilst a chained transformation that uses non-linear principal component analysis captures the non-linear multivariate dependence between variables. The proposed design methodology is applied to two environmental case studies. Performance of the proposed methodology is evaluated through partial redesigns of the original spatial designs. The first application is a soil contamination example that demonstrates the ability of the proposed methodology to address spatial non-linearity in the data. The second application is a forest biomass study that highlights the strength of the methodology in incorporating non-linear multivariate dependence into the design.  相似文献   

19.
Modelling glacier discharge is an important issue in hydrology and climate research. Glaciers represent a fundamental water resource when melting of ice and snow contributes to runoff. Glaciers are also studied as natural global warming sensors. GLACKMA association has implemented one of their Pilot Experimental Catchment areas at the King George Island in the Antarctica which records values of the liquid discharge from Collins glacier. In this paper, we propose the use of time-varying copula models for analyzing the relationship between air temperature and glacier discharge, which is clearly non constant and non linear through time. A seasonal copula model is defined where both the marginal and copula parameters vary periodically along time following a seasonal dynamic. Full Bayesian inference is performed such that the marginal and copula parameters are estimated in a one single step, in contrast with the usual two-step approach. Bayesian prediction and model selection is also carried out for the proposed model such that Bayesian credible intervals can be obtained for the conditional glacier discharge given a value of the temperature at any given time point. The proposed methodology is illustrated using the GLACKMA real data where there is, in addition, a hydrological year of missing discharge data which were not possible to measure accurately due to problems in the sounding.  相似文献   

20.
MODFLOW 2000 head uncertainty,a first-order second moment method   总被引:1,自引:0,他引:1  
A computationally efficient method to estimate the variance and covariance in piezometric head results computed through MODFLOW 2000 using a first-order second moment (FOSM) approach is presented. This methodology employs a first-order Taylor series expansion to combine model sensitivity with uncertainty in geologic data. MODFLOW 2000 is used to calculate both the ground water head and the sensitivity of head to changes in input data. From a limited number of samples, geologic data are extrapolated and their associated uncertainties are computed through a conditional probability calculation. Combining the spatially related sensitivity and input uncertainty produces the variance-covariance matrix, the diagonal of which is used to yield the standard deviation in MODFLOW 2000 head. The variance in piezometric head can be used for calibrating the model, estimating confidence intervals, directing exploration, and evaluating the reliability of a design. A case study illustrates the approach, where aquifer transmissivity is the spatially related uncertain geologic input data. The FOSM methodology is shown to be applicable for calculating output uncertainty for (1) spatially related input and output data, and (2) multiple input parameters (transmissivity and recharge).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号