首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Simulating fields of categorical geospatial variables from samples is crucial for many purposes, such as spatial uncertainty assessment of natural resources distributions. However, effectively simulating complex categorical variables (i.e., multinomial classes) is difficult because of their nonlinearity and complex interclass relationships. The existing pure Markov chain approach for simulating multinomial classes has an apparent deficiency—underestimation of small classes, which largely impacts the usefulness of the approach. The Markov chain random field (MCRF) theory recently proposed supports theoretically sound multi-dimensional Markov chain models. This paper conducts a comparative study between a MCRF model and the previous Markov chain model for simulating multinomial classes to demonstrate that the MCRF model effectively solves the small-class underestimation problem. Simulated results show that the MCRF model fairly produces all classes, generates simulated patterns imitative of the original, and effectively reproduces input transiograms in realizations. Occurrence probability maps are estimated to visualize the spatial uncertainty associated with each class and the optimal prediction map. It is concluded that the MCRF model provides a practically efficient estimator for simulating multinomial classes from grid samples.  相似文献   

2.
Compositional Bayesian indicator estimation   总被引:1,自引:1,他引:0  
Indicator kriging is widely used for mapping spatial binary variables and for estimating the global and local spatial distributions of variables in geosciences. For continuous random variables, indicator kriging gives an estimate of the cumulative distribution function, for a given threshold, which is then the estimate of a probability. Like any other kriging procedure, indicator kriging provides an estimation variance that, although not often used in applications, should be taken into account as it assesses the uncertainty of the estimate. An alternative approach to indicator estimation is proposed in this paper. In this alternative approach the complete probability density function of the indicator estimate is evaluated. The procedure is described in a Bayesian framework, using a multivariate Gaussian likelihood and an a priori distribution which are both combined according to Bayes theorem in order to obtain a posterior distribution for the indicator estimate. From this posterior distribution, point estimates, interval estimates and uncertainty measures can be obtained. Among the point estimates, the median of the posterior distribution is the maximum entropy estimate because there is a fifty-fifty chance of the unknown value of the estimate being larger or smaller than the median; that is, there is maximum uncertainty in the choice between two alternatives. Thus in some sense, the latter is an indicator estimator, alternative to the kriging estimator, that includes its own uncertainty. On the other hand, the mode of the posterior distribution estimator, assuming a uniform prior, is coincidental with the simple kriging estimator. Additionally, because the indicator estimate can be considered as a two-part composition which domain of definition is the simplex, the method is extended to compositional Bayesian indicator estimation. Bayesian indicator estimation and compositional Bayesian indicator estimation are illustrated with an environmental case study in which the probability of the content of a geochemical element in soil being over a particular threshold is of interest. The computer codes and its user guides are public domain and freely available.  相似文献   

3.
In this study, we focus on a hydrogeological inverse problem specifically targeting monitoring soil moisture variations using tomographic ground penetrating radar (GPR) travel time data. Technical challenges exist in the inversion of GPR tomographic data for handling non-uniqueness, nonlinearity and high-dimensionality of unknowns. We have developed a new method for estimating soil moisture fields from crosshole GPR data. It uses a pilot-point method to provide a low-dimensional representation of the relative dielectric permittivity field of the soil, which is the primary object of inference: the field can be converted to soil moisture using a petrophysical model. We integrate a multi-chain Markov chain Monte Carlo (MCMC)–Bayesian inversion framework with the pilot point concept, a curved-ray GPR travel time model, and a sequential Gaussian simulation algorithm, for estimating the dielectric permittivity at pilot point locations distributed within the tomogram, as well as the corresponding geostatistical parameters (i.e., spatial correlation range). We infer the dielectric permittivity as a probability density function, thus capturing the uncertainty in the inference. The multi-chain MCMC enables addressing high-dimensional inverse problems as required in the inversion setup. The method is scalable in terms of number of chains and processors, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. The proposed inversion approach can successfully approximate the posterior density distributions of the pilot points, and capture the true values. The computational efficiency, accuracy, and convergence behaviors of the inversion approach were also systematically evaluated, by comparing the inversion results obtained with different levels of noises in the observations, increased observational data, as well as increased number of pilot points.  相似文献   

4.
A Monte Carlo approach is described for the quantification of uncertainty on travel time estimates. A real (non synthetic) and exhaustive data set of natural genesis is used for reference. Using an approach based on binary indicators, constraint interval data are easily accommodated in the modeling process. It is shown how the incorporation of imprecise data can reduce drastically the uncertainty in the estimates. It is also shown that unrealistic results are obtained when a deterministic modeling is carried out using a kriging estimate of the transmissivity field. Problems related with using sequential indicator simulation for the generation of fields incorporating constraint interval data are discussed. The final results consists of 95% probability intervals of arrival times at selected control planes reflecting the original uncertainty on the transmissivity maps.  相似文献   

5.
A Monte Carlo approach is described for the quantification of uncertainty on travel time estimates. A real (non synthetic) and exhaustive data set of natural genesis is used for reference. Using an approach based on binary indicators, constraint interval data are easily accommodated in the modeling process. It is shown how the incorporation of imprecise data can reduce drastically the uncertainty in the estimates. It is also shown that unrealistic results are obtained when a deterministic modeling is carried out using a kriging estimate of the transmissivity field. Problems related with using sequential indicator simulation for the generation of fields incorporating constraint interval data are discussed. The final results consists of 95% probability intervals of arrival times at selected control planes reflecting the original uncertainty on the transmissivity maps.  相似文献   

6.
 Geostatistical simulation algorithms are routinely used to generate conditional realizations of the spatial distribution of petrophysical properties, which are then fed into complex transfer functions, e.g. a flow simulator, to yield a distribution of responses, such as the time to recover a given proportion of the oil. This latter distribution, often referred to as the space of uncertainty, cannot be defined analytically because of the complexity (non-linearity) of transfer functions, but it can be characterized algorithmically through the generation of many realizations. This paper compares the space of uncertainty generated by four of the most commonly used algorithms: sequential Gaussian simulation, sequential indicator simulation, p-field simulation and simulated annealing. Conditional to 80 sample permeability values randomly drawn from an exhaustive 40×40 image, 100 realizations of the spatial distribution of permeability values are generated using each algorithm and fed into a pressure solver and a flow simulator. Principal component analysis is used to display the sets of realizations into the joint space of uncertainty of the response variables (effective permeability, times to reach 5% and 95% water cuts and to recover 10% and 50% of the oil). The attenuation of ergodic fluctuations through a rank-preserving transform of permeability values reduces substantially the extent of the space of uncertainty for sequential indicator simulation and p-field simulation, while improving the prediction of the response variable by the mean of the output distribution. Differences between simulation algorithms are the most pronounced for long-term responses (95% water cut and 50% oil recovery), with sequential Gaussian simulation yielding the most accurate prediction. In this example, utilizing more than 20 realizations generally increases only slightly the size of the space of uncertainty.  相似文献   

7.
This paper investigates the effects of uncertainty in rock-physics models on reservoir parameter estimation using seismic amplitude variation with angle and controlled-source electromagnetics data. The reservoir parameters are related to electrical resistivity by the Poupon model and to elastic moduli and density by the Xu-White model. To handle uncertainty in the rock-physics models, we consider their outputs to be random functions with modes or means given by the predictions of those rock-physics models and we consider the parameters of the rock-physics models to be random variables defined by specified probability distributions. Using a Bayesian framework and Markov Chain Monte Carlo sampling methods, we are able to obtain estimates of reservoir parameters and information on the uncertainty in the estimation. The developed method is applied to a synthetic case study based on a layered reservoir model and the results show that uncertainty in both rock-physics models and in their parameters may have significant effects on reservoir parameter estimation. When the biases in rock-physics models and in their associated parameters are unknown, conventional joint inversion approaches, which consider rock-physics models as deterministic functions and the model parameters as fixed values, may produce misleading results. The developed stochastic method in this study provides an integrated approach for quantifying how uncertainty and biases in rock-physics models and in their associated parameters affect the estimates of reservoir parameters and therefore is a more robust method for reservoir parameter estimation.  相似文献   

8.
Multigaussian kriging technique has many applications in mining, soil science, environmental science and other fields. Particularly, in the local reserve estimation of a mineral deposit, multigaussian kriging is employed to derive panel-wise tonnages by predicting conditional probability of block grades. Additionally, integration of a suitable change of support model is also required to estimate the functions of the variables with larger support than that of the samples. However, under the assumption of strict stationarity, the grade distributions and important recovery functions are estimated by multigaussian kriging using samples within a supposedly spatial homogeneous domain. Conventionally, the underlying random function model is required to be stationary in order to carry out the inference on ore grade distribution and relevant statistics. In reality, conventional stationary model often fails to represent complicated geological structure. Traditionally, the simple stationary model neither considers the obvious changes in local means and variances, nor is it able to replicate spatial continuity of the deposit and hence produces unreliable outcomes. This study deals with the theoretical design of a non-stationary multigaussian kriging model allowing change of support and its application in the mineral reserve estimation scenario. Local multivariate distributions are assumed here to be strictly stationary in the neighborhood of the panels. The local cumulative distribution function and related statistics with respect to the panels are estimated using a distance kernel approach. A rigorous investigation through simulation experiments is performed to analyze the relevance of the developed model followed by a case study on a copper deposit.  相似文献   

9.
The spatial distribution of residual light non-aqueous phase liquid (LNAPL) is an important factor in reactive solute transport modeling studies. There is great uncertainty associated with both the areal limits of LNAPL source zones and smaller scale variability within the areal limits. A statistical approach is proposed to construct a probabilistic model for the spatial distribution of residual NAPL and it is applied to a site characterized by ultra-violet-induced-cone-penetration testing (CPT–UVIF). The uncertainty in areal limits is explicitly addressed by a novel distance function (DF) approach. In modeling the small-scale variability within the areal limits, the CPT–UVIF data are used as primary source of information, while soil texture and distance to water table are treated as secondary data. Two widely used geostatistical techniques are applied for the data integration, namely sequential indicator simulation with locally varying means (SIS–LVM) and Bayesian updating (BU). A close match between the calibrated uncertainty band (UB) and the target probabilities shows the performance of the proposed DF technique in characterization of uncertainty in the areal limits. A cross-validation study also shows that the integration of the secondary data sources substantially improves the prediction of contaminated and uncontaminated locations and that the SIS–LVM algorithm gives a more accurate prediction of residual NAPL contamination. The proposed DF approach is useful in modeling the areal limits of the non-stationary continuous or categorical random variables, and in providing a prior probability map for source zone sizes to be used in Monte Carlo simulations of contaminant transport or Monte Carlo type inverse modeling studies.  相似文献   

10.
Interpolation techniques for spatial data have been applied frequently in various fields of geosciences. Although most conventional interpolation methods assume that it is sufficient to use first- and second-order statistics to characterize random fields, researchers have now realized that these methods cannot always provide reliable interpolation results, since geological and environmental phenomena tend to be very complex, presenting non-Gaussian distribution and/or non-linear inter-variable relationship. This paper proposes a new approach to the interpolation of spatial data, which can be applied with great flexibility. Suitable cross-variable higher-order spatial statistics are developed to measure the spatial relationship between the random variable at an unsampled location and those in its neighbourhood. Given the computed cross-variable higher-order spatial statistics, the conditional probability density function is approximated via polynomial expansions, which is then utilized to determine the interpolated value at the unsampled location as an expectation. In addition, the uncertainty associated with the interpolation is quantified by constructing prediction intervals of interpolated values. The proposed method is applied to a mineral deposit dataset, and the results demonstrate that it outperforms kriging methods in uncertainty quantification. The introduction of the cross-variable higher-order spatial statistics noticeably improves the quality of the interpolation since it enriches the information that can be extracted from the observed data, and this benefit is substantial when working with data that are sparse or have non-trivial dependence structures.  相似文献   

11.
This paper proposes an approach to estimating the uncertainty related to EPA Storm Water Management Model model parameters, percentage routed (PR) and saturated hydraulic conductivity (Ksat), which are used to calculate stormwater runoff volumes. The methodology proposed in this paper addresses uncertainty through the development of probability distributions for urban hydrologic parameters through extensive calibration to observed flow data in the Philadelphia collection system. The established probability distributions are then applied to the Philadelphia Southeast district model through a Monte Carlo approach to estimate the uncertainty in prediction of combined sewer overflow volumes as related to hydrologic model parameter estimation. Understanding urban hydrology is critical to defining urban water resource problems. A variety of land use types within Philadelphia coupled with a history of cut and fill have resulted in a patchwork of urban fill and native soils. The complexity of urban hydrology can make model parameter estimation and defining model uncertainty a difficult task. The development of probability distributions for hydrologic parameters applied through Monte Carlo simulations provided a significant improvement in estimating model uncertainty over traditional model sensitivity analysis. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

12.
Inverse distance interpolation for facies modeling   总被引:1,自引:0,他引:1  
Inverse distance weighted interpolation is a robust and widely used estimation technique. In practical applications, inverse distance interpolation is oftentimes favored over kriging-based techniques when there is a problem of making meaningful estimates of the field spatial structure. Nowadays application of inverse distance interpolation is limited to continuous random variable modeling. There is a need to extend the approach to categorical/discrete random variables. In this paper we propose such an extension using indicator formalism. The applicability of inverse distance interpolation for categorical modeling is then illustrated using Total’s Joslyn Lease facies data.  相似文献   

13.
 Being a non-linear method based on a rigorous formalism and an efficient processing of various information sources, the Bayesian maximum entropy (BME) approach has proven to be a very powerful method in the context of continuous spatial random fields, providing much more satisfactory estimates than those obtained from traditional linear geostatistics (i.e., the various kriging techniques). This paper aims at presenting an extension of the BME formalism in the context of categorical spatial random fields. In the first part of the paper, the indicator kriging and cokriging methods are briefly presented and discussed. A special emphasis is put on their inherent limitations, both from the theoretical and practical point of view. The second part aims at presenting the theoretical developments of the BME approach for the case of categorical variables. The three-stage procedure is explained and the formulations for obtaining prior joint distributions and computing posterior conditional distributions are given for various typical cases. The last part of the paper consists in a simulation study for assessing the performance of BME over the traditional indicator (co)kriging techniques. The results of these simulations highlight the theoretical limitations of the indicator approach (negative probability estimates, probability distributions that do not sum up to one, etc.) as well as the much better performance of the BME approach. Estimates are very close to the theoretical conditional probabilities, that can be computed according to the stated simulation hypotheses.  相似文献   

14.
Identification of rock boundaries and structural features from well log response is a fundamental problem in geological field studies. However, in a complex geologic situation, such as in the presence of crystalline rocks where metamorphisms lead to facies changes, it is not easy to discern accurate information from well log data using conventional artificial neural network (ANN) methods. Moreover inferences drawn by such methods are also found to be ambiguous because of the strong overlapping of well log signals, which are generally tainted with deceptive noise. Here, we have developed an alternative ANN approach based on Bayesian statistics using the concept of Hybrid Monte Carlo (HMC)/Markov Chain Monte Carlo (MCMC) inversion scheme for modeling the German Continental Deep Drilling Program (KTB) well log data. MCMC algorithm draws an independent and identically distributed (i.i.d) sample by Markov Chain simulation technique from posterior probability distribution using the principle of statistical mechanics in Hamiltonian dynamics. In this algorithm, each trajectory is updated by approximating the Hamiltonian differential equations through a leapfrog discrimination scheme. We examined the stability and efficiency of the HMC-based approach on “noisy” data assorted with different levels of colored noise. We also perform uncertainty analysis by estimating standard deviation (STD) error map of a posteriori covariance matrix at the network output of three types of lithofacies over the entire length of the litho section of KTB. Our analyses demonstrate that the HMC-based approach renders robust means for classification of complex lithofacies successions from the KTB borehole noisy signals, and hence may provide a useful guide for understanding the crustal inhomogeneity and structural discontinuity in many other tectonically critical and complex regions.  相似文献   

15.
Rainfall data in continuous space provide an essential input for most hydrological and water resources planning studies. Spatial distribution of rainfall is usually estimated using ground‐based point rainfall data from sparsely positioned rain‐gauge stations in a rain‐gauge network. Kriging has become a widely used interpolation method to estimate the spatial distribution of climate variables including rainfall. The objective of this study is to evaluate three geostatistical (ordinary kriging [OK], ordinary cokriging [OCK], kriging with an external drift [KED]), and two deterministic (inverse distance weighting, radial basis function) interpolation methods for enhanced spatial interpolation of monthly rainfall in the Middle Yarra River catchment and the Ovens River catchment in Victoria, Australia. Historical rainfall records from existing rain‐gauge stations of the catchments during 1980–2012 period are used for the analysis. A digital elevation model of each catchment is used as the supplementary information in addition to rainfall for the OCK and kriging with an external drift methods. The prediction performance of the adopted interpolation methods is assessed through cross‐validation. Results indicate that the geostatistical methods outperform the deterministic methods for spatial interpolation of rainfall. Results also indicate that among the geostatistical methods, the OCK method is found to be the best interpolator for estimating spatial rainfall distribution in both the catchments with the lowest prediction error between the observed and estimated monthly rainfall. Thus, this study demonstrates that the use of elevation as an auxiliary variable in addition to rainfall data in the geostatistical framework can significantly enhance the estimation of rainfall over a catchment.  相似文献   

16.
Traditional accuracy assessment of satellite-derived maps relies on a confusion matrix and its associated indices built by comparing ground truth observations and classification outputs at specific locations. These indices may be applied at the map-level or at the class level. However, the spatial variation of the accuracy is not captured by those statistics. Pixel-level thematic uncertainty measures derived from class membership probability vectors can provide such spatially explicit information. In this paper, a new information-based criterion—the equivalent reference probability—is introduced to provide a synoptic thematic uncertainty measure that has the advantage of taking the maximum probability value into account while committing for the full set of probabilities. The fundamental theoretical properties of this indicator was first highlighted and its use was afterwards demonstrated on a real case study in Belgium. Results showed that the proposed approach positively correlates with the quality of the classification and is more sensitive than the classical maximum probability criterion. As this information-based criterion can be used for providing spatially explicit maps of thematic uncertainty quality, it provides substantial additional information regarding classification quality compared to conventional quality measures. Accordingly, it proved to be useful both for end-users and map producers as a way to better understand the nature of the errors and to subsequently improve the map quality.  相似文献   

17.
Gurdak JJ  McCray JE  Thyne G  Qi SL 《Ground water》2007,45(3):348-361
A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability.  相似文献   

18.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

19.
This study addresses estimation of net irrigation requirement over a growing season under climate uncertainty. An ecohydrological model, building upon the stochastic differential equation of soil moisture dynamics, is employed as a basis to derive new analytical expressions for estimating seasonal net irrigation requirement probabilistically. Two distinct irrigation technologies are considered. For micro irrigation technology, probability density function of seasonal net irrigation depth (SNID) is derived assessing transient behavior of a stochastic process which is time integral of dichotomous Markov process. Probability mass function of SNID which is a discrete random variable for traditional irrigation technology is also presented using a marked renewal process with quasi-exponentially-distributed time intervals. Comparing the results obtained from the presented models with those resulted from a Monte Carlo approach verified the significance of the probabilistic expressions derived and assumptions made.  相似文献   

20.
Parameter uncertainty in hydrologic modeling is crucial to the flood simulation and forecasting. The Bayesian approach allows one to estimate parameters according to prior expert knowledge as well as observational data about model parameter values. This study assesses the performance of two popular uncertainty analysis (UA) techniques, i.e., generalized likelihood uncertainty estimation (GLUE) and Bayesian method implemented with the Markov chain Monte Carlo sampling algorithm, in evaluating model parameter uncertainty in flood simulations. These two methods were applied to the semi-distributed Topographic hydrologic model (TOPMODEL) that includes five parameters. A case study was carried out for a small humid catchment in the southeastern China. The performance assessment of the GLUE and Bayesian methods were conducted with advanced tools suited for probabilistic simulations of continuous variables such as streamflow. Graphical tools and scalar metrics were used to test several attributes of the simulation quality of selected flood events: deterministic accuracy and the accuracy of 95 % prediction probability uncertainty band (95PPU). Sensitivity analysis was conducted to identify sensitive parameters that largely affect the model output results. Subsequently, the GLUE and Bayesian methods were used to analyze the uncertainty of sensitive parameters and further to produce their posterior distributions. Based on their posterior parameter samples, TOPMODEL’s simulations and the corresponding UA results were conducted. Results show that the form of exponential decline in conductivity and the overland flow routing velocity were sensitive parameters in TOPMODEL in our case. Small changes in these two parameters would lead to large differences in flood simulation results. Results also suggest that, for both UA techniques, most of streamflow observations were bracketed by 95PPU with the containing ratio value larger than 80 %. In comparison, GLUE gave narrower prediction uncertainty bands than the Bayesian method. It was found that the mode estimates of parameter posterior distributions are suitable to result in better performance of deterministic outputs than the 50 % percentiles for both the GLUE and Bayesian analyses. In addition, the simulation results calibrated with Rosenbrock optimization algorithm show a better agreement with the observations than the UA’s 50 % percentiles but slightly worse than the hydrographs from the mode estimates. The results clearly emphasize the importance of using model uncertainty diagnostic approaches in flood simulations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号