首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
The multi-Gaussian model is used in geostatistical applications to predict functions of a regionalized variable and to assess uncertainty by determining local (conditional to neighboring data) distributions. The model relies on the assumption that the regionalized variable can be represented by a transform of a Gaussian random field with a known mean value, which is often a strong requirement. This article presents two variations of the model to account for an uncertain mean value. In the first one, the mean of the Gaussian random field is regarded as an unknown non-random parameter. In the second model, the mean of the Gaussian field is regarded as a random variable with a very large prior variance. The properties of the proposed models are compared in the context of non-linear spatial prediction and uncertainty assessment problems. Algorithms for the conditional simulation of Gaussian random fields with an uncertain mean are also examined, and problems associated with the selection of data in a moving neighborhood are discussed.  相似文献   

2.
Ground water model calibration using pilot points and regularization   总被引:9,自引:0,他引:9  
Doherty J 《Ground water》2003,41(2):170-177
Use of nonlinear parameter estimation techniques is now commonplace in ground water model calibration. However, there is still ample room for further development of these techniques in order to enable them to extract more information from calibration datasets, to more thoroughly explore the uncertainty associated with model predictions, and to make them easier to implement in various modeling contexts. This paper describes the use of "pilot points" as a methodology for spatial hydraulic property characterization. When used in conjunction with nonlinear parameter estimation software that incorporates advanced regularization functionality (such as PEST), use of pilot points can add a great deal of flexibility to the calibration process at the same time as it makes this process easier to implement. Pilot points can be used either as a substitute for zones of piecewise parameter uniformity, or in conjunction with such zones. In either case, they allow the disposition of areas of high and low hydraulic property value to be inferred through the calibration process, without the need for the modeler to guess the geometry of such areas prior to estimating the parameters that pertain to them. Pilot points and regularization can also be used as an adjunct to geostatistically based stochastic parameterization methods. Using the techniques described herein, a series of hydraulic property fields can be generated, all of which recognize the stochastic characterization of an area at the same time that they satisfy the constraints imposed on hydraulic property values by the need to ensure that model outputs match field measurements. Model predictions can then be made using all of these fields as a mechanism for exploring predictive uncertainty.  相似文献   

3.
Monte-Carlo simulations of a two-dimensional finite element model of a flood in the southern part of Sicily were used to explore the parameter space of distributed bed-roughness coefficients. For many real-world events specific data are extremely limited so that there is not only fuzziness in the information available to calibrate the model, but fuzziness in the degree of acceptability of model predictions based upon the different parameter values, owing to model structural errors. Here the GLUE procedure is used to compare model predictions and observations for a certain event, coupled with both a fuzzy-rule-based calibration, and a calibration technique based upon normal and heteroscedastic distributions of the predicted residuals. The fuzzy-rule-based calibration is suited to an event of this kind, where the information about the flood is highly uncertain and arises from several different types of observation. The likelihood (relative possibility) distributions predicted by the two calibration techniques are similar, although the fuzzy approach enabled us to constrain the parameter distributions more usefully, to lie within a range which was consistent with the modellers' a priori knowledge of the system.  相似文献   

4.
Producing accurate spatial predictions for wind power generation together with a quantification of uncertainties is required to plan and design optimal networks of wind farms. Toward this aim, we propose spatial models for predicting wind power generation at two different time scales: for annual average wind power generation, and for a high temporal resolution (typically wind power averages over 15-min time steps). In both cases, we use a spatial hierarchical statistical model in which spatial correlation is captured by a latent Gaussian field. We explore how such models can be handled with stochastic partial differential approximations of Matérn Gaussian fields together with Integrated Nested Laplace Approximations. We demonstrate the proposed methods on wind farm data from Western Denmark, and compare the results to those obtained with standard geostatistical methods. The results show that our method makes it possible to obtain fast and accurate predictions from posterior marginals for wind power generation. The proposed method is applicable in scientific areas as diverse as climatology, environmental sciences, earth sciences and epidemiology.  相似文献   

5.
In most groundwater applications, measurements of concentration are limited in number and sparsely distributed within the domain of interest. Therefore, interpolation techniques are needed to obtain most likely values of concentration at locations where no measurements are available. For further processing, for example, in environmental risk analysis, interpolated values should be given with uncertainty bounds, so that a geostatistical framework is preferable. Linear interpolation of steady-state concentration measurements is problematic because the dependence of concentration on the primary uncertain material property, the hydraulic conductivity field, is highly nonlinear, suggesting that the statistical interrelationship between concentration values at different points is also nonlinear. We suggest interpolating steady-state concentration measurements by conditioning an ensemble of the underlying log-conductivity field on the available hydrological data in a conditional Monte Carlo approach. Flow and transport simulations for each conditional conductivity field must meet the measurements within their given uncertainty. The ensemble of transport simulations based on the conditional log-conductivity fields yields conditional statistical distributions of concentration at points between observation points. This method implicitly meets physical bounds of concentration values and non-Gaussianity of their statistical distributions and obeys the nonlinearity of the underlying processes. We validate our method by artificial test cases and compare the results to kriging estimates assuming different conditional statistical distributions of concentration. Assuming a beta distribution in kriging leads to estimates of concentration with zero probability of concentrations below zero or above the maximal possible value; however, the concentrations are not forced to meet the advection-dispersion equation.  相似文献   

6.
In this study, we focus on a hydrogeological inverse problem specifically targeting monitoring soil moisture variations using tomographic ground penetrating radar (GPR) travel time data. Technical challenges exist in the inversion of GPR tomographic data for handling non-uniqueness, nonlinearity and high-dimensionality of unknowns. We have developed a new method for estimating soil moisture fields from crosshole GPR data. It uses a pilot-point method to provide a low-dimensional representation of the relative dielectric permittivity field of the soil, which is the primary object of inference: the field can be converted to soil moisture using a petrophysical model. We integrate a multi-chain Markov chain Monte Carlo (MCMC)–Bayesian inversion framework with the pilot point concept, a curved-ray GPR travel time model, and a sequential Gaussian simulation algorithm, for estimating the dielectric permittivity at pilot point locations distributed within the tomogram, as well as the corresponding geostatistical parameters (i.e., spatial correlation range). We infer the dielectric permittivity as a probability density function, thus capturing the uncertainty in the inference. The multi-chain MCMC enables addressing high-dimensional inverse problems as required in the inversion setup. The method is scalable in terms of number of chains and processors, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. The proposed inversion approach can successfully approximate the posterior density distributions of the pilot points, and capture the true values. The computational efficiency, accuracy, and convergence behaviors of the inversion approach were also systematically evaluated, by comparing the inversion results obtained with different levels of noises in the observations, increased observational data, as well as increased number of pilot points.  相似文献   

7.
The interactive multi-objective genetic algorithm (IMOGA) combines traditional optimization with an interactive framework that considers the subjective knowledge of hydro-geological experts in addition to quantitative calibration measures such as calibration errors and regularization to solve the groundwater inverse problem. The IMOGA is inherently a deterministic framework and identifies multiple large-scale parameter fields (typically head and transmissivity data are used to identify transmissivity fields). These large-scale parameter fields represent the optimal trade-offs between the different criteria (quantitative and qualitative) used in the IMOGA. This paper further extends the IMOGA to incorporate uncertainty both in the large-scale trends as well as the small-scale variability (which can not be resolved using the field data) in the parameter fields. The different parameter fields identified by the IMOGA represent the uncertainty in large-scale trends, and this uncertainty is modeled using a Bayesian approach where calibration error, regularization, and the expert’s subjective preference are combined to compute a likelihood metric for each parameter field. Small-scale (stochastic) variability is modeled using a geostatistical approach and added onto the large-scale trends identified by the IMOGA. This approach is applied to the Waste Isolation Pilot Plant (WIPP) case-study. Results, with and without expert interaction, are analyzed and the impact that expert judgment has on predictive uncertainty at the WIPP site is discussed. It is shown that for this case, expert interaction leads to more conservative solutions as the expert compensates for some of the lack of data and modeling approximations introduced in the formulation of the problem.  相似文献   

8.
The problem of updating a structural model and its associated uncertainties by utilizing structural response data is addressed. In an identifiable case, the posterior probability density function (PDF) of the uncertain model parameters for given measured data can be approximated by a weighted sum of Gaussian distributions centered at a number of discrete optimal values of the parameters at which some positive measure‐of‐fit function is minimized. The present paper focuses on the problem of model updating in the general unidentifiable case for which certain simplifying assumptions available for identifiable cases are not valid. In this case, the PDF is distributed in the neighbourhood of an extended and usually highly complex manifold of the parameter space that cannot be calculated explicitly. The computational difficulties associated with calculating the highly complex posterior PDF are discussed and a new adaptive algorithm, referred to as the tangential‐projection (TP) algorithm, allowing for an efficient approximate representation of the above manifold and the posterior PDF is presented. Using this approximation, expressions for calculating the uncertain predictive response are established. A numerical example involving noisy data is presented to demonstrate the proposed method. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

9.
It is common in geostatistics to use the variogram to describe the spatial dependence structure and to use kriging as the spatial prediction methodology. Both methods are sensitive to outlying observations and are strongly influenced by the marginal distribution of the underlying random field. Hence, they lead to unreliable results when applied to extreme value or multimodal data. As an alternative to traditional spatial modeling and interpolation we consider the use of copula functions. This paper extends existing copula-based geostatistical models. We show how location dependent covariates e.g. a spatial trend can be accounted for in spatial copula models. Furthermore, we introduce geostatistical copula-based models that are able to deal with random fields having discrete marginal distributions. We propose three different copula-based spatial interpolation methods. By exploiting the relationship between bivariate copulas and indicator covariances, we present indicator kriging and disjunctive kriging. As a second method we present simple kriging of the rank-transformed data. The third method is a plug-in prediction and generalizes the frequently applied trans-Gaussian kriging. Finally, we report on the results obtained for the so-called Helicopter data set which contains extreme radioactivity measurements.  相似文献   

10.
Due to the fast pace increasing availability and diversity of information sources in environmental sciences, there is a real need of sound statistical mapping techniques for using them jointly inside a unique theoretical framework. As these information sources may vary both with respect to their nature (continuous vs. categorical or qualitative), their spatial density as well as their intrinsic quality (soft vs. hard data), the design of such techniques is a challenging issue. In this paper, an efficient method for combining spatially non-exhaustive categorical and continuous data in a mapping context is proposed, based on the Bayesian maximum entropy paradigm. This approach relies first on the definition of a mixed random field, that can account for a stochastic link between categorical and continuous random fields through the use of a cross-covariance function. When incorporating general knowledge about the first- and second-order moments of these fields, it is shown that, under mild hypotheses, their joint distribution can be expressed as a mixture of conditional Gaussian prior distributions, with parameters estimation that can be obtained from entropy maximization. A posterior distribution that incorporates the various (soft or hard) continuous and categorical data at hand can then be obtained by a straightforward conditionalization step. The use and potential of the method is illustrated by the way of a simulated case study. A comparison with few common geostatistical methods in some limit cases also emphasizes their similarities and differences, both from the theoretical and practical viewpoints. As expected, adding categorical information may significantly improve the spatial prediction of a continuous variable, making this approach powerful and very promising.  相似文献   

11.
The groundwater inverse problem of estimating heterogeneous groundwater model parameters (hydraulic conductivity in this case) given measurements of aquifer response (such as hydraulic heads) is known to be an ill-posed problem, with multiple parameter values giving similar fits to the aquifer response measurements. This problem is further exacerbated due to the lack of extensive data, typical of most real-world problems. In such cases, it is desirable to incorporate expert knowledge in the estimation process to generate more reasonable estimates. This work presents a novel interactive framework, called the ‘Interactive Multi-Objective Genetic Algorithm’ (IMOGA), to solve the groundwater inverse problem considering different sources of quantitative data as well as qualitative expert knowledge about the site. The IMOGA is unique in that it looks at groundwater model calibration as a multi-objective problem consisting of quantitative objectives – calibration error and regularization – and a ‘qualitative’ objective based on the preference of the geological expert for different spatial characteristics of the conductivity field. All these objectives are then included within a multi-objective genetic algorithm to find multiple solutions that represent the best combination of all quantitative and qualitative objectives. A hypothetical aquifer case-study (based on the test case presented by Freyberg [Freyberg DL. An exercise in ground-water model calibration and prediction. Ground Water 1988;26(3)], for which the ‘true’ parameter values are known, is used as a test case to demonstrate the applicability of this method. It is shown that using automated calibration techniques without using expert interaction leads to parameter values that are not consistent with site-knowledge. Adding expert interaction is shown to not only improve the plausibility of the estimated conductivity fields but also the predictive accuracy of the calibrated model.  相似文献   

12.
In many fields of study, and certainly in hydrogeology, uncertainty propagation is a recurring subject. Usually, parametrized probability density functions (PDFs) are used to represent data uncertainty, which limits their use to particular distributions. Often, this problem is solved by Monte Carlo simulation, with the disadvantage that one needs a large number of calculations to achieve reliable results. In this paper, a method is proposed based on a piecewise linear approximation of PDFs. The uncertainty propagation with these discretized PDFs is distribution independent. The method is applied to the upscaling of transmissivity data, and carried out in two steps: the vertical upscaling of conductivity values from borehole data to aquifer scale, and the spatial interpolation of the transmissivities. The results of this first step are complete PDFs of the transmissivities at borehole locations reflecting the uncertainties of the conductivities and the layer thicknesses. The second step results in a spatially distributed transmissivity field with a complete PDF at every grid cell. We argue that the proposed method is applicable to a wide range of uncertainty propagation problems.  相似文献   

13.
Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm, Proxy-based Kriging for Sequential Inversion (ProKSI), relies on a variant of the Expected Improvement, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.  相似文献   

14.
This work evaluated the spatial variability and distribution of heterogeneous hydraulic conductivity (K) in the Choushui River alluvial fan in Taiwan, using ordinary kriging (OK) and mean and individual sequential Gaussian simulations (SGS). A baseline flow model constructed by upscaling parameters was inversely calibrated to determine the pumping and recharge rates. Simulated heads using different K realizations were then compared with historically measured heads. A global/local simulated error between simulated and measured heads was analysed to assess the different spatial variabilities of various estimated K distributions. The results of a MODFLOW simulation indicate that the OK realization had the smallest sum of absolute mean simulation errors (SAMSE) and the SGS realizations preserved the spatial variability of the measured K fields. Moreover, the SAMSE increases as the spatial variability of the K field increases. The OK realization yields small local simulation errors in the measured K field of moderate magnitude, whereas the SGS realizations have small local simulation errors in the measured K fields, with high and low values. The OK realization of K can be applied to perform a deterministic inverse calibration. The mean SGS method is suggested for constructing a K field when the application focuses on extreme values of estimated parameters and small calibration errors, such as in a simulation of contaminant transport in heterogeneous aquifers. The individual SGS realization is useful in stochastically assessing the spatial uncertainty of highly heterogeneous aquifers. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

15.
This work introduces a new variational Bayes data assimilation method for the stochastic estimation of precipitation dynamics using radar observations for short term probabilistic forecasting (nowcasting). A previously developed spatial rainfall model based on the decomposition of the observed precipitation field using a basis function expansion captures the precipitation intensity from radar images as a set of ‘rain cells’. The prior distributions for the basis function parameters are carefully chosen to have a conjugate structure for the precipitation field model to allow a novel variational Bayes method to be applied to estimate the posterior distributions in closed form, based on solving an optimisation problem, in a spirit similar to 3D VAR analysis, but seeking approximations to the posterior distribution rather than simply the most probable state. A hierarchical Kalman filter is used to estimate the advection field based on the assimilated precipitation fields at two times. The model is applied to tracking precipitation dynamics in a realistic setting, using UK Met Office radar data from both a summer convective event and a winter frontal event. The performance of the model is assessed both traditionally and using probabilistic measures of fit based on ROC curves. The model is shown to provide very good assimilation characteristics, and promising forecast skill. Improvements to the forecasting scheme are discussed.  相似文献   

16.
17.
 Being a non-linear method based on a rigorous formalism and an efficient processing of various information sources, the Bayesian maximum entropy (BME) approach has proven to be a very powerful method in the context of continuous spatial random fields, providing much more satisfactory estimates than those obtained from traditional linear geostatistics (i.e., the various kriging techniques). This paper aims at presenting an extension of the BME formalism in the context of categorical spatial random fields. In the first part of the paper, the indicator kriging and cokriging methods are briefly presented and discussed. A special emphasis is put on their inherent limitations, both from the theoretical and practical point of view. The second part aims at presenting the theoretical developments of the BME approach for the case of categorical variables. The three-stage procedure is explained and the formulations for obtaining prior joint distributions and computing posterior conditional distributions are given for various typical cases. The last part of the paper consists in a simulation study for assessing the performance of BME over the traditional indicator (co)kriging techniques. The results of these simulations highlight the theoretical limitations of the indicator approach (negative probability estimates, probability distributions that do not sum up to one, etc.) as well as the much better performance of the BME approach. Estimates are very close to the theoretical conditional probabilities, that can be computed according to the stated simulation hypotheses.  相似文献   

18.
A calibration method to solve the groundwater inverse problem under steady- and transient-state conditions is presented. The method compares kriged and numerical head field gradients to modify hydraulic conductivity without the use of non-linear optimization techniques. The process is repeated iteratively until a close match with piezometric data is reached. The approach includes a damping factor to avoid divergence and oscillation of the solution in areas of low hydraulic gradient and a weighting factor to account for temporal head variation in transient simulations. The efficiency of the method in terms of computing time and calibration results is demonstrated with a synthetic field. It is shown that the proposed method provides parameter fields that reproduce both hydraulic conductivity and piezometric data in few forward model solutions. Stochastic numerical experiments are conducted to evaluate the sensitivity of the method to the damping function and to the head field estimation errors.  相似文献   

19.
Forest inventories are mostly based on field observations, and complete records of spatial tree coordinates are seldom taken. The lack of individual coordinates prevents the use of well stablised statistical inference tools based on the likelihood function. However, the Takacs–Fiksel approach, based on equating two expectations derived from different measures, can be used routinely without any measurement of tree coordinates, just by considering nearest neighbour measurements and the counting of trees at some random positions. Despite this, little attention has been paid to the Takacs–Fiksel method in terms of the type of test function and the type of field observation data considered. Motivated by problems based on field observations, we present a simulation study to analyse and illustrate the quality of the parameter estimates for this estimation approach under distinct simulated scenarios, where several test functions and distinct forest sampling designs are taken into account. Indeed, the type of the chosen test function affects the resulting estimates in terms of the forest field observation considered.  相似文献   

20.
Stochastic optimization methods, such as genetic algorithms, search for the global minimum of the misfit function within a given parameter range and do not require any calculation of the gradients of the misfit surfaces. More importantly, these methods collect a series of models and associated likelihoods that can be used to estimate the posterior probability distribution. However, because genetic algorithms are not a Markov chain Monte Carlo method, the direct use of the genetic‐algorithm‐sampled models and their associated likelihoods produce a biased estimation of the posterior probability distribution. In contrast, Markov chain Monte Carlo methods, such as the Metropolis–Hastings and Gibbs sampler, provide accurate posterior probability distributions but at considerable computational cost. In this paper, we use a hybrid method that combines the speed of a genetic algorithm to find an optimal solution and the accuracy of a Gibbs sampler to obtain a reliable estimation of the posterior probability distributions. First, we test this method on an analytical function and show that the genetic algorithm method cannot recover the true probability distributions and that it tends to underestimate the true uncertainties. Conversely, combining the genetic algorithm optimization with a Gibbs sampler step enables us to recover the true posterior probability distributions. Then, we demonstrate the applicability of this hybrid method by performing one‐dimensional elastic full‐waveform inversions on synthetic and field data. We also discuss how an appropriate genetic algorithm implementation is essential to attenuate the “genetic drift” effect and to maximize the exploration of the model space. In fact, a wide and efficient exploration of the model space is important not only to avoid entrapment in local minima during the genetic algorithm optimization but also to ensure a reliable estimation of the posterior probability distributions in the subsequent Gibbs sampler step.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号