首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 645 毫秒
1.
This paper investigates the effects of uncertainty in rock-physics models on reservoir parameter estimation using seismic amplitude variation with angle and controlled-source electromagnetics data. The reservoir parameters are related to electrical resistivity by the Poupon model and to elastic moduli and density by the Xu-White model. To handle uncertainty in the rock-physics models, we consider their outputs to be random functions with modes or means given by the predictions of those rock-physics models and we consider the parameters of the rock-physics models to be random variables defined by specified probability distributions. Using a Bayesian framework and Markov Chain Monte Carlo sampling methods, we are able to obtain estimates of reservoir parameters and information on the uncertainty in the estimation. The developed method is applied to a synthetic case study based on a layered reservoir model and the results show that uncertainty in both rock-physics models and in their parameters may have significant effects on reservoir parameter estimation. When the biases in rock-physics models and in their associated parameters are unknown, conventional joint inversion approaches, which consider rock-physics models as deterministic functions and the model parameters as fixed values, may produce misleading results. The developed stochastic method in this study provides an integrated approach for quantifying how uncertainty and biases in rock-physics models and in their associated parameters affect the estimates of reservoir parameters and therefore is a more robust method for reservoir parameter estimation.  相似文献   

2.
In the paper, we have discovered the abnormal area distribution features of maximum variation values of ground motion parameter uncertainty with different probabilities of exceedance in 50 years within the range of 100°~120°E,29°~42°N for the purpose to solve the problem that abnormal areas of maximum variation values of ground motion parameter uncertainties emerge in a certain cities and towns caused by seismicity parameter uncertainty in a seismic statistical region in an inhomogeneous distribution model that considers tempo-spatial nonuniformity of seismic activity. And we have also approached the interrelation between the risk estimation uncertainty of a site caused by seismicity parameter uncertainty in a seismic statistical region and the delimitation of potential sources, as well as the reasons for forming abnormal areas. The results from the research indicate that the seismicity parameter uncertainty has unequal influence on the uncertainty of risk estimation at each site in a statistical region in the inhomogeneous distribution model, which relates to the scheme for delimiting potential sources. Abnormal areas of maximum variation values of ground motion parameter uncertainty often emerge in the potential sources of Mu≥8 (Mu is upper limit of a potential source) and their vicinity. However, this kind of influence is equal in the homogeneous distribution model. The uncertainty of risk estimation of each site depends on its seat. Generally speaking, the sites located in the middle part of a statistical region are only related to the seismicity parameter uncertainty of the region, while the sites situated in or near the juncture of two or three statistical regions might be subject to the synthetic influences of seismicity parameter uncertainties of several statistical regions.  相似文献   

3.
The inverse problem of parameter structure identification in a distributed parameter system remains challenging. Identifying a more complex parameter structure requires more data. There is also the problem of over-parameterization. In this study, we propose a modified Tabu search for parameter structure identification. We embed an adjoint state procedure in the search process to improve the efficiency of the Tabu search. We use Voronoi tessellation for automatic parameterization to reduce the dimension of the distributed parameter. Additionally, a coarse-fine grid technique is applied to further improve the effectiveness and efficiency of the proposed methodology. To avoid over-parameterization, at each level of parameter complexity we calculate the residual error for parameter fitting, the parameter uncertainty error and a modified Akaike Information Criterion. To demonstrate the proposed methodology, we conduct numerical experiments with synthetic data that simulate both discrete hydraulic conductivity zones and a continuous hydraulic conductivity distribution. Our results indicate that the Tabu search allied with the adjoint state method significantly improves computational efficiency and effectiveness in solving the inverse problem of parameter structure identification.  相似文献   

4.
Modern ground water characterization and remediation projects routinely require calibration and inverse analysis of large three-dimensional numerical models of complex hydrogeological systems. Hydrogeologic complexity can be prompted by various aquifer characteristics including complicated spatial hydrostratigraphy and aquifer recharge from infiltration through an unsaturated zone. To keep the numerical models computationally efficient, compromises are frequently made in the model development, particularly, about resolution of the computational grid and numerical representation of the governing flow equation. The compromise is required so that the model can be used in calibration, parameter estimation, performance assessment, and analysis of sensitivity and uncertainty in model predictions. However, grid properties and resolution as well as applied computational schemes can have large effects on forward-model predictions and on inverse parameter estimates. We investigate these effects for a series of one- and two-dimensional synthetic cases representing saturated and variably saturated flow problems. We show that "conformable" grids, despite neglecting terms in the numerical formulation, can lead to accurate solutions of problems with complex hydrostratigraphy. Our analysis also demonstrates that, despite slower computer run times and higher memory requirements for a given problem size, the control volume finite-element method showed an advantage over finite-difference techniques in accuracy of parameter estimation for a given grid resolution for most of the test problems.  相似文献   

5.
In the last few decades hydrologists have made tremendous progress in using dynamic simulation models for the analysis and understanding of hydrologic systems. However, predictions with these models are often deterministic and as such they focus on the most probable forecast, without an explicit estimate of the associated uncertainty. This uncertainty arises from incomplete process representation, uncertainty in initial conditions, input, output and parameter error. The generalized likelihood uncertainty estimation (GLUE) framework was one of the first attempts to represent prediction uncertainty within the context of Monte Carlo (MC) analysis coupled with Bayesian estimation and propagation of uncertainty. Because of its flexibility, ease of implementation and its suitability for parallel implementation on distributed computer systems, the GLUE method has been used in a wide variety of applications. However, the MC based sampling strategy of the prior parameter space typically utilized in GLUE is not particularly efficient in finding behavioral simulations. This becomes especially problematic for high-dimensional parameter estimation problems, and in the case of complex simulation models that require significant computational time to run and produce the desired output. In this paper we improve the computational efficiency of GLUE by sampling the prior parameter space using an adaptive Markov Chain Monte Carlo scheme (the Shuffled Complex Evolution Metropolis (SCEM-UA) algorithm). Moreover, we propose an alternative strategy to determine the value of the cutoff threshold based on the appropriate coverage of the resulting uncertainty bounds. We demonstrate the superiority of this revised GLUE method with three different conceptual watershed models of increasing complexity, using both synthetic and real-world streamflow data from two catchments with different hydrologic regimes.  相似文献   

6.
Previously we have detailed an application of the generalized likelihood uncertainty estimation (GLUE) procedure to estimate spatially distributed uncertainty in models conditioned against binary pattern data contained in flood inundation maps. This method was applied to two sites where a single consistent synoptic image of inundation extent was available to test the simulation performance of the method. In this paper, we extend this to examine the predictive performance of the method for a reach of the River Severn, west‐central England. Uniquely for this reach, consistent inundation images of two major floods have been acquired from spaceborne synthetic aperture radars, as well as a high‐resolution digital elevation model derived using laser altimetry. These data thus allow rigorous split sample testing of the previous GLUE application. To achieve this, Monte Carlo analyses of parameter uncertainty within the GLUE framework are conducted for a typical hydraulic model applied to each flood event. The best 10% of parameter sets identified in each analysis are then used to map uncertainty in flood extent predictions using the method previously proposed for both an independent validation data set and a design flood. Finally, methods for combining the likelihood information derived from each Monte Carlo ensemble are examined to determine whether this has the potential to reduce uncertainty in spatially distributed measures of flood risk for a design flood. The results show that for this reach and these events, the method previously established is able to produce sharply defined flood risk maps that compare well with observed inundation extent. More generally, we show that even single, poor‐quality inundation extent images are useful in constraining hydraulic model calibrations and that values of effective friction parameters are broadly stationary between the two events simulated, most probably reflecting their similar hydraulics. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

7.
Peiyue Li  Hui Qian  Jianhua Wu 《水文研究》2014,28(4):2293-2301
Accurate knowledge of hydrogeological parameters is essential for groundwater modeling, protection and remediation. Three methods (type curve fitting method, inflection point method and global curve‐fitting method (GCFM)) which are frequently applied in the estimation of leaky aquifer parameters were compared using synthetic pumping tests. The results revealed GCFM could provide best parameter estimation among the three methods with fewer uncertainties associated with the processes of parameter estimation. GCFM was also found to be both time saving and of low cost and is thus more preferable for hydrogeological parameter estimation than the other two methods. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

8.
非线性二次规划贝叶斯叠前反演   总被引:23,自引:11,他引:12       下载免费PDF全文
叠前反演的目的是基于弹性波理论从地震数据中获得地层参数的可靠估计,进而用于描述地层的流体和岩性特征.然而叠前反演问题都是高维的和非适定的,并且容易受各种噪声和采集过程中不确定因素的影响,因此,为了获得稳定可靠的解必需对反演过程加以合理的约束.本文提出了一种基于非线性二次规划的叠前三参数反演方法.首先基于贝叶斯参数估计理论,假设似然函数服从高斯分布,并使待反演的参数服从于改进的Cauchy分布,从而提高了反演结果的分辨率;其次用协方差矩阵来描述参数间的相关程度,进一步提高了反演结果的稳定性;最后将问题转化为一个非线性二次规划的求解问题,并在多种约束下得到问题的解.仿真实验和实际应用皆已表明,本文提出的反演方法运算速度快捷,既使在信噪比很低的情况下也可获得较好的反演结果,为储层的进一步识别提供更多的物性参数.  相似文献   

9.
A number of challenges including instability, nonconvergence, nonuniqueness, nonoptimality, and lack of a general guideline for inverse modelling have limited the application of automatic calibration by generic inversion codes in solving the saltwater intrusion problem in real‐world cases. A systematic parameter selection procedure for the selection of a small number of independent parameters is applied to a real case of saltwater intrusion in a small island aquifer system in the semiarid region of the Persian Gulf. The methodology aims at reducing parameter nonuniqueness and uncertainty and the time spent on inverse modelling computations. Subsequent to the automatic calibration of the numerical model, uncertainty is analysed by constrained nonlinear optimization of the inverse model. The results define the percentage of uncertainty in the parameter estimation that will maintain the model inside a user‐defined neighbourhood of the best possible calibrated model. Sensitivity maps of both pressure and concentration for the small island aquifer system are also developed. These sensitivity maps indicate higher sensitivity of pressure to model parameters compared with concentration. These sensitivity maps serve as a benchmark for correlation analysis and also assist in the selection of observations points of pressure and concentration in the calibration process. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

10.
This paper proposes an approach to estimating the uncertainty related to EPA Storm Water Management Model model parameters, percentage routed (PR) and saturated hydraulic conductivity (Ksat), which are used to calculate stormwater runoff volumes. The methodology proposed in this paper addresses uncertainty through the development of probability distributions for urban hydrologic parameters through extensive calibration to observed flow data in the Philadelphia collection system. The established probability distributions are then applied to the Philadelphia Southeast district model through a Monte Carlo approach to estimate the uncertainty in prediction of combined sewer overflow volumes as related to hydrologic model parameter estimation. Understanding urban hydrology is critical to defining urban water resource problems. A variety of land use types within Philadelphia coupled with a history of cut and fill have resulted in a patchwork of urban fill and native soils. The complexity of urban hydrology can make model parameter estimation and defining model uncertainty a difficult task. The development of probability distributions for hydrologic parameters applied through Monte Carlo simulations provided a significant improvement in estimating model uncertainty over traditional model sensitivity analysis. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

11.
In 1988, an important publication moved model calibration and forecasting beyond case studies and theoretical analysis. It reported on a somewhat idyllic graduate student modeling exercise where many of the system properties were known; the primary forecasts of interest were heads in pumping wells after a river was modified. The model was calibrated using manual trial-and-error approaches where a model's forecast quality was not related to how well it was calibrated. Here, we investigate whether tools widely available today obviate the shortcomings identified 30 years ago. A reconstructed version of the 1988 true model was tested using increasing parameter estimation sophistication. The parameter estimation demonstrated the inverse problem was non-unique because only head data were available for calibration. When a flux observation was included, current parameter estimation approaches were able to overcome all calibration and forecast issues noted in 1988. The best forecasts were obtained from a highly parameterized model that used pilot points for hydraulic conductivity and was constrained with soft knowledge. Like the 1988 results, however, the best calibrated model did not produce the best forecasts due to parameter overfitting. Finally, a computationally frugal linear uncertainty analysis demonstrated that the single-zone model was oversimplified, with only half of the forecasts falling within the calculated uncertainty bounds. Uncertainties from the highly parameterized models had all six forecasts within the calculated uncertainty. The current results outperformed those of the 1988 effort, demonstrating the value of quantitative parameter estimation and uncertainty analysis methods.  相似文献   

12.
In climate models, the land–atmosphere interactions are described numerically by land surface parameterization (LSP) schemes. The continuing improvement in realism in these schemes comes at the expense of the need to specify a large number of parameters that are either directly measured or estimated. Also, an emerging problem is whether the relationships used in LSPs are universal and globally applicable. One plausible approach to evaluate this is to first minimize uncertainty in model parameters by calibration. In this paper, we conduct a comprehensive analysis of some model diagnostics using a slightly modified version of the Simple Biosphere 3 model for a variety of biomes located mainly in the Amazon. First, the degree of influence of each individual parameter in simulating surface fluxes is identified. Next, we estimate parameters using a multi‐operator genetic algorithm applied in a multi‐objective context and evaluate simulations of energy and carbon fluxes against observations. Compared with the default parameter sets, these parameter estimates improve the partitioning of energy fluxes in forest and cropland sites and provide better simulations of daytime increases in assimilation of net carbon during the dry season at forest sites. Finally, a detailed assessment of the parameter estimation problem was performed by accounting for the decomposition of the mean squared error to the total model uncertainty. Analysis of the total prediction uncertainty reveals that the parameter adjustments significantly improve reproduction of the mean and variability of the flux time series at all sites and generally remove seasonality of the errors but do not improve dynamical properties. Our results demonstrate that error decomposition provides a meaningful and intuitive way to understand differences in model performance. To make further advancements in the knowledge of these models, we encourage the LSP community to adopt similar approaches in the future. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

13.
In this paper we extend the generalized likelihood uncertainty estimation (GLUE) technique to estimate spatially distributed uncertainty in models conditioned against binary pattern data contained in flood inundation maps. Untransformed binary pattern data already have been used within GLUE to estimate domain‐averaged (zero‐dimensional) likelihoods, yet the pattern information embedded within such sources has not been used to estimate distributed uncertainty. Where pattern information has been used to map distributed uncertainty it has been transformed into a continuous function prior to use, which may introduce additional errors. To solve this problem we use here ‘raw’ binary pattern data to define a zero‐dimensional global performance measure for each simulation in a Monte Carlo ensemble. Thereafter, for each pixel of the distributed model we evaluate the probability that this pixel was inundated. This probability is then weighted by the measure of global model performance, thus taking into account how well a given parameter set performs overall. The result is a distributed uncertainty measure mapped over real space. The advantage of the approach is that it both captures distributed uncertainty and contains information on global likelihood that can be used to condition predictions of further events for which observed data are not available. The technique is applied to the problem of flood inundation prediction at two test sites representing different hydrodynamic conditions. In both cases, the method reveals the spatial structure in simulation uncertainty and simultaneously enables mapping of flood probability predicted by the model. Spatially distributed uncertainty analysis is shown to contain information over and above that available from global performance measures. Overall, the paper highlights the different types of information that may be obtained from mappings of model uncertainty over real and n‐dimensional parameter spaces. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

14.
Heihe river basin, the second largest inland river basin in China, has attracted more attention in China due to the ever increasing water resources and eco‐environmental problems. In this article, SWAT (Soil and Water Assessment Tool; http://www.brc.tamus.edu/swat/ ) model was applied to upper reaches of the basin for better understanding of the hydrological process over the watershed. Parameter uncertainty and its contribution on model simulation are the main foci. In model calibration, the aggregate parameters instead of the original parameters in SWAT model were used to reduce the computing effort. The Bayesian approach was employed for parameter estimation and uncertainty analysis because its posterior distribution provides not only parameter estimation but also uncertainty analysis without normality assumption. The results indicated that: (1) SWAT model performs satisfactorily in this watershed as a whole, although some low and high flows were under‐ or overestimated, particularly in dry (e.g. 1991) and wet (e.g. 1996) years; (2) all calibrated parameters were not normally distributed (essentially positively or negatively skewed) and the parameter uncertainties were relatively small; and (3) the contributions of parameter uncertainty on model simulation uncertainty were relatively small. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

15.
Physical properties of alluvial environments typically feature a high degree of anisotropy and are characterized by dynamic interactions between the surface and the subsurface. Hydrogeological models are often calibrated under the assumptions of isotropic hydraulic conductivity fields and steady-state conditions. We aim at understanding how these simplifications affect predictions of the water table using physically based models and advanced calibration and uncertainty analysis approaches based on singular value decomposition and Bayesian analysis. Specifically, we present an analysis of the information content provided by steady-state hydraulic data compared to transient data with respect to the estimation of aquifer and riverbed hydraulic properties. We show that assuming isotropy or fixed anisotropy may generate biases both in the estimation of aquifer and riverbed parameters as well as in the predictive uncertainty of the water table. We further demonstrate that the information content provided by steady-state hydraulic heads is insufficient to jointly estimate the aquifer anisotropy together with the aquifer and riverbed hydraulic conductivities and that transient data can help to reduce the predictive uncertainty to a greater extent. The outcomes of the synthetic analysis are applied to the calibration of a dynamic and anisotropic alluvial aquifer in Switzerland (The Rhône River). The results of the synthetic and real world modeling and calibration exercises documented herein provide insight on future data acquisition as well as modeling and calibration strategies for these environments. They also provide an incentive for evaluation and estimation of commonly made simplifying assumptions in order to prevent underestimation of the predictive uncertainty.  相似文献   

16.
17.
This paper proposes a new orientation to address the problem of hydrological model calibration in ungauged basin. Satellite radar altimetric observations of river water level at basin outlet are used to calibrate the model, as a surrogate of streamflow data. To shift the calibration objective, the hydrological model is coupled with a hydraulic model describing the relation between streamflow and water stage. The methodology is illustrated by a case study in the Upper Mississippi Basin using TOPEX/Poseidon (T/P) satellite data. The generalized likelihood uncertainty estimation (GLUE) is employed for model calibration and uncertainty analysis. We found that even without any streamflow information for regulating model behavior, the calibrated hydrological model can make fairly reasonable streamflow estimation. In order to illustrate the degree of additional uncertainty associated with shifting calibration objective and identifying its sources, the posterior distributions of hydrological parameters derived from calibration based on T/P data, streamflow data and T/P data with fixed hydraulic parameters are compared. The results show that the main source is the model parameter uncertainty. And the contribution of remote sensing data uncertainty is minor. Furthermore, the influence of removing high error satellite observations on streamflow estimation is also examined. Under the precondition of sufficient temporal coverage of calibration data, such data screening can eliminate some unrealistic parameter sets from the behavioral group. The study contributes to improve streamflow estimation in ungauged basin and evaluate the value of remote sensing in hydrological modeling. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

18.
Water level time series from groundwater production wells offer a transient dataset that can be used to estimate aquifer properties in areas with active groundwater development. This article describes a new parameter estimation method to infer aquifer properties from such datasets. Specifically, the method analyzes long‐term water level measurements from multiple, interacting groundwater production wells and relies on temporal water level derivatives to estimate the aquifer transmissivity and storativity. Analytically modeled derivatives are compared to derivatives calculated directly from the observed water level data; an optimization technique is used to identify best‐fitting transmissivity and storativity values that minimize the difference between modeled and observed derivatives. We demonstrate how the consideration of derivative (slope) behavior eliminates uncertainty associated with static water levels and well‐loss coefficients, enabling effective use of water level data from groundwater production wells. The method is applied to time‐series data collected over a period of 6 years from a municipal well field operating in the Denver Basin, Colorado (USA). The estimated aquifer properties are shown to be consistent with previously published values. The parameter estimation method is further tested using synthetic water level time series generated with a numerical model that incorporates the style of heterogeneity that occurs in the Denver Basin sandstone aquifers.  相似文献   

19.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

20.
 A comparison of different methods for estimating T-year events is presented, all based on the Extreme Value Type I distribution. Series of annual maximum flood from ten gauging stations at the New Zealand South Island have been used. Different methods of predicting the 100-year event and the connected uncertainty have been applied: At-site estimation and regional index-flood estimation with and without accounting for intersite correlation using either the method of moments or the method of probability weighted moments for parameter estimation. Furthermore, estimation at ungauged sites were considered applying either a log-linear relationship between at-site mean annual flood and catchment characteristics or a direct log-linear relationship between 100-year events and catchment characteristics. Comparison of the results shows that the existence of at-site measurements significantly diminishes the prediction uncertainty and that the presence of intersite correlation tends to increase the uncertainty. A simulation study revealed that in regional index-flood estimation the method of probability weighted moments is preferable to method of moment estimation with regard to bias and RMSE.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号