首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.  相似文献   

2.
Watershed water quality models are increasingly used in management. However, simulations by such complex models often involve significant uncertainty, especially those for non-conventional pollutants which are often poorly monitored. This study first proposed an integrated framework for watershed water quality modeling. Within this framework, Probabilistic Collocation Method (PCM) was then applied to a WARMF model of diazinon pollution to assess the modeling uncertainty. Based on PCM, a global sensitivity analysis method named PCM-VD (VD stands for variance decomposition) was also developed, which quantifies variance contribution of all uncertain parameters. The study results validated the applicability of PCM and PCM-VD to the WARMF model. The PCM-based approach is much more efficient, regarding computational time, than conventional Monte Carlo methods. It has also been demonstrated that analysis using the PCM-based approach could provide insights into data collection, model structure improvement and management practices. It was concluded that the PCM-based approach could play an important role in watershed water quality modeling, as an alternative to conventional Monte Carlo methods to account for parametric uncertainty and uncertainty propagation.  相似文献   

3.
The groundwater interbasin flow, Qy, from the north of Yucca Flat into Yucca Flat simulated using the Death Valley Regional Flow System (DVRFS) model greatly exceeds assessments obtained using other approaches. This study aimed to understand the reasons for the overestimation and to examine whether the Qy estimate can be reduced. The two problems were tackled from the angle of model uncertainty by considering six models revised from the DVRFS model with different recharge components and hydrogeological frameworks. The two problems were also tackled from the angle of parametric uncertainty for each model by first conducting Morris sensitivity analysis to identify important parameters and then conducting Monte Carlo simulations for the important parameters. The uncertainty analysis is general and suitable for tackling similar problems; the Morris sensitivity analysis has been utilized to date in only a limited number of regional groundwater modeling. The simulated Qy values were evaluated by using three kinds of calibration data (i.e., hydraulic head observations, discharge estimates, and constant‐head boundary flow estimates). The evaluation results indicate that, within the current DVRFS modeling framework, the Qy estimate can only be reduced to about half of the original estimate without severely deteriorating the goodness‐of‐fit to the calibration data. The evaluation results also indicate that it is necessary to develop a new hydrogeological framework to produce new flow patterns in the DVRFS model. The issues of hydrogeology and boundary flow are being addressed in a new version of the DVRFS model planned for release by the U.S. Geological Survey.  相似文献   

4.
Water balance variables were monitored in a farmed Mediterranean catchment characterized by a dense ditch network to allow for the separate estimation of the diffuse and concentrated recharge terms during flood events. The 27 ha central part of the catchment was equipped with (i) rain gauges, (ii) ditch gauge stations, (iii) piezometers, (iv) neutron probes, and (v) an eddy covariance mast including a 3D sonic anemometer and a fast hygrometer. The water balance was calculated for two autumnal rain and flood events. We also estimated the uncertainty of this approach with Monte Carlo simulations. Results show, that although ditch area represents only 6% of the total study area, concentrated recharge appeared to be the main source of groundwater recharge. Indeed, it was 40–50% of the total groundwater recharge for autumnal events, which are the major annual recharge events. This indicate that both, concentrated and diffuse recharge should be taken into account in any hydrological modeling approach for Mediterranean catchments. This also means that, since they collect overland flow that is often largely contaminated by chemicals, ditches may be a place where groundwater contamination is likely to occur. The uncertainty analysis indicates that recharge estimates based on water balance exhibit large uncertainty ranges. Nevertheless, Monte Carlo simulations showed that concentrated recharge was higher than expected based on their area.  相似文献   

5.
We designed and evaluated a “tube seepage meter” for point measurements of vertical seepage rates (q), collecting groundwater samples, and estimating vertical hydraulic conductivity (K) in streambeds. Laboratory testing in artificial streambeds show that seepage rates from the tube seepage meter agreed well with expected values. Results of field testing of the tube seepage meter in a sandy‐bottom stream with a mean seepage rate of about 0.5 m/day agreed well with Darcian estimates (vertical hydraulic conductivity times head gradient) when averaged over multiple measurements. The uncertainties in q and K were evaluated with a Monte Carlo method and are typically 20% and 60%, respectively, for field data, and depend on the magnitude of the hydraulic gradient and the uncertainty in head measurements. The primary advantages of the tube seepage meter are its small footprint, concurrent and colocated assessments of q and K, and that it can also be configured as a self‐purging groundwater‐sampling device.  相似文献   

6.
Pump‐and‐treat systems can prevent the migration of groundwater contaminants and candidate systems are typically evaluated with groundwater models. Such models should be rigorously assessed to determine predictive capabilities and numerous tools and techniques for model assessment are available. While various assessment methodologies (e.g., model calibration, uncertainty analysis, and Bayesian inference) are well‐established for groundwater modeling, this paper calls attention to an alternative assessment technique known as screening‐level sensitivity analysis (SLSA). SLSA can quickly quantify first‐order (i.e., main effects) measures of parameter influence in connection with various model outputs. Subsequent comparisons of parameter influence with respect to calibration vs. prediction outputs can suggest gaps in model structure and/or data. Thus, while SLSA has received little attention in the context of groundwater modeling and remedial system design, it can nonetheless serve as a useful and computationally efficient tool for preliminary model assessment. To illustrate the use of SLSA in the context of designing groundwater remediation systems, four SLSA techniques were applied to a hypothetical, yet realistic, pump‐and‐treat case study to determine the relative influence of six hydraulic conductivity parameters. Considered methods were: Taguchi design‐of‐experiments (TDOE); Monte Carlo statistical independence (MCSI) tests; average composite scaled sensitivities (ACSS); and elementary effects sensitivity analysis (EESA). In terms of performance, the various methods identified the same parameters as being the most influential for a given simulation output. Furthermore, results indicate that the background hydraulic conductivity is important for predicting system performance, but calibration outputs are insensitive to this parameter (KBK). The observed insensitivity is attributed to a nonphysical specified‐head boundary condition used in the model formulation which effectively “staples” head values located within the conductivity zone. Thus, potential strategies for improving model predictive capabilities include additional data collection targeting the KBK parameter and/or revision of model structure to reduce the influence of the specified head boundary.  相似文献   

7.
Data assimilation is widely used to improve flood forecasting capability, especially through parameter inference requiring statistical information on the uncertain input parameters (upstream discharge, friction coefficient) as well as on the variability of the water level and its sensitivity with respect to the inputs. For particle filter or ensemble Kalman filter, stochastically estimating probability density function and covariance matrices from a Monte Carlo random sampling requires a large ensemble of model evaluations, limiting their use in real-time application. To tackle this issue, fast surrogate models based on polynomial chaos and Gaussian process can be used to represent the spatially distributed water level in place of solving the shallow water equations. This study investigates the use of these surrogates to estimate probability density functions and covariance matrices at a reduced computational cost and without the loss of accuracy, in the perspective of ensemble-based data assimilation. This study focuses on 1-D steady state flow simulated with MASCARET over the Garonne River (South-West France). Results show that both surrogates feature similar performance to the Monte-Carlo random sampling, but for a much smaller computational budget; a few MASCARET simulations (on the order of 10–100) are sufficient to accurately retrieve covariance matrices and probability density functions all along the river, even where the flow dynamic is more complex due to heterogeneous bathymetry. This paves the way for the design of surrogate strategies suitable for representing unsteady open-channel flows in data assimilation.  相似文献   

8.
Characterization of groundwater contaminant source using Bayesian method   总被引:2,自引:1,他引:1  
Contaminant source identification in groundwater system is critical for remediation strategy implementation, including gathering further samples and analysis, as well as implementing and evaluating different remediation plans. Such problem is usually solved with the aid of groundwater modeling with lots of uncertainty, e.g. existing uncertainty in hydraulic conductivity, measurement variance and the model structure error. Monte Carlo simulation of flow model allows the input uncertainty onto the model predictions of concentration measurements at monitoring sites. Bayesian approach provides the advantage to update estimation. This paper presents an application of a dynamic framework coupling with a three dimensional groundwater modeling scheme in contamination source identification of groundwater. Markov Chain Monte Carlo (MCMC) is being applied to infer the possible location and magnitude of contamination source. Uncertainty existing in heterogonous hydraulic conductivity field is explicitly considered in evaluating the likelihood function. Unlike other inverse-problem approaches to provide single but maybe untrue solution, the MCMC algorithm provides probability distributions over estimated parameters. Results from this algorithm offer a probabilistic inference of the location and concentration of released contamination. The convergence analysis of MCMC reveals the effectiveness of the proposed algorithm. Further investigation to extend this study is also discussed.  相似文献   

9.
This study introduces Bayesian model averaging (BMA) to deal with model structure uncertainty in groundwater management decisions. A robust optimized policy should take into account model parameter uncertainty as well as uncertainty in imprecise model structure. Due to a limited amount of groundwater head data and hydraulic conductivity data, multiple simulation models are developed based on different head boundary condition values and semivariogram models of hydraulic conductivity. Instead of selecting the best simulation model, a variance-window-based BMA method is introduced to the management model to utilize all simulation models to predict chloride concentration. Given different semivariogram models, the spatially correlated hydraulic conductivity distributions are estimated by the generalized parameterization (GP) method that combines the Voronoi zones and the ordinary kriging (OK) estimates. The model weights of BMA are estimated by the Bayesian information criterion (BIC) and the variance window in the maximum likelihood estimation. The simulation models are then weighted to predict chloride concentrations within the constraints of the management model. The methodology is implemented to manage saltwater intrusion in the “1,500-foot” sand aquifer in the Baton Rouge area, Louisiana. The management model aims to obtain optimal joint operations of the hydraulic barrier system and the saltwater extraction system to mitigate saltwater intrusion. A genetic algorithm (GA) is used to obtain the optimal injection and extraction policies. Using the BMA predictions, higher injection rates and pumping rates are needed to cover more constraint violations, which do not occur if a single best model is used.  相似文献   

10.
Today, in different countries, there exist sites with contaminated groundwater formed as a result of inappropriate handling or disposal of hazardous materials or wastes. Numerical modeling of such sites is an important tool for a correct prediction of contamination plume spreading and an assessment of environmental risks associated with the site. Many uncertainties are associated with a part of the parameters and the initial conditions of such environmental numerical models. Statistical techniques are useful to deal with these uncertainties. This paper describes the methods of uncertainty propagation and global sensitivity analysis that are applied to a numerical model of radionuclide migration in a sandy aquifer in the area of the RRC “Kurchatov Institute” radwaste disposal site in Moscow, Russia. We consider 20 uncertain input parameters of the model and 20 output variables (contaminant concentration in the observation wells predicted by the model for the end of 2010). Monte Carlo simulations allow calculating uncertainty in the output values and analyzing the linearity and the monotony of the relations between input and output variables. For the non monotonic relations, sensitivity analyses are classically done with the Sobol sensitivity indices. The originality of this study is the use of modern surrogate models (called response surfaces), the boosting regression trees, constructed for each output variable, to calculate the Sobol indices by the Monte Carlo method. It is thus shown that the most influential parameters of the model are distribution coefficients and infiltration rate in the zone of strong pipe leaks on the site. Improvement of these parameters would considerably reduce the model prediction uncertainty.  相似文献   

11.
The paper discusses the performance and robustness of the Bayesian (probabilistic) approach to seismic tomography enhanced by the numerical Monte Carlo sampling technique. The approach is compared with two other popular techniques, namely the damped least-squares (LSQR) method and the general optimization approach. The theoretical considerations are illustrated by an analysis of seismic data from the Rudna (Poland) copper mine. Contrary to the LSQR and optimization techniques the Bayesian approach allows for construction of not only the “best-fitting” model of the sought velocity distribution but also other estimators, for example the average model which is often expected to be a more robust estimator than the maximum likelihood solution. We demonstrate that using the Markov Chain Monte Carlo sampling technique within the Bayesian approach opens up the possibility of analyzing tomography imaging uncertainties with minimal additional computational effort compared to the robust optimization approach. On the basis of the considered example it is concluded that the Monte Carlo based Bayesian approach offers new possibilities of robust and reliable tomography imaging.  相似文献   

12.
Sites with a limited overburden over a stiff basement are of particular relevance for seismic site response. The characterization of such stratigraphies by means of surface wave methods poses some difficulties in interpretation. Indeed the presence of sharp seismic contrasts between the sediments and the shallow bedrock is likely to cause a relevance of higher modes in the surface wave apparent dispersion curve, which must be properly taken into account in order to provide reliable results. In this study a Monte Carlo algorithm based on a multimodal misfit function has been used for the inversion of experimental dispersion curves. Case histories related to the characterization of stations of the Italian accelerometric network are reported. Spectral ratios and amplification functions associated to each site are moreover evaluated to provide an independent benchmark test. The results show the robustness of the inversion method in such non-trivial conditions and the possibility of getting an estimate of uncertainty related to solution non-uniqueness.  相似文献   

13.
During the past decades much progress has been made in the development of computer based methods for parameter and predictive uncertainty estimation of hydrologic models. The goal of this paper is twofold. As part of this special anniversary issue we first shortly review the most important historical developments in hydrologic model calibration and uncertainty analysis that has led to current perspectives. Then, we introduce theory, concepts and simulation results of a novel data assimilation scheme for joint inference of model parameters and state variables. This Particle-DREAM method combines the strengths of sequential Monte Carlo sampling and Markov chain Monte Carlo simulation and is especially designed for treatment of forcing, parameter, model structural and calibration data error. Two different variants of Particle-DREAM are presented to satisfy assumptions regarding the temporal behavior of the model parameters. Simulation results using a 40-dimensional atmospheric “toy” model, the Lorenz attractor and a rainfall–runoff model show that Particle-DREAM, P-DREAM(VP) and P-DREAM(IP) require far fewer particles than current state-of-the-art filters to closely track the evolving target distribution of interest, and provide important insights into the information content of discharge data and non-stationarity of model parameters. Our development follows formal Bayes, yet Particle-DREAM and its variants readily accommodate hydrologic signatures, informal likelihood functions or other (in)sufficient statistics if those better represent the salient features of the calibration data and simulation model used.  相似文献   

14.
The goal of quantile regression is to estimate conditional quantiles for specified values of quantile probability using linear or nonlinear regression equations. These estimates are prone to “quantile crossing”, where regression predictions for different quantile probabilities do not increase as probability increases. In the context of the environmental sciences, this could, for example, lead to estimates of the magnitude of a 10-year return period rainstorm that exceed the 20-year storm, or similar nonphysical results. This problem, as well as the potential for overfitting, is exacerbated for small to moderate sample sizes and for nonlinear quantile regression models. As a remedy, this study introduces a novel nonlinear quantile regression model, the monotone composite quantile regression neural network (MCQRNN), that (1) simultaneously estimates multiple non-crossing, nonlinear conditional quantile functions; (2) allows for optional monotonicity, positivity/non-negativity, and generalized additive model constraints; and (3) can be adapted to estimate standard least-squares regression and non-crossing expectile regression functions. First, the MCQRNN model is evaluated on synthetic data from multiple functions and error distributions using Monte Carlo simulations. MCQRNN outperforms the benchmark models, especially for non-normal error distributions. Next, the MCQRNN model is applied to real-world climate data by estimating rainfall Intensity–Duration–Frequency (IDF) curves at locations in Canada. IDF curves summarize the relationship between the intensity and occurrence frequency of extreme rainfall over storm durations ranging from minutes to a day. Because annual maximum rainfall intensity is a non-negative quantity that should increase monotonically as the occurrence frequency and storm duration decrease, monotonicity and non-negativity constraints are key constraints in IDF curve estimation. In comparison to standard QRNN models, the ability of the MCQRNN model to incorporate these constraints, in addition to non-crossing, leads to more robust and realistic estimates of extreme rainfall.  相似文献   

15.
We present a methodology conducive to the application of a Galerkin model order reduction technique, Proper Orthogonal Decomposition (POD), to solve a groundwater flow problem driven by spatially distributed stochastic forcing terms. Typical applications of POD to reducing time-dependent deterministic partial differential equations (PDEs) involve solving the governing PDE at some observation times (termed snapshots), which are then used in the order reduction of the problem. Here, the application of POD to solve the stochastic flow problem relies on selecting the snapshots in the probability space of the random quantity of interest. This allows casting a standard Monte Carlo (MC) solution of the groundwater flow field into a Reduced Order Monte Carlo (ROMC) framework. We explore the robustness of the ROMC methodology by way of a set of numerical examples involving two-dimensional steady-state groundwater flow taking place within an aquifer of uniform hydraulic properties and subject to a randomly distributed recharge. We analyze the impact of (i) the number of snapshots selected from the hydraulic heads probability space, (ii) the associated number of principal components, and (iii) the key geostatistical parameters describing the heterogeneity of the distributed recharge on the performance of the method. We find that our ROMC scheme can improve significantly the computational efficiency of a standard MC framework while keeping the same degree of accuracy in providing the leading statistical moments (i.e. mean and covariance) as well as the sample probability density of the state variable of interest.  相似文献   

16.
17.
18.
J.J. Yu 《水文科学杂志》2013,58(12):2117-2131
Abstract

A generalized likelihood uncertainty estimation (GLUE) framework coupling with artificial neural network (ANN) models in two surrogate schemes (i.e. GAE-S1 and GAE-S2) was proposed to improve the efficiency of uncertainty assessment in flood inundation modelling. The GAE-S1 scheme was to construct an ANN to approximate the relationship between model likelihoods and uncertain parameters for facilitating sample acceptance/rejection instead of running the numerical model directly; thus, it could speed up the Monte Carlo simulation in stochastic sampling. The GAE-S2 scheme was to establish independent ANN models for water depth predictions to emulate the numerical models; it could facilitate efficient uncertainty analysis without additional model runs for locations concerned under various scenarios. The results from a study case showed that both GAE-S1 and GAE-S2 had comparable performances to GLUE in terms of estimation of posterior parameters, prediction intervals of water depth, and probabilistic inundation maps, but with reduced computational requirements. The results also revealed that GAE-S1 possessed a slightly better performance in accuracy (referencing to GLUE) than GAE-S2, but a lower flexibility in application. This study shed some light on how to apply different surrogate schemes in using numerical models for uncertainty assessment, and could help decision makers in choosing cost-effective ways of conducting flood risk analysis.  相似文献   

19.
Surface-wave tests are based on the solution of an inverse problem for shear-wave velocity profile identification from the experimentally measured dispersion curve. The main criticisms for these testing methodologies are related to the inverse problem solution and arise from the possible equivalence of different shear-wave velocity profiles. In this paper, some implications of solution non-uniqueness for seismic response studies are investigated using both numerical simulations and experimental data. A Monte Carlo approach for the inversion problem has been used to obtain a set of equivalent shear-wave velocity models. This selection is based on a statistical test which takes into account both data uncertainty and model parameterization. This set of solutions (i.e., soil profiles) is then used to evaluate the seismic response with a conventional one-dimensional analysis. It is shown that equivalent profiles with respect to surface-wave testing are equivalent also with respect to site amplification, thus countering the criticism related to inversion uncertainty for the engineering use of surface-wave tests.  相似文献   

20.
Eutrophication of aquatic ecosystems is one of the most pressing water quality concerns in the United States and around the world. Bank erosion has been largely overlooked as a source of nutrient loading, despite field studies demonstrating that this source can account for the majority of the total phosphorus load in a watershed. Substantial effort has been made to develop mechanistic models to predict bank erosion and instability in stream systems; however, these models do not account for inherent natural variability in input values. To quantify the impacts of this omission, uncertainty and sensitivity analyses were performed on the Bank Stability and Toe Erosion Model (BSTEM), a mechanistic model developed by the US Department of Agriculture – Agricultural Research Service (USDA‐ARS) that simulates both mass wasting and fluvial erosion of streambanks. Generally, bank height, soil cohesion, and plant species were found to be most influential in determining stability of clay (cohesive) banks. In addition to these three inputs, groundwater elevation, stream stage, and bank angle were also identified as important in sand (non‐cohesive) banks. Slope and bank height are the dominant variables in fluvial erosion modeling, while erodibility and critical shear stress had low sensitivity indices; however, these indices do not reflect the importance of critical shear stress in determining the timing of erosion events. These results identify important variables that should be the focus of data collection efforts while also indicating which less influential variables may be set to assumed values. In addition, a probabilistic Monte‐Carlo modeling approach was applied to data from a watershed‐scale sediment and phosphorus loading study on the Missisquoi River, Vermont to quantify uncertainty associated with these published results. While our estimates aligned well with previous deterministic modeling results, the uncertainty associated with these predictions suggests that they should be considered order of magnitude estimates only. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号