首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
Highly detailed physically based groundwater models are often applied to make predictions of system states under unknown forcing. The required analysis of uncertainty is often unfeasible due to the high computational demand. We combine two possible solution strategies: (1) the use of faster surrogate models; and (2) a robust data worth analysis combining quick first-order second-moment uncertainty quantification with null-space Monte Carlo techniques to account for parametric uncertainty. A structurally and parametrically simplified model and a proper orthogonal decomposition (POD) surrogate are investigated. Data worth estimations by both surrogates are compared against estimates by a complex MODFLOW benchmark model of an aquifer in New Zealand. Data worth is defined as the change in post-calibration predictive uncertainty of groundwater head, river-groundwater exchange flux, and drain flux data, compared to the calibrated model. It incorporates existing observations, potential new measurements of system states (“additional” data) as well as knowledge of model parameters (“parametric” data). The data worth analysis is extended to account for non-uniqueness of model parameters by null-space Monte Carlo sampling. Data worth estimates of the surrogates and the benchmark suggest good agreement for both surrogates in estimating worth of existing data. The structural simplification surrogate only partially reproduces the worth of “additional” data and is unable to estimate “parametric” data, while the POD model is in agreement with the complex benchmark for both “additional” and “parametric” data. The variance of the POD data worth estimates suggests the need to account for parameter non-uniqueness, like presented here, for robust results.  相似文献   

2.
A new methodology for the development of bridge‐specific fragility curves is proposed with a view to improving the reliability of loss assessment in road networks and prioritising retrofit of the bridge stock. The key features of the proposed methodology are the explicit definition of critical limit state thresholds for individual bridge components, with consideration of the effect of varying geometry, material properties, reinforcement and loading patterns on the component capacity; the methodology also includes the quantification of uncertainty in capacity, demand and damage state definition. Advanced analysis methods and tools (nonlinear static analysis and incremental dynamic response history analysis) are used for bridge component capacity and demand estimation, while reduced sampling techniques are used for uncertainty treatment. Whereas uncertainty in both capacity and demand is estimated from nonlinear analysis of detailed inelastic models, in practical application to bridge stocks, the demand is estimated through a standard response spectrum analysis of a simplified elastic model of the bridge. The simplified methodology can be efficiently applied to a large number of bridges (with different characteristics) within a road network, by means of an ad hoc developed software involving the use of a generic (elastic) bridge model, which derives bridge‐specific fragility curves. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

3.
Stochastic ground motion models produce synthetic time‐histories by modulating a white noise sequence through functions that address spectral and temporal properties of the excitation. The resultant ground motions can be then used in simulation‐based seismic risk assessment applications. This is established by relating the parameters of the aforementioned functions to earthquake and site characteristics through predictive relationships. An important concern related to the use of these models is the fact that through current approaches in selecting these predictive relationships, compatibility to the seismic hazard is not guaranteed. This work offers a computationally efficient framework for the modification of stochastic ground motion models to match target intensity measures (IMs) for a specific site and structure of interest. This is set as an optimization problem with a dual objective. The first objective minimizes the discrepancy between the target IMs and the predictions established through the stochastic ground motion model for a chosen earthquake scenario. The second objective constraints the deviation from the model characteristics suggested by existing predictive relationships, guaranteeing that the resultant ground motions not only match the target IMs but are also compatible with regional trends. A framework leveraging kriging surrogate modeling is formulated for performing the resultant multi‐objective optimization, and different computational aspects related to this optimization are discussed in detail. The illustrative implementation shows that the proposed framework can provide ground motions with high compatibility to target IMs with small only deviation from existing predictive relationships and discusses approaches for selecting a final compromise between these two competing objectives.  相似文献   

4.
Assessment of parameter and predictive uncertainty of hydrologic models is an essential part in the field of hydrology. However, during the past decades, research related to hydrologic model uncertainty is mostly done with conceptual models. As is accepted that uncertainty in model predictions arises from measurement errors associated with the system input and output, from model structural errors and from problems with parameter estimation. Unfortunately, non-conceptual models, such as black-box models, also suffer from these problems. In this paper, we take the artificial neural network (ANN) rainfall-runoff model as an example, and the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA) is employed to analysis the parameter and predictive uncertainty of this model. Furthermore, based on the results of uncertainty assessment, we finally arrive at a simpler incomplete-connection artificial neural network (ICANN) model as well as with better performance compared to original ANN rainfall-runoff model. These results not only indicate that SCEM-UA can be a useful tool for uncertainty analysis of ANN model, but also prove that uncertainty does exist in ANN rainfall-runoff model. Additionally, in some way, it presents that the ICANN model is with smaller uncertainty than the original ANN model.  相似文献   

5.
Distributed hydrological models can make predictions with much finer spatial resolution than the supporting field data. They will, however, usually not have a predictive capability at model grid scale due to limitations of data availability and uncertainty of model conceptualizations. In previous publications, we have introduced the Representative Elementary Scale (RES) concept as the theoretically minimum scale at which a model with a given conceptualization has a potential for obtaining a predictive accuracy corresponding to a given acceptable accuracy. The new RES concept has similarities to the 25‐year‐old Representative Elementary Area concept, but it differs in the sense that while Representative Elementary Area addresses similarity between subcatchments by sampling within the catchment, RES focuses on effects of data or conceptualization uncertainty by Monte Carlo simulations followed by a scale analysis. In the present paper, we extend and generalize the RES concept to a framework for assessing the minimum scale of potential predictability of a distributed model applicable also for analyses of different model structures and data availabilities. We present three examples with RES analyses and discuss our findings in relation to Beven's alternative blueprint and environmental modeling philosophy from 2002. While Beven here addresses model structural and parameter uncertainties, he does not provide a thorough methodology for assessing to which extent model predictions for variables that are not measured possess opportunities to have meaningful predictive accuracies, or whether this is impossible due to limitations in data and models. This shortcoming is addressed by the RES framework through its analysis of the relationship between aggregation scale of model results and prediction uncertainties and for considering how alternative model structures and alternative data availability affects the results. We suggest that RES analysis should be applied in all modeling studies that aim to use simulation results at spatial scales smaller than the support scale of the calibration data.  相似文献   

6.
We present a framework for design and deployment of decision support modeling based on metrics which have their roots in the scientific method. Application of these metrics to decision support modeling requires recognition of the importance of data assimilation and predictive uncertainty quantification in this type of modeling. The difficulties of implementing these procedures depend on the relationship between data that is available for assimilation and the nature of the prediction(s) that a decision support model is required to make. Three different data/prediction contexts are identified. Unfortunately, groundwater modeling is generally aligned with the most difficult of these. It is suggested that these difficulties can generally be ameliorated through appropriate model design. This design requires strategic abstraction of parameters and processes in a way that is optimal for the making of one particular prediction but is not necessarily optimal for the making of another. It is further suggested that the focus of decision support modeling should be on the ability of a model to provide receptacles for decision-pertinent information rather than on its purported ability to simulate environmental processes. While models are compromised in both of these roles, this view makes it clear that simulation should serve data assimilation and not the other way around. Data assimilation enables the uncertainties of decision-critical model predictions to be quantified and maybe reduced. Decision support modeling requires this.  相似文献   

7.
Ye Zhang 《Ground water》2014,52(3):343-351
Modeling and calibration of natural aquifers with multiple scales of heterogeneity is a challenging task due to limited subsurface access. While computer modeling plays an essential role in aquifer studies, large uncertainty exists in developing a conceptual model of an aquifer and in calibrating the model for decision making. Due to uncertainties such as a lack of understanding of subsurface processes and a lack of techniques to parameterize the subsurface environment (including hydraulic conductivity, source/sink rate, and aquifer boundary conditions), existing aquifer models often suffer nonuniqueness in calibration, leading to poor predictive capability. A robust calibration methodology is needed that can address the simultaneous estimations of aquifer parameters, source/sink, and boundary conditions. In this paper, we propose a multistage and multiscale approach that addresses subsurface heterogeneity at multiple scales, while reducing uncertainty in estimating the model parameters and model boundary conditions. The key to this approach lies in the appropriate development, verification, and synthesis of existing and new techniques of static and dynamic data integration. In particular, based on a given set of observation data, new inversion techniques can be first used to estimate aquifer large‐scale effective parameters and smoothed boundary conditions, based on which parameter and boundary condition estimation can be refined at increasing detail using standard or highly parameterized estimation techniques.  相似文献   

8.
A key point in the application of multi‐model Bayesian averaging techniques to assess the predictive uncertainty in groundwater modelling applications is the definition of prior model probabilities, which reflect the prior perception about the plausibility of alternative models. In this work the influence of prior knowledge and prior model probabilities on posterior model probabilities, multi‐model predictions, and conceptual model uncertainty estimations is analysed. The sensitivity to prior model probabilities is assessed using an extensive numerical analysis in which the prior probability space of a set of plausible conceptualizations is discretized to obtain a large ensemble of possible combinations of prior model probabilities. Additionally, the value of prior knowledge about alternative models in reducing conceptual model uncertainty is assessed by considering three example knowledge states, expressed as quantitative relations among the alternative models. A constrained maximum entropy approach is used to find the set of prior model probabilities that correspond to the different prior knowledge states. For illustrative purposes, a three‐dimensional hypothetical setup approximated by seven alternative conceptual models is employed. Results show that posterior model probabilities, leading moments of the predictive distributions and estimations of conceptual model uncertainty are very sensitive to prior model probabilities, indicating the relevance of selecting proper prior probabilities. Additionally, including proper prior knowledge improves the predictive performance of the multi‐model approach, expressed by reductions of the multi‐model prediction variances by up to 60% compared with a non‐informative case. However, the ratio between‐model to total variance does not substantially decrease. This suggests that the contribution of conceptual model uncertainty to the total variance cannot be further reduced based only on prior knowledge about the plausibility of alternative models. These results advocate including proper prior knowledge about alternative conceptualizations in combination with extra conditioning data to further reduce conceptual model uncertainty in groundwater modelling predictions. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

9.
In this paper, we assess the performance of the catchment model SIMulated CATchment model (SIMCAT), to predict nitrate and soluble reactive phosphorus concentrations against four monitoring regimes with different spatial and temporal sampling frequencies. The Generalised Likelihood Uncertainty Estimation (GLUE) uncertainty framework is used, along with a general sensitivity analysis to understand relative parameter sensitivity. Improvements to model calibration are explored by introducing more detailed process representation using the Integrated Catchments model (INCA) water quality model, driven by the European hydrological predictions for the environment model. The results show how targeted sampling of headwater watercourses upstream of point discharges is essential for calibrating diffuse loads and can exert a strong influence on the whole‐catchment model performance. Further downstream, if the point discharges and loads are accurately represented, then the improvement in the catchment‐scale model performance is relatively small as more calibration points are added or frequency is increased. The higher‐order, dynamic model integrated catchments model of phosphorus dynamics, which incorporates sediment and biotic interaction, resulted in improved whole‐catchment performance over SIMCAT, although there are still large epistemic uncertainties from land‐phase export coefficients and runoff. However, the very large sampling errors in routine monitoring make it difficult to invest confidence in the modelling, especially because we know phosphorous transport to be very episodic and driven by high flow conditions for which there are few samples. The environmental modelling community seems to have been stuck in this position for some time, and whilst it is useful to use an uncertainty framework to highlight these issues, it has not widely been adopted, perhaps because there is no clear mechanism to allow uncertainties to influence investment decisions. This raises the question as to whether it might better place a cost on uncertainty and use this to drive more data collection or improved models, before making investment decisions concerning, for example, mitigation strategies. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

10.
This paper addresses the application of a data‐based mechanistic (DBM) modelling approach using transfer function models (TFMs) with non‐linear rainfall filtering to predict runoff generation from a semi‐arid catchment (795 km2) in Tanzania. With DBM modelling, time series of rainfall and streamflow were allowed to suggest an appropriate model structure compatible with the data available. The model structures were evaluated by looking at how well the model fitted the data, and how well the parameters of the model were estimated. The results indicated that a parallel model structure is appropriate with a proportion of the runoff being routed through a fast flow pathway and the remainder through a slow flow pathway. Finally, the study employed a Generalized Likelihood Uncertainty Estimation (GLUE) methodology to evaluate the parameter sensitivity and predictive uncertainty based on the feasible parameter ranges chosen from the initial analysis of recession curves and calibration of the TFM. Results showed that parameters that control the slow flow pathway are relatively more sensitive than those that control the fast flow pathway of the hydrograph. Within the GLUE framework, it was found that multiple acceptable parameter sets give a range of predictions. This was found to be an advantage, since it allows the possibility of assessing the uncertainty in predictions as conditioned on the calibration data and then using that uncertainty as part of the decision‐making process arising from any rainfall‐runoff modelling project. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

11.
In order to quantify total error affecting hydrological models and predictions, we must explicitly recognize errors in input data, model structure, model parameters and validation data. This paper tackles the last of these: errors in discharge measurements used to calibrate a rainfall‐runoff model, caused by stage–discharge rating‐curve uncertainty. This uncertainty may be due to several combined sources, including errors in stage and velocity measurements during individual gaugings, assumptions regarding a particular form of stage–discharge relationship, extrapolation of the stage–discharge relationship beyond the maximum gauging, and cross‐section change due to vegetation growth and/or bed movement. A methodology is presented to systematically assess and quantify the uncertainty in discharge measurements due to all of these sources. For a given stage measurement, a complete PDF of true discharge is estimated. Consequently, new model calibration techniques can be introduced to explicitly account for the discharge error distribution. The method is demonstrated for a gravel‐bed river in New Zealand, where all the above uncertainty sources can be identified, including significant uncertainty in cross‐section form due to scour and re‐deposition of sediment. Results show that rigorous consideration of uncertainty in flow data results in significant improvement of the model's ability to predict the observed flow. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

12.
Ground water model calibration using pilot points and regularization   总被引:9,自引:0,他引:9  
Doherty J 《Ground water》2003,41(2):170-177
Use of nonlinear parameter estimation techniques is now commonplace in ground water model calibration. However, there is still ample room for further development of these techniques in order to enable them to extract more information from calibration datasets, to more thoroughly explore the uncertainty associated with model predictions, and to make them easier to implement in various modeling contexts. This paper describes the use of "pilot points" as a methodology for spatial hydraulic property characterization. When used in conjunction with nonlinear parameter estimation software that incorporates advanced regularization functionality (such as PEST), use of pilot points can add a great deal of flexibility to the calibration process at the same time as it makes this process easier to implement. Pilot points can be used either as a substitute for zones of piecewise parameter uniformity, or in conjunction with such zones. In either case, they allow the disposition of areas of high and low hydraulic property value to be inferred through the calibration process, without the need for the modeler to guess the geometry of such areas prior to estimating the parameters that pertain to them. Pilot points and regularization can also be used as an adjunct to geostatistically based stochastic parameterization methods. Using the techniques described herein, a series of hydraulic property fields can be generated, all of which recognize the stochastic characterization of an area at the same time that they satisfy the constraints imposed on hydraulic property values by the need to ensure that model outputs match field measurements. Model predictions can then be made using all of these fields as a mechanism for exploring predictive uncertainty.  相似文献   

13.
J. J. Yu  X. S. Qin  O. Larsen 《水文研究》2015,29(6):1267-1279
A generalized likelihood uncertainty estimation (GLUE) method incorporating moving least squares (MLS) with entropy for stochastic sampling (denoted as GLUE‐MLS‐E) was proposed for uncertainty analysis of flood inundation modelling. The MLS with entropy (MLS‐E) was established according to the pairs of parameters/likelihoods generated from a limited number of direct model executions. It was then applied to approximate the model evaluation to facilitate the target sample acceptance of GLUE during the Monte‐Carlo‐based stochastic simulation process. The results from a case study showed that the proposed GLUE‐MLS‐E method had a comparable performance as GLUE in terms of posterior parameter estimation and predicted confidence intervals; however, it could significantly reduce the computational cost. A comparison to other surrogate models, including MLS, quadratic response surface and artificial neural networks (ANN), revealed that the MLS‐E outperformed others in light of both the predicted confidence interval and the most likely value of water depths. ANN was shown to be a viable alternative, which performed slightly poorer than MLS‐E. The proposed surrogate method in stochastic sampling is of practical significance in computationally expensive problems like flood risk analysis, real‐time forecasting, and simulation‐based engineering design, and has a general applicability in many other numerical simulation fields that requires extensive efforts in uncertainty assessment. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

14.
General circulation model outputs are rarely used directly for quantifying climate change impacts on hydrology, due to their coarse resolution and inherent bias. Bias correction methods are usually applied to correct the statistical deviations of climate model outputs from the observed data. However, the use of bias correction methods for impact studies is often disputable, due to the lack of physical basis and the bias nonstationarity of climate model outputs. With the improvement in model resolution and reliability, it is now possible to investigate the direct use of regional climate model (RCM) outputs for impact studies. This study proposes an approach to use RCM simulations directly for quantifying the hydrological impacts of climate change over North America. With this method, a hydrological model (HSAMI) is specifically calibrated using the RCM simulations at the recent past period. The change in hydrological regimes for a future period (2041–2065) over the reference (1971–1995), simulated using bias‐corrected and nonbias‐corrected simulations, is compared using mean flow, spring high flow, and summer–autumn low flow as indicators. Three RCMs driven by three different general circulation models are used to investigate the uncertainty of hydrological simulations associated with the choice of a bias‐corrected or nonbias‐corrected RCM simulation. The results indicate that the uncertainty envelope is generally watershed and indicator dependent. It is difficult to draw a firm conclusion about whether one method is better than the other. In other words, the bias correction method could bring further uncertainty to future hydrological simulations, in addition to uncertainty related to the choice of a bias correction method. This implies that the nonbias‐corrected results should be provided to end users along with the bias‐corrected ones, along with a detailed explanation of the bias correction procedure. This information would be especially helpful to assist end users in making the most informed decisions.  相似文献   

15.
In this study, uncertainty in model input data (precipitation) and parameters is propagated through a physically based, spatially distributed hydrological model based on the MIKE SHE code. Precipitation uncertainty is accounted for using an ensemble of daily rainfall fields that incorporate four different sources of uncertainty, whereas parameter uncertainty is considered using Latin hypercube sampling. Model predictive uncertainty is assessed for multiple simulated hydrological variables (discharge, groundwater head, evapotranspiration, and soil moisture). Utilizing an extensive set of observational data, effective observational uncertainties for each hydrological variable are assessed. Considering not only model predictive uncertainty but also effective observational uncertainty leads to a notable increase in the number of instances, for which model simulation and observations are in good agreement (e.g., 47% vs. 91% for discharge and 0% vs. 98% for soil moisture). Effective observational uncertainty is in several cases larger than model predictive uncertainty. We conclude that the use of precipitation uncertainty with a realistic spatio‐temporal correlation structure, analyses of multiple variables with different spatial support, and the consideration of observational uncertainty are crucial for adequately evaluating the performance of physically based, spatially distributed hydrological models.  相似文献   

16.
Despite the wealth of soil erosion models available for the prediction of both runoff and soil loss at a variety of scales, little quantification is made of uncertainty and error associated with model output. This in part reflects the need to produce unequivocal or optimal results for the end user, which will often be an unrealistic goal. This paper presents a conceptually simple methodology, Generalized Likelihood Uncertainty Estimation (GLUE), for assessing the degree of uncertainty surrounding output from a physically based soil erosion model, the Water Erosion Prediction Project (WEPP). The ability not only to be explicit about model error but also to evaluate future improvements in parameter estimation, observed data or scientific understanding is demonstrated. This approach is applied to two sets of soil loss/runoff plot replicates, one in the UK and one in the USA. Although it is demonstrated that observations can be largely captured within uncertainty bounds, results indicate that these uncertainty bounds are often wide, reflecting the need to qualify results that derive from ‘optimum’ parameter sets, and to accept the concept of equifinality within soil erosion models. Attention is brought to the problem of under‐prediction of large events/over‐prediction of small events, as an area where model improvements could be made, specifically in the case of relatively dry years. Finally it is proposed that such a technique of model evaluation be employed more widely within the discipline so as to aid the interpretation and understanding of complex model output. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

17.
Monte Carlo simulations of a two‐dimensional depth‐averaged distributed bed‐roughness flow model, TELEMAC‐2D, are used to model a detailed tracer dispersion test in a complex reach of the River Severn in the Generalized Likelihood Uncertainty Estimation (GLUE) framework. A time efficient, zero equation, spatially distributed eddy viscosity model is derived from physical reasoning and used to close the flow equations. It is shown to have the property of low numerical diffusion, avoiding recourse to a globally large value of the eddy viscosity. For models of complex river flows, there are typically so many degrees of freedom in the specification of distributed parameters owing to the limitations of field data collection, that the identification of a unique model structure is unlikely. The data used here to constrain the model structure come from a continuous tracer injection experiment, comprising six spatially distributed time series of concentration measurements. Several hundred Monte‐Carlo simulations of different model structures were investigated and it was found that multiple model structures produced feasible simulations of the tracer mixing, giving rise to the phenomenon of equifinality. Rather than optimizing the model structure on the basis of the constraining data, we derive relative possibility measures that express our relative degree of belief in each model structure. These measures can then be used as weights for assessing predictive uncertainty when using a range of model structures, to estimate the flow distribution under varying stages, or for providing maps indicating fully distributed confidence limits in the risk assessments process. Such an approach is used here, and helps to identify the circumstances under which two‐dimensional modelling can be useful. The framework is not limited to the model structures that are developed herein, and more advanced process representation techniques can be included as computational efficiency increases. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

18.
Keith Beven  Andrew Binley 《水文研究》2014,28(24):5897-5918
This paper reviews the use of the Generalized Likelihood Uncertainty Estimation (GLUE) methodology in the 20 years since the paper by Beven and Binley in Hydrological Processes in (1992), which is now one of the most highly cited papers in hydrology. The original conception, the on‐going controversy it has generated, the nature of different sources of uncertainty and the meaning of the GLUE prediction uncertainty bounds are discussed. The hydrological, rather than statistical, arguments about the nature of model and data errors and uncertainties that are the basis for GLUE are emphasized. The application of the Institute of Hydrology distributed model to the Gwy catchment at Plynlimon presented in the original paper is revisited, using a larger sample of models, a wider range of likelihood evaluations and new visualization techniques. It is concluded that there are good reasons to reject this model for that data set. This is a positive result in a research environment in that it requires improved models or data to be made available. In practice, there may be ethical issues of using outputs from models for which there is evidence for model rejection in decision making. Finally, some suggestions for what is needed in the next 20 years are provided. © 2013 The Authors. Hydrological Processes published by John Wiley & Sons, Ltd.  相似文献   

19.
A decision‐aiding methodology for agricultural groundwater management is presented; it is based on the combination of a watershed model, a groundwater flow model, and an optimization model. This methodology was applied to an agricultural watershed in northeastern Greece. The watershed model used was the Soil and Water Assessment Tool (SWAT), which provided recharge rates for the aquifers. These recharge rates were imported in the well‐known MODFLOW groundwater flow model. Both models were calibrated and verified using field data. Then, the nonlinear optimization problem was solved by a piecewise linearization process, in which the Simplex algorithm was applied sequentially. Apart from several pumping and climate change sensitivity scenarios, a land use change scenario and a climate change scenario, combining the three models, were tested, showing the ability of this methodology to be used in the decision‐making process. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号