首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Gurdak JJ  McCray JE  Thyne G  Qi SL 《Ground water》2007,45(3):348-361
A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability.  相似文献   

2.
MODFLOW 2000 head uncertainty,a first-order second moment method   总被引:1,自引:0,他引:1  
A computationally efficient method to estimate the variance and covariance in piezometric head results computed through MODFLOW 2000 using a first-order second moment (FOSM) approach is presented. This methodology employs a first-order Taylor series expansion to combine model sensitivity with uncertainty in geologic data. MODFLOW 2000 is used to calculate both the ground water head and the sensitivity of head to changes in input data. From a limited number of samples, geologic data are extrapolated and their associated uncertainties are computed through a conditional probability calculation. Combining the spatially related sensitivity and input uncertainty produces the variance-covariance matrix, the diagonal of which is used to yield the standard deviation in MODFLOW 2000 head. The variance in piezometric head can be used for calibrating the model, estimating confidence intervals, directing exploration, and evaluating the reliability of a design. A case study illustrates the approach, where aquifer transmissivity is the spatially related uncertain geologic input data. The FOSM methodology is shown to be applicable for calculating output uncertainty for (1) spatially related input and output data, and (2) multiple input parameters (transmissivity and recharge).  相似文献   

3.
This paper investigates the effects of uncertainty in rock-physics models on reservoir parameter estimation using seismic amplitude variation with angle and controlled-source electromagnetics data. The reservoir parameters are related to electrical resistivity by the Poupon model and to elastic moduli and density by the Xu-White model. To handle uncertainty in the rock-physics models, we consider their outputs to be random functions with modes or means given by the predictions of those rock-physics models and we consider the parameters of the rock-physics models to be random variables defined by specified probability distributions. Using a Bayesian framework and Markov Chain Monte Carlo sampling methods, we are able to obtain estimates of reservoir parameters and information on the uncertainty in the estimation. The developed method is applied to a synthetic case study based on a layered reservoir model and the results show that uncertainty in both rock-physics models and in their parameters may have significant effects on reservoir parameter estimation. When the biases in rock-physics models and in their associated parameters are unknown, conventional joint inversion approaches, which consider rock-physics models as deterministic functions and the model parameters as fixed values, may produce misleading results. The developed stochastic method in this study provides an integrated approach for quantifying how uncertainty and biases in rock-physics models and in their associated parameters affect the estimates of reservoir parameters and therefore is a more robust method for reservoir parameter estimation.  相似文献   

4.
In this study, we evaluate uncertainties propagated through different climate data sets in seasonal and annual hydrological simulations over 10 subarctic watersheds of northern Manitoba, Canada, using the variable infiltration capacity (VIC) model. Further, we perform a comprehensive sensitivity and uncertainty analysis of the VIC model using a robust and state-of-the-art approach. The VIC model simulations utilize the recently developed variogram analysis of response surfaces (VARS) technique that requires in this application more than 6,000 model simulations for a 30-year (1981–2010) study period. The method seeks parameter sensitivity, identifies influential parameters, and showcases streamflow sensitivity to parameter uncertainty at seasonal and annual timescales. Results suggest that the Ensemble VIC simulations match observed streamflow closest, whereas global reanalysis products yield high flows (0.5–3.0 mm day−1) against observations and an overestimation (10–60%) in seasonal and annual water balance terms. VIC parameters exhibit seasonal importance in VARS, and the choice of input data and performance metrics substantially affect sensitivity analysis. Uncertainty propagation due to input forcing selection in each water balance term (i.e., total runoff, soil moisture, and evapotranspiration) is examined separately to show both time and space dimensionality in available forcing data at seasonal and annual timescales. Reliable input forcing, the most influential model parameters, and the uncertainty envelope in streamflow prediction are presented for the VIC model. These results, along with some specific recommendations, are expected to assist the broader VIC modelling community and other users of VARS and land surface schemes, to enhance their modelling applications.  相似文献   

5.
Probabilistic-fuzzy health risk modeling   总被引:3,自引:2,他引:1  
Health risk analysis of multi-pathway exposure to contaminated water involves the use of mechanistic models that include many uncertain and highly variable parameters. Currently, the uncertainties in these models are treated using statistical approaches. However, not all uncertainties in data or model parameters are due to randomness. Other sources of imprecision that may lead to uncertainty include scarce or incomplete data, measurement error, data obtained from expert judgment, or subjective interpretation of available information. These kinds of uncertainties and also the non-random uncertainty cannot be treated solely by statistical methods. In this paper we propose the use of fuzzy set theory together with probability theory to incorporate uncertainties into the health risk analysis. We identify this approach as probabilistic-fuzzy risk assessment (PFRA). Based on the form of available information, fuzzy set theory, probability theory, or a combination of both can be used to incorporate parameter uncertainty and variability into mechanistic risk assessment models. In this study, tap water concentration is used as the source of contamination in the human exposure model. Ingestion, inhalation and dermal contact are considered as multiple exposure pathways. The tap water concentration of the contaminant and cancer potency factors for ingestion, inhalation and dermal contact are treated as fuzzy variables while the remaining model parameters are treated using probability density functions. Combined utilization of fuzzy and random variables produces membership functions of risk to individuals at different fractiles of risk as well as probability distributions of risk for various alpha-cut levels of the membership function. The proposed method provides a robust approach in evaluating human health risk to exposure when there is both uncertainty and variability in model parameters. PFRA allows utilization of certain types of information which have not been used directly in existing risk assessment methods.  相似文献   

6.
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster–Shafer (D–S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D–S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D–S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D–S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster–Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D–S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D–S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.  相似文献   

7.
Site response analysis is strongly influenced by the uncertainty associated to the definition of soil properties and model parameters. Deterministic, or even parametric analyses are unable to systematically assess such uncertainty, since the site characterisation can hardly be sufficiently accurate for a deterministic prediction of site response and alternative approaches are hence needed. A fully stochastic procedure for estimating the site amplification of ground motion is proposed and applied to a case study in central Italy. The methodology allows to take into account the record-to-record variability in an input ground motion and the uncertainty in dynamic soil properties and in the definition of the soil model. In particular, their effect on response spectra at the ground surface is evaluated.  相似文献   

8.
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.  相似文献   

9.
In this paper, we analyse the uncertainty and parameter sensitivity of a conceptual water quality model, based on a travel time distribution (TTD) approach, simulating electrical conductivity (EC) in the Duck River, Northwest Tasmania, Australia for a 2-year period. Dynamic TTDs of stream water were estimated using the StorAge Selection (SAS) approach, which was coupled with two alternate methods to model stream water EC: (1) a solute-balance approach and (2) a water age-based approach. Uncertainty analysis using the Differential Evaluation Adoptive Metropolis (DREAM) algorithm showed that: 1. parameter uncertainty was a small contribution to the overall uncertainty; 2. most uncertainty was related to input data uncertainty and model structure; 3. slightly lower total error was obtained in the water age-based model than the solute-balance model; 4. using time-variant SAS functions reduced the model uncertainty markedly, which likely reflects the effect of dynamic hydrological conditions over the year affecting the relative importance of different flow pathways over time. Model parameter sensitivity analysis using the Variogram Analysis of Response Surfaces (VARS-TOOL) framework found that parameters directly related to the EC concentration were most sensitive. In the solute-balance model, the rainfall concentration Crain and in the age-based model, the parameter controlling the rate of change of EC with age (λ) were the most sensitive parameter. Model parameters controlling the age mixes of both evapotranspiration and streamflow water fluxes (i.e., the SAS function parameters) were influential for the solute-balance model. Little change in parameter sensitivity over time was found for the age-based concentration relationship; however, the parameter sensitivity was quite dynamic over time for the solute-balance approach. The overarching outcomes provide water quality modellers, engineers and managers greater insight into catchment functioning and its dependence on hydrological conditions.  相似文献   

10.
Naturally occurring long-term mean annual base recharge to ground water in Nebraska was estimated with the help of a water-balance approach and an objective automated technique for base-flow separation involving minimal parameter-optimization requirements. Base recharge is equal to total recharge minus the amount of evapotranspiration coming directly from ground water. The estimation of evapotranspiration in the water-balance equation avoids the need to specify a contributing drainage area for ground water, which in certain cases may be considerably different from the drainage area for surface runoff. Evapotranspiration was calculated by the WREVAP model at the Solar and Meteorological Surface Observation Network (SAMSON) sites. Long-term mean annual base recharge was derived by determining the product of estimated long-term mean annual runoff (the difference between precipitation and evapotranspiration) and the base-flow index (BFI). The BFI was calculated from discharge data obtained from the U.S. Geological Survey's gauging stations in Nebraska. Mapping was achieved by using geographic information systems (GIS) and geostatistics. This approach is best suited for regional-scale applications. It does not require complex hydrogeologic modeling nor detailed knowledge of soil characteristics, vegetation cover, or land-use practices. Long-term mean annual base recharge rates in excess of 110 mm/year resulted in the extreme eastern part of Nebraska. The western portion of the state expressed rates of only 15 to 20 mm annually, while the Sandhills region of north-central Nebraska was estimated to receive twice as much base recharge (40 to 50 mm/year) as areas south of it.  相似文献   

11.
地震岩相识别概率表征方法   总被引:4,自引:3,他引:1       下载免费PDF全文
储层岩相分布信息是油藏表征的重要参数,基于地震资料开展储层岩相识别通常具有较强的不确定性.传统方法仅获取唯一确定的岩相分布信息,无法解析反演结果的不确定性,增加了油藏评价的风险.本文引入基于概率统计的多步骤反演方法开展地震岩相识别,通过在其各个环节建立输入与输出参量的统计关系,然后融合各环节概率统计信息构建地震数据与储层岩相的条件概率关系以反演岩相分布概率信息.与传统方法相比,文中方法通过概率统计关系表征了地震岩相识别各个环节中地球物理响应关系的不确定性,并通过融合各环节概率信息实现了不确定性传递的数值模拟,最终反演的岩相概率信息能够客观准确地反映地震岩相识别结果的不确定性,为油藏评价及储层建模提供了重要参考信息.模型数据和实际资料应用验证了方法的有效性.  相似文献   

12.
Estimating and mapping spatial uncertainty of environmental variables is crucial for environmental evaluation and decision making. For a continuous spatial variable, estimation of spatial uncertainty may be conducted in the form of estimating the probability of (not) exceeding a threshold value. In this paper, we introduced a Markov chain geostatistical approach for estimating threshold-exceeding probabilities. The differences of this approach compared to the conventional indicator approach lie with its nonlinear estimators—Markov chain random field models and its incorporation of interclass dependencies through transiograms. We estimated threshold-exceeding probability maps of clay layer thickness through simulation (i.e., using a number of realizations simulated by Markov chain sequential simulation) and interpolation (i.e., direct conditional probability estimation using only the indicator values of sample data), respectively. To evaluate the approach, we also estimated those probability maps using sequential indicator simulation and indicator kriging interpolation. Our results show that (i) the Markov chain approach provides an effective alternative for spatial uncertainty assessment of environmental spatial variables and the probability maps from this approach are more reasonable than those from conventional indicator geostatistics, and (ii) the probability maps estimated through sequential simulation are more realistic than those through interpolation because the latter display some uneven transitions caused by spatial structures of the sample data.  相似文献   

13.
2D Monte Carlo versus 2D Fuzzy Monte Carlo health risk assessment   总被引:15,自引:4,他引:11  
Risk estimates can be calculated using crisp estimates of the exposure variables (i.e., contaminant concentration, contact rate, exposure frequency and duration, body weight, and averaging time). However, aggregate and cumulative exposure studies require a better understanding of exposure variables and uncertainty and variability associated with them. Probabilistic risk assessment (PRA) studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. Two-dimensional Monte Carlo Analysis (2D MCA) is one of the advanced modeling approaches that may be used to conduct PRA studies. In this analysis the variables of the risk equation along with the parameters of these variables (for example mean and standard deviation for a normal distribution) are described in terms of probability density functions (PDFs). A variable described in this way is called a second order random variable. Significant data or considerable insight to uncertainty associated with these variables is necessary to develop the appropriate PDFs for these random parameters. Typically, available data and accuracy and reliability of such data are not sufficient for conducting a reliable 2D MCA. Thus, other theories and computational methods that propagate uncertainty and variability in exposure and health risk assessment are needed. One such theory is possibility analysis based on fuzzy set theory, which allows the utilization of incomplete information (incomplete information includes vague and imprecise information that is not sufficient to generate probability distributions for the parameters of the random variables of the risk equation) together with expert judgment. In this paper, as an alternative to 2D MCA, we are proposing a 2D Fuzzy Monte Carlo Analysis (2D FMCA) to overcome this difficulty. In this approach, instead of describing the parameters of PDFs used in defining the variables of the risk equation as random variables, we describe them as fuzzy numbers. This approach introduces new concepts and risk characterization methods. In this paper we provide a comparison of these two approaches relative to their computational requirements, data requirements and availability. For a hypothetical case, we also provide a comperative interpretation of the results generated.  相似文献   

14.
M5 model tree based modelling of reference evapotranspiration   总被引:1,自引:0,他引:1  
This paper investigates the potential of M5 model tree based regression approach to model daily reference evapotranspiration using climatic data of Davis station maintained by California irrigation Management Information System (CIMIS). Four inputs including solar radiation, average air temperature, average relative humidity, and average wind speed whereas reference evapotranspiration calculated using a relation provided by the CIMIS was used as output. To compare the performance of M5 model tree in predicting the reference evapotranspiration, FAO–56 Penman–Monteith equation and calibrated Hargreaves–Samani relation was used. A comparison of results suggests that M5 model tree approach works well in comparison to both FAO–56 and calibrated Hargreaves–Samani relations. To judge the generalization capability of M5 model tree approach, model created by using the Davis data set was tested with the datasets of four different sites. Results from this part of the study suggest that M5 model tree could successfully be employed in modeling the reference evapotranspiration. Further, sensitivity analysis with M5 model tree approach suggests the suitability of solar radiation, average air temperature, average relative humidity, and average wind speed as input parameters to model the reference evapotranspiration Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

15.
Factorial two-stage stochastic programming for water resources management   总被引:3,自引:3,他引:0  
This study presents a factorial two-stage stochastic programming (FTSP) approach for supporting water resource management under uncertainty. FTSP is developed through the integration of factorial analysis and two-stage stochastic programming (TSP) methods into a general modeling framework. It can handle uncertainties expressed as probability distributions and interval numbers. This approach has two advantages in comparison to conventional inexact TSP methods. Firstly, FTSP inherits merits of conventional inexact two-stage optimization approaches. Secondly, it can provide detailed effects of uncertain parameters and their interactions on the system performance. The developed FTSP method is applied to a hypothetical case study of water resources systems analysis. The results indicate that significant factors and their interactions can be identified. They can be further analyzed for generating water allocation decision alternatives in municipal, industrial and agricultural sectors. Reasonable water allocation schemes can thus be formulated based on the resulting information of detailed effects from various impact factors and their interactions. Consequently, maximized net system benefit can be achieved.  相似文献   

16.
Traditional accuracy assessment of satellite-derived maps relies on a confusion matrix and its associated indices built by comparing ground truth observations and classification outputs at specific locations. These indices may be applied at the map-level or at the class level. However, the spatial variation of the accuracy is not captured by those statistics. Pixel-level thematic uncertainty measures derived from class membership probability vectors can provide such spatially explicit information. In this paper, a new information-based criterion—the equivalent reference probability—is introduced to provide a synoptic thematic uncertainty measure that has the advantage of taking the maximum probability value into account while committing for the full set of probabilities. The fundamental theoretical properties of this indicator was first highlighted and its use was afterwards demonstrated on a real case study in Belgium. Results showed that the proposed approach positively correlates with the quality of the classification and is more sensitive than the classical maximum probability criterion. As this information-based criterion can be used for providing spatially explicit maps of thematic uncertainty quality, it provides substantial additional information regarding classification quality compared to conventional quality measures. Accordingly, it proved to be useful both for end-users and map producers as a way to better understand the nature of the errors and to subsequently improve the map quality.  相似文献   

17.
In previous work, we presented a method for estimation and correction of non-linear mathematical model structures, within a Bayesian framework, by merging uncertain knowledge about process physics with uncertain and incomplete observations of dynamical input-state-output behavior. The resulting uncertainty in the model input-state-output mapping is expressed as a weighted combination of an uncertain conceptual model prior and a data-derived probability density function, with weights depending on the conditional data density. Our algorithm is based on the use of iterative data assimilation to update a conceptual model prior using observed system data, and thereby construct a posterior estimate of the model structure (the mathematical form of the equation itself, not just its parameters) that is consistent with both physically based prior knowledge and with the information in the data. An important aspect of the approach is that it facilitates a clear differentiation between the influences of different types of uncertainties (initial condition, input, and mapping structure) on the model prediction. Further, if some prior assumptions regarding the structural (mathematical) forms of the model equations exist, the procedure can help reveal errors in those forms and how they should be corrected. This paper examines the properties of the approach by investigating two case studies in considerable detail. The results show how, and to what degree, the structure of a dynamical hydrological model can be estimated without little or no prior knowledge (or under conditions of incorrect prior information) regarding the functional forms of the storage–streamflow and storage–evapotranspiration relationships. The importance and implications of careful specification of the model prior are illustrated and discussed.  相似文献   

18.
Nowadays, Flood Forecasting and Warning Systems (FFWSs) are known as the most inexpensive and efficient non‐structural measures for flood damage mitigation in the world. Benefit to cost of the FFWSs has been reported to be several times of other flood mitigation measures. Beside these advantages, uncertainty in flood predictions is a subject that may affect FFWS's reliability and the benefits of these systems. Determining the reliability of advanced flood warning systems based on the rainfall–runoff models is a challenge in assessment of the FFWS performance which is the subject of this study. In this paper, a stochastic methodology is proposed to provide the uncertainty band of the rainfall–runoff model and to calculate the probability of acceptable forecasts. The proposed method is based on Monte Carlo simulation and multivariate analysis of the predicted time and discharge error data sets. For this purpose, after the calibration of the rainfall–runoff model, the probability distributions of input calibration parameters and uncertainty band of the model are estimated through the Bayesian inference. Then, data sets of the time and discharge errors are calculated using the Monte Carlo simulation, and the probability of acceptable model forecasts is calculated by multivariate analysis of data using copula functions. The proposed approach was applied for a small watershed in Iran as a case study. The results showed using rainfall–runoff modeling based on real‐time precipitation is not enough to attain high performance for FFWSs in small watersheds, and it seems using weather forecasts as the inputs of rainfall–runoff models is essential to increase lead times and the reliability of FFWSs in small watersheds. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

19.
The Gassmann relations of poroelasticity provide a connection between the dry and the saturated elastic moduli of porous rock and are useful in a variety of petroleum geoscience applications. Because some uncertainty is usually associated with the input parameters, the propagation of error in the inputs into the final moduli estimates is immediately of interest. Two common approaches to error propagation include: a first-order Taylor series expansion and Monte-Carlo methods. The Taylor series approach requires derivatives, which are obtained either analytically or numerically and is usually limited to a first-order analysis. The formulae for analytical derivatives were often prohibitively complicated before modern symbolic computation packages became prevalent but they are now more accessible. We apply this method and present formulae for uncertainty in the predicted bulk and shear moduli for two forms of the Gassmann relations. Numerical results obtained with these uncertainty formulae are compared with Monte-Carlo calculations as a form of validation and to illustrate the relative characteristics of the two approaches. Particular emphasis is given to the problem of correlated variables, which are often ignored in naïve approaches to error analysis. Going out to the error level that the two methods were compared, the means agree and the variance of the Monte Carlo method for bulk modulus grows with input error.  相似文献   

20.
Conceptual hydrological models are popular tools for simulating land phase of hydrological cycle. Uncertainty arises from a variety of sources such as input error, calibration and parameters. Hydrologic modeling researches indicate that parametric uncertainty has been considered as one of the most important source. The objective of this study was to evaluate parameter uncertainty and its propagation in rainfall-runoff modeling. This study tried to model daily flows and calculate uncertainty bounds for Karoon-III basin, Southwest of Iran, using HEC-HMS (SMA). The parameters were represented by probability distribution functions (PDF), and the effect on simulated runoff was investigated using Latin Hypercube Sampling (LHS) on Monte Carlo (MC). Three chosen parameters, based on sensitivity analysis, were saturated-hydraulic-conductivity (Ks), Clark storage coefficient (R) and time of concentration (t c ). Uncertainty associated with parameters were accounted for, by representing each with a probability distribution. Uncertainty bounds was calculated, using parameter sets captured from LHS on parameters PDF of sub-basins and propagating to the model. Results showed that maximum reliability (11%) resulted from Ks propagating. For three parameters, underestimation was more than overestimation. Maximum sharpness and standard deviation (STD) was resulted from propagating Ks. Cumulative Distribution Function (CDF) of flow and uncertainty bounds showed that as flow increased, the width of uncertainty bounds increased for all parameters.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号