首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
2.
The log-Pearson type 3 distribution is widely used in North America and Australia for fitting annual flood series. Four different versions of the method of moments used in fitting this distribution are compared using Monte Carlo simulated samples which reflect some of the characteristics of annual flood series observed at some Canadian rivers. The bias, standard error, root mean square error, and skew, of the parameter estimates, and of estimates of events associated with different probabilities of occurrence are examined. Also examined are the correlation coefficients between probabilities of occurrence are examined. Also examined are the correlation coefficients between the parameter estimates and between the sample moments that are used in each of the four methods of estimation. It is observed that variances, covariances and correlation coefficients calculated using the usual first-order asymptotic approximation might have considerable error and therefore should be used with caution. On the basis of mean square error of events with return period above the range covered by the sample it is observed that a method proposed earlier which uses moments of order 1, 2 and 3 in real space performs better than the other three methods although certain of the other methods follow the recommendation put forward by some investigators that higher order moments (moments of order 3 or more) should be avoided in flood frequency estimation. It is argued in the present study that the use of higher order moments should not be avoided simply because they have high variability because it is not only the variability of the moments which determines the degree of variability of the estimated design flood events but also the correlation that exists between these moments. Some recommendations are given at the end of the study aimed at achieving better efficiency in flood frequency research at a period where more and more distributions and methods of estimation are being proposed.  相似文献   

3.
Selection of a flood frequency distribution and associated parameter estimation procedure is an important step in flood frequency analysis. This is however a difficult task due to problems in selecting the best fit distribution from a large number of candidate distributions and parameter estimation procedures available in the literature. This paper presents a case study with flood data from Tasmania in Australia, which examines four model selection criteria: Akaike Information Criterion (AIC), Akaike Information Criterion—second order variant (AICc), Bayesian Information Criterion (BIC) and a modified Anderson–Darling Criterion (ADC). It has been found from the Monte Carlo simulation that ADC is more successful in recognizing the parent distribution correctly than the AIC and BIC when the parent is a three-parameter distribution. On the other hand, AIC and BIC are better in recognizing the parent distribution correctly when the parent is a two-parameter distribution. From the seven different probability distributions examined for Tasmania, it has been found that two-parameter distributions are preferable to three-parameter ones for Tasmania, with Log Normal appears to be the best selection. The paper also evaluates three most widely used parameter estimation procedures for the Log Normal distribution: method of moments (MOM), method of maximum likelihood (MLE) and Bayesian Markov Chain Monte Carlo method (BAY). It has been found that the BAY procedure provides better parameter estimates for the Log Normal distribution, which results in flood quantile estimates with smaller bias and standard error as compared to the MOM and MLE. The findings from this study would be useful in flood frequency analyses in other Australian states and other countries in particular, when selecting an appropriate probability distribution from a number of alternatives.  相似文献   

4.
Regression‐based regional flood frequency analysis (RFFA) methods are widely adopted in hydrology. This paper compares two regression‐based RFFA methods using a Bayesian generalized least squares (GLS) modelling framework; the two are quantile regression technique (QRT) and parameter regression technique (PRT). In this study, the QRT focuses on the development of prediction equations for a flood quantile in the range of 2 to 100 years average recurrence intervals (ARI), while the PRT develops prediction equations for the first three moments of the log Pearson Type 3 (LP3) distribution, which are the mean, standard deviation and skew of the logarithms of the annual maximum flows; these regional parameters are then used to fit the LP3 distribution to estimate the desired flood quantiles at a given site. It has been shown that using a method similar to stepwise regression and by employing a number of statistics such as the model error variance, average variance of prediction, Bayesian information criterion and Akaike information criterion, the best set of explanatory variables in the GLS regression can be identified. In this study, a range of statistics and diagnostic plots have been adopted to evaluate the regression models. The method has been applied to 53 catchments in Tasmania, Australia. It has been found that catchment area and design rainfall intensity are the most important explanatory variables in predicting flood quantiles using the QRT. For the PRT, a total of four explanatory variables were adopted for predicting the mean, standard deviation and skew. The developed regression models satisfy the underlying model assumptions quite well; of importance, no outlier sites are detected in the plots of the regression diagnostics of the adopted regression equations. Based on ‘one‐at‐a‐time cross validation’ and a number of evaluation statistics, it has been found that for Tasmania the QRT provides more accurate flood quantile estimates for the higher ARIs while the PRT provides relatively better estimates for the smaller ARIs. The RFFA techniques presented here can easily be adapted to other Australian states and countries to derive more accurate regional flood predictions. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

5.
Abstract

Flood frequency analysis can be made by using two types of flood peak series, i.e. the annual maximum (AM) and peaks-over-threshold (POT) series. This study presents a comparison of the results of both methods for data from the Litija 1 gauging station on the Sava River in Slovenia. Six commonly used distribution functions and three different parameter estimation techniques were considered in the AM analyses. The results showed a better performance for the method of L-moments (ML) when compared with the conventional moments and maximum likelihood estimation. The combination of the ML and the log-Pearson type 3 distribution gave the best results of all the considered AM cases. The POT method gave better results than the AM method. The binomial distribution did not offer any noticeable improvement over the Poisson distribution for modelling the annual number of exceedences above the threshold.
Editor D. Koutsoyiannis

Citation Bezak, N., Brilly, M., and ?raj, M., 2014. Comparison between the peaks-over-threshold method and the annual maximum method for flood frequency analysis. Hydrological Sciences Journal, 59 (5), 959–977.  相似文献   

6.
The log-Gumbel distribution is one of the extreme value distributions which has been widely used in flood frequency analysis. This distribution has been examined in this paper regarding quantile estimation and confidence intervals of quantiles. Specific estimation algorithms based on the methods of moments (MOM), probability weighted moments (PWM) and maximum likelihood (ML) are presented. The applicability of the estimation procedures and comparison among the methods have been illustrated based on an application example considering the flood data of the St. Mary's River.  相似文献   

7.
The index flood procedure coupled with the L‐moments method is applied to the annual flood peaks data taken at all stream‐gauging stations in Turkey having at least 15‐year‐long records. First, screening of the data is done based on the discordancy measure (Di) in terms of the L‐moments. Homogeneity of the total geographical area of Turkey is tested using the L‐moments based heterogeneity measure, H, computed on 500 simulations generated using the four parameter Kappa distribution. The L‐moments analysis of the recorded annual flood peaks data at 543 gauged sites indicates that Turkey as a whole is hydrologically heterogeneous, and 45 of 543 gauged sites are discordant which are discarded from further analyses. The catchment areas of these 543 sites vary from 9·9 to 75121 km2 and their mean annual peak floods vary from 1·72 to 3739·5 m3 s?1. The probability distributions used in the analyses, whose parameters are computed by the L‐moments method are the general extreme values (GEV), generalized logistic (GLO), generalized normal (GNO), Pearson type III (PE3), generalized Pareto (GPA), and five‐parameter Wakeby (WAK). Based on the L‐moment ratio diagrams and the |Zdist|‐statistic criteria, the GEV distribution is identified as the robust distribution for the study area (498 gauged sites). Hence, for estimation of flood magnitudes of various return periods in Turkey, a regional flood frequency relationship is developed using the GEV distribution. Next, the quantiles computed at all of 543 gauged sites by the GEV and the Wakeby distributions are compared with the observed values of the same probability based on two criteria, mean absolute relative error and determination coefficient. Results of these comparisons indicate that both distributions of GEV and Wakeby, whose parameters are computed by the L‐moments method, are adequate in predicting quantile estimates. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
The best information on which to base estimates of future flood frequencies is records of past flood events. Where there is a substantial record at the location for which estimates are desired the estimation process is generally straighforward, although a variety of methods is used and there is major uncertainty in the estimates. In general, the frequency of future events is assumed to be indicated by the observed frequency of past events under constant controlling watershed conditions.Techniques are available for using information on historical (pre-record) flood data to improve the reliability of flood frequency estimates. There are methods for detecting and managing extremely unusual actual events (outliers) and for improving the reliability of short-record estimates based on long-record data at related locations. Regional correlation analysis is usable for establishing flood frequency estimates for locations where records are not available.Detailed hydrologic analysis, usually involving rainfall-runoff studies, is required for establishing flood frequency relationships for modified conditions of the watershed or, in many cases, for establishing flood frequency estimates for newly formed drainage systems such as in urban areas and airports.The principal use of flood frequency functions is to compare expected changes in flood damages (due to a contemplated action) with the economic and social costs or benefits of the contemplated action.  相似文献   

9.
The principle of maximum entropy (POME) was employed to derive a new method of parameter estimation for the 2-parameter generalized Pareto (GP2) distribution. Monte Carlo simulated data were used to evaluate this method and compare it with the methods of moments (MOM), probability weighted moments (PWM), and maximum likelihood estimation (MLE). The parameter estimates yielded by POME were comparable or better within certain ranges of sample size and coefficient of variation.  相似文献   

10.
The principle of maximum entropy (POME) was used to derive the Pearson type (PT) III distribution. The POME yielded the minimally prejudiced PT III distribution by maximizing the entropy subject to two appropriate constraints which were the mean and the mean of the logarithm of real values about a constant >0. This provided a unique method for parameter estimation. Historical flood data were used to evaluate this method and compare it with the methods of moments and maximum likelihood estimation.  相似文献   

11.
Bayes estimate of the probability of exceedance of annual floods   总被引:1,自引:1,他引:1  
In this paper Lindley's Bayesian approximation procedure is used to obtain the Bayes estimate of the probability of exceedence of a flood discharge. The Bayes estimates of the probability of exceedence has been shown by S.K. Sinha to be equivalent to the estimate of the probability of exceedence from the predictive or Bayesian disribution, of a future flood discharge. The evaluation of complex ratios of multiple integrals common in a Bayesian analysis is not necessary using Lindley's procedure. The Bayes estimates are compared to those obtained by the method of maximum likelihood and the method of moments. The results show that Bayes estimates of the probability of exceedence are larger as expected, but have smaller posterior standard deviations.  相似文献   

12.
The index flood method is widely used in regional flood frequency analysis (RFFA) but explicitly relies on the identification of ‘acceptable homogeneous regions’. This paper presents an alternative RFFA method, which is particularly useful when ‘acceptably homogeneous regions’ cannot be identified. The new RFFA method is based on the region of influence (ROI) approach where a ‘local region’ can be formed to estimate statistics at the site of interest. The new method is applied here to regionalize the parameters of the log‐Pearson 3 (LP3) flood probability model using Bayesian generalized least squares (GLS) regression. The ROI approach is used to reduce model error arising from the heterogeneity unaccounted for by the predictor variables in the traditional fixed‐region GLS analysis. A case study was undertaken for 55 catchments located in eastern New South Wales, Australia. The selection of predictor variables was guided by minimizing model error. Using an approach similar to stepwise regression, the best model for the LP3 mean was found to use catchment area and 50‐year, 12‐h rainfall intensity as explanatory variables, whereas the models for the LP3 standard deviation and skewness only had a constant term for the derived ROIs. Diagnostics based on leave‐one‐out cross validation show that the regression model assumptions were not inconsistent with the data and, importantly, no genuine outlier sites were identified. Significantly, the ROI GLS approach produced more accurate and consistent results than a fixed‐region GLS model, highlighting the superior ability of the ROI approach to deal with heterogeneity. This method is particularly applicable to regions that show a high degree of regional heterogeneity. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

13.
Conventional flood frequency analysis is concerned with providing an unbiased estimate of the magnitude of the design flow exceeded with the probabilityp, but sampling uncertainties imply that such estimates will, on average, be exceeded more frequently. An alternative approach is therefore, to derive an estimator which gives an unbiased estimate of flow risk: the difference between the two magnitudes reflects uncertainties in parameter estimation. An empirical procedure has been developed to estimate the mean true exceedance probabilities of conventional estimates made using a GEV distribution fitted by probability weighted moments, and adjustment factors have been determined to enable the estimation of flood magnitudes exceeded with, on average, the desired probability.  相似文献   

14.
This paper reports the results of an investigation into flood simulation by areal rainfall estimated from the combination of gauged and radar rainfalls and a rainfall–runoff model on the Anseong‐cheon basin in the southern part of Korea. The spatial and temporal characteristics and behaviour of rainfall are analysed using various approaches combining radar and rain gauges: (1) using kriging of the rain gauge alone; (2) using radar data alone; (3) using mean field bias (MFB) of both radar and rain gauges; and (4) using conditional merging technique (CM) of both radar and rain gauges. To evaluate these methods, statistics and hyetograph for rain gauges and radar rainfalls were compared using hourly radar rainfall data from the Imjin‐river, Gangwha, rainfall radar site, Korea. Then, in order to evaluate the performance of flood estimates using different rainfall estimation methods, rainfall–runoff simulation was conducted using the physics‐based distributed hydrologic model, Vflo?. The flood runoff hydrograph was used to compare the calculated hydrographs with the observed one. Results show that the rainfall field estimated by CM methods improved flood estimates, because it optimally combines rainfall fields representing actual spatial and temporal characteristics of rainfall. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

15.
 A comparison of different methods for estimating T-year events is presented, all based on the Extreme Value Type I distribution. Series of annual maximum flood from ten gauging stations at the New Zealand South Island have been used. Different methods of predicting the 100-year event and the connected uncertainty have been applied: At-site estimation and regional index-flood estimation with and without accounting for intersite correlation using either the method of moments or the method of probability weighted moments for parameter estimation. Furthermore, estimation at ungauged sites were considered applying either a log-linear relationship between at-site mean annual flood and catchment characteristics or a direct log-linear relationship between 100-year events and catchment characteristics. Comparison of the results shows that the existence of at-site measurements significantly diminishes the prediction uncertainty and that the presence of intersite correlation tends to increase the uncertainty. A simulation study revealed that in regional index-flood estimation the method of probability weighted moments is preferable to method of moment estimation with regard to bias and RMSE.  相似文献   

16.
Frequency calculation for extreme flood and methods used for its uncertainty estimation are popular subjects in hydrology research. In this study, uncertainties in extreme flood estimations of the upper Yangtze River were investigated using the Delta and profile likelihood function (PLF) methods, which were used to calculate confidence intervals of key parameters of the generalized extreme value distribution and quantiles of extreme floods. Datasets of annual maximum daily flood discharge (AMDFD) from six hydrological stations located in the main stream and tributaries of the upper Yangtze River were selected in this study. The results showed that AMDFD data from the six stations followed the Weibull distribution, which has a short tail and is bounded above with an upper bound. Of the six stations, the narrowest confidence interval can be detected in the Yichang station, and the widest interval was found in the Cuntan station. Results also show that the record length and the return period are two key factors affecting the confidence interval. The width of confidence intervals decreased with the increase of record length because more information was available, while the width increased with the increase of return period. In addition, the confidence intervals of design floods were similar for both methods in a short return period. However, there was a comparatively large difference between the two methods in a long return period, because the asymmetry of the PLF curve increases with an increase in the return period. This asymmetry of the PLF method is more proficient in reflecting the uncertainty of design flood, suggesting that PLF method is more suitable for uncertainty analysis in extreme flood estimations of the upper Yangtze River Basin.  相似文献   

17.
Previously we have detailed an application of the generalized likelihood uncertainty estimation (GLUE) procedure to estimate spatially distributed uncertainty in models conditioned against binary pattern data contained in flood inundation maps. This method was applied to two sites where a single consistent synoptic image of inundation extent was available to test the simulation performance of the method. In this paper, we extend this to examine the predictive performance of the method for a reach of the River Severn, west‐central England. Uniquely for this reach, consistent inundation images of two major floods have been acquired from spaceborne synthetic aperture radars, as well as a high‐resolution digital elevation model derived using laser altimetry. These data thus allow rigorous split sample testing of the previous GLUE application. To achieve this, Monte Carlo analyses of parameter uncertainty within the GLUE framework are conducted for a typical hydraulic model applied to each flood event. The best 10% of parameter sets identified in each analysis are then used to map uncertainty in flood extent predictions using the method previously proposed for both an independent validation data set and a design flood. Finally, methods for combining the likelihood information derived from each Monte Carlo ensemble are examined to determine whether this has the potential to reduce uncertainty in spatially distributed measures of flood risk for a design flood. The results show that for this reach and these events, the method previously established is able to produce sharply defined flood risk maps that compare well with observed inundation extent. More generally, we show that even single, poor‐quality inundation extent images are useful in constraining hydraulic model calibrations and that values of effective friction parameters are broadly stationary between the two events simulated, most probably reflecting their similar hydraulics. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

18.
Changes in river flow regime resulted in a surge in the number of methods of non-stationary flood frequency analysis. Common assumption is the time-invariant distribution function with time-dependent location and scale parameters while the shape parameters are time-invariant. Here, instead of location and scale parameters of the distribution, the mean and standard deviation are used. We analyse the accuracy of the two methods in respect to estimation of time-dependent first two moments, time-invariant skewness and time-dependent upper quantiles. The method of maximum likelihood (ML) with time covariate is confronted with the Two Stage (TS) one (combining Weighted Least Squares and L-moments techniques). Comparison is made by Monte Carlo simulations. Assuming parent distribution which ensures the asymptotic superiority of ML method, the Generalized Extreme Value distribution with various values of linearly changing in time first two moments, constant skewness, and various time-series lengths are considered. Analysis of results indicates the superiority of TS methods in all analyzed aspects. Moreover, the estimates from TS method are more resistant to probability distribution choice, as demonstrated by Polish rivers’ case studies.  相似文献   

19.
The principle of maximum entropy (POME) was employed to derive a new method of parameter estimation for the 3-parameter log-logistic distribution (LLD3). Monte Carlo simulated data were used to evaluate this method and compare it with the methods of moments (MOM), probability weighted moments (PWM), and maximum likelihood estimation (MLE). Simulation results showed that POME's performance was superior in predicting quantiles of large recurrence intervals when population skew was greater than or equal to 2.0. In all other cases, POME's performance was comparable to other methods.  相似文献   

20.
The principle of maximum entropy (POME) was employed to derive a new method of parameter estimation for the 3-parameter log-logistic distribution (LLD3). Monte Carlo simulated data were used to evaluate this method and compare it with the methods of moments (MOM), probability weighted moments (PWM), and maximum likelihood estimation (MLE). Simulation results showed that POME's performance was superior in predicting quantiles of large recurrence intervals when population skew was greater than or equal to 2.0. In all other cases, POME's performance was comparable to other methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号