首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Regression‐based regional flood frequency analysis (RFFA) methods are widely adopted in hydrology. This paper compares two regression‐based RFFA methods using a Bayesian generalized least squares (GLS) modelling framework; the two are quantile regression technique (QRT) and parameter regression technique (PRT). In this study, the QRT focuses on the development of prediction equations for a flood quantile in the range of 2 to 100 years average recurrence intervals (ARI), while the PRT develops prediction equations for the first three moments of the log Pearson Type 3 (LP3) distribution, which are the mean, standard deviation and skew of the logarithms of the annual maximum flows; these regional parameters are then used to fit the LP3 distribution to estimate the desired flood quantiles at a given site. It has been shown that using a method similar to stepwise regression and by employing a number of statistics such as the model error variance, average variance of prediction, Bayesian information criterion and Akaike information criterion, the best set of explanatory variables in the GLS regression can be identified. In this study, a range of statistics and diagnostic plots have been adopted to evaluate the regression models. The method has been applied to 53 catchments in Tasmania, Australia. It has been found that catchment area and design rainfall intensity are the most important explanatory variables in predicting flood quantiles using the QRT. For the PRT, a total of four explanatory variables were adopted for predicting the mean, standard deviation and skew. The developed regression models satisfy the underlying model assumptions quite well; of importance, no outlier sites are detected in the plots of the regression diagnostics of the adopted regression equations. Based on ‘one‐at‐a‐time cross validation’ and a number of evaluation statistics, it has been found that for Tasmania the QRT provides more accurate flood quantile estimates for the higher ARIs while the PRT provides relatively better estimates for the smaller ARIs. The RFFA techniques presented here can easily be adapted to other Australian states and countries to derive more accurate regional flood predictions. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

2.
Estimation of flood quantiles in ungauged catchments is a common problem in hydrology. For this, the log-linear regression model is widely adopted. However, in many cases, a simple log transformation may not be able to capture the complexity and nonlinearity in flood generation processes. This paper develops generalized additive model (GAM) to deal with nonlinearity between the dependent and predictor variables in regional flood frequency analysis (RFFA) problems. The data from 85 gauged catchments from New South Wales State in Australia is used to compare the performances of a number of alternative RFFA methods with respect to variable selection, variable transformation and delineation of regions. Four RFFA methods are compared in this study: GAM with fixed region, log-linear model, canonical correlation analysis (to form neighbourhood in the space catchment attributes) and region-of-influence approach. Based on the outcome from a leave-one-out validation approach, it has been found that the GAM method generally outperforms the other methods even without linking GAM with a neighbourhood/region-of-influence approach. The main strength of GAM is that it captures the non-linearity between the dependent and predictor variables without any restrictive assumption. The findings of this study will encourage other researchers worldwide to apply GAM in RFFA studies, allowing development of more flexible and realistic RFFA models and their wider adoption in practice.  相似文献   

3.
Various regional flood frequency analysis procedures are used in hydrology to estimate hydrological variables at ungauged or partially gauged sites. Relatively few studies have been conducted to evaluate the accuracy of these procedures and estimate the error induced in regional flood frequency estimation models. The objective of this paper is to assess the overall error induced in the residual kriging (RK) regional flood frequency estimation model. The two main error sources in specific flood quantile estimation using RK are the error induced in the quantiles local estimation procedure and the error resulting from the regional quantile estimation process. Therefore, for an overall error assessment, the corresponding errors associated with these two steps must be quantified. Results show that the main source of error in RK is the error induced into the regional quantile estimation method. Results also indicate that the accuracy of the regional estimates increases with decreasing return periods. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

4.
The index flood method is widely used in regional flood frequency analysis (RFFA) but explicitly relies on the identification of ‘acceptable homogeneous regions’. This paper presents an alternative RFFA method, which is particularly useful when ‘acceptably homogeneous regions’ cannot be identified. The new RFFA method is based on the region of influence (ROI) approach where a ‘local region’ can be formed to estimate statistics at the site of interest. The new method is applied here to regionalize the parameters of the log‐Pearson 3 (LP3) flood probability model using Bayesian generalized least squares (GLS) regression. The ROI approach is used to reduce model error arising from the heterogeneity unaccounted for by the predictor variables in the traditional fixed‐region GLS analysis. A case study was undertaken for 55 catchments located in eastern New South Wales, Australia. The selection of predictor variables was guided by minimizing model error. Using an approach similar to stepwise regression, the best model for the LP3 mean was found to use catchment area and 50‐year, 12‐h rainfall intensity as explanatory variables, whereas the models for the LP3 standard deviation and skewness only had a constant term for the derived ROIs. Diagnostics based on leave‐one‐out cross validation show that the regression model assumptions were not inconsistent with the data and, importantly, no genuine outlier sites were identified. Significantly, the ROI GLS approach produced more accurate and consistent results than a fixed‐region GLS model, highlighting the superior ability of the ROI approach to deal with heterogeneity. This method is particularly applicable to regions that show a high degree of regional heterogeneity. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

5.
6.
7.
Regional flood frequency analysis (RFFA) was carried out on data for 55 hydrometric stations in Namak Lake basin, Iran, for the period 1992–2012. Flood discharge of specific return periods was computed based on the log Pearson Type III distribution, selected as the best regional distribution. Independent variables, including physiographic, meteorological, geological and land-use variables, were derived and, using three strategies – gamma test (GT), GT plus classification and expert opinion – the best input combination was selected. To select the best technique for regionalization, support vector regression (SVR), adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN) and nonlinear regression (NLR) techniques were applied to predict peak flood discharge for 2-, 5-, 10-, 25-, 50- and 100-year return periods. The GT + ANFIS and GT + SVR models gave better performance than the ANN and NLR models in the RFFA. The results of the input variable selection showed that the GT technique improved the model performance.  相似文献   

8.
Estimation of design flood in ungauged catchments is a common problem in hydrology. Methods commonly adopted for this task are limited to peak flow estimation, e.g. index flood, rational and regression‐based methods. To estimate complete design hydrograph, rainfall–runoff modelling is preferred. The currently recommended method in Australia known as Design Event Approach (DEA) has some serious limitations since it ignores the probabilistic nature of principal model inputs (such as temporal patterns (TP) and initial loss) except for design rainfall depth. A more holistic approach such as Joint Probability Approach (JPA)/Monte Carlo Simulation Technique (MCST) can overcome some of the limitations associated with the DEA. Although JPA/MCST has been investigated by many researchers, it has been proved to be difficult to apply since its routine application needs readily available regional design data such as stochastic rainfall duration, TP and losses, which are largely unavailable for Australian states. This paper presents regionalization of the model inputs/parameters to the JPA/MCST for eastern New South Wales (NSW) in Australia. This uses data from 86 pluviograph stations and six catchments from NSW to regionalize the input distributions for application with the JPA/MCST. The independent testing to three test catchments shows that the regionalized JPA/MCST generally outperforms the at‐site DEA. The developed regionalized JPA/MCST can be applied at any arbitrary location in eastern NSW. The method and design data developed here although primarily applicable to eastern NSW can be adapted to other Australian states and countries. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

9.
Abstract

A new technique is developed for identifying groups for regional flood frequency analysis. The technique uses a clustering algorithm as a starting point for partitioning the collection of catchments. The groups formed using the clustering algorithm are subsequently revised to improve the regional characteristics based on three requirements that are defined for effective groups. The result is overlapping groups that can be used to estimate extreme flow quantiles for gauged or ungauged catchments. The technique is applied to a collection of catchments from India and the results indicate that regions with the desired characteristics can be identified using the technique. The use of the groups for estimating extreme flow quantiles is demonstrated for three example sites.  相似文献   

10.
11.
Hydrological models used for flood prediction in ungauged catchments are commonly fitted to regionally transferred data. The key issue of this procedure is to identify hydrologically similar catchments. Therefore, the dominant controls for the process of interest have to be known. In this study, we applied a new machine learning based approach to identify the catchment characteristics that can be used to identify the active processes controlling runoff dynamics. A random forest (RF) regressor has been trained to estimate the drainage velocity parameters of a geomorphologic instantaneous unit hydrograph (GIUH) in ungauged catchments, based on regionally available data. We analyzed the learning procedure of the algorithm and identified preferred donor catchments for each ungauged catchment. Based on the obtained machine learning results from catchment grouping, a classification scheme for drainage network characteristics has been derived. This classification scheme has been applied in a flood forecasting case study. The results demonstrate that the RF could be trained properly with the selected donor catchments to successfully estimate the required GIUH parameters. Moreover, our results showed that drainage network characteristics can be used to identify the influence of geomorphological dispersion on the dynamics of catchment response.  相似文献   

12.
In this study, a quantitative assessment of uncertainty was made in connection with the calibration of Australian Water Balance Model (AWBM) for both gauged and ungauged catchment cases. For the gauged catchment, five different rainfall data sets, 23 different calibration data lengths and eight different optimization techniques were adopted. For the ungauged catchment case, the optimum parameter sets obtained from the nearest gauged catchment were transposed to the ungauged catchments, and two regional prediction equations were used to estimate runoff. Uncertainties were ascertained by comparing the observed and modelled runoffs by the AWBM on the basis of different combinations of methods, model parameters and input data. The main finding from this study was that the uncertainties in the AWBM modelling outputs could vary from ?1.3% to 70% owing to different input rainfall data, ?5.7% to 11% owing to different calibration data lengths and ?6% to 0.2% owing to different optimization techniques adopted in the calibration of the AWBM. The performance of the AWBM model was found to be dominated mainly by the selection of appropriate rainfall data followed by the selection of an appropriate calibration data length and optimization algorithm. Use of relatively short data length (e.g. 3 to 6 years) in the calibration was found to generate relatively poor results. Effects of different optimization techniques on the calibration were found to be minimal. The uncertainties reported here in relation to the calibration and runoff estimation by the AWBM model are relevant to the selected study catchments, which are likely to differ for other catchments. The methodology presented in this paper can be applied to other catchments in Australia and other countries using AWBM and similar rainfall–runoff models. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

13.
14.
15.
Regional frequency analysis is an important tool in estimating design flood for ungauged catchments. Index flood is an important component in regionalized flood formulas. In the past, many formulas have been developed based on various numbers of calibration catchments (e.g. from less than 20 to several hundred). However, there is a lack of systematic research on the model uncertainties caused by the number of calibration catchments (i.e. what is the minimum number of calibration catchment? and how should we choose the calibration catchments?). This study uses the statistical resampling technique to explore the impact of calibration catchment numbers on the index flood estimation. The study is based on 182 catchments in England and an index flood formula has been developed using the input variable selection technique in the data mining field. The formula has been used to explore the model uncertainty due to a range of calibration catchment numbers (from 15 to 130). It is found that (1) as expected, the more catchments are used in the calibration, the more reliable of the models developed are (i.e. with a narrower band of uncertainty); (2) however, poor models are still possible with a large number of calibration catchments (e.g. 130). In contrast, good models with a small number of calibration catchments are also achievable (with as low as 15 calibration catchments). This indicates that the number of calibration catchments is only one of the factors influencing the model performance. The hydrological community should explore why a smaller calibration data set could produce a better model than a large calibration data set. It is clear from this study that the information content in the calibration data set is equally if not more important than the number of calibration data. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

16.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

17.
Abstract

The physically-based flood frequency models use readily available rainfall data and catchment characteristics to derive the flood frequency distribution. In the present study, a new physically-based flood frequency distribution has been developed. This model uses bivariate exponential distribution for rainfall intensity and duration, and the Soil Conservation Service-Curve Number (SCS-CN) method for deriving the probability density function (pdf) of effective rainfall. The effective rainfall-runoff model is based on kinematic-wave theory. The results of application of this derived model to three Indian basins indicate that the model is a useful alternative for estimating flood flow quantiles at ungauged sites.  相似文献   

18.
The log-Gumbel distribution is one of the extreme value distributions which has been widely used in flood frequency analysis. This distribution has been examined in this paper regarding quantile estimation and confidence intervals of quantiles. Specific estimation algorithms based on the methods of moments (MOM), probability weighted moments (PWM) and maximum likelihood (ML) are presented. The applicability of the estimation procedures and comparison among the methods have been illustrated based on an application example considering the flood data of the St. Mary's River.  相似文献   

19.
Abstract

Flood frequency estimation is crucial in both engineering practice and hydrological research. Regional analysis of flood peak discharges is used for more accurate estimates of flood quantiles in ungauged or poorly gauged catchments. This is based on the identification of homogeneous zones, where the probability distribution of annual maximum peak flows is invariant, except for a scale factor represented by an index flood. The numerous applications of this method have highlighted obtaining accurate estimates of index flood as a critical step, especially in ungauged or poorly gauged sections, where direct estimation by sample mean of annual flood series (AFS) is not possible, or inaccurate. Therein indirect methods have to be used. Most indirect methods are based upon empirical relationships that link index flood to hydrological, climatological and morphological catchment characteristics, developed by means of multi-regression analysis, or simplified lumped representation of rainfall–runoff processes. The limits of these approaches are increasingly evident as the size and spatial variability of the catchment increases. In these cases, the use of a spatially-distributed, physically-based hydrological model, and time continuous simulation of discharge can improve estimation of the index flood. This work presents an application of the FEST-WB model for the reconstruction of 29 years of hourly streamflows for an Alpine snow-fed catchment in northern Italy, to be used for index flood estimation. To extend the length of the simulated discharge time series, meteorological forcings given by daily precipitation and temperature at ground automatic weather stations are disaggregated hourly, and then fed to FEST-WB. The accuracy of the method in estimating index flood depending upon length of the simulated series is discussed, and suggestions for use of the methodology provided.
Editor D. Koutsoyiannis  相似文献   

20.
《水文科学杂志》2013,58(1):86-87
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号