首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.
As part I of a sequence of two papers, previously developed L-moments by Hosking (J R Stat Soc Ser B Methodol 52(2):105–124, 1990), and the LH-moments by Wang (Water Resour Res 33(12):2841–2848, 1997) are re-visited. New relationships are developed for regional homogeneity analysis by the LH-moments, and further establishment of regional homogeneity is investigated. Previous works of Hosking (J R Stat Soc Ser B Methodol 52(2):105–124, 1990) and Wang (Water Resour Res 33(12):2841–2848, 1997) on L-moments and LH-moments for the generalized extreme value (GEV) distribution are extended to the generalized Pareto (GPA) and the generalized logistic (GLO) distributions. The Karkhe watershed, located in western Iran is used as a case study area. Regional homogeneity was investigated by first assuming the entire study area as one regional cluster. Then the entire study area was designated “homogeneous” by the L-moments (L); and was designated “heterogeneous” by all four levels of the LH-moments (L1 to L4). The k-means method was used to investigate the case of two regional clusters. All levels of the L- and LH-moments designated the upper watershed (region A), “homogeneous”, and the lower watershed (region B) “possibly-homogeneous”. The L3 level of the GPA and the L4 level of the GLO were selected for regions A and B, respectively. Wang (Water Resour Res 33(12):2841–2848, 1997) identified a reversing trend in improved performance of the GEV distribution at the LH-moments level of L3 (during the goodness-of-fit test). Similar results were also obtained in this research for the GEV distribution. However, for the case of the GPA distribution the reversing trend started at L4 for region A; and at L2 for region B. As for the case of the GLO, an improved performance was observed for all levels (moving from L to L4); for both regions.  相似文献   

2.
Abstract

Statistical analysis of extreme events is often carried out to predict large return period events. In this paper, the use of partial L-moments (PL-moments) for estimating hydrological extremes from censored data is compared to that of simple L-moments. Expressions of parameter estimation are derived to fit the generalized logistic (GLO) distribution based on the PL-moments approach. Monte Carlo analysis is used to examine the sampling properties of PL-moments in fitting the GLO distribution to both GLO and non-GLO samples. Finally, both PL-moments and L-moments are used to fit the GLO distribution to 37 annual maximum rainfall series of raingauge station Kampung Lui (3118102) in Selangor, Malaysia, and it is found that analysis of censored rainfall samples of PL-moments would improve the estimation of large return period events.

Editor D. Koutsoyiannis; Associate editor K. Hamed

Citation Zakaria, Z.A., Shabri, A. and Ahmad, U.N., 2012. Estimation of the generalized logistic distribution of extreme events using partial L-moments. Hydrological Sciences Journal, 57 (3), 424–432.  相似文献   

3.
Abstract

Abstract A parameter estimation method is proposed for fitting the generalized extreme value (GEV) distribution to censored flood samples. Partial L-moments (PL-moments), which are variants of L-moments and analogous to ?partial probability weighted moments?, are defined for the analysis of such flood samples. Expressions are derived to calculate PL-moments directly from uncensored annual floods, and to fit the parameters of the GEV distribution using PL-moments. Results of Monte Carlo simulation study show that sampling properties of PL-moments, with censoring flood samples of up to 30% are similar to those of simple L-moments, and also that both PL-moment and LH-moments (higher-order L-moments) have similar sampling properties. Finally, simple L-moments, LH-moments, and PL-moments are used to fit the GEV distribution to 75 annual maximum flow series of Nepalese and Irish catchments, and it is found that, in some situations, both LH- and PL-moments can produce a better fit to the larger flow values than simple L-moments.  相似文献   

4.
Abstract

Statistical analysis of extremes is often used for predicting the higher return-period events. In this paper, the trimmed L-moments with one smallest value trimmed—TL-moments (1,0)—are introduced as an alternative way to estimate floods for high return periods. The TL-moments (1,0) have an ability to reduce the undesirable influence that a small value in the statistical sample might have on a large return period. The main objective of this study is to derive the TL-moments (1,0) for the generalized Pareto (GPA) distribution. The performance of the TL-moments (1,0) was compared with L-moments through Monte Carlo simulation based on the streamflow data of northern Peninsular Malaysia. The result shows that, for some cases, the use of TL-moments (1,0) is a better option as compared to L-moments in modelling those series.

Citation Ahmad, U.N., Shabri, A. & Zakaria, Z.A. (2011) Trimmed L-moments (1,0) for the generalized Pareto distribution. Hydrol.Sci. J. 56(6), 1053–1060.  相似文献   

5.
6.
The index flood procedure coupled with the L‐moments method is applied to the annual flood peaks data taken at all stream‐gauging stations in Turkey having at least 15‐year‐long records. First, screening of the data is done based on the discordancy measure (Di) in terms of the L‐moments. Homogeneity of the total geographical area of Turkey is tested using the L‐moments based heterogeneity measure, H, computed on 500 simulations generated using the four parameter Kappa distribution. The L‐moments analysis of the recorded annual flood peaks data at 543 gauged sites indicates that Turkey as a whole is hydrologically heterogeneous, and 45 of 543 gauged sites are discordant which are discarded from further analyses. The catchment areas of these 543 sites vary from 9·9 to 75121 km2 and their mean annual peak floods vary from 1·72 to 3739·5 m3 s?1. The probability distributions used in the analyses, whose parameters are computed by the L‐moments method are the general extreme values (GEV), generalized logistic (GLO), generalized normal (GNO), Pearson type III (PE3), generalized Pareto (GPA), and five‐parameter Wakeby (WAK). Based on the L‐moment ratio diagrams and the |Zdist|‐statistic criteria, the GEV distribution is identified as the robust distribution for the study area (498 gauged sites). Hence, for estimation of flood magnitudes of various return periods in Turkey, a regional flood frequency relationship is developed using the GEV distribution. Next, the quantiles computed at all of 543 gauged sites by the GEV and the Wakeby distributions are compared with the observed values of the same probability based on two criteria, mean absolute relative error and determination coefficient. Results of these comparisons indicate that both distributions of GEV and Wakeby, whose parameters are computed by the L‐moments method, are adequate in predicting quantile estimates. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

7.
Abstract

A parameter estimation method is proposed for fitting probability distribution functions to low flow observations. LL-moments are variants of L-moments that are analogous to LH-moments, which were defined for the analysis of floods. LL-moments give higher weights to the small observations. Expressions are given that relate them to the probability distribution function for the case of normal, Weibull and power distributions. Sampling properties of the LL-moments and of the distribution parameters and quantiles estimated by them are found by a Monte Carlo simulation study. It is shown on an example that the low flow quantile estimates obtained by LL-moments may be significantly different from those obtained by L-moments.  相似文献   

8.
Development of design flood hydrographs using probability density functions   总被引:1,自引:0,他引:1  
Probability density functions (PDFs) are used to fit the shape of hydrographs and have been popularly used for the development of synthetic unit hydrographs by many hydrologists. Nevertheless, modelling the shapes of continuous stream flow hydrographs, which are probabilistic in nature, is rare. In the present study, a novel approach was followed to model the shape of stream flow hydrographs using PDF and subsequently to develop design flood hydrographs for various return periods. Four continuous PDFs, namely, two parameter Beta, Weibull, Gamma and Lognormal, were employed to fit the shape of the hydrographs of 22 years at a site of Brahmani River in eastern India. The shapes of the observed and PDF fitted hydrographs were compared and root mean square errors, error of peak discharge (EQP) and error of time to peak (ETP) were computed. The best‐fitted shape and scale parameters of all PDFs were subjected to frequency analysis and the quartiles corresponding to 20‐, 50‐, 100‐ and 200‐year were estimated. The estimated parameters of each return period were used to develop the flood hydrographs for 20‐, 50‐, 100‐ and 200‐year return periods. The peak discharges of the developed design flood hydrographs were compared with the design discharges estimated from the frequency analysis of 22 years of annual peak discharges at that site. Lognormal‐produced peak discharge was very close to the estimated design discharge in case of 20‐year flood hydrograph. On the other hand, peak discharge obtained using the Weibull PDF had close agreement with the estimated design discharge obtained from frequency analysis in case of 50‐, 100‐ and 200‐year return periods. The ranking of the PDFs based on estimation of peak of design flood hydrograph for 50‐, 100‐ and 200‐year return periods was found to have the following order: Weibull > Beta > Lognormal > Gamma. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

9.
Large data sets covering large areas and time spans and composed of many different independent sources raise the question of the obtained degree of harmonization. The present study is an analysis of the harmonization with respect to the moment magnitude M w within the earthquake catalogue for central, northern, and northwestern Europe (CENEC). The CENEC earthquake catalogue (Grünthal et al., J Seismol, 2009) contains parameters for over 8,000 events in the time period 1000–2004 with magnitude M w ≥ 3.5. Only about 2% of the data used for CENEC have original M w magnitudes derived directly from digital data. Some of the local catalogues and data files providing data give M w, but calculated by the respective agency from other magnitude measures or intensity. About 60% of the local data give strength measures other than M w, and these have to be transformed by us using available formulae or new regressions based on original M w data. Although all events are thus unified to M w magnitude, inhomogeneity in the M w obtained from over 40 local catalogues and data files and 50 special studies is inevitable. Two different approaches have been followed to investigate the compatibility of the different M w sets throughout CENEC. The first harmonization check is performed using M w from moment tensor solutions from SMTS and Pondrelli et al. (Phys Earth Planet Inter 130:71–101, 2002; Phys Earth Planet Inter 164:90–112, 2007). The method to derive the SMTS is described, e.g., by Braunmiller et al. (Tectonophysics 356:5–22, 2002) and Bernardi et al. (Geophys J Int 157:703–716, 2004), and the data are available in greater extent since 1997. One check is made against the M w given in national catalogues and another against the M w derived by applying different empirical relations developed for CENEC. The second harmonization check concerns the vast majority of data in CENEC related to earthquakes prior to 1997 or where no moment tensor based M w exists. In this case, an empirical relation for the M w dependence on epicentral intensity (I 0) and focal depth (h) was derived for 41 master events, i.e., earthquakes, located all over central Europe, with high-quality data. To include also the data lacking h, the corresponding depth-independent relation for these 41 events was also derived. These equations are compared with the different sets of data from which CENEC has been composed, and the goodness of fit is demonstrated for each set. The vast majority of the events are very well or reasonably consistent with the respective relation so that the data can be said to be harmonized with respect to M w, but there are exceptions, which are discussed in detail.  相似文献   

10.
Let {Y, Y i , −∞ < i < ∞} be a doubly infinite sequence of identically distributed and asymptotically linear negative quadrant dependence random variables, {a i , −∞ < i < ∞} an absolutely summable sequence of real numbers. We are inspired by Wang et al. (Econometric Theory 18:119–139, 2002) and Salvadori (Stoch Environ Res Risk Assess 17:116–140, 2003). And Salvadori (Stoch Environ Res Risk Assess 17:116–140, 2003) have obtained Linear combinations of order statistics to estimate the quantiles of generalized pareto and extreme values distributions. In this paper, we prove the complete convergence of under some suitable conditions. The results obtained improve and generalize the results of Li et al. (1992) and Zhang (1996). The results obtained extend those for negative associated sequences and ρ*-mixing sequences. CIC Number O211, AMS (2000) Subject Classification 60F15, 60G50 Research supported by National Natural Science Foundation of China  相似文献   

11.
12.
The generalized Pareto distribution is one of the popular models in the environmental sciences. Scientists in these areas are often interested in comparing the values of an environmental variable under two different conditions, locations, etc. This would require the study of the ratio X/(X+Y) where X and Y are independent generalized Pareto random variables. In this note, the exact distribution of X/(X+Y) is derived, which turns out to involve the Gauss hypergeometric function. An application of this result is provided to assess the relative extremity of rainfall for 14 locations in Florida. Some computer programs for use in the applications are also provided.  相似文献   

13.
The spatial distributions of severely damaged buildings (red-tagged) and of breaks in the water distribution system following the 1994 Northridge, California, earthquake (ML = 6·4) are investigated relative to the local characteristics of surficial geology. The pipe breaks are used as an indicator of nonlinear soil response, and the red-tagged buildings as indicator of severe shaking. The surficial geology is described by several generalized categories based on age, textural character and thickness of the near surface layer. Two regions are studied: the San Fernando Valley and Los Angeles-Santa Monica. The analysis shows that there is no simple correlation between damage patterns and surficial geology. Single family wood-frame buildings were damaged less when built on fine silt and clay (0–3 m thick) from the late Holocene.  相似文献   

14.
《水文科学杂志》2013,58(3):550-567
Abstract

The multivariate extension of the logistic model with generalized extreme value (GEV) marginals is applied to provide a regional at-site flood estimate. The maximum likelihood estimators of the parameters were obtained numerically by using a multivariable constrained optimization algorithm. The asymptotic results were checked by distribution sampling techniques in order to establish whether or not those results can be utilized for small samples. A region in northern Mexico with 21 gauging stations was selected to apply the model. Results were compared with those obtained by the most popular univariate distributions, the bivariate approach of the logistic model and three regional methods: station-year, index flood and L-moments. These show that there is a reduction in the standard error of fit when estimating the parameters of the marginal distribution with the trivariate distribution instead of its univariate and bivariate counterpart, and differences between at-site and regional at-site design events can be significant as return period increases.  相似文献   

15.
An important problem in frequency analysis is the selection of an appropriate probability distribution for a given sample data. This selection is generally based on goodness-of-fit tests. The goodness-of-fit method is an effective means of examining how well a sample data agrees with an assumed probability distribution as its population. However, the goodness of fit test based on empirical distribution functions gives equal weight to differences between empirical and theoretical distribution functions corresponding to all observations. To overcome this drawback, the modified Anderson–Darling test was suggested by Ahmad et al. (1988b). In this study, the critical values of the modified Anderson–Darling test statistics are revised using simulation experiments with extensions of the shape parameters for the GEV and GLO distributions, and a power study is performed to test the performance of the modified Anderson–Darling test. The results of the power study show that the modified Anderson–Darling test is more powerful than traditional tests such as the χ2, Kolmogorov–Smirnov, and Cramer von Mises tests. In addition, to compare the results of these goodness-of-fit tests, the modified Anderson–Darling test is applied to the annual maximum rainfall data in Korea.  相似文献   

16.
On June 15, 1995 at 00:15 GMT a devastating earthquake (6.2M L ) occurred in the western end of the Gulf of Corinth. This was followed 15 min later by the largest aftershock (5.4M L ). The main event was located by the University of Patras Seismological Network (PATNET) at the northern side of the Gulf of Corinth graben. The second event (5.4M L ) was located also by PATNET near the city of Egion, on a fault parallel to the Eliki major fault that defines the south bound of the Gulf of Corinth graben. A seismogenic volume that spans the villages of Akrata (SE) and Rodini (NW) and extends to Eratini (NE) was defined by the aftershock sequence, which includes 858 aftershocks of magnitude greater than 2M L that occurred the first seventeen days. The distribution of hypocentres in cross section does not immediately suggest a planar distribution but rather defines a volume about 15 km (depth) by 35 km (NW-SE) and by 20 km (NE-SW).  相似文献   

17.
The aim of this paper is to illustrate the effects of spatial organization of lake chains and associated storage thresholds upon lake-overflow behaviour, and specifically their impact upon large scale flow connectivity and the flood frequency of lake overflows. The analysis was carried out with the use of a multiple bucket model of the lake chain system, consisting of a network of both lakes and associated catchment areas, which explicitly incorporated within it three storage thresholds: a catchment field capacity threshold that governs catchment subsurface stormflow, a total storage capacity threshold that governs catchment surface runoff, and a lake storage capacity threshold that determines lake overflow. The model is driven by rainfall inputs generated by a stochastic rainfall model that is able to capture rainfall variability at a wide range of time scales. The study is used to gain insights into the process controls of lake-overflow generation, and in particular, to explore the crucial role of factors relating to lake organization, such as the average catchment area to lake area (AC/AL) ratio and the distribution of AC/AL with distance in the downstream direction (increasing or decreasing). The study showed that the average AC/AL value was the most important factor determining the frequency of occurrence and magnitude of floods from a landscape consisting of lake chains. The larger the average AC/AL value the more runoff is generated from catchments thus increasing both the occurrence and magnitude of lake overflows. In this case the flood frequency curve reflects that of the catchment area, and lake organization does not play an important role. When AC/AL is small the landscape is lake dominated, the spatial organization of lakes has a significant impact on lake connectivity, and consequently on flood frequency. One of the aspects of lake organization that may have a significant influence on lake connectivity is the spatial distribution of AC/AL from upstream to downstream (increasing or decreasing). In a landscape in which AC/AL increases downstream, lake overflow will occur more frequently relative to a similar landscape (i.e. identical AC/AL) with a constant value of AC/AL. When AC/AL decreases downstream, however, runoff inputs from the upstream parts will trigger lake overflow in the downstream parts, and consequently, full connectivity may be achieved leading to increased flood frequencies.  相似文献   

18.
The generalized gamma (GG) distribution has a density function that can take on many possible forms commonly encountered in hydrologic applications. This fact has led many authors to study the properties of the distribution and to propose various estimation techniques (method of moments, mixed moments, maximum likelihood etc.). We discuss some of the most important properties of this flexible distribution and present a flexible method of parameter estimation, called the generalized method of moments (GMM) which combines any three moments of the GG distribution. The main advantage of this general method is that it has many of the previously proposed methods of estimation as special cases. We also give a general formula for the variance of theT-year eventX T obtained by the GMM along with a general formula for the parameter estimates and also for the covariances and correlation coefficients between any pair of such estimates. By applying the GMM and carefully choosing the order of the moments that are used in the estimation one can significantly reduce the variance ofT-year events for the range of return periods that are of interest.  相似文献   

19.
Abstract

Regional frequency analysis of annual maximum flood data comprising 407 stations from 11 countries of southern Africa is presented. Forty-one homogeneous regions are identified. The L-moments of the observed data indicate that the possible underlying frequency distributions are Pearson type 3 (P3), lognormal 3-parameter (LN3), General Pareto (GPA) or General Extreme Value (GEV). Simulation experiments for the selection of the most suitable flood frequency procedure indicate that Pearson type 3/Probability Weighted Moments (P3/PWM) and log-Pearson type 3/Method of Moments (LP3/MOM) are suitable procedures for the region.  相似文献   

20.
This paper empirically investigates the asymptotic behaviour of the flood probability distribution and more precisely the possible occurrence of heavy tail distributions, generally predicted by multiplicative cascades. Since heavy tails considerably increase the frequency of extremes, they have many practical and societal consequences. A French database of 173 daily discharge time series is analyzed. These series correspond to various climatic and hydrological conditions, drainage areas ranging from 10 to 105 km2, and are from 22 to 95 years long. The peaks-over-threshold method has been used with a set of semi-parametric estimators (Hill and Generalized Hill estimators), and parametric estimators (maximum likelihood and L-moments). We discuss the respective interest of the estimators and compare their respective estimates of the shape parameter of the probability distribution of the peaks. We emphasize the influence of the selected number of the highest observations that are used in the estimation procedure and in this respect the particular interest of the semi-parametric estimators. Nevertheless, the various estimators agree on the prevalence of heavy tails and we point out some links between their presence and hydrological and climatic conditions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号