首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The beta-κ distribution is a distinct case of the generalized beta distribution of the second kind. In previous studies, beta-p and beta-κ distributions have played important roles in representing extreme events, and thus, the present paper uses the beta-κ distribution. Further, this paper uses the method of moments and the method of L-moments to estimate the parameters from the beta-κ distribution, and to demonstrate the performance of the proposed model, the paper presents a simulation study using three estimation methods (including the maximum likelihood estimation method) and beta-κ and non beta-κ samples. In addition, this paper evaluates the performance of the beta-κ distribution by employing two widely used extreme value distributions (i.e., the GEV and Gumbel distributions) and two sets of actual data on extreme events.  相似文献   

2.
This paper considers a problem of analyzing temporal and spatial structure of particulate matter (PM) data with emphasizing high-level \(\text {PM}_{10}\). The proposed method is based on a combination of a generalized extreme value (GEV) distribution and a multiscale concept from scaling property theory used in hydrology. In this study, we use hourly \(\text {PM}_{10}\) data observed for 5 years on 25 stations located in Seoul metropolitan area, Korea. For our analysis, we calculate monthly maximum values for various duration times and area coverages at each station, and show that their distribution follows a GEV distribution. In addition, we identify that the GEV parameters of \(\text {PM}_{10}\) maxima hold a new scaling property, termed ‘piecewise linear scaling property’ for certain duration times. By using this property, we construct a 12-month return level map of hourly \(\text {PM}_{10}\) data at any arbitrary d-hour duration. Furthermore, we extend our study to understand spatio-temporal multiscale structure of \(\text {PM}_{10}\) extremes over different temporal and spatial scales.  相似文献   

3.
Floods have changed in a complex manner, triggered by the changing environment (i.e., intensified human activities and global warming). Hence, for better flood control and mitigation in the future, bivariate frequency analysis of flood and extreme precipitation events is of great necessity to be performed within the context of changing environment. Given this, in this paper, the Pettitt test and wavelet coherence transform analysis are used in combination to identify the period with transformed flood-generating mechanism. Subsequently, the primary and secondary return periods of annual maximum flood (AMF) discharge and extreme precipitation (Pr) during the identified period are derived based on the copula. Meanwhile, the conditional probability of occurring different flood discharge magnitudes under various extreme precipitation scenarios are estimated using the joint dependence structure between AMF and Pr. Moreover, Monte Carlo-based algorithm is performed to evaluate the uncertainties of the above copula-based analyses robustly. Two catchments located on the Loess plateau are selected as study regions, which are Weihe River Basin (WRB) and Jinghe River Basin (JRB). Results indicate that: (1) the 1994–2014 and 1981–2014 are identified as periods with transformed flood-generating mechanism in the WRB and JRB, respectively; (2) the primary and secondary return periods for AMF and Pr are examined. Furthermore, chance of occurring different AMF under varying Pr scenarios also be elucidated according to the joint distribution of AMF and Pr. Despite these, one thing to notice is that the associate uncertainties are considerable, thus greatly challenges measures of future flood mitigation. Results of this study offer technical reference for copula-based frequency analysis under changing environment at regional and global scales.  相似文献   

4.
The problem of fitting a probability distribution, here log-Pearson Type III distribution, to extreme floods is considered from the point of view of two numerical and three non-numerical criteria. The six techniques of fitting considered include classical techniques (maximum likelihood, moments of logarithms of flows) and new methods such as mixed moments and the generalized method of moments developed by two of the co-authors. The latter method consists of fitting the distribution using moments of different order, in particular the SAM method (Sundry Averages Method) uses the moments of order 0 (geometric mean), 1 (arithmetic mean), –1 (harmonic mean) and leads to a smaller variance of the parameters. The criteria used to select the method of parameter estimation are:
–  - the two statistical criteria of mean square error and bias;
–  - the two computational criteria of program availability and ease of use;
–  - the user-related criterion of acceptability.
These criteria are transformed into value functions or fuzzy set membership functions and then three Multiple Criteria Decision Modelling (MCDM) techniques, namely, composite programming, ELECTRE, and MCQA, are applied to rank the estimation techniques.  相似文献   

5.
ABSTRACT

Flood quantile estimation based on partial duration series (peak over threshold, POT) represents a noteworthy alternative to the classical annual maximum approach since it enlarges the available information spectrum. Here the POT approach is discussed with reference to its benefits in increasing the robustness of flood quantile estimations. The classical POT approach is based on a Poisson distribution for the annual number of exceedences, although this can be questionable in some cases. Therefore, the Poisson distribution is compared with two other distributions (binomial and Gumbel-Schelling). The results show that only rarely is there a difference from the Poisson distribution. In the second part we investigate the robustness of flood quantiles derived from different approaches in the sense of their temporal stability against the occurrence of extreme events. Besides the classical approach using annual maxima series (AMS) with the generalized extreme value distribution and different parameter estimation methods, two different applications of POT are tested. Both are based on monthly maxima above a threshold, but one also uses trimmed L-moments (TL-moments). It is shown how quantile estimations based on this “robust” POT approach (rPOT) become more robust than AMS-based methods, even in the case of occasional extraordinary extreme events.
Editor M.C. Acreman Associate editor A. Viglione  相似文献   

6.
Robustness of large quantile estimates to the largest element in a sample of methods of moments (MOM) and L-moments (LMM) was evaluated and compared. Quantiles were estimated by log-logistic and log-Gumbel distributions. Both are lower bounded and two-parameter distributions, with the coefficient of variation (CV) serving as the shape parameter. In addition, the results of these two methods were compared with those of the maximum likelihood method (MLM). Since identification and elimination of the outliers in a single sample require the knowledge of the samples parent distribution which is unknown, one estimates it by using the parameter estimation method which is relatively robust to the largest element in the sample. In practice this means that the method should be robust to extreme elements (including outliers) in a sample.The effect of dropping the largest element of the series on the large quantile values was assessed for various coefficient of variation (CV) / sample size (N) combinations. To that end, Monte-Carlo sampling experiments were applied. The results were compared with those obtained from the single representative sample, (the first order approximation), i.e., consisting of both the average values (Exi) for every position (i) of an ordered sample and the theoretical quantiles based on the plotting formula (PP).The ML-estimates of large quantiles were found to be most robust to largest element of samples except for a small sample where MOM-estimates were more robust. Comparing the performance of two other methods in respect to the large quantiles estimation, MOM was found to be more robust for small and moderate samples drawn from distributions with zero lower bound as shown for log-Gumbel and log-logistic distributions. The results from representative samples were fairly compatible with the M-C simulation results. The Ex-sample results were closer to the M-C results for smaller CV-values, and to the PP-sample results for greater CV values.  相似文献   

7.
Extreme precipitation event is rare and mostly occurs on a relatively small local scale, which presents marked uncertainties when analyzing its characteristics. Using daily precipitation data covering 1959–2009 from 62 stations over the Pearl River Basin, the percentile method (PM) and the absolute critical value method (ACVM) are applied to define extreme precipitation thresholds (EPT), and their different impacts on the spatial–temporal distribution of extreme precipitation event were analyzed in this study. The findings of this study show: (1) Using the K-means clustering algorithm in terms of precipitation indices and the topography, longitude and latitude of each station, the whole basin is divided into eight precipitation zones. (2) The extreme indices, including extreme precipitation frequency, extreme precipitation proportion and proportion of extremely n-day precipitation, calculated by PM are markedly higher than those calculated by ACVM during five decades, which is particularly obvious in the low precipitation area such as the west-northern of the basin since more daily precipitation events are treated as extreme precipitation in this region if EPT is defined by PM. (3) The spatial distributions of extreme frequencies respectively calculated by these two methods are quite different across the basin. The spatial distribution of extreme frequencies calculated by ACVM shows a high-value center in the southeast coastal areas and a low-value center in the northwest mountain areas. However, the extreme frequencies calculated by PM distribute evenly over the basin, which is obviously inconsistent with the empirical results, an area with heavy precipitation usually has a high extreme precipitation frequency, and vice versa.  相似文献   

8.
The paper deals with the probability estimates of temperature extremes (annual temperature maxima and heat waves) in the Czech Republic. Two statistical methods of probability estimations are compared; one based on the stochastic modelling of time series of the daily maximum temperature (TMAX) using the first-order autoregressive (AR(1)) model, the other consisting in fitting the extreme value distribution to the sample of annual temperature peaks.The AR(1) model is able to reproduce the main characteristics of heat waves, though the estimated probabilities should be treated as upper limits because of deficiencies in simulating the temperature variability inherent to the AR(1) model. Theoretical extreme value distributions do not yield good results when applied to maximum annual lengths of heat waves and periods of tropical days (TMAX 30°C), but it is the best method for estimating the probability and recurrence time of annual one-day temperature extremes. However, there are some difficulties in the application: the use of the two-parameter Gumbel distribution and the three-parameter generalized extreme value (GEV) distribution may lead to different results, particularly for long return periods. The resulting values also depend on the chosen procedure of parameter estimation. Based on our findings, the shape parameter testing for the GEV distribution and the L moments technique for parameter estimation may be recommended.The application of the appropriate statistical tools indicates that the heat wave and particularly the long period of consecutive tropical days in 1994 were probably a more rare event than the record-breaking temperatures in July 1983 exceeding 40°C. An improvement of the probability estimate of the 1994 heat wave may be expected from a more sophisticated model of the temperature series.  相似文献   

9.
The most general approach to studying the recurrence law in the area of the rare largest events is associated with the use of limit law theorems of the theory of extreme values. In this paper, we use the Generalized Pareto Distribution (GPD). The unknown GPD parameters are typically determined by the method of maximal likelihood (ML). However, the ML estimation is only optimal for the case of fairly large samples (>200–300), whereas in many practical important cases, there are only dozens of large events. It is shown that in the case of a small number of events, the highest accuracy in the case of using the GPD is provided by the method of quantiles (MQs). In order to illustrate the obtained methodical results, we have formed the compiled data sets characterizing the tails of the distributions for typical subduction zones, regions of intracontinental seismicity, and for the zones of midoceanic (MO) ridges. This approach paves the way for designing a new method for seismic risk assessment. Here, instead of the unstable characteristics—the uppermost possible magnitude Mmax—it is recommended to use the quantiles of the distribution of random maxima for a future time interval. The results of calculating such quantiles are presented.  相似文献   

10.
Bayesian probability theory is an appropriate and useful method for estimating parameters in seismic hazard analysis. The analysis in Bayesian approaches is based on a posterior belief, also their special ability is to take into account the uncertainty of parameters in probabilistic relations and a priori knowledge. In this study, we benefited the Bayesian approach in order to estimate maximum values of peak ground acceleration (Amax) also quantiles of the relevant probabilistic distributions are figured out in a desired future interval time in Iran. The main assumptions are Poissonian character of the seismic events flow and properties of the Gutenberg-Richter distribution law. The map of maximum possible values of Amax and also map of 90% quantile of distribution of maximum values of Amax on a future interval time 100 years is presented. According to the results, the maximum value of the Amax is estimated for Bandar Abbas as 0.3g and the minimum one is attributed to Esfahan as 0.03g. Finally, the estimated values in Bayesian approach are compared with what was presented applying probabilistic seismic hazard (PSH) methods based on the conventional Cornel (1968) method. The distribution function of Amax for future time intervals of 100 and 475 years are calculated for confidence limit of probability level of 90%.  相似文献   

11.
This paper presents an approach to estimating the probability distribution of annual discharges Q based on rainfall-runoff modelling using multiple rainfall events. The approach is based on the prior knowledge about the probability distribution of annual maximum daily totals of rainfall P in a natural catchment, random disaggregation of the totals into hourly values, and rainfall-runoff modelling. The presented Multi-Event Simulation of Extreme Flood method (MESEF) combines design event method based on single-rainfall event modelling, and continuous simulation method used for estimating the maximum discharges of a given exceedance probability using rainfall-runoff models. In the paper, the flood quantiles were estimated using the MESEF method, and then compared to the flood quantiles estimated using classical statistical method based on observed data.  相似文献   

12.
In this short note, I comment on the research of Pisarenko et al. (Pure Appl. Geophys 171:1599–1624, 2014) regarding the extreme value theory and statistics in the case of earthquake magnitudes. The link between the generalized extreme value distribution (GEVD) as an asymptotic model for the block maxima of a random variable and the generalized Pareto distribution (GPD) as a model for the peaks over threshold (POT) of the same random variable is presented more clearly. Inappropriately, Pisarenkoet al. (Pure Appl. Geophys 171:1599–1624, 2014) have neglected to note that the approximations by GEVD and GPD work only asymptotically in most cases. This is particularly the case with truncated exponential distribution (TED), a popular distribution model for earthquake magnitudes. I explain why the classical models and methods of the extreme value theory and statistics do not work well for truncated exponential distributions. Consequently, these classical methods should be used for the estimation of the upper bound magnitude and corresponding parameters. Furthermore, I comment on various issues of statistical inference in Pisarenkoet al. and propose alternatives. I argue why GPD and GEVD would work for various types of stochastic earthquake processes in time, and not only for the homogeneous (stationary) Poisson process as assumed by Pisarenko et al. (Pure Appl. Geophys 171:1599–1624, 2014). The crucial point of earthquake magnitudes is the poor convergence of their tail distribution to the GPD, and not the earthquake process over time.  相似文献   

13.
Frequency analysis of climate extreme events in Zanjan, Iran   总被引:2,自引:1,他引:1  
In this study, generalized extreme value distribution (GEV) and generalized Pareto distribution (GPD) were fitted to the maximum and minimum temperature, maximum wind speed, and maximum precipitation series of Zanjan. Maximum (minimum) daily and absolute annual observations of Zanjan station from 1961 to 2011 were used. The parameters of the distributions were estimated using the maximum likelihood estimation method. Quantiles corresponding to 2, 5, 10, 25, 50, and 100 years return periods were calculated. It was found that both candidate distributions fitted to extreme events series, were statistically reasonable. Most of the observations from 1961 to 2011 were found to fall within 1–10 years return period. Low extremal index (θ) values were found for excess maximum and minimum temperatures over a high threshold, indicating the occurrence of consecutively high peaks. For the purpose of filtering the dependent observations to obtain a set of approximately independent threshold excesses, a declustering method was performed, which separated the excesses into clusters, then the de-clustered peaks were fitted to the GPD. In both models, values of the shape parameters of extreme precipitation and extreme wind speed were close to zero. The shape parameter was less negative in the GPD than the GEV. This leads to significantly lower return period estimates for high extremes with the GPD model.  相似文献   

14.
Hydrological frequency analysis is the most widely used method to estimate risk for extreme values. The most used statistical distributions to fit extreme value data in hydrology can be regrouped in three classes: class C of regularly varying distributions, class D of sub exponential and class E, Exponential depending on their tail behavior. The Halphen distributions (Halphen type A (HA), Halphen type B (HB)) are separated by the Gamma distribution; these three distributions belong to class D and can be displayed in the (δ1, δ2) moment-ratio diagram. In this study, a statistical test for discriminating between HA, HB and the Gamma distribution is developed. The methodology is based on: (1) the generation of N samples of different sizes n around the Gamma curve; (2) the determination of the confidence zones around the Gamma curve for each fixed couple (δ1, δ2) moment-ratios and finally; (3) the study of the power of the test developed and the calculation of the type 2 error β and the power of the test which is 1-β for a fixed significance level α. Results showed that the test is powerful especially for high coefficients of skewness. This test will be included in Decision Support System of the HYFRAN-PLUS software.  相似文献   

15.
16.
17.
The observation of extreme waves at FINO 1 during storm Britta on the 1st November 2006 has initiated a series of research studies regarding the mechanisms behind. The roles of stability and the presence of the open cell structures have been previously investigated but not conclusive. To improve our understanding of these processes, which are essential for a good forecast of similarly important events offshore, this study revisits the development of storm Britta using an atmospheric and wave coupled modeling system, wind and wave measurements from ten stations across the North Sea, cloud images and Synthetic Aperture Radar (SAR) data. It is found here that a standard state-of-the-art model is capable of capturing the important characteristics of a major storm like Britta, including the storm path, storm peak wind speed, the open cells, and peak significant wave height (H s ) for open sea. It was also demonstrated that the impact of the open cells has negligible contribution to the development of extreme H s observed at FINO 1. At the same time, stability alone is not sufficient in explaining the development of extreme H s . The controlling conditions for the development of Britta extreme H s observed at FINO 1 are the persistent strong winds and a long and undisturbed fetch over a long period.  相似文献   

18.
Moment tensors of ten witwatersrand mine tremors   总被引:1,自引:0,他引:1  
Ground motions, recorded both underground and on the surface in two of the South African Gold mining districts, were inverted to determine complete moment tensors for 10 mining-induced tremors in the magnitude range 1.9 to 3.3. The resulting moment tensors fall into two separate categories. Seven of the events involve substantial coseismic volumetric reduction-V together with normal faulting entailing shear deformation AD, where the summation is over fault planes of areaA and average slipD. For these events the ratio-V/AD ranges from 0.58 to 0.92, with an average value of 0.71. For the remaining three events V is not significantly different from zero; these events are largely double-couple sources involving normal faulting. Surprisingly, the two types of source mechanism appear to be very distinct in that there is not a continuous distribution of the source mix from V=0 to-VAD. Presumably, the coseismic closure indicates substantial interaction between a mine stope and adjacent shear failure in the surrounding rock, under the influence of an ambient stress for which the maximum principal stress is oriented vertically.  相似文献   

19.
In a previous paper (Makropoulos andBurton, 1983) the seismic risk of the circum-Pacific belt was examined using a whole process technique reduced to three representative parameters related to the physical release of strain energy, these are:M 1, the annual modal magnitude determined using the Gutenberg-Richter relationship;M 2, the magnitude equivalent to the total strain energy release rate per annum, andM 3, the upper bound magnitude equivalent to the maximum strain energy release in a region.The risk analysis is extended here using the part process statistical model of Gumbel's IIIrd asymptotic distribution of extreme values. The circum-Pacific is chosen being a complete earthquake data set, and the stability postulate on which asymptotic distributions of extremes are deduced to give similar results to those obtained from whole process or exact distributions of extremes is successfully checked. Additionally, when Gumbel III asymptotic distribution curve fitting is compared with Gumbel I using reduced chi-squared it is seen to be preferable in all cases and it also allows extensions to an upper-bounded range of magnitude occurrences. Examining the regional seismicity generates several seismic risk results, for example, the annual mode for all regions is greater thanm(1)=7.0, with the maximum being in the Japan, Kurile, Kamchatka region atm(1)=7.6. Overall, the most hazardous areas are situated in this northwestern region and also diagonally opposite in the southeastern circum-Pacific. Relationships are established between the Gumbel III parameters and quantitiesm 1(1),X 2 and , quantities notionally similar toM 1,M 2 andM 3 although is shown to be systematically larger thanM; thereby giving a physical link through strain energy release to seismic risk statistics. Inall regions of the circum-Pacific similar results are obtained forM 1,M 2 andM 3 and the notionally corresponding statistical quantitiesm 1(1),X 2 and , demonstrating that the relationships obtained are valid over a wide range of seismotectonic enviroments.  相似文献   

20.
ABSTRACT

There is a lack of suitable methods for creating precipitation scenarios that can be used to realistically estimate peak discharges with very low probabilities. On the one hand, existing methods are methodically questionable when it comes to physical system boundaries. On the other hand, the spatio-temporal representativeness of precipitation patterns as system input is limited. In response, this paper proposes a method of deriving spatio-temporal precipitation patterns and presents a step towards making methodically correct estimations of infrequent floods by using a worst-case approach. A Monte Carlo approach allows for the generation of a wide range of different spatio-temporal distributions of an extreme precipitation event that can be tested with a rainfall–runoff model that generates a hydrograph for each of these distributions. Out of these numerous hydrographs and their corresponding peak discharges, the physically plausible spatio-temporal distributions that lead to the highest peak discharges are identified and can eventually be used for further investigations.
Editor A. Castellarin; Associate editor E. Volpi  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号