首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 625 毫秒
1.
The deterioration of the condition of process plants assets has a major negative impact on the safety of its operation. Risk based integrity modeling provides a methodology to quantify the risks posed by an aging asset. This provides a means for the protection of human life, financial investment and the environmental damage from the consequences of its failures. This methodology is based on modeling the uncertainty in material degradations using probability distributions, known as priors. Using Bayes theorem, one may improve the prior distribution to obtain a posterior distribution using actual inspection data. Although the choice of priors is often subjective, a rational consensus can be achieved by judgmental studies and analyzing the generic data from the same or similar installations. The first part of this paper presents a framework for a risk based integrity modeling. This includes a methodology to select the prior distributions for the various types of corrosion degradation mechanisms, namely, the uniform, localized and erosion corrosion. Several statistical tests were conducted based on the data extracted from the literature to check which of the prior distributions follows data the best. Once the underlying distribution has been confirmed, one can estimate the parameters of the distributions. In the second part, the selected priors are tested and validated using actual plant inspection data obtained from existing assets in operation. It is found that uniform corrosion can be best described using 3P-Weibull and 3P-Lognormal distributions. Localized corrosion can be best described using Type1 extreme value and 3P-Weibull, while erosion corrosion can best be described using the 3P-Weibull, Type1 extreme value, or 3P-Lognormal distributions.  相似文献   

2.
评B.Epstein关于地震震级分布的统计模型/   总被引:1,自引:0,他引:1       下载免费PDF全文
马逢时 《地震学报》1982,4(4):426-433
本文指出 B.Epstein提出的地震震级分布函数G(y)=exp(——e-y),y0并不是Ⅰ型极大值分布函数,而是泊松指数型复合极值分布,这是因为G(y)在y=0处有跃度e-.当地震资料中有某一年无震时,Epstein方法就不适用了。为此,本文不但给出更一般的理论和方法,而且提出一种新的计算方法。它与台风导致的海洋波高分布的计算是相似的。此法对地震资料较少的地区将有明显的优越性。   相似文献   

3.
A new approach to evaluate the extreme value distribution (EVD) of the response and reliability of general multi-DOF nonlinear stochastic structures is proposed. The approach is based on the recently developed probability density evolution method, which enables the instantaneous probability density functions of the stochastic responses to be captured. In the proposed method, a virtual stochastic process is first constructed to satisfy the condition that the extreme value of the response equals the value of the constructed process at a certain instant of time. The probability density evolution method is then applied to evaluate the instantaneous probability density function of the response, yielding the EVD. The reliability is therefore available through a simple integration over the safe domain. A numerical algorithm is developed using the Number Theoretical Method to select the discretized representative points. Further, a hyper-ball is imposed to sieve the points from the preceding point set in the hypercube. In the numerical examples, the EVD of random variables is evaluated and compared with the analytical solution. A frame structure is analyzed to capture the EVD of the response and the dynamic reliability. The investigations indicate that the proposed approach provides reasonable accuracy and efficiency.  相似文献   

4.
It is common in geostatistics to use the variogram to describe the spatial dependence structure and to use kriging as the spatial prediction methodology. Both methods are sensitive to outlying observations and are strongly influenced by the marginal distribution of the underlying random field. Hence, they lead to unreliable results when applied to extreme value or multimodal data. As an alternative to traditional spatial modeling and interpolation we consider the use of copula functions. This paper extends existing copula-based geostatistical models. We show how location dependent covariates e.g. a spatial trend can be accounted for in spatial copula models. Furthermore, we introduce geostatistical copula-based models that are able to deal with random fields having discrete marginal distributions. We propose three different copula-based spatial interpolation methods. By exploiting the relationship between bivariate copulas and indicator covariances, we present indicator kriging and disjunctive kriging. As a second method we present simple kriging of the rank-transformed data. The third method is a plug-in prediction and generalizes the frequently applied trans-Gaussian kriging. Finally, we report on the results obtained for the so-called Helicopter data set which contains extreme radioactivity measurements.  相似文献   

5.
The beta-κ distribution is a distinct case of the generalized beta distribution of the second kind. In previous studies, beta-p and beta-κ distributions have played important roles in representing extreme events, and thus, the present paper uses the beta-κ distribution. Further, this paper uses the method of moments and the method of L-moments to estimate the parameters from the beta-κ distribution, and to demonstrate the performance of the proposed model, the paper presents a simulation study using three estimation methods (including the maximum likelihood estimation method) and beta-κ and non beta-κ samples. In addition, this paper evaluates the performance of the beta-κ distribution by employing two widely used extreme value distributions (i.e., the GEV and Gumbel distributions) and two sets of actual data on extreme events.  相似文献   

6.
Abstract

Abstract A new theoretically-based distribution in frequency analysis is proposed. The extended three-parameter Burr XII distribution includes the generalized Pareto distribution, which is used to model the exceedences over threshold; log-logistic distribution, which is also advocated in flood frequency analysis; and Weibull distribution, which is a part of the generalized extreme value distribution used to model annual maxima as special cases. The extended Burr distribution is flexible to approximate extreme value distribution. Note that both the generalized Pareto and generalized extreme value distributions are limiting results in modelling the exceedences over threshold and block extremes, respectively. From a modelling perspective, generalization might be necessary in order to obtain a better fit. The extended three-parameter Burr XII distribution is therefore a meaningful candidate distribution in the frequency analysis. Maximum likelihood estimation for this distribution is investigated in the paper. The use of the extended three-parameter Burr XII distribution is demonstrated using data from China.  相似文献   

7.
The goal of quantile regression is to estimate conditional quantiles for specified values of quantile probability using linear or nonlinear regression equations. These estimates are prone to “quantile crossing”, where regression predictions for different quantile probabilities do not increase as probability increases. In the context of the environmental sciences, this could, for example, lead to estimates of the magnitude of a 10-year return period rainstorm that exceed the 20-year storm, or similar nonphysical results. This problem, as well as the potential for overfitting, is exacerbated for small to moderate sample sizes and for nonlinear quantile regression models. As a remedy, this study introduces a novel nonlinear quantile regression model, the monotone composite quantile regression neural network (MCQRNN), that (1) simultaneously estimates multiple non-crossing, nonlinear conditional quantile functions; (2) allows for optional monotonicity, positivity/non-negativity, and generalized additive model constraints; and (3) can be adapted to estimate standard least-squares regression and non-crossing expectile regression functions. First, the MCQRNN model is evaluated on synthetic data from multiple functions and error distributions using Monte Carlo simulations. MCQRNN outperforms the benchmark models, especially for non-normal error distributions. Next, the MCQRNN model is applied to real-world climate data by estimating rainfall Intensity–Duration–Frequency (IDF) curves at locations in Canada. IDF curves summarize the relationship between the intensity and occurrence frequency of extreme rainfall over storm durations ranging from minutes to a day. Because annual maximum rainfall intensity is a non-negative quantity that should increase monotonically as the occurrence frequency and storm duration decrease, monotonicity and non-negativity constraints are key constraints in IDF curve estimation. In comparison to standard QRNN models, the ability of the MCQRNN model to incorporate these constraints, in addition to non-crossing, leads to more robust and realistic estimates of extreme rainfall.  相似文献   

8.
Abstract

Two probability density functions (pdf), popular in hydrological analyses, namely the log-Gumbel (LG) and log-logistic (LL), are discussed with respect to (a) their applicability to hydrological data and (b) the drawbacks resulting from their mathematical properties. This paper—the first in a two-part series—examines a classical problem in which the considered pdf is assumed to be the true distribution. The most significant drawback is the existence of the statistical moments of LG and LL for a very limited range of parameters. For these parameters, a very rapid increase of the skewness coefficient, as a function of the coefficient of variation, is observed (especially for the log-Gumbel distribution), which is seldom observed in the hydrological data. These probability distributions can be applied with confidence only to extreme situations. For other cases, there is an important disagreement between empirical data and theoretical distributions in their tails, which is very important for the characterization of the distribution asymmetry. The limited range of shape parameters in both distributions makes the analyses (such as the method of moments), that make use of the interpretation of moments, inconvenient. It is also shown that the often-used L-moments are not sufficient for the characterization of the location, scale and shape parameters of pdfs, particularly in the case where attention is paid to the tail part of probability distributions. The maximum likelihood method guarantees an asymptotic convergence of the estimators beyond the domain of the existence of the first two moments (or L-moments), but it is not sensitive enough to the upper tails shape.  相似文献   

9.
The temporal‐spatial resolution of input data‐induced uncertainty in a watershed‐based water quality model, Hydrologic Simulation Program‐FORTRAN (HSPF), is investigated in this study. The temporal resolution‐induced uncertainty is described using the coefficient of variation (CV). The CV is found to decrease with decreasing temporal resolution and follow a log‐normal relation with time interval for temperature data while it exhibits a power‐law relation for rainfall data. The temporal‐scale uncertainties in the temperature and rainfall data follow a general extreme value distribution and a Weibull distribution, respectively. The Nash‐Sutcliffe coefficient (NSC) is employed to represent the spatial resolution induced uncertainty. The spatial resolution uncertainty in the dissolved oxygen and nitrate‐nitrogen concentrations simulated using HSPF is observed to follow a general extreme value distribution and a log‐normal distribution, respectively. The probability density functions (PDF) provide new insights into the effect of temporal‐scale and spatial resolution of input data on uncertainties involved in watershed modelling and total maximum daily load calculations. This study exhibits non‐symmetric distributions of uncertainty in water quality modelling, which simplify weather and water quality monitoring and reducing the cost involved in flow and water quality monitoring. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

10.
Starting from a recent paper by Murshed (Stoch Environ Res Risk Assess 25:897–911, 2011) in which a good performance of the Beta-k distribution in analyzing extreme hydrologic events is shown, in this paper, we propose the use of two new four-parameters distribution functions strongly related to the Beta-k distribution, namely the Beta-Dagum and the Beta-Singh-Maddala distributions. More in detail, the new distributions are a generalization of a reparametrization of Beta-k and Beta-p distributions, respectively. For these distributions some particular interpretations in terms of maximum and minimum of sequences of random variables can be derived and the maximal and minimal domain of attraction can be obtained. Moreover, the method of maximum likelihood, the method of moments and the method of L-moments are examined to estimate the parameters. Finally, two different applications on real data regarding maxima and minima of river flows are reported, in order to show the potentiality of these two models in the extreme events analysis.  相似文献   

11.
Hans Van de Vyver 《水文研究》2018,32(11):1635-1647
Rainfall intensity–duration–frequency (IDF) curves are a standard tool in urban water resources engineering and management. They express how return levels of extreme rainfall intensity vary with duration. The simple scaling property of extreme rainfall intensity, with respect to duration, determines the form of IDF relationships. It is supposed that the annual maximum intensity follows the generalized extreme value (GEV) distribution. As well known, for simple scaling processes, the location parameter and scale parameter of the GEV distribution obey a power law with the same exponent. Although, the simple scaling hypothesis is commonly used as a suitable working assumption, the multiscaling approach provides a more general framework. We present a new IDF relationship that has been formulated on the basis of the multiscaling property. It turns out that the GEV parameters (location and scale) have a different scaling exponent. Next, we apply a Bayesian framework to estimate the multiscaling GEV model and to choose the most appropriate model. It is shown that the model performance increases when using the multiscaling approach. The new model for IDF curves reproduces the data very well and has a reasonable degree of complexity without overfitting on the data.  相似文献   

12.
Random variable simulation has been applied to many applications in hydrological modelling, flood risk analysis, environmental impact assessment, etc. However, computer codes for simulation of distributions commonly used in hydrological frequency analysis are not available in most software libraries. This paper presents a frequency‐factor‐based method for random number generation of five distributions (normal, log–normal, extreme‐value type I, Pearson type III and log‐Pearson type III) commonly used in hydrological frequency analysis. The proposed method is shown to produce random numbers of desired distributions through three means of validation: (1) graphical comparison of cumulative distribution functions (CDFs) and empirical CDFs derived from generated data; (2) properties of estimated parameters; (3) type I error of goodness‐of‐fit test. An advantage of the method is that it does not require CDF inversion, and frequency factors of the five commonly used distributions involves only the standard normal deviate. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

13.
The research presented in this paper involves the application of the joint probability method to the estimation of extreme water levels resulting from astronomical tides and surge residuals and the investigation of the effects of tide–surge interactions on extreme water levels. The distribution of tide peaks was analysed from field records (<20 years) and a 46-year dataset of monthly maximum tidal amplitudes. Large surges were extracted from both field records and a numerical model hindcast covering the 48 largest storm events in the Irish Sea over the period 1959–2005. Extreme storm surges and tides were independently modelled using the generalised extreme value statistical model, and derived probability distributions were used to compute extreme water levels. An important, and novel, aspect of this research is an analysis of tide–surge interactions and their effects on total water level; where interactions exist, they lead to lower total water levels than in the case of independency. The degree of decrease varies with interaction strength, magnitude of surge peak at a particular phase of tide and the distribution of peaks over a tidal cycle. Therefore, including interactions in the computation of extreme levels may provide very useful information at the design stage of coastal protection systems.  相似文献   

14.
15.
Forecasting of extreme events and phenomena that respond to non-Gaussian heavy-tailed distributions (e.g., extreme environmental events, rock permeability, rock fracture intensity, earthquake magnitudes) is essential to environmental and geoscience risk analysis. In this paper, new parametric heavy-tailed distributions are devised starting from the exponential power probability density function (pdf) which is modified by explicitly including higher-order “cumulant parameters” into the pdf. Instead of dealing with whole power random variables, novel “residual” random variables are proposed to reconstruct the cumulant generating function. The expected value of a residual random variable with the corresponding pdf for order G, gives the input higher-order cumulant parameter. Thus, each parametric pdf is used to simulate a random variable containing residuals that yield, in average, the expected cumulant parameter. The cumulant parameters allow the formulation of heavy-tailed skewed pdfs beyond the lognormal to handle extreme events. Monte Carlo simulation of heavy-tailed distributions with higher-order parameters is demonstrated with a simple example for permeability.  相似文献   

16.
The method of Relative Entropy with Fractile constraints (REF method) is explained and applied to model extreme compound hydrological phenomena, such as extreme sea levels under storm conditions. Also presented is a simple method of Tail Entropy Approximation (TEA), which amounts to a correction of traditional statistical estimates for extreme observations.Distribution assumptions are necessary but downplayed in the REF method, relegating the prior distribution to the role of an extrapolation function. The estimates are objective in an information-theoretical sense. They also satisfy a strict requirement of self-consistency that is generally not satisfied by standard statistical methods: invariance under monotonic transformations of the random variable.Historical records of storm surge levels in the Netherlands and annual maximum tidal heights for Sheerness, UK, are used as examples. Comparison is made with distributions obtained using other methods.It is concluded that the tail entropy approximation provides simple, objective estimates of extremes in the tail beyond the range of observations.  相似文献   

17.
18.
The Halphen family of distributions is a flexible and complete system to fit sets of observations independent and identically distributed. Recently, it is shown that this family of distributions represents a potential alternative to the generalized extreme value distributions to model extreme hydrological events. The existence of jointly sufficient statistics for parameter estimation leads to optimality of the method of maximum likelihood (ML). Nevertheless, the ML method requires numerical approximations leading to less accurate values. However, estimators by the method of moments (MM) are explicit and their computation is fast. Even though MM method leads to good results, it is not optimal. In order to combine the advantages of the ML (optimality) and MM (efficiency and fast computations), two new mixed methods were proposed in this paper. One of the two methods is direct and the other is iterative, denoted respectively direct mixed method (MMD) and iterative mixed method (MMI). An overall comparison of the four estimation methods (MM, ML, MMD and MMI) was performed using Monte Carlo simulations regarding the three Halphen distributions. Generally, the MMI method can be considered for the three Halphen distributions since it is recommended for a majority of cases encountered in hydrology. The principal idea of the mixed methods MMD and MMI could be generalized for other distributions with complicated density functions.  相似文献   

19.
《水文科学杂志》2013,58(2):367-386
Abstract

Extremes of streamflow are usually modelled using heavy tailed distributions. While scrutinising annual flow maxima or the peaks over threshold, the largest elements in a sample are often suspected to be low quality data, outliers or values corresponding to much longer return periods than the observation period. In the case of floods, since the interest is focused mainly on the estimation of the right-hand tail of a distribution function, sensitivity of large quantiles to extreme elements of a series becomes the problem of special concern. This study investigated the sensitivity problem using the log-Gumbel distribution by generating samples of different sizes and different values of the coefficient of L-variation by means of Monte Carlo experiments. Parameters of the log-Gumbel distribution were estimated by the probability weighted moments (PWM) method, both for complete samples and the samples deprived of their largest element. In the latter case Hosking's concept of the “A” type PWM with Type II censoring was employed. The largest value was censored above the random threshold T corresponding to the non-exceedence probability F T. The effect of the F T value on the performance of the quantile estimates was then examined. Experimental results show that omission of the largest sample element need not result in a decrease in the accuracy of large quantile estimates obtained from the log-Gumbel model by the PWM method.  相似文献   

20.
This paper presents the review of the experience in applying the approach based on the limiting distributions of the extreme value theory (the generalized Pareto distribution, GPS, and generalized extreme value distribution, GEV) for deriving the distributions of maximal magnitudes and related ground accelerations from the earthquakes on the future time intervals of a given duration. The results of analyzing the global and regional earthquake catalogs and the ground peak accelerations during the earthquakes are described. It is shown that the magnitude of the strongest possible earthquake M max (and analogous characteristics for other types of data), which is often used in seismic risk assessment, is potentially unstable. We suggest a stable alternative for M max in the form of quantiles Q q (τ) of the maximal possible earthquake, which could occur during the future time interval of length τ. The quantity of the characteristic maximal event M c, which has been introduced in our previous publications, is another helpful robust scalar parameter. All the cases of approximation of the tails of empirical distributions, which were studied in our works, turned out to be finite (bounded); however, the rightmost point of these distributions, M max, is often poorly detectable and unstable. Therefore, the M max parameter has a low practical value.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号