首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
Stability analysis generally relies on the estimate of failure probability P. When information is scarce, incomplete, imprecise or vague, this estimate is imprecise. To represent epistemic uncertainty, possibility distributions have shown to be a more flexible tool than probability distributions. The joint propagation of possibilistic and probabilistic information can rely on more advanced techniques such as the classical random sampling of the cumulative probability distribution F and of the intervals from the possibility distributions π. The imprecise probability P is then associated with a random interval, which can be summarized by a pair of indicators bounding it. In the present paper, we propose a graphical tool to explore the sensitivity on these indicators. This is conducted by means of the contribution to sample probability of failure plot based on the ordering of the randomly generated levels of confidence associated with the quantiles of F and to the α-cuts of π. This presents several advantages: (1) the contribution of both types of uncertainty, aleatoric and epistemic, can be compared in a unique setting; (2) the analysis is conducted in a post-processing step, i.e. at no extra computational cost; (3) it allows highlighting the regions of the quantiles and of the nested intervals which contribute the most to the bounds of P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry).  相似文献   

2.
Significant uncertainties are associated with the definition of both the exploration targeting criteria and computational algorithms used to generate mineral prospectivity maps. In prospectivity modeling, the input and computational uncertainties are generally made implicit, by making a series of best-guess or best-fit decisions, on the basis of incomplete and imprecise information. The individual uncertainties are then compounded and propagated into the final prospectivity map as an implicit combined uncertainty which is impossible to directly analyze and use for decision making. This paper proposes a new approach to explicitly define uncertainties of individual targeting criteria and propagate them through a computational algorithm to evaluate the combined uncertainty of a prospectivity map. Applied to fuzzy logic prospectivity models, this approach involves replacing point estimates of fuzzy membership values by statistical distributions deemed representative of likely variability of the corresponding fuzzy membership values. Uncertainty is then propagated through a fuzzy logic inference system by applying Monte Carlo simulations. A final prospectivity map is represented by a grid of statistical distributions of fuzzy prospectivity. Such modeling of uncertainty in prospectivity analyses allows better definition of exploration target quality, as understanding of uncertainty is consistently captured, propagated and visualized in a transparent manner. The explicit uncertainty information of prospectivity maps can support further risk analysis and decision making. The proposed probabilistic fuzzy logic approach can be used in any area of geosciences to model uncertainty of complex fuzzy systems.  相似文献   

3.
Accurate and reliable displacement forecasting plays a key role in landslide early warning. However, due to the epistemic uncertainties associated with landslide systems, errors are unavoidable and sometimes significant in traditional methods of deterministic point forecasting. Transforming traditional point forecasting into probabilistic forecasting is essential for quantifying the associated uncertainties and improving the reliability of landslide displacement forecasting. This paper proposes a hybrid approach based on bootstrap, extreme learning machine (ELM), and artificial neural network (ANN) methods to quantify the associated uncertainties via probabilistic forecasting. The hybrid approach consists of two steps. First, a bootstrap-based ELM is applied to estimate the true regression mean of landslide displacement and the corresponding variance of model uncertainties. Second, an ANN is used to estimate the variance of noise. Reliable prediction intervals (PIs) can be computed by combining the true regression mean, variance of model uncertainty, and variance of noise. The performance of the proposed hybrid approach was validated using monitoring data from the Shuping landslide, Three Gorges Reservoir area, China. The obtained results suggest that the Bootstrap-ELM-ANN approach can be used to perform probabilistic forecasting in the medium term and long term and to quantify the uncertainties associated with landslide displacement forecasting for colluvial landslides with step-like deformation in the Three Gorges Reservoir area.  相似文献   

4.
ABSTRACT

A fact that is generally overlooked in many geotechnical uncertainty analyses is that input data of the model may be correlated. While this correlation may influence the system response, epistemic uncertainties i.e. lack of knowledge of this correlation appears as a risk factor. This paper discusses how a negative correlation between cohesion (c’) and friction angle (Ø’) with their associated uncertainties can influence both the bearing resistance of a shallow strip foundation footing and the estimation of its safety. A probabilistic approach that considers both the negative correlation and the uncertainty is used in this work as a reference. This method is compared to Eurocode 7 variants that do not for the correlation. These variants, resistance and material factoring methods appears to be more or less conservative depending on the negative correlation degree between (c’–Ø), their associated uncertainties and soil configurations. Finally, the proposed probabilistic comparison shows that the material factoring method is more conservative than the resistance one.  相似文献   

5.
Editorial     
The assessment of rock-fall hazards is subject to significant uncertainty, which is not fully considered in general practice and research. This paper reviews and classifies the various sources of the uncertainty. Taking a generic framework for risk assessment as source, a probabilistic model is presented that consistently combines the different types of uncertainties, in order to obtain a unified estimate of rock-fall risk. An important aspect of the model is that it allows for incorporating all available information, including physical and empirical models, observations and expert knowledge, by means of Bayesian updating. Detailed formulations are developed for various types of information. Finally, two examples considering rock-fall risk on roads, with and without protection structures, illustrate the application of the probabilistic modeling framework to practical problems.  相似文献   

6.
In Canada, Montreal is the second city with the highest seismic risk. This is due to its relatively high seismic hazard, old infrastructures and high population density. The region is characterised by moderate seismic activity with no recent record of a major earthquake. The lack of historical strong ground motion records for the region contributes to large uncertainties in the estimation of hazards. Among the sources of uncertainty, the attenuation function is the main contributor and its effect on estimates of risks is investigated. Epistemic uncertainty was considered by obtaining damage estimates for three attenuation functions that were developed for Eastern North America. The results indicate that loss estimates are highly sensitive to the choice of the attenuation function and suggest that epistemic uncertainty should be considered both for the definition of the hazard function and in loss estimation methodologies. Seismic loss estimates are performed for a 2% in 50 years seismic threat, which corresponds to the design level earthquake in the national building code of Canada, using HAZUS-MH4 for the Montreal region over 522 census tracts. The study estimated that for the average scenario roughly 5% of the building stock would be damaged with direct economic losses evaluated at 1.4 billion dollars for such a scenario. The maximum number of casualties would result in approximately 500 people being injured or dead at a calculated time of occurrence of 2?pm.  相似文献   

7.
Seismic hazard analysis is based on data and models, which both are imprecise and uncertain. Especially the interpretation of historical information into earthquake parameters, e.g. earthquake size and location, yields ambiguous and imprecise data. Models based on probability distributions have been developed in order to quantify and represent these uncertainties. Nevertheless, the majority of the procedures applied in seismic hazard assessment do not take into account these uncertainties, nor do they show the variance of the results. Therefore, a procedure based on Bayesian statistics was developed to estimate return periods for different ground motion intensities (MSK scale).Bayesian techniques provide a mathematical model to estimate the distribution of random variables in presence of uncertainties. The developed method estimates the probability distribution of the number of occurrences in a Poisson process described by the parameter . The input data are the historical occurrences of intensities for a particular site, represented by a discrete probability distribution for each earthquake. The calculation of these historical occurrences requires a careful preparation of all input parameters, i.e. a modelling of their uncertainties. The obtained results show that the variance of the recurrence rate is smaller in regions with higher seismic activity than in less active regions. It can also be demonstrated that long return periods cannot be estimated with confidence, because the time period of observation is too short. This indicates that the long return periods obtained by seismic source methods only reflects the delineated seismic sources and the chosen earthquake size distribution law.  相似文献   

8.
9.
The least squares Monte Carlo method is a decision evaluation method that can capture the effect of uncertainty and the value of flexibility of a process. The method is a stochastic approximate dynamic programming approach to decision making. It is based on a forward simulation coupled with a recursive algorithm which produces the near-optimal policy. It relies on the Monte Carlo simulation to produce convergent results. This incurs a significant computational requirement when using this method to evaluate decisions for reservoir engineering problems because this requires running many reservoir simulations. The objective of this study was to enhance the performance of the least squares Monte Carlo method by improving the sampling method used to generate the technical uncertainties used in obtaining the production profiles. The probabilistic collocation method has been proven to be a robust and efficient uncertainty quantification method. By using the sampling methods of the probabilistic collocation method to approximate the sampling of the technical uncertainties, it is possible to significantly reduce the computational requirement of running the decision evaluation method. Thus, we introduce the least squares probabilistic collocation method. The decision evaluation considered a number of technical and economic uncertainties. Three reservoir case studies were used: a simple homogeneous model, the PUNQ-S3 model, and a modified portion of the SPE10 model. The results show that using the sampling techniques of the probabilistic collocation method produced relatively accurate responses compared with the original method. Different possible enhancements were discussed in order to practically adapt the least squares probabilistic collocation method to more realistic and complex reservoir models. Furthermore, it is desired to perform the method to evaluate high-dimensional decision scenarios for different chemical enhanced oil recovery processes using real reservoir data.  相似文献   

10.
With the recent transition to a more risk-based approach in flood management, flood risk models—being a key component in flood risk management—are becoming increasingly important. Such models combine information from four components: (1) the flood hazard (mostly inundation depth), (2) the exposure (e.g. land use), (3) the value of elements at risk and (4) the susceptibility of the elements at risk to hydrologic conditions (e.g. depth–damage curves). All these components contain, however, a certain degree of uncertainty which propagates through the calculation and accumulates in the final damage estimate. In this study, an effort has been made to assess the influence of uncertainty in these four components on the final damage estimate. Different land-use data sets and damage models have been used to represent the uncertainties in the exposure, value and susceptibility components. For the flood hazard component, inundation depth has been varied systematically to estimate the sensitivity of flood damage estimations to this component. The results indicate that, assuming the uncertainty in inundation depth is about 25 cm (about 15% of the mean inundation depth), the total uncertainty surrounding the final damage estimate in the case study area can amount to a factor 5–6. The value of elements at risk and depth–damage curves are the most important sources of uncertainty in flood damage estimates and can both introduce about a factor 2 of uncertainty in the final damage estimates. Very large uncertainties in inundation depth would be necessary to have a similar effect on the uncertainty of the final damage estimate, which seem highly unrealistic. Hence, in order to reduce the uncertainties surrounding potential flood damage estimates, these components deserve prioritisation in future flood damage research. While absolute estimates of flood damage exhibit considerable uncertainty (the above-mentioned factor 5–6), estimates for proportional changes in flood damages (defined as the change in flood damages as a percentage of a base situation) are much more robust.  相似文献   

11.
The hazard assessment of potential earthquake-induced landslides is an important aspect of the study of earthquake-induced landslides. In this study, we assessed the hazard of potential earthquake-induced landslides in Huaxian County with a new hazard assessment method. This method is based on probabilistic seismic hazard analysis and the Newmark cumulative displacement assessment model. The model considers a comprehensive suite of information, including the seismic activities and engineering geological conditions in the study area, and simulates the uncertainty of the intensity parameters of the engineering geological rock groups using the Monte Carlo method. Unlike previous assessment studies on ground motions with a given exceedance probability level, the hazard of earthquake-induced landslides obtained by the method presented in this study allows for the possibility of earthquake-induced landslides in different parts of the study area in the future. The assessment of the hazard of earthquake-induced landslides in this study showed good agreement with the historical distribution of earthquake-induced landslides. This indicates that the assessment properly reflects the macroscopic rules for the development of earthquake-induced landslides in the study area, and can provide a reference framework for the management of the risk of earthquake-induced landslides and land planning.  相似文献   

12.
Wang  Ying  Zhang  Qiang  Wang  Su-Ping  Wang  Jin-Song  Yao  Yu-bi 《Natural Hazards》2017,87(2):899-918
A formal Bayesian approach that uses the Markov chain Monte Carlo method to estimate the uncertainties of natural hazards has attracted significant attention in recent years, and a fuzzy graph can be considered an estimation of the relationship that we want to know in risk systems. However, the challenge with such approaches is to sufficiently consider uncertainty without much prior knowledge and adequate measurement. This paper proposes a new adaptive Bayesian framework that is based on the conventional Bayesian scheme and the optimal information diffusion model to more precisely calculate the conditional probabilities in the fuzzy graph for recognizing relationships and estimating uncertainty in natural disasters with scant data. This methodology is applied to study the relationship between the earthquake’s magnitude and the isoseismal area with strong-motion earthquake observations. It is also compared with other techniques, including classic Bayesian regression and artificial neural networks. The results show that the new method achieves better performance than do the main existing methods with incomplete data.  相似文献   

13.
Probabilistic Assessment of Earthquake Insurance Rates for Turkey   总被引:2,自引:0,他引:2  
A probabilistic model is presented to obtain a realistic estimate of earthquake insurance rates for reinforced concrete buildings in Turkey. The model integrates information on seismic hazard and information on expected earthquake damage on engineering facilities in a systematic way, yielding to estimates of earthquake insurance premiums. In order to demonstrate the application of the proposed probabilistic method, earthquake insurance rates are computed for reinforced concrete buildings constructed in five cities located in different seismic zones of Turkey. The resulting rates are compared with the rates currently charged by the insurance companies. The earthquake insurance rates are observed to be sensitive to the assumptions on seismic hazard and damage probability matrices and to increase significantly with increasing violation of the code requirements.  相似文献   

14.
The character and importance of uncertainty in dam safety risk analysis drives how risk assessments are used in practice. The current interpretation of uncertainty is that, in addition to the aleatory risk which arises from presumed uncertainty in the world, it comprises the epistemic aspects of irresolution in a model or forecast, specifically model and parameter uncertainty. This is true in part but it is not all there is to uncertainty in risk analysis. The physics of hazards and of failure may be poorly understood, which goes beyond uncertainty in its conventional sense. There may be alternative scenarios of future conditions, for example non-stationarity in the environment, which cannot easily be forecast. There may also be deep uncertainties of the type associated with climate change. These are situations in which analysts do not know or do not agree on the system characterisation relating actions to consequences or on the probability distributions for key parameters. All of these facets are part of the uncertainty in risk analysis with which we must deal.  相似文献   

15.
The conventional liquefaction potential assessment methods (also known as simplified methods) profoundly rely on empirical correlations based on observations from case histories. A probabilistic framework is developed to incorporate uncertainties in the earthquake ground motion prediction, the cyclic resistance prediction, and the cyclic demand prediction. The results of a probabilistic seismic hazard assessment, site response analyses, and liquefaction potential analyses are convolved to derive a relationship for the annual probability and return period of liquefaction. The random field spatial model is employed to quantify the spatial uncertainty associated with the in-situ measurements of geotechnical material.  相似文献   

16.
This paper presents a methodology to evaluate the seismic reliability of geostructures in an optimal way. Taguchi design of experiments are adopted to find the most efficient and cost-effective combination of material properties in the uncertainty domain. Twelve uniform and mixed design models are tested. A polynomial-based response surface meta-model is built for each one and the accuracy of perdition is examined using 10,000 Monte Carlo simulations. A two-dimensional gravity dam is used as a vehicle for probabilistic transient analyses. The ground motion record-to-record variability is added as well using over one hundred earthquake records selected based on probabilistic seismic hazard analysis. Dynamic sensitivity of epistemic random variables are evaluated for the first time. Finally, an efficient and practical procedure is proposed in order to determine the reliability index of the geostructures. This approach, in fact, can be generalised for any type of engineering structures dealing with multi-hazard problems.  相似文献   

17.
Hazus-MH earthquake modeling in the central USA   总被引:2,自引:2,他引:0  
This investigation was undertaken to assess the sensitivity of the Hazus-MH (v2.0) earthquake model to model parameters and to guide the selection of these parameters for realistic earthquake-loss assessment in the central USA. To accomplish these goals, we performed several sensitivity analyses and a validation assessment using earthquake damage surveys from the 2008 M5.2 Mt. Carmel, Illinois earthquake. We evaluated the sensitivity of the Hazus-MH earthquake model to the selection of seismic hazard data, attenuation function, soils data, liquefaction data, and structural fragility curves. These sensitivity analyses revealed that earthquake damage, loss, and casualty estimates are most sensitive to the seismic hazard data and selection of the attenuation function. The selection of the seismic hazard data and attenuation function varied earthquake damages and capital-stock losses by ±68?% and casualty estimates by ±84?%. The validation assessment revealed that Hazus-MH overpredicted observed damages by 68?C221?% depending on the model parameters employed. The model run using region-specific soils, liquefaction, and structure fragility curves produced the most realistic damage estimate (within 68?% of actual damages). Damage estimates using default Hazus-MH parameters were overpredicted by 155?%. The uncertainties identified here are similar to uncertainties recognized in other Hazus-MH validation assessments. Despite uncertainties in Hazus-MH earthquake-loss estimates, such estimates are still useful for planning and response so long as the limitations of the results are properly conveyed to planners, decision makers, emergency responders, and the public.  相似文献   

18.
19.
The past 12 years have seen significant steps forward in the science and practice of coastal flood analysis. This paper aims to recount and critically assess these advances, while helping identify next steps for the field. This paper then focuses on a key problem, connecting the probabilistic characterization of flood hazards to their physical mechanisms. Our investigation into the effects of natural structure on the probabilities of storm surges shows that several different types of spatial-, temporal-, and process-related organizations affect key assumptions made in many of the methods used to estimate these probabilities. Following a brief introduction to general historical methods, we analyze the two joint probability methods used in most tropical cyclone hazard and risk studies today: the surface response function and Bayesian quadrature. A major difference between these two methods is that the response function creates continuous surfaces, which can be interpolated or extrapolated on a fine scale if necessary, and the Bayesian quadrature optimizes a set of probability masses, which cannot be directly interpolated or extrapolated. Several examples are given here showing significant impacts related to natural structure that should not be neglected in hazard and risk assessment for tropical cyclones including: (1) differences between omnidirectional sampling and directional-dependent sampling of storms in near coastal areas; (2) the impact of surge probability discontinuities on the treatment of epistemic uncertainty; (3) the ability to reduce aleatory uncertainty when sampling over larger spatial domains; and (4) the need to quantify trade-offs between aleatory and epistemic uncertainties in long-term stochastic sampling.  相似文献   

20.

In this paper, seismic risk scenarios for Bucharest, the capital city of Romania, are proposed and assessed. Bucharest has one of the highest seismic risk levels in Europe, and this is due to a combination of relatively high seismic hazard and a building stock built mainly before the devastating Vrancea 1977 earthquake. In this study, the seismic risk of Bucharest is assessed using the most recent information regarding the characteristics of the residential building stock. The ground motion amplitudes are evaluated starting from random fields obtained by coupling a ground motion model derived for the Vrancea intermediate-depth seismic source with a spatial correlation model. The seismic risk evaluation method applied in this study is based on the well-known macroseismic method. For several structural typologies, the vulnerability parameters are evaluated based on a damage survey performed on 18,000 buildings in Bucharest after the March 1977 earthquake. Subsequently, the risk metrics are compared with those from other studies in the literature that apply a different risk assessment methodology in order to gain a better view of the uncertainties associated with a seismic risk study at city level. Finally, the impact of several Vrancea intermediate-depth earthquake scenarios is evaluated and the results show that the earthquake which has the closest epicenter to Bucharest appears to be the most damaging.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号