首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 484 毫秒
1.
This paper presents a methodology to represent and propagate epistemic uncertainties within a scenario-based earthquake risk model. Unlike randomness, epistemic uncertainty stems from incomplete, vague or imprecise information. This source of uncertainties still requires the development of adequate tools in seismic risk analysis. We propose to use the possibility theory to represent three types of epistemic uncertainties, namely imprecision, model uncertainty and vagueness due to qualitative information. For illustration, an earthquake risk assessment for the city of Lourdes (Southern France) using this approach is presented. Once adequately represented, uncertainties are propagated and they result in a family of probabilistic damage curves. The latter is synthesized, using the concept of fuzzy random variables, by means of indicators bounding the true probability to exceed a given damage grade. The gap between the pair of probabilistic indicators reflects the imprecise character of uncertainty related to the model, thus picturing the extent of what is ignored and can be used in risk management.  相似文献   

2.
The character and importance of uncertainty in dam safety risk analysis drives how risk assessments are used in practice. The current interpretation of uncertainty is that, in addition to the aleatory risk which arises from presumed uncertainty in the world, it comprises the epistemic aspects of irresolution in a model or forecast, specifically model and parameter uncertainty. This is true in part but it is not all there is to uncertainty in risk analysis. The physics of hazards and of failure may be poorly understood, which goes beyond uncertainty in its conventional sense. There may be alternative scenarios of future conditions, for example non-stationarity in the environment, which cannot easily be forecast. There may also be deep uncertainties of the type associated with climate change. These are situations in which analysts do not know or do not agree on the system characterisation relating actions to consequences or on the probability distributions for key parameters. All of these facets are part of the uncertainty in risk analysis with which we must deal.  相似文献   

3.
Seismic hazard analysis is based on data and models, which both are imprecise and uncertain. Especially the interpretation of historical information into earthquake parameters, e.g. earthquake size and location, yields ambiguous and imprecise data. Models based on probability distributions have been developed in order to quantify and represent these uncertainties. Nevertheless, the majority of the procedures applied in seismic hazard assessment do not take into account these uncertainties, nor do they show the variance of the results. Therefore, a procedure based on Bayesian statistics was developed to estimate return periods for different ground motion intensities (MSK scale).Bayesian techniques provide a mathematical model to estimate the distribution of random variables in presence of uncertainties. The developed method estimates the probability distribution of the number of occurrences in a Poisson process described by the parameter . The input data are the historical occurrences of intensities for a particular site, represented by a discrete probability distribution for each earthquake. The calculation of these historical occurrences requires a careful preparation of all input parameters, i.e. a modelling of their uncertainties. The obtained results show that the variance of the recurrence rate is smaller in regions with higher seismic activity than in less active regions. It can also be demonstrated that long return periods cannot be estimated with confidence, because the time period of observation is too short. This indicates that the long return periods obtained by seismic source methods only reflects the delineated seismic sources and the chosen earthquake size distribution law.  相似文献   

4.
Some Bayesian methods of dealing with inaccurate or vague data are introduced in the framework of seismic hazard assessment. Inaccurate data affected by heterogeneous errors are modeled by a probability distribution instead of the usual value plus a random error representation; these data are generically called imprecise. The earthquake size and the number of events in a certain time are modeled as imprecise data. Imprecise data allow us to introduce into the estimation procedures the uncertainty inherent in the inaccuracy and heterogeneity of the measuring systems from which the data were obtained. The problem of estimating the parameter of a Poisson process is shown to be feasible by the use of Bayesian techniques and imprecise data. This background technique can be applied to a general problem of seismic hazard estimation. Initially, data in a regional earthquake catalog are assumed imprecise both in size and location (i.e errors in the epicenter or spreading over a given source). By means of scattered attenuation laws, the regional catalog can be translated into a so-called site catalog of imprecise events. The site catalog is then used to estimate return periods or occurrence probabilities, taking into account all sources of uncertainty. Special attention is paid to priors in the Bayesian estimation. They can be used to introduce additional information as well as scattered frequency-size laws for local events. A simple example is presented to illustrate the capabilities of this methodology.  相似文献   

5.
A Bayesian procedure for Probabilistic Tsunami Hazard Assessment   总被引:1,自引:1,他引:0  
In this paper, a Bayesian procedure is implemented for the Probability Tsunami Hazard Assessment (PTHA). The approach is general and modular incorporating all significant information relevant for the hazard assessment, such as theoretical and empirical background, analytical or numerical models, instrumental and historical data. The procedure provides the posterior probability distribution that integrates the prior probability distribution based on the physical knowledge of the process and the likelihood based on the historical data. Also, the method deals with aleatory and epistemic uncertainties incorporating in a formal way all sources of relevant uncertainty, from the tsunami generation process to the wave propagation and impact on the coasts. The modular structure of the procedure is flexible and easy to modify and/or update as long as new models and/or information are available. Finally, the procedure is applied to an hypothetical region, Neverland, to clarify the PTHA evaluation in a realistic case.  相似文献   

6.
三维剖面地质界线是构建三维地质结构模型的重要基础数据,其不确定性会影响三维模型的几何形态和属性分布。以单一分布为假设前提的统计学不确定性分析方法掩盖了其他概率分布特征对模型的影响。突破单一误差分布条件的假设前提,本文使用Monte Carlo方法模拟了不同概率分布情况下地质剖面数据中地质界线的抽样采集,以及地质界线空间分布的不确定性;依托地质界线空间位置与地质属性的耦合关系,提出了用地质属性概率分布实现地质界线空间不确定性的定量可视化,并结合实际地质剖面探讨了多种概率分布条件下地质界线的空间不确定性。实例研究表明,基于Monte Carlo模拟的不确定性分析方法可以突破单一误差分布假设条件,结合地质属性概率可充分揭示出建模数据的内在不确定性与模型外在要素形态之间的耦合关系。  相似文献   

7.
Typically, if uncertainty in subsurface parameters is addressed, it is done so using probability theory. Probability theory is capable of only handling one of the two types of uncertainty (aleatory), hence epistemic uncertainty is neglected. Dempster–Shafer evidence theory (DST) is an approach that allows analysis of both epistemic and aleatory uncertainty. In this paper, DST combination rules are used to combine measured field data on permeability, along with the expert opinions of hydrogeologists (subjective information) to examine uncertainty. Dempster’s rule of combination is chosen as the combination rule of choice primarily due to the theoretical development that exists and the simplicity of the data. Since Dempster’s rule does have some criticisms, two other combination rules (Yager’s rule and the Hau–Kashyap method) were examined which attempt to correct the problems that can be encountered using Dempster’s rule. With the particular data sets used here, there was not a clear superior combination rule. Dempster’s rule appears to suffice when the conflict amongst the evidence is low.  相似文献   

8.
ABSTRACT

A fact that is generally overlooked in many geotechnical uncertainty analyses is that input data of the model may be correlated. While this correlation may influence the system response, epistemic uncertainties i.e. lack of knowledge of this correlation appears as a risk factor. This paper discusses how a negative correlation between cohesion (c’) and friction angle (Ø’) with their associated uncertainties can influence both the bearing resistance of a shallow strip foundation footing and the estimation of its safety. A probabilistic approach that considers both the negative correlation and the uncertainty is used in this work as a reference. This method is compared to Eurocode 7 variants that do not for the correlation. These variants, resistance and material factoring methods appears to be more or less conservative depending on the negative correlation degree between (c’–Ø), their associated uncertainties and soil configurations. Finally, the proposed probabilistic comparison shows that the material factoring method is more conservative than the resistance one.  相似文献   

9.
基于Bootstrap抽样技术提出了有限数据条件下边坡可靠度分析方法。简要介绍了传统的边坡可靠度分析方法。采用Bootstrap方法模拟了抗剪强度参数概率分布函数的统计不确定性。以无限边坡为例研究了抗剪强度分布参数和分布类型不确定性对边坡可靠度的影响规律。结果表明:基于有限数据估计的样本均值、样本标准差和AIC值具有较大的变异性,这种变异性进一步导致了抗剪强度参数概率分布函数存在明显的统计不确定性。在考虑抗剪强度参数概率分布函数的统计不确定性时,边坡可靠度指标应为具有一定置信度水平的置信区间,而不是传统可靠度分析中的固定值。边坡可靠度指标的置信区间变化范围随安全系数的增加而增大,同时考虑分布参数和分布类型不确定性计算的可靠度指标具有更大的变异性和更宽的置信区间变化范围。Bootstrap方法为有限数据条件下抗剪强度参数概率分布函数统计不确定性的模拟以及边坡可靠度的评估提供了一条有效的途径。  相似文献   

10.
胡国华  夏军 《冰川冻土》2002,24(4):433-437
以概率论和灰色系统理论方法为基础,利用灰色概率、灰色概率分布、灰色期望及灰色方差等基本概念,针对环境系统的随机不确定性和灰色不确定性,建立了基于灰色概率的非突发性环境风险度的量化方法. 将非突发性环境风险归因于环境系统的随机不确定性和灰色不确定性,将影响环境容量和环境负荷耗用量的变量的分布处理成灰色概率分布,并用具有灰色概率形式的环境风险度来量化环境系统的非突发性失效风险性. 最后,将具有灰色概率形式的环境风险度转化成一般的系统失效风险率,进而用改进一阶二矩法进行计算. 作为算例给出了该方法应用于嘉陵江苍溪段有机污染风险度的估算.  相似文献   

11.
Leakage of CO2 and displaced brine from geologic carbon sequestration (GCS) sites into potable groundwater or to the near-surface environment is a primary concern for safety and effectiveness of GCS. The focus of this study is on the estimation of the probability of CO2 leakage along conduits such as faults and fractures. This probability is controlled by (1) the probability that the CO2 plume encounters a conductive fault that could serve as a conduit for CO2 to leak through the sealing formation, and (2) the probability that the conductive fault(s) intersected by the CO2 plume are connected to other conductive faults in such a way that a connected flow path is formed to allow CO2 to leak to environmental resources that may be impacted by leakage. This work is designed to fit into the certification framework for geological CO2 storage, which represents vulnerable resources such as potable groundwater, health and safety, and the near-surface environment as discrete “compartments.” The method we propose for calculating the probability of the network of conduits intersecting the CO2 plume and one or more compartments includes four steps: (1) assuming that a random network of conduits follows a power-law distribution, a critical conduit density is calculated based on percolation theory; for densities sufficiently smaller than this critical density, the leakage probability is zero; (2) for systems with a conduit density around or above the critical density, we perform a Monte Carlo simulation, generating realizations of conduit networks to determine the leakage probability of the CO2 plume (P leak) for different conduit length distributions, densities and CO2 plume sizes; (3) from the results of Step 2, we construct fuzzy rules to relate P leak to system characteristics such as system size, CO2 plume size, and parameters describing conduit length distribution and uncertainty; (4) finally, we determine the CO2 leakage probability for a given system using fuzzy rules. The method can be extended to apply to brine leakage risk by using the size of the pressure perturbation above some cut-off value as the effective plume size. The proposed method provides a quick way of estimating the probability of CO2 or brine leaking into a compartment for evaluation of GCS leakage risk. In addition, the proposed method incorporates the uncertainty in the system parameters and provides the uncertainty range of the estimated probability.  相似文献   

12.
The accurate information through water quality analysis, scientific study on F ? distribution in groundwater and geochemical knowledge with spatial information on geology and climate are necessary to understand the source/cause, type and level of F ? contamination. The Dindigul district is a hard-rock terrain and marked as one of the fluoride-rich area in Tamilnadu due to occurrence of various rock types including fluoride-bearing minerals. The F ? content of groundwater can thus originate from the dissolution of fluoride-bearing minerals in the bed rock. Eighty-six representative groundwater samples from Dindigul district was collected during two different seasons. Samples were analysed for F ?, other major cations and anions. The study area is chiefly composed of hornblende biotite gneiss and charnockite, apart from this untreated tannery effluents also let from many places in the study area. Geographical Information System technique was adopted to study the sources of F ?, and it was found that F ? in the study is mainly attributed to geogenic source.  相似文献   

13.
ABSTRACT

The turning bands simulation is a valuable and highly useful tool in solving various geological-mining, environmental and geological-engineering problems when it is essential to determine the uncertainty of the estimates of simulated values Zs (realizations) and assess the risk. This paper presents an investigative methodology and the results of calculations connected with the use of conditional turning bands simulation and bundled indicator kriging, making it possible to analyse the risk at different levels of uncertainty in the solution of optimization of the exploitation problems encountered in the mining of the polymetallic copper ore deposits in the Lubin-Sieroszowice region (Foresudetic monocline, the SW part of Poland). Examples of the evaluation of simulated values Zs and probability P average values Z* of the deposit parameters within the block located in the Rudna Mine (the block R-3) area are provided.  相似文献   

14.
15.
The uncertainty in the spatial distributions of consolidation settlement (s c) and time (t p) for Songdo New City is evaluated by using a probabilistic procedure. Ordinary kriging and three theoretical semivariogram models are used to estimate the spatial distributions of geo-layers which affect s c and t p in this study. The spatial map of mean (μ) and standard deviation (σ) for s c and t p are determined by using a first-order second moment method based on the evaluated statistics and probability density functions (PDFs) of soil properties. It is shown that the coefficients of variation (COVs) of the compression ratio [C c/(1 + e 0)] and the coefficient of consolidation (c v) are the most influential factors on the uncertainties of s c and t p, respectively. The μ and σ of the s c and t p, as well as the probability that s c exceeds 100 cm [P(s c > 100 cm)] and the probability that t p exceeds 36 months [P(t p > 36 months)] in Sect. 1, are observed to be larger than those of other sections because the thickness of the consolidating layer in Sect. 1 is the largest in the entire study area. The area requiring additional fill after the consolidation appears to increase as the COV of C c/(1 + e 0) increases and as the probabilistic design criterion (α) decreases. It is also shown that the area requiring the prefabricated vertical drains installation increases as the COV of c v increases and as the α decreases. The design procedure presented in this paper could be used in the decision making process for the design of geotechnical structures at coastal reclamation area.  相似文献   

16.
Approximate local confidence intervals are constructed from uncertainty models in the form of the conditional distribution of the random variable Z given values of variables [Zi, i=1,...,n]. When the support of the variable Z is any support other than that of the data, the conditional distributions require a change of support correction. This paper investigates the effect of change of support on the approximate local confidence intervals constructed by cumulative indicator kriging, class indicator kriging, and probability kriging under a variety of conditions. The conditions are generated by three simulated deposits with grade distributions of successively higher degree of skewness; a point support and two different block supports are considered. The paper also compares the confidence intervals obtained from these methods using the most used measures of confidence interval effectiveness.  相似文献   

17.
Significant uncertainties are associated with the definition of both the exploration targeting criteria and computational algorithms used to generate mineral prospectivity maps. In prospectivity modeling, the input and computational uncertainties are generally made implicit, by making a series of best-guess or best-fit decisions, on the basis of incomplete and imprecise information. The individual uncertainties are then compounded and propagated into the final prospectivity map as an implicit combined uncertainty which is impossible to directly analyze and use for decision making. This paper proposes a new approach to explicitly define uncertainties of individual targeting criteria and propagate them through a computational algorithm to evaluate the combined uncertainty of a prospectivity map. Applied to fuzzy logic prospectivity models, this approach involves replacing point estimates of fuzzy membership values by statistical distributions deemed representative of likely variability of the corresponding fuzzy membership values. Uncertainty is then propagated through a fuzzy logic inference system by applying Monte Carlo simulations. A final prospectivity map is represented by a grid of statistical distributions of fuzzy prospectivity. Such modeling of uncertainty in prospectivity analyses allows better definition of exploration target quality, as understanding of uncertainty is consistently captured, propagated and visualized in a transparent manner. The explicit uncertainty information of prospectivity maps can support further risk analysis and decision making. The proposed probabilistic fuzzy logic approach can be used in any area of geosciences to model uncertainty of complex fuzzy systems.  相似文献   

18.
The statistical technique of discriminant analysis associated with the calculation of an information coefficient has been applied to the concentrations of 37 chemical elements for calculating the mixing of stream sediments of different origin in the Mignone river basin.Discriminant analysis has been based on sample catchment basins (SCBs), defined as the part of the drainage basin between two consecutive sampling points along the same stream branch, and on the identification of 4 different litho-geochemical groups. This approach, has been used to define the membership probability values for every sample by applying Bayes' rule and calculating posterior probability. The grade of uncertainty for each group assignment has been evaluated by using an information coefficient, based on a classification entropy index, and running a procedure analogue to that used for processing membership values in fuzzy analysis. The maximum theoretical concentration that can be expected in soils near the sampling point (enhanced concentration) has then been calculated from both the measured and the membership values by introducing a specific enhancement function. Theoretical background concentrations at every sampling point have been also calculated by weighting the average value of concentration in each group with the membership values for each sample. These have been successively compared with the measured and enhanced concentrations to identify anomalous areas.The distribution maps of Arsenic and Vanadium in the Mignone River basin (central Italy) have been drawn accordingly to this technique, leading to the identification of areas of potential risk for human health.  相似文献   

19.
This paper presents a methodology for assessing local probability distributions by disjunctive kriging when the available data set contains some imprecise measurements, like noisy or soft information or interval constraints. The basic idea consists in replacing the set of imprecise data by a set of pseudohard data simulated from their posterior distribution; an iterative algorithm based on the Gibbs sampler is proposed to achieve such a simulation step. The whole procedure is repeated many times and the final result is the average of the disjunctive kriging estimates computed from each simulated data set. Being data-independent, the kriging weights need to be calculated only once, which enables fast computing. The simulation procedure requires encoding each datum as a pre-posterior distribution and assuming a Markov property to allow the updating of pre-posterior distributions into posterior ones. Although it suffers some imperfections, disjunctive kriging turns out to be a much more flexible approach than conditional expectation, because of the vast class of models that allows its computation, namely isofactorial models.  相似文献   

20.
硅钼黄分光光度法测定地下水中偏硅酸的不确定度评定   总被引:3,自引:1,他引:2  
采用不确定度连续传递模型,对硅钼黄分光光度法测定地下水中偏硅酸(DZ/T0064.62—93)的不确定度进行评定。测量结果的不确定度主要来源于标准溶液引入的不确定度、曲线拟合产生的不确定度和测量过程引入的不确定度三部分,而二氧化硅和偏硅酸的摩尔质量不确定度较小,可以忽略不计。采用双误差回归的方式对标准曲线进行拟合,在对各个不确定度分量进行量化的基础上,通过合成得到测量结果的标准不确定度,再乘以95%置信概率下的扩展因子2,得到测量结果的扩展不确定度。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号