首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Probabilistic seismic risk assessment for spatially distributed lifelines is less straightforward than for individual structures. While procedures such as the ‘PEER framework’ have been developed for risk assessment of individual structures, these are not easily applicable to distributed lifeline systems, due to difficulties in describing ground‐motion intensity (e.g. spectral acceleration) over a region (in contrast to ground‐motion intensity at a single site, which is easily quantified using Probabilistic Seismic Hazard Analysis), and since the link between the ground‐motion intensities and lifeline performance is usually not available in closed form. As a result, Monte Carlo simulation (MCS) and its variants are well suited for characterizing ground motions and computing resulting losses to lifelines. This paper proposes a simulation‐based framework for developing a small but stochastically representative catalog of earthquake ground‐motion intensity maps that can be used for lifeline risk assessment. In this framework, Importance Sampling is used to preferentially sample ‘important’ ground‐motion intensity maps, and K‐Means Clustering is used to identify and combine redundant maps in order to obtain a small catalog. The effects of sampling and clustering are accounted for through a weighting on each remaining map, so that the resulting catalog is still a probabilistically correct representation. The feasibility of the proposed simulation framework is illustrated by using it to assess the seismic risk of a simplified model of the San Francisco Bay Area transportation network. A catalog of just 150 intensity maps is generated to represent hazard at 1038 sites from 10 regional fault segments causing earthquakes with magnitudes between five and eight. The risk estimates obtained using these maps are consistent with those obtained using conventional MCS utilizing many orders of magnitudes more ground‐motion intensity maps. Therefore, the proposed technique can be used to drastically reduce the computational expense of a simulation‐based risk assessment, without compromising the accuracy of the risk estimates. This will facilitate computationally intensive risk analysis of systems such as transportation networks. Finally, the study shows that the uncertainties in the ground‐motion intensities and the spatial correlations between ground‐motion intensities at various sites must be modeled in order to obtain unbiased estimates of lifeline risk. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

2.
The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.  相似文献   

3.
Two key issues distinguish probabilistic seismic risk analysis of a lifeline or portfolio of structures from that of a single structure. Regional analysis must consider the correlation among lifeline components or structures in the portfolio, and the larger scope makes it much more computationally demanding. In this paper, we systematically identify and compare alternative methods for regional hazard analysis that can be used as the first part of a computationally efficient regional probabilistic seismic risk analysis that properly considers spatial correlation. Specifically, each method results in a set of probabilistic ground motion maps with associated hazard‐consistent annual occurrence probabilities that together represent the regional hazard. The methods are compared according to how replicable and computationally tractable they are and the extent to which the resulting maps are physically realistic, consistent with the regional hazard and regional spatial correlation, and few in number. On the basis of a conceptual comparison and an empirical comparison for Los Angeles, we recommend a combination of simulation and optimization approaches: (i) Monte Carlo simulation with importance sampling of the earthquake magnitudes to generate a set of probabilistic earthquake scenarios (defined by source and magnitude); (ii) the optimization‐based probabilistic scenario method, a mixed‐integer linear program, to reduce the size of that set; (iii) Monte Carlo simulation to generate a set of probabilistic ground motion maps, varying the number of maps sampled from each earthquake scenario so as to minimize the sampling variance; and (iv) the optimization‐based probabilistic scenario again to reduce the set of probabilistic ground motion maps. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

4.
现代城市的迅速发展对生命线工程系统依赖性逐渐增强。地震后生命线工程系统的性能直接决定了灾后生活和生产的恢复以及抢险工作的进行,因此对生命线工程系统进行地震作用下的可靠性分析具有十分重要的意义。本文中主要介绍2种求解大型网络抗震可靠度算法———最小路递推分解算法和最小割递推分解算法。在此基础上,利用这2种算法对沈阳市供气系统进行了分析。研究结果表明,合理选择使用这2种算法可以有效的进行不同地震烈度条件下的大型生命线工程系统的可靠性分析。  相似文献   

5.
During flood events, breaching of flood defences along a river system can have a significant reducing effect on downstream water levels and flood risks. This paper presents a Monte Carlo based flood risk framework for policy decision making, which takes this retention effect into account. The framework is developed to estimate societal flood risk in terms of potential numbers of fatalities and associated probabilities. It is tested on the Rhine–Meuse delta system in the Netherlands, where floods can be caused by high flows in the Rhine and Meuse rivers and/or high sea water levels in the North Sea. Importance sampling is applied in the Monte Carlo procedure to increase computational efficiency of the flood risk computations. This paper focuses on the development and testing of efficient importance sampling strategies for the framework. The development of an efficient importance sampling strategy for river deltas is more challenging than for non-tidal rivers where only discharges are relevant, because the relative influence of river discharge and sea water level on flood levels differs from location to location. As a consequence, sampling methods that are efficient and accurate for one location may be inefficient for other locations or, worse, may introduce errors in computed design water levels. Nevertheless, in the case study described in this paper the required simulation time was reduced by a factor 100 after the introduction of an efficient importance sampling method in the Monte Carlo framework, while at the same time the accuracy of the Monte Carlo estimates were improved.  相似文献   

6.
In risk analysis, a complete characterization of the concentration distribution is necessary to determine the probability of exceeding a threshold value. The most popular method for predicting concentration distribution is Monte Carlo simulation, which samples the cumulative distribution function with a large number of repeated operations. In this paper, we first review three most commonly used Monte Carlo (MC) techniques: the standard Monte Carlo, Latin Hypercube sampling, and Quasi Monte Carlo. The performance of these three MC approaches is investigated. We then apply stochastic collocation method (SCM) to risk assessment. Unlike the MC simulations, the SCM does not require a large number of simulations of flow and solute equations. In particular, the sparse grid collocation method and probabilistic collocation method are employed to represent the concentration in terms of polynomials and unknown coefficients. The sparse grid collocation method takes advantage of Lagrange interpolation polynomials while the probabilistic collocation method relies on polynomials chaos expansions. In both methods, the stochastic equations are reduced to a system of decoupled equations, which can be solved with existing solvers and whose results are used to obtain the expansion coefficients. Then the cumulative distribution function is obtained by sampling the approximate polynomials. Our synthetic examples show that among the MC methods, the Quasi Monte Carlo gives the smallest variance for the predicted threshold probability due to its superior convergence property and that the stochastic collocation method is an accurate and efficient alternative to MC simulations.  相似文献   

7.
A new probabilistic analytical approach to evaluate seismic system reliability of large lifeline systems is presented in this paper. The algorithm takes the shortest path from the source to the terminal of a node weight or edge weight network as decomposition policy, using the Boolean laws of set operation and probabilistic operation principal, a recursive decomposition process then could be constructed. For a general weight network, the modified Torrieri method (NTR/T method) is introduced to combine with the suggested algorithm. Therefore, the recursive decomposition algorithm may be applied to evaluate the seismic reliability of general lifeline systems. A series of case studies, including a practical district electric power network system and a large urban water supply system, show that the suggested algorithm supplies a useful probabilistic analysis means for the seismic reliability evaluation of large lifeline systems. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

8.
Recent earthquake events evidenced that damage of structural components in a lifeline network may cause prolonged disruption of lifeline services, which eventually results in significant socio‐economic losses in the affected area. Despite recent advances in network reliability analysis, the complexity of the problem and various uncertainties still make it a challenging task to evaluate the post‐hazard performance and connectivity of lifeline networks efficiently and accurately. In order to overcome such challenges and take advantage of merits of multi‐scale analysis, this paper develops a multi‐scale system reliability analysis method by integrating a network decomposition approach with the matrix‐based system reliability (MSR) method. In addition to facilitating system reliability analysis of large‐size networks, the multi‐scale approach enables optimizing the level of computational effort on subsystems; identifying the relative importance of components and subsystems at multiple scales; and providing a collaborative risk management framework. The MSR method is uniformly applied for system reliability analyses at both the lower‐scale (for link failure) and the higher‐scale (for system connectivity) to obtain the probability of general system events, various conditional probabilities, component importance measures, statistical correlation between subsystem failures and parameter sensitivities. The proposed multi‐scale analysis method is demonstrated by its application to a gas distribution network in Shelby County of Tennessee. A parametric study is performed to determine the number of segments during the lower‐scale MSR analysis of each pipeline based on the strength of the spatial correlation of seismic intensity. It is shown that the spatial correlation should be considered at both scales for accurate reliability evaluation. The proposed multi‐scale analysis approach provides an effective framework of risk assessment and decision support for lifeline networks under earthquake hazards. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

9.
Incremental dynamic analysis (IDA) is presented as a powerful tool to evaluate the variability in the seismic demand and capacity of non‐deterministic structural models, building upon existing methodologies of Monte Carlo simulation and approximate moment‐estimation. A nine‐story steel moment‐resisting frame is used as a testbed, employing parameterized moment‐rotation relationships with non‐deterministic quadrilinear backbones for the beam plastic‐hinges. The uncertain properties of the backbones include the yield moment, the post‐yield hardening ratio, the end‐of‐hardening rotation, the slope of the descending branch, the residual moment capacity and the ultimate rotation reached. IDA is employed to accurately assess the seismic performance of the model for any combination of the parameters by performing multiple nonlinear timehistory analyses for a suite of ground motion records. Sensitivity analyses on both the IDA and the static pushover level reveal the yield moment and the two rotational‐ductility parameters to be the most influential for the frame behavior. To propagate the parametric uncertainty to the actual seismic performance we employ (a) Monte Carlo simulation with latin hypercube sampling, (b) point‐estimate and (c) first‐order second‐moment techniques, thus offering competing methods that represent different compromises between speed and accuracy. The final results provide firm ground for challenging current assumptions in seismic guidelines on using a median‐parameter model to estimate the median seismic performance and employing the well‐known square‐root‐sum‐of‐squares rule to combine aleatory randomness and epistemic uncertainty. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

10.
The extrapolation in time of an earthquake sequence considered as a multidimensional stochastic point process is discussed. Estimates of seismic risk for both long- and short-term predictions are considered and an algorithm for the calculations is proposed. Several examples of short-term extrapolations are carried out by means of Monte Carlo simulations of the process. An assessment of the predictability of the seismic process shows that the catalog of strong earthquakes (M ? 7.0) contains about 0.4 bits of information per earthquake for the particular model of the process applied here.  相似文献   

11.
In the last few decades hydrologists have made tremendous progress in using dynamic simulation models for the analysis and understanding of hydrologic systems. However, predictions with these models are often deterministic and as such they focus on the most probable forecast, without an explicit estimate of the associated uncertainty. This uncertainty arises from incomplete process representation, uncertainty in initial conditions, input, output and parameter error. The generalized likelihood uncertainty estimation (GLUE) framework was one of the first attempts to represent prediction uncertainty within the context of Monte Carlo (MC) analysis coupled with Bayesian estimation and propagation of uncertainty. Because of its flexibility, ease of implementation and its suitability for parallel implementation on distributed computer systems, the GLUE method has been used in a wide variety of applications. However, the MC based sampling strategy of the prior parameter space typically utilized in GLUE is not particularly efficient in finding behavioral simulations. This becomes especially problematic for high-dimensional parameter estimation problems, and in the case of complex simulation models that require significant computational time to run and produce the desired output. In this paper we improve the computational efficiency of GLUE by sampling the prior parameter space using an adaptive Markov Chain Monte Carlo scheme (the Shuffled Complex Evolution Metropolis (SCEM-UA) algorithm). Moreover, we propose an alternative strategy to determine the value of the cutoff threshold based on the appropriate coverage of the resulting uncertainty bounds. We demonstrate the superiority of this revised GLUE method with three different conceptual watershed models of increasing complexity, using both synthetic and real-world streamflow data from two catchments with different hydrologic regimes.  相似文献   

12.
A methodology for the performance‐based seismic risk assessment of classical columns is presented. Despite their apparent instability, classical columns are, in general, earthquake resistant, as proven from the fact that many classical monuments have survived many strong earthquakes over the centuries. Nevertheless, the quantitative assessment of their reliability and the understanding of their dynamic behavior are not easy, because of the fundamental nonlinear character and the sensitivity of their response. In this paper, a seismic risk assessment is performed for a multidrum column using Monte Carlo simulation with synthetic ground motions. The ground motions adopted contain a high‐ and low‐frequency component, combining the stochastic method, and a simple analytical pulse model to simulate the directivity pulse contained in near source ground motions. The deterministic model for the numerical analysis of the system is three‐dimensional and is based on the Discrete Element Method. Fragility curves are produced conditional on magnitude and distance from the fault and also on scalar intensity measures for two engineering demand parameters, one concerning the intensity of the response during the ground shaking and the other the residual deformation of the column. Three performance levels are assigned to each engineering demand parameter. Fragility analysis demonstrated some of the salient features of these spinal systems under near‐fault seismic excitations, as for example, their decreased vulnerability for very strong earthquakes of magnitude 7 or larger. The analysis provides useful results regarding the seismic reliability of classical monuments and decision making during restoration process. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

13.
The paper discusses the performance and robustness of the Bayesian (probabilistic) approach to seismic tomography enhanced by the numerical Monte Carlo sampling technique. The approach is compared with two other popular techniques, namely the damped least-squares (LSQR) method and the general optimization approach. The theoretical considerations are illustrated by an analysis of seismic data from the Rudna (Poland) copper mine. Contrary to the LSQR and optimization techniques the Bayesian approach allows for construction of not only the “best-fitting” model of the sought velocity distribution but also other estimators, for example the average model which is often expected to be a more robust estimator than the maximum likelihood solution. We demonstrate that using the Markov Chain Monte Carlo sampling technique within the Bayesian approach opens up the possibility of analyzing tomography imaging uncertainties with minimal additional computational effort compared to the robust optimization approach. On the basis of the considered example it is concluded that the Monte Carlo based Bayesian approach offers new possibilities of robust and reliable tomography imaging.  相似文献   

14.
In this paper, a new probabilistic analytical approach, the minimal cut-based recursive decomposition algorithm (MCRDA), is presented to evaluate the seismic reliability of large-scale lifeline systems. Based on the minimal cut searching algorithm, the approach calculates the disjoint minimal cuts one by one using the basic procedure of the recursive decomposition method. At the same time, the process obtains the disjoint minimal paths of the system. In order to improve the computation efficiency, probabilistic inequality is used to calculate a solution that satisfies the prescribed error bound. A series of case studies show that MCRDA converges rapidly when the edges of the systems have low reliabilities. Therefore, the approach can be used to evaluate large-scale lifeline systems subjected to strong seismic wave excitation.  相似文献   

15.
An analytical approximation for the calculation of the stationary reliability of linear dynamic systems with higher‐dimensional output under Gaussian excitation is presented. For systems with certain parameters theoretical and computational issues are discussed for two topics: (1) the correlation of failure events at different parts of the failure boundary and (2) the approximation of the conditional out‐crossing rate across the failure boundary by the unconditional one. The correlation in the first topic is approximated by a multivariate integral, which is evaluated numerically by an efficient algorithm. For the second topic some existing semi‐empirical approximations are discussed and a new one is introduced. The extension to systems with uncertain parameters requires the calculation of a multi‐dimensional reliability integral over the space of the uncertain parameters. An existing asymptotic approximation is used for this task and an efficient scheme for numerical calculation of the first‐ and second‐order derivatives of the integrand is presented. Stochastic simulation using an importance sampling approach is also considered as an alternative method, especially for cases where the dimension of the uncertain parameters is moderately large. Comparisons between the proposed approximations and Monte Carlo simulation for some examples related to earthquake excitation are made. It is suggested that the proposed analytical approximations are appropriate for problems that require a large number of consistent error estimates of the probability of failure, as occurs in reliability‐based design optimization. Numerical problems regarding computational efficiency may arise when the dimension of both the output and the uncertain parameters is large. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

16.
Fragility curves represent the conditional probability that a structure's response may exceed the performance limit for a given ground motion intensity. Conventional methods for computing building fragilities are either based on statistical extrapolation of detailed analyses on one or two specific buildings or make use of Monte Carlo simulation with these models. However, the Monte Carlo technique usually requires a relatively large number of simulations to obtain a sufficiently reliable estimate of the fragilities, and it is computationally expensive and time consuming to simulate the required thousands of time history analyses. In this paper, high‐dimensional model representation based response surface method together with the Monte Carlo simulation is used to develop the fragility curve, which is then compared with that obtained by using Latin hypercube sampling. It is used to replace the algorithmic performance‐function with an explicit functional relationship, fitting a functional approximation, thereby reducing the number of expensive numerical analyses. After the functional approximation has been made, Monte Carlo simulation is used to obtain the fragility curve of the system. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

17.
城市供水管网在地震时的连通可靠性分析   总被引:1,自引:0,他引:1  
何双华  赵洋  宋灿 《地震学刊》2011,(5):585-589
考虑地震作用效应和管道抗力的随机特性,建立了埋地管道单元的概率预测模型,评估其在地震时的震害状态。把供水管网系统简化为边权有向网络图,通过Monte Carlo随机模拟过程,近似再现管网中各管段的破坏状态,进而分别结合图论理论方法和模糊关系矩阵法,对管网进行连通可靠性分析。由于Monte Carlo模拟算法是以管网各节点与水源点处于连通状态的近似频率计算来代替精确概率分析,为获得稳定的计算结果,对所用算例进行了5000次模拟。算例分析表明,基于图论方法和模糊关系矩阵法得到的管网连通可靠性结果基本相等。  相似文献   

18.
华北地区地震目录完全性分析   总被引:6,自引:1,他引:6  
刘杰  陈棋福 《地震》1996,16(1):59-67
文中发展了由Tinti et al.(1985)提出的一种判断地震目录不完全性的方法。它由三步组成:余震和震群的剔除,相对完全性的识别以及绝对完全性的计算。假定经过剔除余震和震群后理想化(即没有任何地震从目录中遗漏)的地震目录是一个稳态随机泊松过程,而对于实际的目录,通过拟合检验,将目录分成一组时间区段,每个区段是相对完全的,并具有不同的地震发生率。如果目录中至少有一区段被认为是绝对完全的,那么真  相似文献   

19.
以甘肃高台地区国家级重点文物保护单位骆驼城土遗址为研究对象,对其依次采用实地勘察、室内土工试验、统计分析、数值模拟四种手段进行了研究。将可靠度理论引入有限元动力分析中,利用APDL语言编制程序,使用蒙特卡洛法进行一千次的抽样数值模拟后进行抗震概率性分析。从而实现了土遗址抗震安全性评价的一种新方法和手段。计算结果对于骆驼城土遗址抗震防护及加固具有重要的参考价值,同时也为类似土遗址的防灾减灾工作提供一定依据。  相似文献   

20.
Fast performance uncertainty estimation via pushover and approximate IDA   总被引:1,自引:0,他引:1  
Approximate methods based on the static pushover are introduced to estimate the seismic performance uncertainty of structures having non‐deterministic modeling parameters. At their basis lies the use of static pushover analysis to approximate Incremental Dynamic Analysis (IDA) and estimate the demand and capacity epistemic uncertainty. As a testbed we use a nine‐storey steel frame having beam hinges with uncertain moment–rotation relationships. Their properties are fully described by six, randomly distributed, parameters. Using Monte Carlo simulation with Latin hypercube sampling, a characteristic ensemble of structures is created. The Static Pushover to IDA (SPO2IDA) software is used to approximate the IDA capacity curve from the appropriately post‐processed results of the static pushover. The approximate IDAs allow the evaluation of the seismic demand and capacity for the full range of limit‐states, even close to global dynamic instability. Moment‐estimating techniques such as Rosenblueth's point estimating method and the first‐order, second‐moment (FOSM) method are adopted as simple alternatives to obtain performance statistics with only a few simulations. The pushover is shown to be a tool that combined with SPO2IDA and moment‐estimating techniques can supply the uncertainty in the seismic performance of first‐mode‐dominated buildings for the full range of limit‐states, thus replacing semi‐empirical or code‐tabulated values (e.g. FEMA‐350), often adopted in performance‐based earthquake engineering. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号