首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
Artificial open channels being costlier infrastructure, their design should ensure reliability along with optimality in project cost. This paper presents reliability analysis of composite channels, considering uncertainty associated with various design parameters such as friction factors, longitudinal slope, channel width, side slope, and flow depth. This study also considers uncertainties of watershed characteristics, rainfall intensity and drainage area to quantify the uncertainty of runoff. For uncertainty modeling, the advanced first order second moment method and Monte Carlo simulation are used and it is found that the results by both approaches show good agreement. Then, a reliability index that can be used to design a composite channel to convey design discharge for a specified risk or probability of failure is presented, and its sensitivity with different channel design parameters are analyzed. To validate the effectiveness of the present approach, the reliability values and safety factors for variable system loading scenario are obtained under static and dynamic environment. The sensitivity analysis shows that flow depth and bed width are the most influencing parameters that affect the safety factor and reliability.  相似文献   

2.
Previous comparison studies on seismic isolation have demonstrated its beneficial and detrimental effects on the structural performance of high‐speed rail bridges during earthquakes. Striking a balance between these 2 competing effects requires proper tuning of the controlling design parameters in the design of the seismic isolation system. This results in a challenging problem for practical design in performance‐based engineering, particularly when the uncertainty in seismic loading needs to be explicitly accounted for. This problem can be tackled using a novel probabilistic performance‐based optimum seismic design (PPBOSD) framework, which has been previously proposed as an extension of the performance‐based earthquake engineering methodology. For this purpose, a parametric probabilistic demand hazard analysis is performed over a grid in the seismic isolator parameter space, using high‐throughput cloud‐computing resources, for a California high‐speed rail (CHSR) prototype bridge. The derived probabilistic structural demand hazard results conditional on a seismic hazard level and unconditional, i.e., accounting for all seismic hazard levels, are used to define 2 families of risk features, respectively. Various risk features are explored as functions of the key isolator parameters and are used to construct probabilistic objective and constraint functions in defining well‐posed optimization problems. These optimization problems are solved using a grid‐based, brute‐force approach as an application of the PPBOSD framework, seeking optimum seismic isolator parameters for the CHSR prototype bridge. This research shows the promising use of seismic isolation for CHSR bridges, as well as the potential of the versatile PPBOSD framework in solving probabilistic performance‐based real‐world design problems.  相似文献   

3.
In this study a simulation-based fuzzy chance-constrained programming (SFCCP) model is developed based on possibility theory. The model is solved through an indirect search approach which integrates fuzzy simulation, artificial neural network and simulated annealing techniques. This approach has the advantages of: (1) handling simulation and optimization problems under uncertainty associated with fuzzy parameters, (2) providing additional information (i.e. possibility of constraint satisfaction) indicating that how likely one can believe the decision results, (3) alleviating computational burdens in the optimization process, and (4) reducing the chances of being trapped in local optima. The model is applied to a petroleum-contaminated aquifer located in western Canada for supporting the optimal design of groundwater remediation systems. The model solutions provide optimal groundwater pumping rates for the 3, 5 and 10 years of pumping schemes. It is observed that the uncertainty significantly affects the remediation strategies. To mitigate such impacts, additional cost is required either for increased pumping rate or for reinforced site characterization.  相似文献   

4.
A stochastic optimization model based on an adaptive feedback correction process and surrogate model uncertainty was proposed and applied for remediation strategy design at a dense non-aqueous phase liquids (DNAPL)-contaminated groundwater site. One hundred initial training samples were obtained using the Latin hypercube sampling method. A surrogate model of a multiphase flow simulation model was constructed based on these samples employing the self-adaptive particle swarm optimization kriging (SAPSOKRG) method. An optimization model was built, using the SAPSOKRG surrogate model as a constraint. Then, an adaptive feedback correction process was designed and applied to iteratively update the training samples, surrogate model, and optimization model. Results showed that the training samples, the surrogate model, and the optimization model were effectively ameliorated. However, the surrogate model is an approximation of the simulation model, and some degree of uncertainty exists even though the surrogate model was ameliorated. Therefore, residuals between the surrogate model and the simulation model were calculated, and an uncertainty analysis was conducted. Based on the uncertainty analysis results, a stochastic optimization model was constructed and solved to obtain optimal remediation strategies at different confidence levels (60, 70, 80, 90, 95%) and under different remediation objectives (average DNAPL removal rate ≥?70,?≥?75,?≥?80,?≥?85,?≥?90%). The optimization results demonstrated that the higher the confidence level and remediation objective, the more expensive was remediation. Therefore, decision makers can weigh remediation costs, confidence levels, and remediation objectives to make an informed choice. This also allows decision makers to determine the reliability of a selected strategy and provides a new tool for DNAPL-contaminated groundwater remediation design.  相似文献   

5.
Dam overtopping risk assessment considering inspection program   总被引:3,自引:2,他引:1  
Safety inspection of large dams in Taiwan is conducted every 5 years. The practice does not take into consideration uncertainty of dam conditions. The goal of this study is to determine the optimal dam inspection interval under the consideration of overtopping risk incorporating uncertainty gate availability. In earlier studies, assessment of overtopping risk only considered the uncertainties in reservoir properties and natural randomness of hydrologic events without giving much thought to the availability of spillway gates. As a result, the overtopping risk could be underestimated. In this study, an innovative concept is proposed to evaluate dam overtopping by taking into account spillway gate availability. The framework consists of three parts: (1) evaluation of conditional overtopping risk for different numbers of malfunctioning spillway gates; (2) evaluation of spillway gate availability; and (3) dam inspection scheduling. Furthermore, considerations are given to overtopping risk, inspection cost, and dam break cost for determining the optimal inspection schedule. The methodology is applied to the Shihmen Reservoir in Taiwan and to evaluate its time-dependent overtopping risk. Results show that overtopping risk considering the availability of the spillway gates is higher than the one without considering the availability of the spillway gates.  相似文献   

6.
Typical pump-and-treat (PAT) optimization problems involve design of pumping schemes, while minimizing cost and meeting a set of constraints. Due to scarcity of information about the hydrogeological system, stochastic modeling approaches can be used to assess tradeoffs between optimality and reliability. Using a stochastic approach, the constrained, single-objective problem may be turned into a multiobjective problem by substituting constraint inequalities with an additional objective function (OF) that accounts for the reliability of the PAT process. In this work, two approaches are analyzed: in one case, the additional OF consists of the probability of failure of a given remediation policy; in another, the OF additional is represented by the recourse, namely the penalty cost induced by the violation of constraints. In order to overcome the overwhelming computational cost required by stochastic simulation, surrogate forms of the OFs are introduced. In the test case under investigation, such functions are estimated by a kriging interpolation of the OF over a series of data points obtained from stochastic simulations of flow and transport, and calibrated against stochastic optimization solutions. The analysis of the two approaches for addressing the tradeoff of cost vs. reliability indicates that recourse accounts not only for the frequency of constraint violations, as the probability of failure does, but also for the intensity with which these occur. Ultimately, the recourse method allows considering less restrictive policies, although these may be highly sensitive to the choice of penalty functions.  相似文献   

7.
This paper investigates numerical optimization of dense nonaqueous phase liquid (DNAPL) site remediation design considering effects of prediction and measurement uncertainty. Results are presented for a hypothetical problem involving remediation using thermal source reduction (TSR) and bioremediation with electron donor (ED) injection. Pump-and-treat is utilized as a backup measure if compliance criteria are not met. Remediation system design variables are optimized to minimize expected net present value (ENPV) cost. Adaptive criteria are assumed for real-time control of TSR and ED duration. Source zone dissolved concentration data enabled more reliable and lower cost operation of TSR than soil concentration data, but using both soil and dissolved data improved results sufficiently to more than offset the additional cost. Decisions to terminate remediation and monitoring or to initiate pump-and-treat are complicated by measurement noise. Simultaneous optimization of monitoring frequency, averaging period, and lookback periods to confirm decisions, in addition to remediation design variables, reduced ENPV cost. Results indicate that remediation design under conditions of uncertainty is affected by subtle interactions and tradeoffs between design variables, compliance rules, site characteristics, and uncertainty in model predictions and monitoring data. Optimized designs yielded cost savings of up to approximately 50% compared with a nonoptimized design based on common engineering practices. Significant improvements in accuracy and reductions in cost were achieved by recalibrating the model to data collected during remediation and re-optimizing design variables. Repeating this process periodically is advisable to minimize total costs and maximize reliability.  相似文献   

8.
The St. Vénant equations for unsteady flow in open channels and the Muskingum method are written both in their conventional forms and in the state-space formulation. The hydrodynamic equation of motion is solved by the method of state trajectory variation and the result for the first-order variation in the state-space variables is used as a basis of linking the parameters of the Muskingum model with the hydraulic parameters of the open channel reach. The results are applicable to any shape of cross-section and to any type of friction law.  相似文献   

9.
Current reliability‐based control techniques have been successfully applied to linear systems; however, incorporation of stochastic nonlinear behavior of systems in such control designs remains a challenge. This paper presents two reliability‐based control algorithms that minimize failure probabilities of nonlinear hysteretic systems subjected to stochastic excitations. The proposed methods include constrained reliability‐based control (CRC) and unconstrained reliability‐based control (URC) algorithms. Accurate probabilistic estimates of nonlinear system responses to stochastic excitations are derived analytically using enhanced stochastic averaging of energy envelope proposed previously by the authors. Convolving these demand estimates with capacity models yields the reliability of nonlinear systems in the control design process. The CRC design employs the first‐level and second‐level optimizations sequentially where the first‐level optimization solves the Hamilton–Jacobi–Bellman equation and the second‐level optimization searches for optimal objective function parameters to minimize the probability of failure. In the URC design, a single optimization minimizes the probability of failure by directly searching for the optimal control gain. Application of the proposed control algorithms to a building on nonlinear foundation has shown noticeable improvements in system performance under various stochastic excitations. The URC design appears to be the most optimal method as it reduced the probability of slight damage to 8.7%, compared with 11.6% and 19.2% for the case of CRC and a stochastic linear quadratic regulator, respectively. Under the Kobe ground motion, the normalized peak drift displacement with respect to stochastic linear quadratic regulator is reduced to 0.78 and 0.81 for the URC and CRC cases, respectively, at comparable control force levels. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

10.
This paper revisits the phenomenon of dynamic soil‐structure interaction (SSI) with a probabilistic approach. For this purpose, a twofold objective is pursued. First, the effect of SSI on inelastic response of the structure is studied considering the prevailing uncertainties. Second, the consequence of practicing SSI provisions of the current seismic design codes on the structural performance is investigated in a probabilistic framework. The soil‐structure system is modeled by the sub‐structure method. The uncertainty in the properties of the soil and the structure is described by random variables that are input to this model. Monte Carlo sampling analysis is employed to compute the probability distribution of the ductility demand of the structure, which is selected as the metrics for the structural performance. In each sample, a randomly generated soil‐structure system is subjected to a randomly selected and scaled ground motion. To comprehensively model the uncertainty in the ground motion, a suite of 3269 records is employed. An extensive parametric study is conducted to cover a wide range of soil‐structure systems. The results reveal the probability that SSI increases the ductility demand of structures designed based on the conventional fixed‐based assumption but built on flexible soil in reality. The results also show it is highly probable that practicing SSI provisions of modern seismic codes increase the ductility demand of the structure. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

11.
Highly non-linear seismic trace inversion problems can be solved efficiently by an implementation of Tabu Search, a meta-heuristic method related to artificial intelligence. The implementation under consideration is a deterministic, global search that combines the advantages ofa local search, giving a quick descent to local misfit minima, with an ability to cross misfit barriers in the model space. Once Tabu Search has found an area of low misfit, it performs an extensive exploration of its deepest points. This property makes it possible to use Tabu Search for a semiquantitative resolution and uncertainty analysis of the inverse problem.  相似文献   

12.
Uncertainty plagues every effort to model subsurface processes and every decision made on the basis of such models. Given this pervasive uncertainty, virtually all practical problems in hydrogeology can be formulated in terms of (ecologic, monetary, health, regulatory, etc.) risk. This review deals with hydrogeologic applications of recent advances in uncertainty quantification, probabilistic risk assessment (PRA), and decision-making under uncertainty. The subjects discussed include probabilistic analyses of exposure pathways, PRAs based on fault tree analyses and other systems-based approaches, PDF (probability density functions) methods for propagating parametric uncertainty through a modeling process, computational tools (e.g., random domain decompositions and transition probability based approaches) for quantification of geologic uncertainty, Bayesian algorithms for quantification of model (structural) uncertainty, and computational methods for decision-making under uncertainty (stochastic optimization and decision theory). The review is concluded with a brief discussion of ways to communicate results of uncertainty quantification and risk assessment.  相似文献   

13.
框架结构抗震可靠度优化的模型和准则   总被引:3,自引:1,他引:2  
本以框架结构为研究对象,建议了一个用于结构抗震可靠度优化的模型,把最优结构表达为在给定的材料质量上限值和构件截面尺寸下限值的约束条件下具有最小地震失效概率的结构,模型中的约束函数全部都是设计变量的线性函数,可行域是一个凸域,采用Kuhn-Tucker条件推地了最优解必须满足的准则,据此可以构造出优化的迭代格式,中讨论了优化准则的物理意义,给出目标函数对设计变量偏导数的解析求解过程。  相似文献   

14.
含噪声数据反演的概率描述   总被引:5,自引:4,他引:1       下载免费PDF全文
根据贝叶斯理论给出了对含噪声地球物理数据处理的具体流程和方法,主要包括似然函数估计和后验概率计算.我们将数据向量的概念扩展为数据向量的集合,通过引入数据空间内的信赖度,把数据噪声转移到模型空间的概率密度函数上,即获得了反映数据本身的不确定性的似然函数.该方法由于避免了处理阶段数据空间内的人工干预,因而可以保证模型空间中的概率密度单纯反映数据噪声,具有信息保真度高、保留可行解的优点.为了得到加入先验信息的后验分布,本文提出了使用加权矩阵的概率分析法,该方法在模型空间直接引入地质信息,对噪声引起的反演多解性有很强的约束效果.整个处理流程均以大地电磁反演为例进行了展示.  相似文献   

15.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

16.
Anyone working on inverse problems is aware of their ill-posed character. In the case of inverse problems, this concept (ill-posed) proposed by J. Hadamard in 1902, admits revision since it is somehow related to their ill-conditioning and the use of local optimization methods to find their solution. A more general and interesting approach regarding risk analysis and epistemological decision making would consist in analyzing the existence of families of equivalent model parameters that are compatible with the prior information and predict the observed data within the same error bounds. Otherwise said, the ill-posed character of discrete inverse problems (ill-conditioning) originates that their solution is uncertain. Traditionally nonlinear inverse problems in discrete form have been solved via local optimization methods with regularization, but linear analysis techniques failed to account for the uncertainty in the solution that it is adopted. As a result of this fact uncertainty analysis in nonlinear inverse problems has been approached in a probabilistic framework (Bayesian approach), but these methods are hindered by the curse of dimensionality and by the high computational cost needed to solve the corresponding forward problems. Global optimization techniques are very attractive, but most of the times are heuristic and have the same limitations than Monte Carlo methods. New research is needed to provide uncertainty estimates, especially in the case of high dimensional nonlinear inverse problems with very costly forward problems. After the discredit of deterministic methods and some initial years of Bayesian fever, now the pendulum seems to return back, because practitioners are aware that the uncertainty analysis in high dimensional nonlinear inverse problems cannot (and should not be) solved via random sampling methodologies. The main reason is that the uncertainty “space” of nonlinear inverse problems has a mathematical structure that is embedded in the forward physics and also in the observed data. Thus, problems with structure should be approached via linear algebra and optimization techniques. This paper provides new insights to understand uncertainty from a deterministic point of view, which is a necessary step to design more efficient methods to sample the uncertainty region(s) of equivalent solutions.  相似文献   

17.
This paper presents optimization and uncertainty analysis of operation policies for Hirakud reservoir system in Orissa state, India. The Hirakud reservoir project serves multiple purposes such as flood control, irrigation and power generation in that order of priority. A 10-daily reservoir operation model is formulated to maximize annual hydropower production subjected to satisfying flood control restrictions, irrigation requirements, and various other physical and technical constraints. The reservoir operational model is solved by using elitist-mutated particle swarm optimization (EMPSO) method, and the uncertainty in release decisions and end-storages are analyzed. On comparing the annual hydropower production obtained by EMPSO method with historical annual hydropower, it is found that there is a greater chance of improving the system performance by optimally operating the reservoir system. The analysis also reveals that the inflow into reservoir is highly uncertain variable, which significantly influences the operational decisions for reservoir system. Hence, in order to account uncertainty in inflow, the reservoir operation model is solved for different exceedance probabilities of inflows. The uncertainty in inflows is represented through probability distributions such as normal, lognormal, exponential and generalized extreme value distributions; and the best fit model is selected to obtain inflows for different exceedance probabilities. Then the reservoir operation model is solved using EMPSO method to arrive at suitable operational policies corresponding to various inflow scenarios. The results show that the amount of annual hydropower generated decreases as the value of inflow exceedance probability increases. The obtained operational polices provides confidence in release decisions, therefore these could be useful for reservoir operation.  相似文献   

18.
基于零阶和一阶优化算法的建筑结构抗震优化设计   总被引:2,自引:0,他引:2  
针对第三水准“大震不倒”的抗震设防目标,提出一种建筑结构的抗震优化设计方法,依据“用相同的投资获最好的设计”的设计理念,建立了在强烈地震波的作用下,以建筑结构最大的层间相对位移最小化作为优化目标,同时满足体积约束的优化数学模型,并采用动力有限元分析模型和高效的显式动力分析方法进行结构分析获得最大的层间相对位移,采用零阶和一阶优化算法分别求解优化数学模型,在显式动力分析软件ANSYS/LS—DYNA的基础上进行二次开发,实现了一个三维框架结构的抗震优化设计。数值结果表明该方法能获得较高质量的解,经优化设计后建筑结构的抗震性能得到了很大的改善。  相似文献   

19.
Existing riverbank riprap could face the risk of failure if the flood regime changes in future. Additionally, changed sediment transport in rivers, as a possible result of climate change, impacts the failure risk of flood protection measures. Evaluation of this potential failure is the primary issue of riprap stability and safety assessment. The consequences of the bank failure are probably uncontrolled erosion and flooding with disastrous consequences in residential areas or damage to infrastructures. Thus, a probabilistic analysis of riprap failure considering different mechanisms due to the flood and sediment transport uncertainties is required to assess embankment stability. In this article, the concept of a probabilistic assessment model based on Monte Carlo simulation method, moment analysis methods, and Rosenblueth point estimation method are presented to define the failure risk of riprap as the river bank protection. The probability of failure in different modes, namely direct block erosion, toe scouring and overtopping, has been defined by taking into account the river bed level variation based on bedload transport described with a probabilistic function of the peak discharge. The result of three models comparison revealed a good agreement (the average deviation of less than 2%) in estimation of riprap failure probability. This model is a strategical tool to search the critical river reaches and helps to evaluate the risk maps. So that, the model could cover the engineering aspect of environmental stability in the rivers with riprap as the bank protections.  相似文献   

20.
IntroductionTodetermineaseismicdesignstandard,examineaseismicdesignorpredictseismicdamage,theparametersofgroundmotioneventim...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号