首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 687 毫秒
1.
The development of robust risk assessment procedures for offshore oil and gas operations is a major element for the assessment of the potential feedback between planned activities and the environment. We illustrate a methodological and computational framework conducive to (1) a quantitative risk analysis of deepwater well barrier failures and subsequent hydrocarbon release to the environment and (2) the analysis of the value of the deployment of conventional and/or innovative mitigation measures. Our methodological framework is grounded on historical records and combines the use of Dynamic Event Trees and Decision Trees from which we estimate probability of occurrence and impact of post-blowout events. Each sequence of response actions, which are undertaken immediately after the event or in the subsequent days, is considered within the context of appropriately structured event paths. This approach is conducive to an estimate of the expected value of key decisions and underlying technologies, with an emphasis on their potential to reduce the oil spill volume, which can critically impact the environment. Our study yields an original comparative analysis of diverse intervention strategies, and forms a basis to guiding future efforts towards the development and deployment of technologies and operating procedures yielding maximum benefit in terms of safety of operations and environmental protection.  相似文献   

2.
Temporal and spatial distribution of dam failure events in China   总被引:3,自引:0,他引:3  
Reservoirs play a vital role in economic development and flood control. Nevertheless, both human and natural factors may lead to dam failures with catastrophic consequences. Analyzing the data of dam failure events from 1954 to 2003 and using the method of energy spectrum analysis, this paper studies the periodicity of dam failures. The rate of failure of different dam ages is analyzed. The climate is the main factor affecting the rate of dam failure. Climate diagrams are used to analyze the spatial distribution of dam failure events in China. High rate of dam failure occurs with 25-year and 12.5-year periods. The distribution of the percentage of dam failure shows an L-shape as a function of service age. The first 5 years of operation is known as the "infant period", during which, the probability of dam failure is much higher than during any other periods. The failure rate in areas near or north to the 400 mm annual isopluvial line is notably higher than other areas. In areas with high temperature difference among seasons have a high annual average dam failure rate.  相似文献   

3.
Extreme high precipitation amounts are among environmental events with the most disastrous consequences for human society. This paper deals with the identification of ‘homogeneous regions’ according to statistical characteristics of precipitation extremes in the Czech Republic, i.e. the basic and most important step toward the regional frequency analysis. Precipitation totals measured at 78 stations over 1961–2000 are used as an input dataset. Preliminary candidate regions are formed by the cluster analysis of site characteristics, using the average-linkage clustering and Ward’s method. Several statistical tests for regional homogeneity are utilized, based on the 10-yr event and the variation of L-moment statistics. In compliance with results of the tests, the area of the Czech Republic has been divided into four homogeneous regions. The findings are supported by simulation experiments proposed to evaluate stability of the test results. Since the regions formed reflect also climatological differences in precipitation regimes and synoptic patterns causing high precipitation amounts, their future application may not be limited to the frequency analysis of extremes.  相似文献   

4.
Electrical conductivity mapping is a prerequisite tool for hydrogeological or environmental studies. Its interpretation still remains qualitative but advantages can be expected from a quantitative approach. However a full 3D interpretation is too laborious a task in comparison with the limited cost and time which are involved in the majority of such field studies. It is then of value to define the situations where lateral variations are sufficiently smooth for a 1D model to describe correctly the underlying features. For slingram conductivity measurements, criteria allowing an approximate 1D inversion are defined: these mainly consist of a limited rate of variation over three times the intercoil spacing. In geological contexts where the weathering has generated a conductive intermediate layer between the underlying sound rock and the soil, this processing can be applied to determine the thickness of the conductive layer from the apparent resistivity map when the other geoelectrical parameters are known. The examples presented illustrate this application.  相似文献   

5.
We discuss the methodology to use in searching for statistical relationships between catastrophic events such as earthquakes and volcanic eruptions on the one hand and astronomical or geographic data on the other. It is pointed out that data should necessarily be normalized before doing statistical probabilistic analyses. Examples of studies are provided whose authors arrive at certainly false inferences owing to the absence of a correct normalization.  相似文献   

6.
Field‐based palaeoflood event reconstruction has the potential to contribute to the development of our understanding of long‐term landscape evolution. However, the reconstruction of past flow event histories (magnitude and frequency) over long‐term (Quaternary) timescales is fraught with difficulties. Here we make a preliminary exploration of some of the practicalities of flood reconstruction from fluvial terrace archives using commonly available sedimentological and geomorphological observations from a field perspective. We utilize Manning and palaeostage indicators to reconstruct historic events that can be used as benchmarks for a lesser used competence based approach, which is applied to coarse‐grained strath terrace deposits. We evaluate the results against gauged records for extreme and catastrophic events that affected the same region in 1973 and 2012. The findings suggest that the competence approach is most effectively applied to terrace deposits if the channel geometry is taken into account when sampling both in cross‐section and in longitudinal section and calibrated against the sedimentology for palaeo‐flow depth. Problems can arise where constrictive channel geometries allow boulder jams to develop, acting as sediment traps for the coarsest material and leading to downstream ‘boulder starvation’. Useful sites to target for palaeoflood reconstruction, therefore, would be upstream of such constrictive reaches where the coarsest transportable bedload has been effectively trapped. Sites to avoid would be downflow, where the deposited material would poorly represent palaeoflood competence. Underestimation from maximum boulder preservation and limited section exposure issues would appear to outweigh possible overestimation concerns related to fluid density and unsteady flow characteristics such as instantaneous acceleration forces. Flood data derived from river terrace deposits suggests that basal terrace geometries and coarse boulder lags common to many terrace sequences are likely the result of extreme flow events which are subsequently filled by lesser magnitude flood events, in this environmental setting. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

7.
—?Earthquake hazard parameters are estimated by the application of the maximum likelihood method. The technique is based on a procedure which utilizes data of different quality, e.g., those in which the uncertainty in the assessment of the magnitudes is great and those in which the magnitudes are computed with great precision. In other words the data were extracted from both historical (incomplete) and recorded (complete) files. The historical part of the catalogue contains only the strongest events, whereas the complete part can be divided into several sub-catalogues; each one assumed to be complete above a specified magnitude threshold. Uncertainty in the determination of magnitudes has also been taken into account. The method allows us to estimate the earthquake hazard parameters which are the maximum regional magnitude, M max, the activity rate, λ, of the seismic events and the well known value β (b=β?log?e), which is the slope of the magnitude-frequency relationship. All these parameters are of physical significance. The mean return periods, RP, of earthquakes with a certain lower magnitude M?≥?m are also determined. The method is applied in the Island of Crete and the adjacent area, where catastrophic earthquakes are known from the historical era. The earthquake hazard of the whole area is divided in a cellular manner which allow the analysis of the localized hazard parameters and the representation of their regional variation. The seismic hazard analysis, which is expressed by: (a) The annual probability of exceedance of a specified value of magnitude and (b) the return periods (in years) that are expected for given magnitudes, for shallow events is finally performed for shallow events. This hazard analysis is useful for both theoretical and practical reasons and provides a tool for earthquake resistant design in both areas of low and high seismicity.  相似文献   

8.
This work aims to provide a dynamic assessment of flood risk and community resilience by explicitly accounting for variable human behaviour, e.g. risk-taking and awareness-raising attitudes. We consider two different types of socio-hydrological systems: green systems, whereby societies deal with risk only via non-structural measures, and technological systems, whereby risk is dealt with also by structural measures, such as levees. A stylized model of human–flood interactions is first compared to real-world data collected at two test sites (People’s Republic of Bangladesh and the city of Rome, Italy) and then used to explore plausible trajectories of flood risk. The results show that flood risk in technological systems tends to be significantly lower than in green systems. However, technological systems may undergo catastrophic events, which lead to much higher losses. Furthermore, green systems prove to be more resilient than technological ones, which makes them more capable of withstanding environmental and social changes.
EDITOR D. Koutsoyiannis

ASSOCIATE EDITOR not assigned  相似文献   

9.
According to UK Government surveys, concern for the environment is growing. Environmental regulation of the industry is becoming wider in its scope and tougher in its implementation. Various techniques are available to assess how the industry can drive down its environmental impact and comply with environmental regulation. Environmental Assessments (EA) required by European law do not cover the whole life cycle of the project that they are analysing. Life Cycle Analysis (LCA) was developed to assess the environmental loadings of a product, process or activity over its entire life cycle. It was the first technique used in environmental analysis that adopted what was described as a holistic approach. It fails this approach by not assessing accidental emissions or environmental impacts other than those that are direct. Cost Benefit Analysis (CBA) offers the opportunity to value environmental effects and appraise a project on the basis of costs and benefits. Not all environmental effects can be valued and of those that can there is considerable uncertainty in their valuation and occurrence. CBA cannot satisfactorily measure the total environmental risk of a project. Consequently there is a need for a technique that overcomes the failures of project-level EA, LCA and CBA, and assesses total environmental risk. Many organizations such as, the British Medical Association, the European Oilfield Speciality Chemicals Association, the Royal Ministry of Petroleum and Energy (Norway) and Shell Expro now recognize that a holistic approach is an integral part of assessing total risk. The Brent SPAR case study highlights the interdisciplinary nature required of any environmental analysis. Holistic Environmental Assessment is recommended as such an environmental analysis.  相似文献   

10.
The development of large ring lasers made it possible for accurate detection of rotational seismic waves over a wide range of amplitudes and frequencies. Due to their insensitivity to translational motion the optical angular rate sensors are very attractive for application in seismology, geodesy, and even fundamental physics. However, the operation of the large ring lasers in the near-field is difficult due to their mass, size and, environmental requirements. Hence, the fiber-optic gyros may be used for seismic applications, where the mobility is more important and where the high rotation rates are expected. This kind of sensors also can be used for correction of standard seismometers subjected to tilt. In this paper we present the current state of experimental research dedicated to application of fiber-optic gyros for seismology. The test results demonstrate that the tactical grade optical sensors are capable of successfully measuring small rotation fluctuations down to 10?5 rad/s. However, the seismometer correction seems only feasible in the range of rotation rates about 10?3 rad/s and higher. This limitation is caused by the excessive noise in the output of the fiber-optic gyro. The possible measures to overcome this problem are discussed as well as advantages of optical gyros for strong motion seismology.  相似文献   

11.
选取2001年1月-6月中美合作项目(SMALL)中磁通门磁力仪的数据和同期磁变仪的数据进行对比分析。结果表明,两种仪器记录的日变形态、日变幅、极值及极值时间具有较好的一致性,说明该项目中磁通门磁力仪的资料真实可靠。给出的典型地磁脉动事件表明,该仪器可提供精确定时的高质量地磁资料。  相似文献   

12.
This paper summarizes the latest developments, future prospects, and proposed countermeasures of reservoir sedimentation and channel scour downstream of the Three Gorges Reservoir (TGR) on the Yangtze River in China. Three key results have been found.(1) The incoming sediment load to the TGR has been significantly lower than expected.(2) The accumulated volume of sediment deposition in the TGR is smaller than expected because the overall sediment delivery ratio is relatively low, and the deposition in the near-dam area of the TGR is still developing.(3) River bed scour in the river reaches downstream of the Gezhouba Dam is still occurring and channel scour has extended to reaches as far downstream as the Hukou reach. Significantly, sedimentation of the TGR is less problematic than expected since the start of operation of the TGR on the one hand;on the other hand, the possible increases in sediment risks from dependence on upstream sediment control, deposition in the reservoir, and scour along middle Yangtze River should be paid more attention.(1) Sediment trapped by dams built along the upper Yangtze River and billion tons of loose materials on unstable slopes produced by the Wenchuan Earthquake could be new sediment sources for the upper Yangtze River. More seriously, possible release of this sediment into the upper Yangtze River due to new earthquakes or extreme climate events could overwhelm the river system, and produce catastrophic consequences.(2) Increasing sediment deposition in the TGR is harmful to the safety and efficiency of project operation and navigation.(3) The drastic scour along the middle Yangtze River has intensified the down-cutting of the riverbed and erosion of revetment, it has already led to increasing risk to flood control structures and ecological safety. It is suggested to continue the Field Observation Program, to initiate research programs and to focus on risks of sedimentation.  相似文献   

13.
For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management.  相似文献   

14.
Spatial modeling of drought events using max-stable processes   总被引:1,自引:1,他引:0  
With their severe environmental and socioeconomic impact, drought events belong to the most far-reaching natural disasters. Effects are tremendous in rain-fed agricultural areas as in Africa. We analyzed and modeled the spatio-temporal statistical behavior of the Normalized Difference Vegetation Index as a risk indicator for drought, reflecting its stochastic effects on vegetation. The study used a data set for Rwanda obtained from multitemporal satellite remote sensor measurements during a 14-year period and divided into season-specific spatial random fields. Maximal deviations from average conditions were modeled with max-stable Brown–Resnick processes taking methodological and computational challenges into account. Those challenges are caused by the large spatial extent and the relatively short time span covered by the data. Extensive simulations enabled us to go beyond the observations and, thus, to estimate several important characteristics of extreme drought events, such as their expected return period.  相似文献   

15.
Unpreparedness is often the main cause of the economic and social damages caused by floods. To mitigate these impacts, short-term forecasting has been the focus of several studies during the past decades; however, less effort has been paid to flood predictions at longer lead times. Here, we use forecasts by six models from the North American Multi-Model Ensemble project with a lead time from 0.5 to 9.5 months to predict the seasonal duration of floods above four National Weather Service flood categories (“action,” “flood,” “moderate” and “major”). We focus on 202 U.S. Geological Survey gage stations across the U.S. Midwest and use a statistical framework which considers precipitation, temperature, and antecedent wetness conditions as predictors. We find that the prediction skill of the duration of floods for the “action” and “flood” categories is overall low, largely because of the low accuracy of the climate forecasts rather than of the errors introduced by the statistical models. The prediction skill slightly improves when considering the shortest lead times (i.e., from 0.5 to 2.5 months) during spring in the Northern Great Plains, where antecedent wetness conditions play an important role in influencing the generation of floods. It is very difficult to draw strong conclusions with respect to the “moderate” and “major” flood categories because of the limited number of available events.  相似文献   

16.
This paper develops a new method for decision-making under uncertainty. The method, Bayesian Programming (BP), addresses a class of two-stage decision problems with features that are common in environmental and water resources. BP is applicable to two-stage combinatorial problems characterized by uncertainty in unobservable parameters, only some of which is resolved upon observation of the outcome of the first-stage decision. The framework also naturally accommodates stochastic behavior, which has the effect of impeding uncertainty resolution. With the incorporation of systematic methods for decision search and Monte Carlo methods for Bayesian analysis, BP addresses limitations of other decision-analytic approaches for this class of problems, including conventional decision tree analysis and stochastic programming. The methodology is demonstrated with an illustrative problem of water quality pollution control. Its effectiveness for this problem is compared to alternative approaches, including a single-stage model in which expected costs are minimized and a deterministic model in which uncertain parameters are replaced by their mean values. A new term, the expected value of including uncertainty resolution, or EVIUR, is introduced and evaluated for the illustrative problem. It is a measure of the worth of incorporating the experimental value of decisions into an optimal decision-making framework. For the illustrative problem, the two-stage adaptive management framework extracted up to approximately 50% of the gains of perfect information. The strength and limitations of the method are discussed and conclusions are presented.  相似文献   

17.
In most groundwater applications, measurements of concentration are limited in number and sparsely distributed within the domain of interest. Therefore, interpolation techniques are needed to obtain most likely values of concentration at locations where no measurements are available. For further processing, for example, in environmental risk analysis, interpolated values should be given with uncertainty bounds, so that a geostatistical framework is preferable. Linear interpolation of steady-state concentration measurements is problematic because the dependence of concentration on the primary uncertain material property, the hydraulic conductivity field, is highly nonlinear, suggesting that the statistical interrelationship between concentration values at different points is also nonlinear. We suggest interpolating steady-state concentration measurements by conditioning an ensemble of the underlying log-conductivity field on the available hydrological data in a conditional Monte Carlo approach. Flow and transport simulations for each conditional conductivity field must meet the measurements within their given uncertainty. The ensemble of transport simulations based on the conditional log-conductivity fields yields conditional statistical distributions of concentration at points between observation points. This method implicitly meets physical bounds of concentration values and non-Gaussianity of their statistical distributions and obeys the nonlinearity of the underlying processes. We validate our method by artificial test cases and compare the results to kriging estimates assuming different conditional statistical distributions of concentration. Assuming a beta distribution in kriging leads to estimates of concentration with zero probability of concentrations below zero or above the maximal possible value; however, the concentrations are not forced to meet the advection-dispersion equation.  相似文献   

18.
利用京津冀基础地理数据、本地化地震灾害快速评估模型、本地化辅助决策模版和离线快速评估与应急制图等技术,实现了支持Web页面与移动端的北京市本地化地震应急工作平台。面向远程应急和移动办公需求,实现了基于Android移动终端的地震快速触发、灾害快速评估、辅助决策、信息自动推送、应急指挥调度、日常运维管理和综合信息查询等功能,在地震应急响应中,打破场所和时间限制,第一时间产出快速评估报告、辅助决策报告和应急专题图件,为北京市地震应急指挥决策提供科学高效的信息服务。平台自运行以来在多次地震应急处置中发挥了重要应用实效。  相似文献   

19.
We advance our own definitions of the following terms: catastrophic volcanic eruption (CE), catastrophic supereruption (CSE), different-rank and different-type episodes and phases of volcanic catastrophism (VC). All eruptions are subdivided into three classes according to the volume and weight of the erupted and transported (juvenile and resurgent) material, whatever its chemical composition: class I (>0.5 km3), class II (≥5 km3), and class III, or supereruptions (>50 km3). We characterize the types and varieties of CEs and CSEs, with most of these being the main components of identified VC episodes and phases. The primary phenomena to be considered include catastrophic events of the 19th to 21st centuries, not only in the Kuril–Kamchatka region, but also in other volcanic areas. These events have been studied in detail by modern methods and can serve as approximate models to reconstruct similar past events, especially regarding their dynamics, productivity, and catastrophic impact.  相似文献   

20.
--The mechanical and statistical characteristics of acoustic emission (AE) events during stable sliding are investigated through a laboratory experiment using a granite specimen with a pre-cut fault. Numerous AE events are found to be generated on the pre-cut fault, indicating that microscopically unstable fracture occurs during macroscopically stable sliding. The composite focal mechanism solution of AE events is determined from the first motion directions of P-waves. The determined mechanism is consistent with the double-couple one expected for the slip on the pre-cut fault. The source radii of large AE events are estimated to be about 10 mm from the widths of the first P-wave pulses. These indicate that the AE events are generated by shear fracture whose faulting area is a part of the pre-cut fault plane. The occurrence of AE events as a stochastic process approximately obeys the Poisson process, if the effect of mutually dependent events consti tuting clusters is corrected. The observed amplitude-frequency relation of AE events approximately follows a power law for a limited amplitude range. As the macroscopic sliding rate increases, the number of AE events per unit sliding distance decreases. This rate dependence of the AE activity is qualitatively consistent with the observation that the real area of contact between sliding surfaces decreases with an increase in the sliding rate as reported in the literature.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号