首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到15条相似文献,搜索用时 27 毫秒
1.
Ivan G. Wong 《Natural Hazards》2014,72(3):1299-1309
The occurrence of several recent “extreme” earthquakes with their significant loss of life and the apparent failure to have been prepared for such disasters has raised the question of whether such events are accounted for in modern seismic hazard analyses. In light of the great 2011 Tohoku-Oki earthquake, were the questions of “how big, how bad, and how often” addressed in probabilistic seismic hazard analyses (PSHA) in Japan, one of the most earthquake-prone but most earthquake-prepared countries in the world? The guidance on how to properly perform PSHAs exists but may not be followed for a whole range of reasons, not all technical. One of the major emphases of these guidelines is that it must be recognized that there are significant uncertainties in our knowledge of earthquake processes and these uncertainties need to be fully incorporated into PSHAs. If such uncertainties are properly accounted for in PSHA, extreme events can be accounted for more often than not. This is not to say that no surprises will occur. That is the nature of trying to characterize a natural process such as earthquake generation whose properties also have random (aleatory) uncertainties. It must be stressed that no PSHA is ever final because new information and data need to be continuously monitored and addressed, often requiring an updated PSHA.  相似文献   

2.
Hydrogeology Journal - There is a pressing need to improve public administration of groundwater abstraction and use, given the global need to achieve sustainable resource exploitation and to reduce...  相似文献   

3.
Much of the coal consumed in the US since the end of the last century has been produced from the Pennsylvanian strata of the Appalachian basin. Even though quantities mined in the past are less than they are today, this basin yielded from 70% to 80% of the nation's annual coal production from the end of the last century until the early 1970s. During the last 25 years, the proportion of the nation's coal that was produced annually from the Appalachian basin has declined markedly, and today it is only about 40% of the total. The amount of coal produced annually in the Appalachian basin, however, has been rising slowly over the last several decades, and has ranged generally from 400 to 500 million tons (Mt) per year.A large proportion of Appalachian historical production has come from relatively few counties in southwestern Pennsylvania, northern and southern West Virginia, eastern Kentucky, Virginia and Alabama. Many of these counties are decades past their years of peak production and several are almost depleted of economic deposits of coal. Because the current major consumer of Appalachian coal is the electric power industry, coal quality, especially sulfur content, has a great impact on its marketability. High-sulfur coal deposits in western Pennsylvania and Ohio are in low demand when compared with the lower sulfur coals of Virginia and southern West Virginia. Only five counties in the basin that have produced 500 Mt or more exhibit increasing rates of production at relatively high levels. Of these, six are in the central part of the basin and only one, Greene County, Pennsylvania, is in the northern part of the basin. Decline rate models, based on production decline rates and the decline rate of the estimated, “potential” reserve, indicate that Appalachian basin annual coal production will be 200 Mt or less by the middle of the next century.  相似文献   

4.
Applied flood risk analyses, especially in urban areas, very often pose the question how detailed the analysis needs to be in order to give a realistic figure of the expected risk. The methods used in research and practical applications range from very basic approaches with numerous simplifying assumptions up to very sophisticated, data and calculation time demanding applications both on the hazard and on the vulnerability part of the risk. In order to shed some light on the question of required model complexity in flood risk analyses and outputs sufficiently fulfilling the task at hand, a number of combinations of models of different complexity both on the hazard and on the vulnerability side were tested in a case study. The different models can be organized in a model matrix of different complexity levels: On the hazard side, the approaches/models selected were (A) linear interpolation of gauge water levels and intersection with a digital elevation model (DEM), (B) a mixed 1D/2D hydraulic model with simplifying assumptions (LISFLOOD-FP) and (C) a Saint-Venant 2D zero-inertia hyperbolic hydraulic model considering the built environment and infrastructure. On the vulnerability side, the models used for the estimation of direct damage to residential buildings are in order of increasing complexity: (I) meso-scale stage-damage functions applied to CORINE land cover data, (II) the rule-based meso-scale model FLEMOps+ using census data on the municipal building stock and CORINE land cover data and (III) a rule-based micro-scale model applied to a detailed building inventory. Besides the inundation depths, the latter two models consider different building types and qualities as well as the level of private precaution and contamination of the floodwater. The models were applied in a municipality in east Germany, Eilenburg. It suffered extraordinary damage during the flood of August 2002, which was well documented as were the inundation extent and depths. These data provide an almost unique data set for the validation of flood risk analyses. The analysis shows that the combination of the 1D/2D model and the meso-scale damage model FLEMOps+ performed best and provide the best compromise between data requirements, simulation effort, and an acceptable accuracy of the results. The more detailed approaches suffered from complex model set-up, high data requirements, and long computation times.  相似文献   

5.
Warning systems are increasingly applied to reduce damage caused by different magnitudes of rockslides and rockfalls. In an integrated risk-management approach, the optimal risk mitigation strategy is identified by comparing the achieved effectiveness and cost; estimating the reliability of the warning system is the basis for such considerations. Here, we calculate the reliability and effectiveness of the warning system installed in Preonzo prior to a major rockfall in May 2012. “Reliability” is defined as the ability of the warning system to forecast the hazard event and to prevent damage. To be cost-effective, the warning system should forecast an event with a limited number of false alarms to avoid unnecessary costs for intervention measures. The analysis shows that to be reliable, warning systems should be designed as fail-safe constructions. They should incorporate components with low failure probabilities, high redundancy, have low warning thresholds, and additional control systems. In addition, the experts operating the warning system should have limited risk tolerance. In an additional hypothetical probabilistic analysis, we investigate the effect of the risk attitude of the decision makers and of the number of sensors on the probability of detecting the event and initiating a timely evacuation, as well as on the related intervention cost. The analysis demonstrates that quantitative assessments can support the identification of optimal warning system designs and decision criteria.  相似文献   

6.
The optimal delay time between the contour holes in rock blasting has been studied by theoretical and empirical research in Sweden, regarding ground vibrations, increase in crack frequency, radial crack length and finally overbreak (half cast factor). The model test presented in this paper concerns controlled contour blasting in tunnelling and the full-scale blasts concern tunnelling, road cutting, and dimensional stone quarrying. The results indicate that the microsequential contour blasting technique (contour holes fired in sequence and with a delay in the order of 1–2 ms) is superior to simultaneous initiation both regarding blast-induced ground vibrations and crack frequency increase in the rock mass. Both these evaluation methods reflects the conditions deeper in the remaining rock mass. Simultaneous initiation, however, is superior to micro-sequential contour blasting both regarding the half cast factor and the length of radial cracks emanating from the blastholes. These two parameters are more related to the surface conditions after blasting. The industrial applications of this new knowledge are the use of micro-sequential contour blasting when ground vibrations are of greater concern than the contour, for example, in trench blasting or quarrying in urban areas, and the use of simultaneous initiation when an even rock surface is of high priority.  相似文献   

7.
Vink  Ryanne  Varró  Krisztina 《GeoJournal》2021,86(2):963-978
GeoJournal - Building on the insights of scholarship highlighting specific aspects of the forming of place meanings and experiences during running (events), this paper aims at applying a more...  相似文献   

8.
9.
Gillian Rose 《Geoforum》2009,40(1):46-54
The paper begins with the recent interest in what Mbembe has called ‘necropolitics’: the politics of the distribution of life and death by modern sovereign states. The necropolitical works through the production of spatialities, visualities, and bodies that are classed, racialised and gendered in particular ways. The paper explores a series of such productions in its discussion of the British press coverage of the bombs that exploded on London’s public transport system on 7 July 2005, and in particular the photographs used by the newspapers. It argues that the newspapers pictured bodies as gendered and racialised, with the former fixed and visible much more clearly than the latter. It further argues that the newspapers differentiated between various bodies by assuming that a certain sort of care was deserved only by some of those involved in the bombings. Finally, the paper examines how the coverage worked to place the readers of the newspapers in a specific position in the necropolitical order of power as citizens who care only for certain people, and in a particular way. The paper concludes by considering the implications of that specific caring for contemporary necropolitics and its visualities and spatialities.  相似文献   

10.
Steinmann, then professor of geology at Freiburg (Germany), more than a 100 years ago wondered about the southern end of the extensional Rhinegraben and proposed that elements of the graben penetrated the contractional Jura. In particular, he recognized the "Schwarzwaldlinie” in the southern prolongation of the eastern border of the southern Rhinegraben, a line-up of topographic as well as structural irregularities. He conjectured that it was caused by normal faults of the Rhinegraben system. Subsequently—100 years ago—Buxtorf (1907) proposed the hypothesis, that the Jura was a thin-skinned nappe sheared off on Triassic evaporites. In the autochthonous basement underneath the wrinkled skin, the ``Schwarzwald line” is difficult to define. It probably consists of a gentle flexure punctuated by faults that approximately coincides with Steinmann’s original projection, although he sought to identify its constituent faults in the badly deformed allochthonous skin. Current data place the thin-skin elements of the Schwarzwald line in a more westerly, allochthonous position where most of them were reactivated into sinistrally transpressional structures.  相似文献   

11.
This article interrogates how social media can provide a platform for contesting dominant discourses. It does so through the lens of competitive eating, demonstrating that amateur competitive eaters use social media sites to challenge and subvert mass media representations of their sport while concomitantly upholding normative notions of healthy eating and bodies. Competitors consider themselves to be skilful athletes that discipline and train their bodies to eat. They regard their eating practices, which are often depicted in the mass media as uncontrolled and gluttonous, as controlled ingestion, and present an alternative perspective of their ‘sport’ – a perspective that stresses health, physical expertise and a fit, trained body over voracity and insatiability. Social media acts as a ‘precipitating agency’ for the creation of these alternative definitions of disciplined eating, as well as the construction of new digital eating identities. Instead of focusing on the food being ingested and the ‘Carnivalesque’ practice of competitive eating, we draw attention to the performers’ voices and the ways they attend to the mechanics of gurgitation, including methods of chewing, swallowing and stomach stretching, and their ability to manage, regulate and operate ingestivity. As hegemonic discourses align the notion of ‘good eating’ to discipline, order and restraint, competitive eating is thus revealed to be a practice that mirrors and appropriates, yet also ultimately reproduces, conventional narratives. Social media is, in turn, shown to be a political tool for counter-discursive practices that are produced in dialogue with, and concomitantly uphold and contest, normative discourses of mass media.  相似文献   

12.
Di Matteo  Dante 《GeoJournal》2021,86(3):1465-1480

The widespread popularity reached by food trucks (FTs) has led to a reshape of many food events into new forms of street food events (SFEs), in which the FTs become the main attractors for the visitors, and not anymore a simple support element to the event making. Such SFEs have rapidly been recognized as a pivotal place marker for attracting visitors from within and beyond the regional boundaries: it is therefore significant to understand visitors’ motivations to attend a SFE, both for addressing policies and for supporting business decision-making processes, since the overall visitors’ perceptions are related to a form of loyalty towards the visited destination, and this linkage might encourage revisit intentions. This study applies an ordered multinomial model to a SFE taking place in Abruzzo region (Italy) and the main findings reveal that visitors’ perceptions tend to strengthen if they find in the event memorable atmospheres and non-routine food specialties, suggesting the search for a hedonic consumption. In these regards, this study provides implications on how SFEs might represent an enhancer for the local and regional development and how SFEs might help preserve the economic and social fabric of smaller and larger communities.

  相似文献   

13.

The interpretation of aquifer responses to pumping tests is an important tool for assessing aquifer geometry and properties, which are critical in the assessment of water resources or in environmental remediation. However, the responses of aquifers, measured by time-drawdown relationships in monitoring wells, are nonunique solutions that are affected by many factors. Jacob’s Zoo is a collection of graphical interpretations that allows students and practitioners to develop an intuitive feel for how natural hydrogeological systems work, and develop a set of skills that provide a better understanding of aquifer properties far beyond interpretation of pumping tests. Jacob’s Zoo, based on the work of Jacob (1950), fosters a deeper understanding, although few practitioners realize the full utility of the method. Jacob CE (1950) Flow of groundwater, In: Rouse H (ed) Engineering Hydraulics, Wiley, New York. P 321–386.

  相似文献   

14.
Hydrothermal volatile-solubility and partitioning experiments were conducted with fluid-saturated haplogranitic melt, H2O, CO2, and S in an internally heated pressure vessel at 900°C and 200?MPa; three additional experiments were conducted with iron-bearing melt. The run-product glasses were analyzed by electron microprobe, FTIR, and SIMS; and they contain ??0.12 wt% S, ??0.097 wt% CO2, and ??6.4 wt% H2O. Apparent values of log f O2 for the experiments at run conditions were computed from the [(S6+)/(S6++S2?)] ratio of the glasses, and they range from NNO ?0.4 to NNO?+?1.4. The C?CO?CH?CS fluid compositions at run conditions were computed by mass balance, and they contained 22?C99?mol% H2O, 0?C78?mol% CO2, 0?C12?mol% S, and <3 wt% alkalis. Eight S-free experiments were conducted to determine the H2O and CO2 concentrations of melt and fluid compositions and to compare them with prior experimental results for C?CO?CH fluid-saturated rhyolite melt, and the agreement is excellent. Sulfur partitions very strongly in favor of fluid in all experiments, and the presence of S modifies the fluid compositions, and hence, the CO2 solubilities in coexisting felsic melt. The square of the mole fraction of H2O in melt increases in a linear fashion, from 0.05 to 0.25, with the H2O concentration of the fluid. The mole fraction of CO2 in melt increases linearly, from 0.0003 to 0.0045, with the CO2 concentration of C?CO?CH?CS fluids. Interestingly, the CO2 concentration in melts, involving relatively reduced runs (log f O2????NNO?+?0.3) that contain 2.5?C7?mol% S in the fluid, decreases significantly with increasing S in the system. This response to the changing fluid composition causes the H2O and CO2 solubility curve for C?CO?CH?CS fluid-saturated haplogranitic melts at 200?MPa to shift to values near that modeled for C?CO?CH fluid-saturated, S-free rhyolite melt at 150?MPa. The concentration of S in haplogranitic melt increases in a linear fashion with increasing S in C?CO?CH?CS fluids, but these data show significant dispersion that likely reflects the strong influence of f O2 on S speciation in melt and fluid. Importantly, the partitioning of S between fluid and melt does not vary with the (H2O/H2O?+?CO2) ratio of the fluid. The fluid-melt partition coefficients for H2O, CO2, and S and the atomic (C/S) ratios of the run-product fluids are virtually identical to thermodynamic constraints on volatile partitioning and the H, S, and C contents of pre-eruptive magmatic fluids and volcanic gases for subduction-related magmatic systems thus confirming our experiments are relevant to natural eruptive systems.  相似文献   

15.
The debate on genetic modification (GM) is persistent, polarized and mainly involves organized groups at the national level. With the European Union’s new policy of coexistence, commercial cultivation of GM crops is expected by the Dutch Ministry of Agriculture, Nature and Food Quality within the next few years, especially maize (BT) and potato (Phytophthera resistance and starch production). This makes the debate relevant for those directly confronted with this cultivation: the inhabitants of local rural communities. In The Netherlands, stakeholders formulated coexistence rules to prevent problems between conventional, organic and GM farmers that grow their crops in the same limited land area. Little is known, however, regarding the perceptions of the non-farming inhabitants of rural communities (“the neighbours”) in the debate. This paper presents the results of a focus group-based argumentative analysis of whether (and how) the GM issues play a decisive role among non-farming inhabitants of four rural communities in the Netherlands. We analysed the arguments in relation to a conceptual model that describes the potential rise and dynamics from a pre-Nimby ambivalence towards an outspoken Nimby position. We observed that the GM debate was given very little priority relative to other national issues on the political agenda and that more social cohesion correlates with fewer arguments in the national debate. It is argued that this mechanism keeps the Nimby ambivalence in an undetermined mode, which in turn diminishes the chances of radical rural-based protest against local GM cultivation of crops.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号