排序方式: 共有60条查询结果,搜索用时 31 毫秒
31.
32.
Empirical fragility assessment of buildings affected by the 2011 Great East Japan tsunami using improved statistical models 总被引:3,自引:2,他引:1
Tsunamis are destructive natural phenomena which cause extensive damage to the built environment, affecting the livelihoods and economy of the impacted nations. This has been demonstrated by the tragic events of the Indian Ocean tsunami in 2004, or the Great East Japan tsunami in 2011. Following such events, a few studies have attempted to assess the fragility of the existing building inventory by constructing empirical stochastic functions, which relate the damage to a measure of tsunami intensity. However, these studies typically fit a linear statistical model to the available damage data, which are aggregated in bins of similar levels of tsunami intensity. This procedure, however, cannot deal well with aggregated data, low and high damage probabilities, nor does it result in the most realistic representation of the tsunami-induced damage. Deviating from this trend, the present study adopts the more realistic generalised linear models which address the aforementioned disadvantages. The proposed models are fitted to the damage database, containing 178,448 buildings surveyed in the aftermath of the 2011 Japanese tsunami, provided by the Ministry of Land, Infrastructure Transport and Tourism in Japan. In line with the results obtained in previous studies, the fragility curves show that wooden buildings (the dominant construction type in Japan) are the least resistant against tsunami loading. The diagnostics show that taking into account both the building’s construction type and the tsunami flow depth is crucial to the quality of the damage estimation and that these two variables do not act independently. In addition, the diagnostics reveal that tsunami flow depth estimates low levels of damage reasonably well; however, it is not the most representative measure of intensity of the tsunami for high damage states (especially structural damage). Further research using disaggregated damage data and additional explanatory variables is required in order to obtain reliable model estimations of building damage probability. 相似文献
33.
34.
Toru Kouyama Takeshi Imamura Masato Nakamura Takehiko Satoh Yoshihumi Futaana 《Planetary and Space Science》2012,60(1):207-216
An improved cloud tracking method for deriving wind velocities from successive planetary images was developed. The new method incorporates into the traditional cross-correlation method an algorithm that corrects for erroneous cloud motion vectors by re-determining the most plausible correlation peak among all of the local maxima on the correlation surface by comparing each vector with its neighboring vectors. The newly developed method was applied to the Venusian violet images obtained by the Solid State Imaging system (SSI) onboard the Galileo spacecraft during its Venus flyby. Although the results may be biased by the choice of spatial scale of atmospheric features, the cloud tracking is the most practical mean of estimating the wind velocities with extensive spatial and temporal coverage. The two-dimensional distribution of the horizontal wind vector field over 5 days was obtained. It was found from these wind maps that the solar-fixed component in 1990 was similar to that in 1982 obtained by the Pioneer Venus orbiter. The deviation of the instantaneous zonal wind field from the solar-fixed component shows a distinct wavenumber-1 structure in the equatorial region. On the assumption that this structure is a manifestation of an equatorial Kelvin wave, the phase relationship between the zonal wind and the cloud brightness suggests a short photochemical lifetime of the violet absorber. The momentum deposition by this Kelvin wave, which is subject to radiative damping, would induce a westward mean-wind acceleration of ~0.3 m s?1 per Earth day. 相似文献
35.
I. Charvet A. Suppasri F. Imamura 《Stochastic Environmental Research and Risk Assessment (SERRA)》2014,28(7):1853-1867
Tsunamis are disastrous events typically causing loss of life, and extreme damage to the built environment, as shown by the recent disaster that struck the East coast of Japan in 2011. In order to quantitatively estimate damage in tsunami prone areas, some studies used a probabilistic approach and derived fragility functions. However, the models chosen do not provide a statistically sound representation of the data. This study applies advanced statistical methods in order to address these limitations. The area of study is the city of Ishinomaki in Japan, the worst affected area during the 2011 event and for which an extensive amount of detailed building damage data has been collected. Ishinomaki city displays a variety of geographical environments that would have significantly affected tsunami flow characteristics, namely a plain, a narrow coast backed up by high topography (terrain), and a river. The fragility analysis assesses the relative structural vulnerability between these areas, and reveals that the buildings surrounding the river were less likely to be damaged. The damage probabilities for the terrain area (with relatively higher flow depths and velocities) were lower or similar to the plain, which confirms the beneficial role of coastal protection. The model diagnostics show tsunami flow depth alone is a poor predictor of tsunami damage for reinforced concrete and steel structures, and for all structures other variables are influential and need to be taken into account in order to improve fragility estimations. In particular, evidence shows debris impact contributed to at least a significant amount of non-structural damage. 相似文献
36.
A numerical model for far-field tsunamis and its application to predict damages done to aquaculture 总被引:2,自引:0,他引:2
The 1960 Chilean tsunami which traveled the Pacific Ocean and caused much damages to Japan is simulated from its generation to the terminal effects on coastal areas. In the computation of ocean propagation by the linear longwave theory, a new technique is introduced to keep the same accuracy as the linear Boussinesq equation and reduce the CPU time as well as the computer memory. In the coastal transformation computation, the energy dissipation due to sea-bottom scouring is suggested to be included, particularly in the case of long bays. To obtain accurate results, the current velocity requires finer spatial grids than the water surface elevation. Damage done to pearl culture rafts are explained in terms of the computed current velocity. 相似文献
37.
K. Nishiizumi M. Imamura C.P. Kohl M.T. Murrell J.R. Arnold G.P. Russ 《Earth and Planetary Science Letters》1979,44(3):409-419
The activity of solar cosmic-ray-produced53Mn has been measured as a function of depth in the upper 100 g/cm2 (~55 cm) of lunar cores 60009–60010 and 12025–12028. Additional samples which supplement our earlier work were analyzed from the Apollo 15 and 16 drill stems. These data, taken in conjunction with our previously published results and the22Na and26Al data of the Battelle Northwest group, indicate that in at least three of the four cases studied the regolith has been measureably disturbed within the last 10 m.y. In one case gardening to 19 g/cm2 is required. Activities measured in the uppermost 2 g/cm2 indicate frequent mixing within this depth range. No undisturbed profiles were observed nor were any major discontinuities observed in the profiles. The Monte Carlo gardening model of Arnold has been used to derive profiles for the gardened moon-wide average of53Mn and26Al as a function of depth. The53Mn and26Al experimental results are compared with these theoretical predictions. Agreement is good in several respects, but the calculated depths of disturbance appear to be too low. 相似文献
38.
Measurements of cosmic-ray produced53Mn are reported for a series of lunar surface samples down to a depth of 416 g/cm2. These results clearly illustrate the decrease in activity with depth as the incident galactic cosmic rays are absorbed. Below 60 g/cm2 the production rate decreases exponentially with a mean length, λ, of about 220 g/cm2. These results indicate that, at the Apollo 15 site, the lunar regolith has been unmixed, on a meter scale, for the last 5 my. The neutron activation technique for53Mn, which allowed samples smaller than 200 mg to be used for these measurements, is described. 相似文献
39.
Disasters in Viet Nam are discussed by compiling recent data on the geophysical and social environments, the frequency of disasters, and the values of human and financial losses in 1953–1991. Examinations of yearly frequency and damages caused by typhoons indicate a relatively increasing value of losses in spite of a constant or decreasing frequency in the decade of the 1980s, meaning inadequate prevention programs. The two successive typhoons in 1985 are described as the most catastrophic disaster for 100 years, in which high waves combined with high tides destroyed the dike system and flooded a large area in the central part of Viet Nam, which suggests some serious deficiencies in prevention efforts, especially in coastal areas. Disasters on the coast have been significant because of the rapid growth of the population in the low lands and the destruction of coastal environments, such as coastal erosion caused by a deforestation of mangroves and a short supply of sand. As an example, coastal erosion at Ha Nam Nimh province in the northern part of Viet Nam at an average receding speed of around 15 m/year is described. 相似文献
40.