首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Aftershock rates seem to follow a power law decay, but the assessment of the aftershock frequency immediately after an earthquake, as well as during the evolution of a seismic excitation remains a demand for the imminent seismic hazard. The purpose of this work is to study the temporal distribution of triggered earthquakes in short time scales following a strong event, and thus a multiple seismic sequence was chosen for this purpose. Statistical models are applied to the 1981 Corinth Gulf sequence, comprising three strong (M = 6.7, M = 6.5, and M = 6.3) events between 24 February and 4 March. The non-homogeneous Poisson process outperforms the simple Poisson process in order to model the aftershock sequence, whereas the Weibull process is more appropriate to capture the features of the short-term behavior, but not the most proper for describing the seismicity in long term. The aftershock data defines a smooth curve of the declining rate and a long-tail theoretical model is more appropriate to fit the data than a rapidly declining exponential function, as supported by the quantitative results derived from the survival function. An autoregressive model is also applied to the seismic sequence, shedding more light on the stationarity of the time series.  相似文献   

2.
Major earthquakes (i.e., mainshocks) typically trigger a sequence of lower magnitude events clustered both in time and space. Recent advances of seismic hazard analysis stochastically model aftershock occurrence (given the main event) as a nonhomogeneous Poisson process with rate that decays in time as a negative power law. Risk management in the post‐event emergency phase has to deal with this short‐term seismicity. In fact, because the structural systems of interest might have suffered some damage in the mainshock, possibly worsened by damaging aftershocks, the failure risk may be large until the intensity of the sequence reduces or the structure is repaired. At the state‐of‐the‐art, the quantitative assessment of aftershock risk is aimed at building tagging, that is, to regulate occupancy. The study, on the basis of age‐dependent stochastic processes, derived closed‐form approximations for the aftershock reliability of simple nonevolutionary elastic‐perfectly‐plastic damage‐cumulating systems, conditional on different information about the structure. Results show that, in the case hypotheses apply, the developed models may represent a basis for handy tools enabling risk‐informed tagging by stakeholders and decision makers. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

3.
Nonuniform friction as a physical basis for earthquake mechanics   总被引:2,自引:0,他引:2  
A review of simple models and observations suggests that the main first-order features of active faulting-mechanical instability, the frequency-magnitude relations, seismic and aseismie slip, seismic radiation, incoherency and rupture stoppage — may be explained by a single characteristic of crustal faults: the spatial variation of the effective frictional stress, which resists slippage on faults. Faultoffset data suggest that rupture propagation ceases in regions of high resistance which act, as barriers. In these regions slippage is associated with negative stress drop. The spacing and the amplitudeA() of the barriers, as inferred from the frequency-magnitude and moment relation for earthquakes, obeys a simple statistical relationA()p. On the scale of particle motion, this variability of frictional stress provides a mechanical instability which may be associated with the concept of dynamic friction. Invariably, the rapid particle motion in the model is always preceded by accelerated creep. The particle acceleration is highly irregular, giving rise to an almost random acceleration record on the fault. The particle displacement is relatively smooth, giving rise to simple displacement time function in the far field. Rupture propagation time is approximately proportional to the gradient of frictional stress along the fault. Consequently sharp changes of this stress may cause multiple events and other long period irregularities in the fault motion.The power density spectrum associated with the frictional stress implies that stress may be related to a Poisson distribution of lengths. The autocorrelation of such type of distribution yields a correlation lengthk L –1 , similar perhaps toHaskell's (1964) andAki's (1967) correlation lengths inferred from spectral analysis of seismic waves. The partial incoherency of faulting implies that preseismic deformation may be significantly incoherent, consequently the prediction of small moderate earthquakes may be subject to inherent uncertainties. We conclude that frictional stress heterogeneities may be necessary and sufficient to explain active faulting associated with small and moderate earthquakes.  相似文献   

4.
A mixed model is proposed to fit earthquake interevent time distribution. In this model, the whole distribution is constructed by mixing the distribution of clustered seismicity, with a suitable distribution of background seismicity. Namely, the fit is tested assuming a clustered seismicity component modeled by a non-homogeneous Poisson process and a background component modeled using different hypothetical models (exponential, gamma and Weibull). For southern California, Japan, and Turkey, the best fit is found when a Weibull distribution is implemented as a model for background seismicity. Our study uses earthquake random sampling method we introduced recently. It is performed here to account for space–time clustering of earthquakes at different distances from a given source and to increase the number of samples used to estimate earthquake interevent time distribution and its power law scaling. For Japan, the contribution of clustered pairs of events to the whole distribution is analyzed for different magnitude cutoffs, m c, and different time periods. The results show that power laws are mainly produced by the dominance of correlated pairs at small and long time ranges. In particular, both power laws, observed at short and long time ranges, can be attributed to time–space clustering revealed by the standard Gardner and Knopoff’s declustering windows.  相似文献   

5.
An earthquake catalog derived from the detection of seismically-generated T-waves is used to study the time-clustering behavior of moderate-size (?3.0 M) earthquakes between 15 and 35°N along the Mid-Atlantic Ridge (MAR). Within this region, the distribution of inter-event times is consistent with a non-periodic, non-random, clustered process. The highest degrees of clustering are associated temporally with large mainshock-aftershock sequences; however, some swarm-like activity also is evident. Temporal fluctuations characterized by a power spectral density P(f) that decays as 1/fα are present within the time sequence, with α ranging from 0.12 to 0.55 for different regions of the spreading axis. This behavior is negligible at time scales less than ∼5×103 s, and earthquake occurrence becomes less clustered (smaller α) as increasing size thresholds are applied to the catalog. A power-law size-frequency scaling for Mid-Atlantic Ridge earthquakes also can be demonstrated using the distribution of acoustic magnitudes, or source levels. Although fractal seismic behavior has been linked to the structure of the underlying fault population in other environments, power-law fault size distributions have not been observed widely in the mid-ocean ridge setting.  相似文献   

6.
Bayes' theorem has possible application to earthquake prediction because it can be used to represent the dependence of the inter-arrival time (T) of thenext event on magnitude (M) of thepreceding earthquake (Ferraes, 1975;Bufe et al., 1977;Shimazaki andNakata, 1980;Sykes andQuittmeyer, 1981). First, we derive the basic formulas, assuming that the earthquake process behaves as a Poisson process. Under this assumption the likelihood probabilities are determined by the Poisson distribution (Ferraes, 1985) after which we introduce the conjugate family of Gamma prior distributions. Finally, to maximize the posterior Bayesian probabilityP(/M) we use calculus and introduce the analytical condition .Subsequently we estimate the occurrence of the next future large earthquake to be felt in Mexico City. Given the probabilistic model, the prediction is obtained from the data set that include all events withM7.5 felt in Mexico City from 1900 to 1985. These earthquakes occur in the Middle-America trench, along Mexico, but are felt in Mexico City. To see the full significance of the analysis, we give the result using two models: (1) The Poisson-Gamma, and (2) The Poisson-Exponential (a special case of the Gamma).Using the Poisson-Gamma model, the next expected event will occur in the next time interval =2.564 years from the last event (occurred on September 19, 1985) or equivalently, the expected event will occur approximately in April, 1988.Using the Poisson-Exponential model, the next expected damaging earthquake will occur in the next time interval =2.381 years from the last event, or equivalently in January, 1988.It should be noted that very strong agreement exists between the two predicted occurrence times, using both models.  相似文献   

7.
中强地震发生后,地震检测因受到尾波的干扰可能会遗漏部分微震事件,影响地震目录的完备性。文章利用波形模板匹配方法对2020年新疆伽师MS6.4地震序列开展微震检测,相比原始的中国地震台网中心统一地震目录,新检测出1 756个微震事件,地震数量增加了1.3倍。基于检测后的余震目录计算最小完备震级为ML1.2,地震活动性b值为0.76,较原始目录的ML1.6和0.77均有所降低。通过伽师震源区地震序列活动特征分析,结果表明前震序列在主震前短时间内(前36小时)出现地震活动的密集增强,相应的b值显示为低值;主震发生后地震序列完备震级较高,随着时间的推移,完备震级缓慢降低并趋于稳定,并且呈周期性的波动。本研究提高了伽师震源区地震目录的完备性,为精细化描述该地区地震序列时空演化特征提供了关键数据基础。  相似文献   

8.
We describe a fully automated seismic event detection and location system, providing for real-time estimates of the epicentral parameters of both local and distant earthquakes. The system uses 12 telemetered short-period stations, with a regional aperture of 350 km, as well as two 3-component broad-band stations. Detection and location of teleseismic events is achieved independently and concurrently on the short-period and long-period channels. The long-period data is then used to obtain an estimate of the seismic momentM 0 of the earthquake through the mantle magnitudeM m, as introduced byOkal andTalandier (1989). In turn, this estimate ofM 0 is used to infer the expected tsunami amplitude at Papeete, within 15 minutes of the recording of Rayleigh waves. The performance of the method is discussed in terms of the accuracy of the epicentral parameters and seismic moment obtained in real time, as compared to the values later published by the reporting agencies. Our estimates are usually within 3 degrees of the reported epicenter, and the standard deviation on the seismic moment only 0.19 unit of magnitude for a population of 154 teleseismic events.  相似文献   

9.
Following the increase in seismic activity which occurred near Isernia (Molise, Central Italy) in January 1986, a digital seismic network of four stations with three-component, short-period seismometers, was installed in the area by the Osservatorio Vesuviano. The temporary network had an average station spacing of about 8–10 km and, in combination with permanent local seismic stations, allowed the accurate determination of earthquake locations during an operating period of about one month. Among the 1517 detected earthquakes, 170 events were selected with standard errors on epicentre and depth not greater than respectively 0.5 and 1.5 km. The most frequent focal depths ranged between 4 and 8 km, while the epicentres distribution covered a small area NE of Isernia of about 10 km2. A main rupture zone could not be clearly identified from the spatial distribution of the earthquakes, suggesting a rupture mechanism involving heterogeneous materials. The activity was characterized by low energy levels, the largest earthquake, on January 18, 1986, havingM D =4.0. The time sequence of events and pattern of seismic energy release revealed a strong temporal clustering of events, similar to the behaviour commonly associated with seismic swarms.  相似文献   

10.
A new detection and location algorithm for a single seismic array is described. This algorithm is an improvement of methods of joint polarization analysis and beamforming. The method has been used in new automatic detector-locator program UDL (Universal Detector Locator) to process data of aftershock sequence of the strongest intraplate earthquake (M w = 6.1) in Storfjorden, Spitsbergen archipelago. A temporal behavior of the sequence of the aftershocks is presented. It has been found that the sequence does not satisfy Omori law and have periodicity for events with M < 0. Possible reasons of this phenomenon are discussed.  相似文献   

11.
四川盆地南部的长宁页岩气开发区附近地震频发,近年来已发生近10次ML>4.0的中型地震和万余次ML1.0~3.0的小微地震,灾害风险持续增高.由于国家地震台网的固定台站较为分散,难以捕捉到1级以下微震事件的精确信息,通过近场微震监测数据来分析页岩气开发区的地震风险演化趋势,已经成为势在必行的科学问题.本文基于专门布设的...  相似文献   

12.
Joint hypocenter determination is performed for intermediate and deep earthquakes of the Tyrrhenian Sea region.This analysis allowed us to obtain a catalogue of 70 well-located events in this peculiar Benioff zone, which is characterized by quite low seismic activity, compared to the Pacific deep earthquake regions. The method used for the analysis is that ofFrohlich (1979), a variant of the successive approximation technique, which allows use of a great number of events and stations but saves computer memory. The results show a spoon-shaped Benioff zone, dipping NW in the Tyrrhenian Sea to 500km depth. 32 reliable fault-plane solutions have been determined using these new earthquake locations, confirming the predominance of down-dip compression in the central part of the slab and more complex motion along the borders of the zone, as previously suggested byGasparini et al. (1982).  相似文献   

13.
Tinti  S.  Mulargia  F. 《Pure and Applied Geophysics》1985,123(2):199-220
The apparent magnitude of an earthquakey is defined as the observed magnitude value and differs from the true magnitudem because of the experimental noisen. Iff(m) is the density distribution of the magnitudem, and ifg(n) is the density distribution of the errorn, then the density distribution ofy is simply computed by convolvingf andg, i.e.h(y)=f*g.If the distinction betweeny andm is not realized, any statistical analysis based on the frequency-magnitude relation of the earthquake is bound to produce questionable results. In this paper we investigate the impact of the apparent magnitude idea on the statistical methods that study the earthquake distribution by taking into account only the largest (or extremal) earthquakes. We use two approaches: the Gumbel method based on Gumbel theory (Gumbel, 1958), and the Poisson method introduced byEpstein andLomnitz (1966). Both methods are concerned with the asymptotic properties of the magnitude distributions. Therefore, we study and compare the asymptotic behaviour of the distributionsh(y) andf(m) under suitable hypotheses on the nature of the experimental noise. We investigate in detail two dinstinct cases: first, the two-side limited symmetrical noise, i.e. the noise that is bound to assume values inside a limited region, and second, the normal noise, i.e. the noise that is distributed according to a normal symmetric distribution.We further show that disregarding the noise generally leads to biased results and that, in the framework of the apparent magnitude, the Poisson approach preserves its usefulness, while the Gumbel method gives rise to a curious paradox.  相似文献   

14.
The method of relative seismic moment tensor determination proposed byStrelitz (1980) is extended a) from an interactive time domain analysis to an automated frequency domain procedure, and b) from an analysis of subevents of complex deep-focus earthquakes to the study of individual source mechanism of small events recorded at few stations.The method was applied to the recovery of seismic moment tensor components of 95 intermediate depth earthquakes withM L=2.6–4.9 from the Vrancea region, Romania. The main feature of the obtained fault plane solutions is the horizontality ofP axes and the nonhorizontal orienaation ofT axes (inverse faulting). Those events with high fracture energy per unit area of the fault can be grouped unambiguously into three depth intervals: 102–106 km, 124–135 km and 141–152 km. Moreover, their fault plane solutions are similar to ones of all strong and most moderate events from this region and the last two damaging earthquakes (November 10, 1940 withM W=7.8 and March 4 1977 withM W=7.5) occurred within the third and first depth interval, respectively. This suggests a possible correlation at these depths between fresh fracture of rocks and the occurrence of strong earthquakes.  相似文献   

15.
The study of seismic anomalies, related both to the temporal trend of aftershock sequences and to the temporal series of mainshocks, is important for an understanding of the physical processes relating to the existence and the characteristics of seismic precursors. The purpose of this work is to highlight some methodological aspects related to the observation of possible anomalies in the temporal decay of an aftershock sequence. It is realized by means of several parameters. We focused our work on an analysis of the Papua New Guinea seismic sequence that occurred on November 16, 2000. The magnitude of the mainshock is M = 8.2. The observed temporal series of shocks per day can be considered as a sum of a deterministic contribution and a stochastic contribution. If the decay can be modeled as a nonstationary Poisson process where the intensity function is equal to n(t) = K(t + c)p + K 1, the number of aftershocks in a small time interval Δt is the mean value n(tt, with a standard deviation σ = √n(tt. We observe that there are some variations in seismicity that can be considered as seismic anomalies before the occurrence of a large aftershock. The data, checked according to completeness criteria, come from the website of the USGS NEIC data bank (). The text was submitted by the authors in English.  相似文献   

16.
王鹏  侯金欣  吴朋 《中国地震》2017,33(4):453-462
中强地震序列的主震发生后,短时间内受台站距震中较远、尾波干扰和波形重叠等因素的影响,往往会遗漏大量的地震,而地震目录的完整性会直接影响到震后趋势判定和余震序列特征分析的科学性和可靠性。本文利用基于GPU加速的模板匹配方法对2017年8月1~12日的连续波形进行扫描计算,检测九寨沟MS7.0地震前后遗漏的地震事件,选取台网目录中信噪比较高的1033个地震事件作为模板,在主震前7天至震后5天期间识别出4854个检测地震事件,为台网可定位目录的3.3倍,除去对台网单台地震事件的修正外,还检测到1797个遗漏地震事件,将完备震级从1.6级降低到1.4级。基于补充了遗漏地震的完整地震目录,对2017年8月8日九寨沟MS7.0地震序列活动特征进行分析。结果表明,前震序列在主震前短时间内出现了地震活动的密集增强,b值也显示为低值状态,可能是深部断层发生破裂之前的加速蠕动的结果。随着时间的推移,余震序列的完备震级逐渐下降并趋于稳定,b值存在缓慢升高的趋势,未来较长时期内余震序列仍将处于持续衰减的状态。  相似文献   

17.
A stochastic triggering (epidemic) model incorporating short-term clustering was fitted to the instrumental earthquake catalog of Italy for event with local magnitudes 2.6 and greater to optimize its ability to retrospectively forecast 33 target events of magnitude 5.0 and greater that occurred in the period 1990–2006. To obtain an unbiased evaluation of the information value of the model, forecasts of each event use parameter values obtained from data up to the end of the year preceding the target event. The results of the test are given in terms of the probability gain of the epidemic-type aftershock sequence (ETAS) model relative to a time-invariant Poisson model for each of the 33 target events. These probability gains range from 0.93 to 32000, with ten of the target events yielding a probability gain of at least 10. As the forecasting capability of the ETAS model is based on seismic activity recorded prior to the target earthquakes, the highest probability gains are associated with the occurrence of secondary mainshocks during seismic sequences. However, in nine of these cases, the largest mainshock of the sequence was marked by a probability gain larger than 50, having been preceded by previous smaller magnitude earthquakes. The overall evaluation of the performance of the epidemic model has been carried out by means of four popular statistical criteria: the relative operating characteristic diagram, the R score, the probability gain, and the log-likelihood ratio. These tests confirm the superior performance of the method with respect to a spatially varying, time-invariant Poisson model. Nevertheless, this method is characterized by a high false alarm rate, which would make its application in real circumstances problematic.  相似文献   

18.
The 3 strongest earthquakes,M7.0, which have occurred since 1973 in the area of Greece were preceded by a specific increase of the earthquake activity in the lower magnitude range. This activation is depicted by algorithm M8. This algorithm of intermediate term earthquake prediction was originally designed for diagnosis by Times of Increased Probability (TIPs) of the strongest earthquake,M8.0 worldwide (Keilis-Borok andKossobokov, 1984). At present the algorithm is retrospectively tested for smaller magnitudes in different seismic regions (Keilis-Borok andKossobokov, 1986, 1988). A TIP refers to a time period of 5 years and an area whose linear size is proportional and several times larger than that of the incipient earthquake source. Altogether the TIPs diagnosed by the algorithm M8 in the area of Greece occupy less than 20% and the Times of Expectation (TEs) about 10% of the total space-time domain considered. Also there is a current TIP for the southeastern Aegean sea and 1988–1992. It may specify the long-term prediction given inWyss andBaer (1981a,b).The results of this study are further evidence favoring applicability of algorithm M8 in diverse seismotectonic environment and magnitude ranges and support indirectly the hypothesis of self-similarity of the earthquake activity. It also implies the possibility of intermediate term prediction of the strongest earthquakes in the area of Greece.  相似文献   

19.
人工地震目录模拟是改进现有地震目录不完备性、弥补大地震记录稀缺,以及完善地震学相关研究的有效途径之一。本文基于地震活动的泊松分布模型、古登堡-里克特震级-频度关系,利用能较逼真描述具有随机性质事物特点及物理实验过程的蒙特卡洛方法,模拟汾渭地震带未来30、50、100年等不同时长的地震目录,并对其进行统计检验。分析表明,模拟地震目录符合设定的地震活动性参数和泊松分布假设特征。依据模拟地震目录,对未来该区域地震趋势进行了分析,以期为地震危险性分析提供参考。  相似文献   

20.
张晖  谭毅培  马婷  翟浩  张珂  李娟 《中国地震》2021,37(2):430-441
内蒙古和林格尔地处鄂尔多斯块体北缘阴山地震带内,历史上6级以上强震频发.2020年3月30日和林格尔发生ML4.5地震,打破了自2005年以来阴山地震带ML4.0以上地震的长期平静.研究此次地震序列的发震构造对区域应力状态和地震危险性分析有重要作用,然而内蒙古地震台网台站较为稀疏,相对于华北其他地区地震监测能力较低,对...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号