首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 359 毫秒
1.
The application of a newly developed physics-based earthquake simulator to the active faults inferred by aeromagnetism in southern Calabria has produced a synthetic catalog lasting 100 ky including more than 18,000 earthquakes of magnitude ≥?4.0. This catalog exhibits temporal, spatial and magnitude features, which resemble those of the observed seismicity. As an example of the potential use of synthetic catalogs, a map of the peak ground acceleration (PGA) for a given exceedance probability on the territory under investigation has been produced by means of a simple attenuation law applied to all the events reported in the synthetic catalog. This map was compared with the existing hazard map that is presently used in the national seismic building regulations. The comparison supports a strong similarity of our results with the values given in the present Italian seismic building code, despite the latter being based on a different methodology. The same similarity cannot be recognized for the comparison of our present study with the results obtained from a previous study based on our same methodology but with a different geological model.  相似文献   

2.
Earthquake rupture process generally involves several faults activities instead of a single fault. A new method using both fuzzy clustering and principal component analysis makes it possible to reconstruct three dimensional structure of involved faults in earthquake if the aftershocks around the active fault planes distribute uniformly. When seismic events are given, the optimal faults structures can be determined by our new method. Each of sub-fault planes is fully characterized by its central location, length, width, strike and dip. The resolution determines the number of fault segments needed to describe the earthquake catalog. The higher the resolution, the finer the structure of the reconstructed fault segments. The new method successfully reconstructs the fault segments using synthetic earthquake catalogs. By taking the 28 June 1992 Landers earthquake occured in southern California as an example, the reconstructed fault segments are consistent with the faults already known on geological maps or blind faults that appeared quite frequently in longer-term catalogs.  相似文献   

3.
—We report the analysis of over 16 years of fault creep and seismicity data from part of the creeping section of the San Andreas fault to examine and assess the temporal association between creep events and subsequent earthquakes. The goal is to make a long-term evaluation of creep events as a potential earthquake precursor. We constructed a catalog of creep events from available digital creepmeter data and compared it to a declustered seismicity catalog for the area between San Juan Bautista and San Benito, California, for 1980 to 1996. For magnitude thresholds of 3.8 and above and time windows of 5 to 10 days, we find relatively high success rates (40% to 55% 'hits') but also very high false alarm rates (generally above 90%). These success rates are statistically significant (0.0007 < P < 0.04). We also tested the actual creep event catalog against two different types of synthetic seismicity catalogs, and found that creep events are followed closely in time by earthquakes from the real catalog far more frequently than the average for the synthetic catalogs, generally by more than two standard deviations. We find no identifiable spatial pattern between the creep events and earthquakes that are hit or missed. We conclude that there is a significant temporal correlation between creep events and subsequent small to moderate earthquakes, however that additional information (such as from other potential precursory phenomena) is required to reduce the false alarm rate to an acceptable level.  相似文献   

4.
大地震的破裂过程一般涉及多个断层的活动, 发震断层并非是单一断层平面, 而是由多个断层平面组合而成. 利用成丛小震发生在断层面附近的原则, 假定震源点围绕子断层面中心服从三维正态分布, 使用GK模糊聚类方法结合主成分分析给出了一种可以重构活动断层网络三维空间结构的新方法.该方法首先对全部震源点目录使用GK模糊聚类方法得到它的划分矩阵, 再利用划分矩阵及合适的阈值, 剔除离群震源点, 提取出平面型分布的子类, 最后对每个子类在三维正态分布的假设下确定断层面分布的95%置信矩形断层区域位置、 走向角和倾向角参数.当给定地震目录事件后, 可以给出符合假设的一系列最优断层面区域, 每个子断层由它的中心位置、 长度、 宽度、 走向角和倾角所刻画.为检验新方法的性能, 先进行计算机仿真, 结果显示算法可成功地重建模拟地震目录的断层部分.最后将新方法用于南加州兰德斯(Landers)地震部分余震精确定位数据中, 得到的重建结果与已知的研究结果比较吻合, 说明了新方法的有效性.   相似文献   

5.
—A seismically active region is modelled as a system of absolutely rigid blocks separated by infinitely thin plane faults. The interaction of the blocks along the fault planes and with the underlying medium is viscous-elastic. The system of blocks moves as a consequence of prescribed motion of the boundary blocks and of the underlying medium. When, for some part of a fault plane the stress exceeds a certain strength level, a stress-drop ("a failure") occurs, and it can cause failures in other parts of the fault planes. In our model the failures represent earthquakes. As a result of the numerical simulation a synthetic earthquake catalog is produced. ? The procedure is applied to the numerical modelling of the dynamics of the block-structure, which approximates the tectonic structure of the Vrancea region. The result of the numerical experiment is a synthetic earthquake catalog with the space distribution of epicenters close to the real distribution and the frequency-magnitude relations (Gutenberg-Richter curves) obtained for the synthetic and real catalogs possessing some common features.  相似文献   

6.
We employ a computationally efficient fault system earthquake simulator, RSQSim, to explore effects of earthquake nucleation and fault system geometry on earthquake occurrence. The simulations incorporate rate- and state-dependent friction, high-resolution representations of fault systems, and quasi-dynamic rupture propagation. Faults are represented as continuous planar surfaces, surfaces with a random fractal roughness, and discontinuous fractally segmented faults. Simulated earthquake catalogs have up to 106 earthquakes that span a magnitude range from ~M4.5 to M8. The seismicity has strong temporal and spatial clustering in the form of foreshocks and aftershocks and occasional large-earthquake pairs. Fault system geometry plays the primary role in establishing the characteristics of stress evolution that control earthquake recurrence statistics. Empirical density distributions of earthquake recurrence times at a specific point on a fault depend strongly on magnitude and take a variety of complex forms that change with position within the fault system. Because fault system geometry is an observable that greatly impacts recurrence statistics, we propose using fault system earthquake simulators to define the empirical probability density distributions for use in regional assessments of earthquake probabilities.  相似文献   

7.
基于地震目录估计完备震级方法的数值实验   总被引:8,自引:2,他引:6       下载免费PDF全文
本文将5种估计完备震级(magnitude of completeness,简记为Mc)的方法运用在三个不同模型产生的理论地震目录上,进而对比它们的优缺点.我们发现分段斜率中值分析法MBASS(The Median-based analysis of the segment slope)适用于不完备部分台网探测地震能力随震级变化快及监测能力在时间上存在不均匀性(heterogeneity)的目录中,但是要求目录包含大量的地震事件,而b值稳定法MBS(The Mcby b-value stability approach)则适合用于台网探测地震能力随震级减小衰减慢地震目录,但是比较费时.最大曲率法MAXC(The Maximum Curvature technique)和拟合优度测试法GFT(The Goodness-of-Fit Test)在使用时都低估Mc,需要加调整量.完整性震级范围法EMR(Mcfrom Entire Magnitude Range)则一般给出比较稳定、适中的Mc估计值.这种方法适用在地震数目少,且对地震丢失容忍度比较高的情况.在实践中针对不同性质的地震目录,我们希望这项研究能帮助研究者选择最合适估计完备震级Mc的方法,并指出了一些估计完备震级中应当避免的问题.  相似文献   

8.
Recently the equilibrium property of ergodicity was identified in an earthquake fault system (Tiampo et al., Phys. Rev. Lett. 91, 238501, 2003; Phys. Rev. E 75, 066107, 2007). Ergodicity in this context not only requires that the system is stationary for these networks at the applicable spatial and temporal scales, but also implies that they are in a state of metastable equilibrium, one in which the ensemble averages can be substituted for temporal averages when studying their behavior in space and time. In this work we show that this property can be used to identify those regions of parameter space which are stationary when applied to the seismicity of two naturally-occurring earthquake fault networks. We apply this measure to one particular seismicity-based forecasting tool, the Pattern Informatics index (Tiampo et al., Europhys. Lett. 60, 481–487, 2002; Rundle et al., Proc. National Acad. Sci., U.S.A., Suppl. 1, 99, 2463, 2002), in order to test the hypothesis that the identification of ergodic regions can be used to improve and optimize forecasts that rely on historic seismicity catalogs. We also apply the same measure to synthetic catalogs in order to better understand the physical process that affects this accuracy. We show that, in particular, ergodic regions defined by magnitude and time period provide more reliable forecasts of future events in both natural and synthetic catalogs, and that these improvements can be directly related to specific features or properties of the catalogs that impact the behavior of their spatial and temporal statistics.  相似文献   

9.
Automated rainfall simulator for variable rainfall on urban green areas   总被引:1,自引:0,他引:1  
Rainfall simulators can enhance our understanding of the hydrologic processes affecting the total runoff to urban drainage systems. This knowledge can be used to improve urban drainage designs. In this study, a rainfall simulator is developed to simulate rainfall on urban green surfaces. The rainfall simulator is controlled by a microcomputer programmed to replicate the temporal variations in rainfall intensity of both historical and synthetic rainfall events with constant rainfall intensity on an area of 1 m2. The performance of the rainfall simulator is tested under laboratory conditions with regard to spatial uniformity of the rainfall, the kinetic energy of the raindrops, and the ability to replicate historical and synthetic rainfall events with temporally varying intensity. The rainfall simulator is applied in the field to evaluate its functionality under field conditions and the influence of wind on simulated rainfall. Finally, a field study is carried out on the relationship between runoff, soil volumetric water content, and surface slope. Performance and field tests show that the simulated rainfall has a uniform spatial distribution, whereas the kinetic energy of the raindrops is slightly higher than that of other comparable rainfall simulators. The rainfall simulator performs best in low wind speed conditions. The simulator performs well in replicating historical and synthetic rainfall events by matching both intensity variations and accumulated rainfall depth. The field study shows good correlation between rainfall, runoff, infiltration, soil water content, and surface slope.  相似文献   

10.
We use an efficient earthquake simulator that incorporates rate-state constitutive properties and uses boundary element method to discretize the fault surfaces, to generate the synthetic earthquakes in the fault system. Rate-and-state seismicity equation is subsequently employed to calculate the seismicity rate in a region of interest using the Coulomb stress transfer from the main shocks in the fault system. The Coulomb stress transfer is obtained by resolving the induced stresses due to the fault patch slips onto the optimal-oriented fault planes. The example results show that immediately after a main shock the aftershocks are concentrated in the vicinity of the rupture area due to positive stress transfers and then disperse away into the surrounding region toward the background rate distribution. The number of aftershocks near the rupture region is found to decay with time as Omori aftershock decay law predicts. The example results demonstrate that the rate-and-state fault system earthquake simulator and the seismicity equations based on the rate-state friction nucleation of earthquake are well posited to characterize the aftershock distribution in regional assessments of earthquake probabilities.  相似文献   

11.
The numerical modeling of actually observed events contained in the Baikal earthquake catalog allowed the production of synthetic catalogs. These catalogs reflect real situations that can give rise to heterogeneities in earthquake catalogs acquired by seismological monitoring. For each of the resulting 65 catalogs we calculated the slope γ and seismic activity (so-called in Russian; it is the seismicity rate or the constant term in the frequency-size relation) A 10 in the recurrence relation. We also investigated the effects of changes in estimates of earthquake energy class on the parameters of the recurrence relation. It was shown that the slope γ is not very sensitive, but that the seismic activity A 10 is sensitive, to changes in earthquake energy class and can be used to test an earthquake catalog for heterogeneity. Formulas were derived to relate the parameters of recurrence relations and the scaling factors of two energy scales, which can yield recurrence parameters for one of the scales, provided we know the ratio of the respective scaling factors and the recurrence parameters based on the other scale.  相似文献   

12.
由地震分布丛集性给出断层参数的一种新方法   总被引:5,自引:2,他引:3       下载免费PDF全文
由于大范围内地质构造的复杂性和介质的非均匀性,发震断层面的几何形态一般十分复杂.如果大地震的破裂过程涉及多个断层的活动,则发震断层并非是单一断层平面,而是多个断层面的组合.利用地震空间位置分布丛集性,即震源点成丛位于断层面附近的假设,结合稳健扩充算法和主成分分析给出一种可以重构活断层网络三维空间结构的新方法.该方法每次从震源点集中处开始,利用假设检验扩充子断层面,并得到多个子断层面.接着按震源点属于最近断层面的准则把各子断层面内的震源点进行竞争,并根据一定假设合并和删除一些子断层面,最后用主成分分析确定每个子断层面参数.于是可根据地震事件目录给出一组矩形区域来描述断层面网络结构,其中每个矩形断层面由其位置、走向和倾角确定.通过计算机模拟发现,新方法可成功地重建模拟地震目录的断层面,最后用于南加州1992年6月28日发生的Landers地震部分余震目录中,得到各个子断层面参数与已知地质破裂或隐伏断层相当一致.  相似文献   

13.
Forecasts of future earthquake hazard in the San Francisco Bay region (SFBR) are dependent on the distribution used for the possible magnitude of future events. Based on the limited observed data, it is not possible to statistically distinguish between many distributions with very different tail behavior. These include the modified and truncated Gutenberg–Richter distributions, and a composite distribution assembled by the Working Group on California Earthquake Probabilities. There is consequent ambiguity in the estimated probability of very large, and hence damaging, events. A related question is whether the energy released in earthquakes is a small or large proportion of the stored energy in the crust, corresponding loosely to the ideas of self-organized criticality, and intermittent criticality, respectively. However, the SFBR has experienced three observed accelerating moment release (AMR) cycles, terminating in the 1868 Hayward, 1906 San Andreas and 1989 Loma Prieta events. A simple stochastic model based on elastic rebound has been shown to be capable of producing repeated AMR cycles in large synthetic catalogs. We propose that such catalogs can provide the basis of a test of a given magnitude distribution, via comparisons between the AMR properties of the real and synthetic data. Our results show that the truncated Gutenberg–Richter distribution produces AMR behavior closest to the observed AMR behavior. The proviso is that the magnitude parameters b and m max are such that a sequence of large events that suppresses activity for several centuries is unlikely to occur. Repeated simulation from the stochastic model using such distributions produces 30-year hazard estimates at various magnitudes, which are compared with the estimates from the 2003 Working Group on California Earthquake Probabilities.  相似文献   

14.
Waveform cross correlation is an efficient tool for detection and characterization of seismic signals. For the purposes of the Comprehensive Nuclear-Test-Ban Treaty, cross correlation can globally reduce the threshold of detection by 0.3 to 0.4 magnitude units. However, the technique critically depends on the availability of master events. In Part I of this paper, we have demonstrated that in seismically active regions the best master events (grand-masters) replicated over a regular grid allow improving the efficiency of signal detection and event finding. In aseismic areas, there are two approaches to populate the global grid of master events for the International Monitoring System: the replication of grand-masters and calculation of synthetic seismograms for master-events in the global grid nodes. The efficiency of synthetic templates depends on the accuracy of shape and amplitude predictions controlled by focal depth and mechanism, source function, velocity structure and attenuation along the master/station path. Here we test three focal mechanisms (explosion, thrust fault, and actual Harvard CMT solution for one of the April 11, 2012 Sumatra aftershocks) and two velocity structures (ak135 and CRUST 2.0). Sixteen synthetic master events were distributed over a 1° × 1° grid covering the zone of aftershocks. We built five cross correlation standard event lists (XSEL) and compared detections and events with those built using the real and grand master events as well as with the Reviewed Events Bulletin of the International Data Centre. The XSELs were built using an explosion source and ak135 and the reverse fault with isotropic radiation pattern to demonstrate the performance similar to that of the real and grand masters. Here we have proved quantitatively that it is possible to cover all aseismic areas with synthetic masters without significant loss in seismic monitoring capabilities based on cross correlation.  相似文献   

15.
In this work, we make an attempt to review some of the recent studies on earthquakes using either real catalogs or synthetic data coming from some model systems. A common feature of all these works is the use of q-statistics as a tool.  相似文献   

16.
The development of studies of solar sources and their effects on the state of the near-Earth space required systematization of the corresponding information in the form of databases and catalogs for the entire time of observation of any geoeffective phenomenon that includes, if possible at the time of creation, all of the characteristics of the phenomena themselves and the sources of these phenomena on the Sun. A uniform presentation of information in the form of a series of similar catalogs that cover long time intervals is of particular importance. The large amount of information collected in such catalogs makes it necessary to use modern methods of its organization and presentation that allow a transition between individual parts of the catalog and a quick search for necessary events and their characteristics, which is implemented in the presented Catalog of Solar Proton Events in the 23rd Cycle of Solar Activity of the sequence of catalogs (six separate issues) that cover the period from 1970 to 2009 (20th–23rd solar cycles).  相似文献   

17.
Determining the focal mechanism of earthquakes helps us to better define faults and understand the stress regime. This technique can be helpful in the oil and gas industry where it can be applied to microseismic events. The objective of this paper is to find double couple focal mechanisms, excluding scalar seismic moments, and the depths of small earthquakes using data from relatively few local stations. This objective is met by generating three‐component synthetic seismograms to match the observed normalized velocity seismograms. We first calculate Green's functions given an initial estimate of the earthquake's hypocentre, the locations of the seismic recording stations and a 1D velocity model of the region for a series of depths. Then, we calculate the moment tensor for different combinations of strikes, dips and rakes for each depth. These moment tensors are combined with the Green's functions and then convolved with a source time function to produce synthetic seismograms. We use a grid search to find the synthetic seismogram with the largest objective function that best fits all three components of the observed velocity seismogram. These parameters define the focal mechanism solution of an earthquake. We tested the method using three earthquakes in Southern California with moment magnitudes of 5.0, 5.1 and 4.4 using the frequency range 0.1–2.0 Hz. The source mechanisms of the events were determined independently using data from a multitude of stations. Our results obtained, from as few as three stations, generally match those obtained by the Southern California Earthquake Data Center. The main advantage of this method is that we use relatively high‐frequency full‐waveforms, including those from short‐period instruments, which makes it possible to find the focal mechanism and depth of earthquakes using as few as three stations when the velocity structure is known.  相似文献   

18.
With dense seismic arrays and advanced imaging methods, regional three-dimensional(3D) Earth models have become more accurate. It is now increasingly feasible and advantageous to use a 3D Earth model to better locate earthquakes and invert their source mechanisms by fitting synthetics to observed waveforms. In this study, we develop an approach to determine both the earthquake location and source mechanism from waveform information. The observed waveforms are filtered in different frequency bands and separated into windows for the individual phases. Instead of picking the arrival times, the traveltime differences are measured by cross-correlation between synthetic waveforms based on the 3D Earth model and observed waveforms. The earthquake location is determined by minimizing the cross-correlation traveltime differences. We then fix the horizontal location of the earthquake and perform a grid search in depth to determine the source mechanism at each point by fitting the synthetic and observed waveforms. This new method is verified by a synthetic test with noise added to the synthetic waveforms and a realistic station distribution. We apply this method to a series of M_W3.4–5.6 earthquakes in the Longmenshan fault(LMSF) zone, a region with rugged topography between the eastern margin of the Tibetan plateau and the western part of the Sichuan basin. The results show that our solutions result in improved waveform fits compared to the source parameters from the catalogs we used and the location can be better constrained than the amplitude-only approach. Furthermore, the source solutions with realistic topography provide a better fit to the observed waveforms than those without the topography, indicating the need to take the topography into account in regions with rugged topography.  相似文献   

19.
Missing early aftershocks following relatively large or moderate earthquakes can cause significant bias in the analysis of seismic catalogs. In this paper, we systematically address the aftershock missing problem for five earthquake sequences associated with moderate-size events that occurred inland Japan, by using a stochastic replenishing method. The method is based on the notion that if a point process (e.g., earthquake sequence) with time-independent marks (e.g., magnitudes) is completely observed, it can be transformed into a homogeneous Poisson process by a bi-scale empirical transformation. We use the Japan Meteorological Agency (JMA) earthquake catalog to select the aftershock data and replenish the missing early events using the later complete part of each aftershock sequence. The time windows for each sequence span from 6 months before the mainshock to three months after. The semi-automatic spatial selection uses a clustering method for the epicentral selection of earthquakes. The results obtained for the original JMA catalog and replenished datasets are compared to get insight into the biases that the missing early aftershocks may cause on the Omori-Utsu law parameters’ estimation, characterizing the aftershock decay with time from the mainshock. We have also compared the Omori-Utsu law parameter estimates for two datasets following the same mainshock; the first dataset is the replenished sequence, while the second dataset has been obtained by waveform-based analysis to detect early aftershocks that are not recorded in the JMA catalog. Our results demonstrate that the Omori-Utsu law parameters estimated for the replenished datasets are robust with respect to the threshold magnitude used for the analyzed datasets. Even when using aftershock time windows as short as three days, the replenished datasets provide stable Omori-Utsu law parameter estimations. The p-values for all the analyzed sequences are about 1.1 and c-values are significantly smaller compared to those of original datasets. Our findings prove that the replenishment method is a fast, reliable approach to address the missing aftershock problem.  相似文献   

20.
—Careful observation has shown that mining-induced seismicity follows a multimodal distribution, which we assume to arise from many distinct physical processes. The two major modes however, arise from those seismic events that are associated in some way with geological features on the one hand, and those that are associated, among other things, with fracturing in the volume of extreme stress concentrations ahead of the stope faces, on the other. We call the former "genuine" events and the latter "spurious" events.¶Untangling these modes has been a major problem for those researchers wishing to work with unimodal seismic catalogs. Partial separation of the genuine events from a catalog can be obtained by a careful selection from a scatter diagram of log (radiated seismic energy) against log (scalar seismic moment) or equivalently by selecting a threshold value of magnitude say, from an inspection of the Gutenberg-Richter diagram. This threshold is usually considerably greater than the threshold of completeness that can be achieved by modern seismic networks on mines.¶The main objective of this paper will be the demonstration that a simple neural network can improve this separation. In this study, for example, simple elimination below the threshold of log (scalar seismic moment) = 9.5 resulted in 206 genuine events remaining in the catalog. After running the eliminated events through a trained neural network, an additional 72 genuine events were found, representing an increase of nearly 35%.¶This has important consequences for statistical hazard analysis and for the identification of active geological structures in mines.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号