首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6篇
  免费   0篇
地球物理   2篇
地质学   2篇
自然地理   2篇
  2007年   1篇
  2006年   2篇
  1997年   1篇
  1993年   1篇
  1992年   1篇
排序方式: 共有6条查询结果,搜索用时 15 毫秒
1
1.
Different time series were constructed from the data set containing all the seismic events recorded by the Parkfield network between 1969 and 1987. These series were analyzed to determine whether there exists an attractor in the phase space of the dynamical system characterizing seismic activity and to tentatively establish its dimension. The study has yielded ambiguous results. For all the time series analyzed, the dimension of the attractor appears higher than 12 and the correlation function of the seismic time series is undistinguishable from that of a series of random numbers of the same length. The lack of difference between the scaling parameters of two series suggests that, for all practical purposes, the seismic time series cannot be discriminated from a random series.  相似文献   
2.
In this work, we apply the Pattern Informatics technique for evaluating one surface expression of the underlying stress field, the seismicity, in order to study the Parkfield–Coalinga interaction over the years preceding the 1983 Coalinga earthquake. We find that significant anomalous seismicity changes occur during the mid-1970s in this region prior to the Coalinga earthquake that illustrate a reduction in the probability of an event at Parkfield, while the probability of an event at Coalinga is seen to increase. This suggests that the one event did not trigger or hinder the other, rather that the dynamics of the earthquake system are a function of stress field changes on a larger spatial and temporal scale.  相似文献   
3.
4.
5.
Y. Y. Kagan 《Tectonophysics》1997,270(3-4):207-219
This note discusses three interconnected statistical problems concerning the Parkfield sequence of moderate earthquakes and the Parkfield prediction experiment: (a) Is it possible that the quasi-periodic Parkfield sequence of characteristic earthquakes is no uncommon, specific phenomenon (the research hypothesis), but can be explained by a preferential selection from available earthquake catalogs? To this end we formulate the null hypothesis (earthquakes occur according to the Poisson process in time and their size follows the Gutenberg-Richter relation). We test whether the null hypothesis can be rejected as an explanation for the Parkfield sequence. (b) If the null hypothesis cannot be refuted, what is the probability of magnitude m ≥ 6 earthquake occurrence in the Parkfield region? (c) The direct goal of the Parkfield experiment is the registration of precursory phenomena prior to a m6 earthquake. However, in the absence of the characteristic earthquake, can the experiment resolve which of the two competing hypotheses is true in a reasonable time? Statistical analysis is hindered by an insufficiently rigorous definition of the research model and inadequate or ambiguous data. However, we show that the null hypothesis cannot be decisively rejected. The quasi-periodic pattern of intermediate size earthquakes in the Parkfield area is a statistical event likely to occur by chance if it has been preferentially selected from available earthquake catalogs. The observed magnitude-frequency curves for small and intermediate earthquakes in the Parkfield area agree with the theoretical distribution computed on the basis of a modified Gutenberg-Richter law (gamma distribution), using deformation rates for the San Andreas fault. We show that the size distribution of the Parkfield characteristic earthquakes can also be attributed to selection bias. According to the null hypothesis, the yearly probability of a m ≥ 6 earthquake originating in the Parkfield area is less than 1%, signifying that several more decades of observation may be needed before the expected event occurs. By its design, the Parkfield experiment cannot be expected to yield statistically significant conclusions on the validity of the research hypothesis for many decades.  相似文献   
6.
The occurrence of the September 28, 2004 M w = 6.0 mainshock at Parkfield, California, has significantly increased the mean and aperiodicity of the series of time intervals between mainshocks in this segment of the San Andreas fault. We use five different statistical distributions as renewal models to fit this new series and to estimate the time-dependent probability of the next Parkfield mainshock. Three of these distributions (lognormal, gamma and Weibull) are frequently used in reliability and time-to-failure problems. The other two come from physically-based models of earthquake recurrence (the Brownian Passage Time Model and the Minimalist Model). The differences resulting from these five renewal models are emphasized.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号