首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12214篇
  免费   2866篇
  国内免费   791篇
测绘学   235篇
大气科学   353篇
地球物理   11059篇
地质学   2260篇
海洋学   229篇
天文学   230篇
综合类   1008篇
自然地理   497篇
  2024年   23篇
  2023年   109篇
  2022年   285篇
  2021年   454篇
  2020年   396篇
  2019年   445篇
  2018年   453篇
  2017年   454篇
  2016年   328篇
  2015年   514篇
  2014年   648篇
  2013年   651篇
  2012年   633篇
  2011年   694篇
  2010年   649篇
  2009年   898篇
  2008年   644篇
  2007年   741篇
  2006年   733篇
  2005年   734篇
  2004年   629篇
  2003年   618篇
  2002年   501篇
  2001年   452篇
  2000年   418篇
  1999年   365篇
  1998年   374篇
  1997年   336篇
  1996年   346篇
  1995年   301篇
  1994年   293篇
  1993年   222篇
  1992年   178篇
  1991年   108篇
  1990年   75篇
  1989年   53篇
  1988年   45篇
  1987年   17篇
  1986年   17篇
  1985年   8篇
  1984年   5篇
  1983年   2篇
  1979年   15篇
  1978年   1篇
  1977年   1篇
  1954年   5篇
排序方式: 共有10000条查询结果,搜索用时 31 毫秒
91.
Numerical models are starting to be used for determining the future behaviour of seismic faults and fault networks. Their final goal would be to forecast future large earthquakes. In order to use them for this task, it is necessary to synchronize each model with the current status of the actual fault or fault network it simulates (just as, for example, meteorologists synchronize their models with the atmosphere by incorporating current atmospheric data in them). However, lithospheric dynamics is largely unobservable: important parameters cannot (or can rarely) be measured in Nature. Earthquakes, though, provide indirect but measurable clues of the stress and strain status in the lithosphere, which should be helpful for the synchronization of the models.The rupture area is one of the measurable parameters of earthquakes. Here we explore how it can be used to at least synchronize fault models between themselves and forecast synthetic earthquakes. Our purpose here is to forecast synthetic earthquakes in a simple but stochastic (random) fault model. By imposing the rupture area of the synthetic earthquakes of this model on other models, the latter become partially synchronized with the first one. We use these partially synchronized models to successfully forecast most of the largest earthquakes generated by the first model. This forecasting strategy outperforms others that only take into account the earthquake series. Our results suggest that probably a good way to synchronize more detailed models with real faults is to force them to reproduce the sequence of previous earthquake ruptures on the faults. This hypothesis could be tested in the future with more detailed models and actual seismic data.  相似文献   
92.
The Great Lisbon earthquake has the largest documented felt area of any shallow earthquake and an estimated magnitude of 8.5–9.0. The associated tsunami ravaged the coast of SW Portugal and the Gulf of Cadiz, with run-up heights reported to have reached 5–15 m. While several source regions offshore SW Portugal have been proposed (e.g.— Gorringe Bank, Marquis de Pombal fault), no single source appears to be able to account for the great seismic moment as well as all the historical tsunami amplitude and travel time observations. A shallow east dipping fault plane beneath the Gulf of Cadiz associated with active subduction beneath Gibraltar, represents a candidate source for the Lisbon earthquake of 1755.Here we consider the fault parameters implied by this hypothesis, with respect to total slip, seismic moment, and recurrence interval to test the viability of this source. The geometry of the seismogenic zone is obtained from deep crustal studies and can be represented by an east dipping fault plane with mean dimensions of 180 km (N–S) × 210 km (E–W). For 10 m of co-seismic slip an Mw 8.64 event results and for 20 m of slip an Mw 8.8 earthquake is generated. Thus, for convergence rates of about 1 cm/yr, an event of this magnitude could occur every 1000–2000 years. Available kinematic and sedimentological data are in general agreement with such a recurrence interval. Tsunami wave form modeling indicates a subduction source in the Gulf of Cadiz can partly satisfy the historical observations with respect to wave amplitudes and arrival times, though discrepancies remain for some stations. A macroseismic analysis is performed using site effect functions calculated from isoseismals observed during instrumentally recorded strong earthquakes in the region (M7.9 1969 and M6.8 1964). The resulting synthetic isoseismals for the 1755 event suggest a subduction source, possibly in combination with an additional source at the NW corner of the Gulf of Cadiz can satisfactorily explain the historically observed seismic intensities. Further studies are needed to sample the turbidites in the adjacent abyssal plains to better document the source region and more precisely calibrate the chronology of great earthquakes in this region.  相似文献   
93.
《中国地球化学学报》2006,25(B08):239-239
Methylmercury (MeHg) is a powerful neurotoxicant in humans. In terms of biomarkers of MeHg exposure, hair and blood have long been used in epidemiological studies as the biomarkers of choice. In fact, total hair mercury (Hg) content as well as organic blood Hg concentrations reflects exposure to organic Hg from food consumption. Extensive studies, establishing a constant and linear relation between MeHg intake versus Hg levels in hair and blood, were conducted by governmental officials to establish guidelines on safe levels of MeHg exposure, which were translated into threshold daily fish consumption rates (usually expressed as μg MeHg per kg bodyweight). Nowadays, in most epidemiologic studies blood or hair mercury (Hg) level is commonly used as a valid proxy to estimate human exposure to methylmercury (MeHg) through fish consumption without relating this signal to actual fish consumption patterns among populations. Human variability in mercury toxicokinetics was identified and measurement error has been pointed out to be a substantial contributor to observed variability, particularly where dietary information is retrospective and self-reported. However, experimental evidence indicates that significant variability among individuals may exist in the biokinetics of mercury. Also recent findings from previous population-based studies through COMERN initiative also revealed that MeHg metabolic processes might greatly vary across populations. In fact, it is unlikely that the magnitude of the difference measured between observed and expected levels of mercury, given the reported intake, can be entirely explained by laboratory measurement errors or reporting bias.  相似文献   
94.
The three most important components necessary for functioning of an operational flood warning system are: (1) a rainfall measuring system; (2) a soil moisture updating system; and, (3) a surface discharge measuring system. Although surface based networks for these systems can be largely inadequate in many parts of the world, this inadequacy particularly affects the tropics, which are most vulnerable to flooding hazards. Furthermore, the tropical regions comprise developing countries lacking the financial resources for such surface-based monitoring. The heritage of research conducted on evaluating the potential for measuring discharge from space has now morphed into an agenda for a mission dedicated to space-based surface discharge measurements. This mission juxtaposed with two other upcoming space-based missions: (1) for rainfall measurement (Global Precipitation Measurement, GPM), and (2) soil moisture measurement (Hydrosphere State, HYDROS), bears promise for designing a fully space-borne system for early warning of floods. Such a system, if operational, stands to offer tremendous socio-economic benefit to many flood-prone developing nations of the tropical world. However, there are two competing aspects that need careful assessment to justify the viability of such a system: (1) cost-effectiveness due to surface data scarcity; and (2) flood prediction uncertainty due to uncertainty in the remote sensing measurements. This paper presents the flood hazard mitigation opportunities offered by the assimilation of the three proposed space missions within the context of these two competing aspects. The discussion is cast from the perspective of current understanding of the prediction uncertainties associated with space-based flood prediction. A conceptual framework for a fully space-borne system for early-warning of floods is proposed. The need for retrospective validation of such a system on historical data comprising floods and its associated socio-economic impact is stressed. This proposal for a fully space-borne system, if pursued through wide interdisciplinary effort as recommended herein, promises to enhance the utility of the three space missions more than what their individual agenda can be expected to offer.  相似文献   
95.
The area of Serravalle, sited in the northern part of the town of Vittorio Veneto (TV), NE Italy, has been the target of a seismic microzonation campaign. 10 seismic stations have been deployed for a 7 months period to record in continuous mode. Three stations were installed on bedrock outcrops and seven on sedimentary sites with variable cover thickness. Spectral analyses have been performed on the collected data-set using the Generalized Inversion Technique (GIT, e.g. Andrews, 1986). In particular, spectral ratios have been calculated for each station relatively to the average of the three reference, bedrock sites. The spectral ratios provide quantitative estimates of the seismic motion amplifications which occur in each of the monitored sites. Two sites show high values of amplification, 5 times larger than signal amplitude at the reference sites, in correspondence of well discernible peak frequencies of 5 Hz. Results for the other stations show smaller amounts of site amplification spreading over a broad range of frequencies. Sites where the highest amplifications were recorded all lie on the left bank of the Meschio River and in areas farther away from its outlet into the plain correlating with the presence of thick layers of Quaternary deposits.  相似文献   
96.
This study investigates the extent to which people's views on the causes and preventability of earthquake damage might be influenced by their degree of exposure to hazard as well as what information they have been given about the hazard. The results show that the provision of hazard zoning information influences judgements on preventability and causes of damage, but this effect depends on the degree of hazard faced by residents. In low hazard zones, information leads to the view that causes are manageable, whereas in high hazard zones information may induce a degree of fatalism. The use of public information in risk management needs to take into account the degree of risk faced by the recipients.  相似文献   
97.
98.
99.
100.
Preparedness is a key dependent variable in many studies examining people’s response to disasters such as earthquakes. A feature of many studies on this issue, however, is the lack of attention given to psychometric issues when constructing measures of preparedness. With regard to earthquake preparation, for example, many studies could be greatly improved by the use of a valid and reliable measure of preparedness. This research developed such a measure that assessed both low-level preparedness, such as having an emergency kit, and high-level preparedness, such as altering home structures to mitigate damage. Studies of Wellington (New Zealand) residents using two samples totalling n=652 showed that 23 items measuring these different aspects of earthquake preparation could be combined into a reliable, valid, unifactorial scale. This brief scale should have utility in multivariate studies of earthquake preparation, either as a dependent variable, where preparation is the outcome variable of primary interest, or as one of several independent variables, where preparation and other measures predict another outcome variable.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号