首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   269篇
  免费   25篇
  国内免费   1篇
测绘学   5篇
大气科学   9篇
地球物理   89篇
地质学   95篇
海洋学   48篇
天文学   28篇
自然地理   21篇
  2022年   3篇
  2021年   3篇
  2020年   6篇
  2019年   1篇
  2018年   9篇
  2017年   9篇
  2016年   11篇
  2015年   12篇
  2014年   13篇
  2013年   21篇
  2012年   16篇
  2011年   18篇
  2010年   11篇
  2009年   23篇
  2008年   20篇
  2007年   17篇
  2006年   17篇
  2005年   12篇
  2004年   10篇
  2003年   12篇
  2002年   10篇
  2001年   5篇
  2000年   7篇
  1999年   4篇
  1998年   4篇
  1997年   2篇
  1996年   1篇
  1995年   3篇
  1994年   1篇
  1992年   1篇
  1990年   2篇
  1989年   2篇
  1987年   1篇
  1985年   2篇
  1984年   2篇
  1983年   1篇
  1981年   2篇
  1973年   1篇
排序方式: 共有295条查询结果,搜索用时 15 毫秒
271.
Semantic similarity is central for the functioning of semantically enabled processing of geospatial data. It is used to measure the degree of potential semantic interoperability between data or different geographic information systems (GIS). Similarity is essential for dealing with vague data queries, vague concepts or natural language and is the basis for semantic information retrieval and integration. The choice of similarity measurement influences strongly the conceptual design and the functionality of a GIS. The goal of this article is to provide a survey presentation on theories of semantic similarity measurement and review how these approaches – originally developed as psychological models to explain human similarity judgment – can be used in geographic information science. According to their knowledge representation and notion of similarity we classify existing similarity measures in geometric, feature, network, alignment and transformational models. The article reviews each of these models and outlines its notion of similarity and metric properties. Afterwards, we evaluate the semantic similarity models with respect to the requirements for semantic similarity measurement between geospatial data. The article concludes by comparing the similarity measures and giving general advice how to choose an appropriate semantic similarity measure. Advantages and disadvantages point to their suitability for different tasks.  相似文献   
272.
273.
Passive acoustic telemetry was used to monitor the movements of cownose rays (Rhinoptera bonasus) within the Caloosahatchee River estuary in Southwest Florida. Twelve rays were tracked within the river between January 2004 and May 2005 for periods up to 234 days. Linear home range was calculated for all individuals and ranged between 0 and 18.4 km (daily) and 1 and 22.3 km (overall). Ray position within the river was compared to changing water quality parameters throughout the study. Although home range size did not increase with increasing salinity, individuals did occur farther upriver with decreasing flow rates and increasing salinity. There were no differences detected between day and night distribution patterns. Movement and presence patterns demonstrated significant use of the estuarine river over all months, indicating that cownose rays in southwest Florida may not undertake long seasonal migrations as established for other parts of their range.  相似文献   
274.
For the Tortonian, Steppuhn et al. [Steppuhn, A., Micheels, A., Geiger, G., Mosbrugger, V., 2006. Reconstructing the Late Miocene climate and oceanic heat flux using the AGCM ECHAM4 coupled to a mixed-layer ocean model with adjusted flux correction. Palaeogeography, Palaeoclimatology, Palaeoecology, 238, 399–423] perform a model simulation which considers a generally lower palaeorography, a weaker ocean heat transport and an atmospheric CO2 concentration of 353 ppm. The Tortonian simulation of Steppuhn et al. [Steppuhn, A., Micheels, A., Geiger, G., Mosbrugger, V., 2006. Reconstructing the Late Miocene climate and oceanic heat flux using the AGCM ECHAM4 coupled to a mixed-layer ocean model with adjusted flux correction. Palaeogeography, Palaeoclimatology, Palaeoecology, 238, 399–423] demonstrates some realistic trends: the high latitudes are warmer than today and the meridional temperature gradient is reduced. However, the Tortonian run also indicates some insufficiencies such as too cool mid-latitudes which can be due to an underestimated pCO2 in the atmosphere. As a sensitivity study, we perform a further model experiment for which we additionally increase the atmospheric carbon dioxide (700 ppm). According to this CO2 sensitivity experiment, we find a global warming and a globally more intense water cycle as compared to the previous Tortonian run. Particularly the high latitudes are warmer in the Tortonian CO2 sensitivity run which leads to a lower amount of Arctic sea ice and a reduced equator-to-pole temperature difference. Our Tortonian CO2 sensitivity study basically agrees with results from recent climate model experiments which consider an increase of CO2 during the next century (e.g. [Cubasch, U., Meehl, G.A., Boer, G.J., Stouffer, R.J., Dix, M., Noda, A., Senior, C.A., Raper, S., Yap, K.S., 2001. Projections of Future Climate Change. In: Houghton, J.T., Y. Ding, D.J. Griggs, M. Noguer, P.J. van der Linden, X. Dai, K. Maskell, C.A. Johnson (eds.), Climate Change 2001: The Scientific Basis. Contribution of Working Group I to the Third Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, Cambridge, 525–582]) suggesting that the climatic response on a higher atmospheric CO2 concentration is almost independent from the different settings of boundary conditions (Tortonian versus today). To validate the Tortonian model simulations, we perform a quantitative comparison with terrestrial proxy data. This comparison demonstrates that the Tortonian CO2 sensitivity experiment tends to be more realistic than the previous Tortonian simulation by Steppuhn et al. [Steppuhn, A., Micheels, A., Geiger, G., Mosbrugger, V., 2006. Reconstructing the Late Miocene climate and oceanic heat flux using the AGCM ECHAM4 coupled to a mixed-layer ocean model with adjusted flux correction. Palaeogeography, Palaeoclimatology, Palaeoecology, 238, 399–423]. However, a high carbon dioxide concentration of 700 ppm is questionable for the Late Miocene, and it cannot explain shortcomings of our Tortonian run with ‘normal’ CO2. In order to fully understand the Late Miocene climate, further model experiments should also consider the palaeovegetation.  相似文献   
275.
276.
277.
The ShakeMap software automatically generates maps of the peak ground motion parameters (shakemaps) and of instrumental intensity soon after an earthquake. Recorded data are fundamental to obtaining accurate results. In case observations are not available, ShakeMap relies on ground motion predictive equations, but due to unmodelled site conditions or finite fault effects, large uncertainties may appear, mainly in the near-source area where damage is relevant. In this paper, we aim to account for source effects in ShakeMap by computing synthetics to be used for integrating observations and ground motion predictive equations when near-source data are not available. To be effective, the computation of synthetics, as well as of the finite fault, should be done in near real time. Therefore, we computed rapid synthetic seismograms, by a stochastic approach, including the main fault features that were obtained through inversion of regional and teleseismic data. The rapidity of calculation is linked to a number of assumptions, and simplifications that need testing before the procedure can run in automatic mode. To assess the performance of our procedure, we performed a retrospective validation analysis considered as case study of the M w = 6.3 earthquake, which occurred in central Italy on April 6, 2009. In that case, the first shakemaps, generated a few minutes after the earthquake, suffered large uncertainties on ground motion estimates in an area closer to the epicenter due to the lack of near-field data. To verify our approach, we recomputed shakemaps for the L’Aquila earthquake, integrating data available soon after the earthquake at different elapse times with synthetics, and we compared our shaking map with the final shakemap, obtained when all the data were available. Our analysis evidences that (1) when near-source data are missing, the integration of real data with synthetics reduces discrepancies between computed and actual ground shaking maps, mainly in the near-field zone where the damage is relevant and (2) the approach that we adopted is promising in trying to reduce such discrepancies and could be easily implemented in ShakeMap, but some a priori calibration is necessary before running in an automatic mode.  相似文献   
278.
The ocean floor is one of the main accumulation sites of marine debris. The study of this kind of debris still lags behind that of shorelines. It is necessary to identify the methods used to evaluate this debris and how the results are presented and interpreted. From the available literature on benthic marine debris (26 studies), six sampling methods were registered: bottom trawl net, sonar, submersible, snorkeling, scuba diving and manta tow. The most frequent method used was bottom trawl net, followed by the three methods of diving. The majority of the debris was classified according to their former use and the results usually expressed as items per unity of area. To facilitate comparisons of the contamination levels among sites and regions some standardization requirements are suggested.  相似文献   
279.
280.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号