首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   502篇
  免费   73篇
  国内免费   52篇
测绘学   152篇
大气科学   39篇
地球物理   126篇
地质学   164篇
海洋学   58篇
天文学   7篇
综合类   20篇
自然地理   61篇
  2023年   4篇
  2022年   10篇
  2021年   8篇
  2020年   19篇
  2019年   10篇
  2018年   11篇
  2017年   24篇
  2016年   17篇
  2015年   18篇
  2014年   40篇
  2013年   35篇
  2012年   17篇
  2011年   28篇
  2010年   19篇
  2009年   35篇
  2008年   27篇
  2007年   30篇
  2006年   39篇
  2005年   26篇
  2004年   17篇
  2003年   27篇
  2002年   24篇
  2001年   11篇
  2000年   14篇
  1999年   17篇
  1998年   21篇
  1997年   11篇
  1996年   12篇
  1995年   5篇
  1994年   6篇
  1993年   9篇
  1992年   6篇
  1991年   4篇
  1990年   7篇
  1989年   7篇
  1988年   8篇
  1987年   1篇
  1982年   1篇
  1974年   1篇
  1954年   1篇
排序方式: 共有627条查询结果,搜索用时 15 毫秒
1.
顾吉林  汤宏山  刘淼  耿杨  于月  陶涛 《地理科学》2019,39(3):516-523
分别对2015年6~12月和2016年6~12月大连地区的大气污染物PM2.5、PM10、SO2、NO2、CO和O3的浓度数据进行数据统计分析,基于ENVI软件平台利用MODIS数据反演大连地区的气溶胶光学厚度,通过回归建模研究气溶胶光学厚度与大连地区10个地面监测站点的大气污染物PM2.5、PM10、SO2、NO2、CO和O3的浓度数据的相关性。回归建模以气溶胶光学厚度(AOD)为自变量,以大气污染物PM2.5、PM10、SO2、NO2、CO和O3为因变量,在SPSS软件中分别选取线性、对数、三次、乘幂、指数5种函数类型进行研究,通过对比回归模型的拟合优度R2,选择最优拟合模型,探讨利用遥感数据反演气溶胶光学厚度监测大气污染的相关性。结果表明:气溶胶光学厚度与NO2、PM2.5和PM10的最优拟合模型均为三次模型,其拟合优度R2分别是0.685、0.801和0.845;与O3和SO2的最优拟合模型为指数模型,其R2为0.367和0.482;与CO的最优拟合模型为对数模型,其拟合优度R2为0.810。该结果为分析大气气溶胶污染来源以及治理提供了数据。  相似文献   
2.
Inland water bodies are globally threatened by environmental degradation and climate change. On the other hand, new water bodies can be designed during landscape restoration (e.g. after coal mining). Effective management of new water resources requires continuous monitoring; in situ surveys are, however, extremely time-demanding. Remote sensing has been widely used for identifying water bodies. However, the use of optical imagery is constrained by accuracy problems related to the difficulty in distinguishing water features from other surfaces with low albedo, such as tree shadows. This is especially true when mapping water bodies of different sizes. To address these problems, we evaluated the potential of integrating hyperspectral data with LiDAR (hereinafter “integrative approach”). The study area consisted of several spoil heaps containing heterogeneous water bodies with a high variability of shape and size. We utilized object-based classification (Support Vector Machine) based on: (i) hyperspectral data; (ii) LiDAR variables; (iii) integration of both datasets. Besides, we classified hyperspectral data using pixel-based approaches (K-mean, spectral angle mapper). Individual approaches (hyperspectral data, LiDAR data and integrative approach) resulted in 2–22.4 % underestimation of the water surface area (i.e, omission error) and 0.4–1.5 % overestimation (i.e., commission error).The integrative approach yielded an improved discrimination of open water surface compared to other approaches (omission error of 2 % and commission error of 0.4 %). We also evaluated the success of detecting individual ponds; the integrative approach was the only one capable of detecting the water bodies with both omission and commission errors below 10 %. Finally, the assessment of misclassification reasons showed a successful elimination of shadows in the integrative approach. Our findings demonstrate that the integration of hyperspectral and LiDAR data can greatly improve the identification of small water bodies and can be applied in practice to support mapping of restoration process.  相似文献   
3.
Obtaining depth of closure (DoC) in an accurate manner is a fundamental issue for coastal engineering, since good results for coastal structures and beach nourishment depend mainly on DoC. Currently, there are two methods for obtaining the DoC, mathematical formulations and profile surveys. However, these methods can incur important errors if one does not take into account the characteristics and morphology of the area, or if one does not have a sufficiently long time series. In this work the DoC is obtained from the break in the trend of the sediment with the depth, that is, in general with the increase of the depth a decrease in the size of the sediment takes place. However, at one point this tendency changes and the size increases, and then decreases again. When comparing the point where the minimum sediment size occurs before the increase, it is observed that the error incurred is small compared to other methods. If the Standard Deviation of Depth Change (SDDC) method is considered as the most accurate method, the error incurred by the proposed method is less than 7%. In addition, it can be seen that the dispersion of the sediment method always occurs outside the zone of bar movement. Whereas in the methods of profiles survey (using 2 cm precision profiles), sometimes the DoC is obtained within the active zone of bar movement. In addition, where the relative minimum of the median sediment size is found, and the sizes of 0.063 and 0.125 mm predominate in the composition of the sample. Therefore, this new method allows the precise location of the DoC to be obtained in a fast and simple way. Furthermore, this method has the advantage that it is not affected by the modifications that may be experienced by both the study area and the cross-shore beach profile.  相似文献   
4.
Machine learning algorithms are an important measure with which to perform landslide susceptibility assessments,but most studies use GIS-based classification methods to conduct susceptibility zonation.This study presents a machine learning approach based on the C5.0 decision tree(DT)model and the K-means cluster algorithm to produce a regional landslide susceptibility map.Yanchang County,a typical landslide-prone area located in northwestern China,was taken as the area of interest to introduce the proposed application procedure.A landslide inventory containing 82 landslides was prepared and subse-quently randomly partitioned into two subsets:training data(70%landslide pixels)and validation data(30%landslide pixels).Fourteen landslide influencing factors were considered in the input dataset and were used to calculate the landslide occurrence probability based on the C5.0 decision tree model.Susceptibility zonation was implemented according to the cut-off values calculated by the K-means clus-ter algorithm.The validation results of the model performance analysis showed that the AUC(area under the receiver operating characteristic(ROC)curve)of the proposed model was the highest,reaching 0.88,compared with traditional models(support vector machine(SVM)=0.85,Bayesian network(BN)=0.81,frequency ratio(FR)=0.75,weight of evidence(WOE)=0.76).The landslide frequency ratio and fre-quency density of the high susceptibility zones were 6.76/km2 and 0.88/km2,respectively,which were much higher than those of the low susceptibility zones.The top 20%interval of landslide occurrence probability contained 89%of the historical landslides but only accounted for 10.3%of the total area.Our results indicate that the distribution of high susceptibility zones was more focused without contain-ing more"stable"pixels.Therefore,the obtained susceptibility map is suitable for application to landslide risk management practices.  相似文献   
5.
One important step in binary modeling of environmental problems is the generation of absence-datasets that are traditionally generated by random sampling and can undermine the quality of outputs.To solve this problem,this study develops the Absence Point Generation(APG)toolbox which is a Python-based ArcGIS toolbox for automated construction of absence-datasets for geospatial studies.The APG employs a frequency ratio analysis of four commonly used and important driving factors such as altitude,slope degree,topographic wetness index,and distance from rivers,and considers the presence locations buffer and density layers to define the low potential or susceptibility zones where absence-datasets are gener-ated.To test the APG toolbox,we applied two benchmark algorithms of random forest(RF)and boosted regression trees(BRT)in a case study to investigate groundwater potential using three absence datasets i.e.,the APG,random,and selection of absence samples(SAS)toolbox.The BRT-APG and RF-APG had the area under receiver operating curve(AUC)values of 0.947 and 0.942,while BRT and RF had weaker per-formances with the SAS and Random datasets.This effect resulted in AUC improvements for BRT and RF by 7.2,and 9.7%from the Random dataset,and AUC improvements for BRT and RF by 6.1,and 5.4%from the SAS dataset,respectively.The APG also impacted the importance of the input factors and the pattern of the groundwater potential maps,which proves the importance of absence points in environmental bin-ary issues.The proposed APG toolbox could be easily applied in other environmental hazards such as landslides,floods,and gully erosion,and land subsidence.  相似文献   
6.
2008年汶川大地震诱发大型地震滑坡300余处,造成了巨大的生命财产损失,研究其发生机理有重要的理论意义和防灾减灾的实用价值.经收集分析已有地震滑坡的研究成果及多次深入现场调查,本文发现众多大型地震滑坡发生时都伴随有区别于汶川主震的地面震动,并将其称为滑坡地面震动,简称滑坡震动.本文在论述滑坡震动依据、成因及特征的基础上,根据滑坡震动力与主震力的组合情况及其对大型地震滑坡的影响不同,将汶川地震滑坡划分为3种类型:主震型地震滑坡,迟震型地震滑坡和同震型地震滑坡.认为主震型地震滑坡在主震结束前滑动,其主导失稳力学因素为主震力和重力,无滑坡震动或可忽略; 同震型地震滑坡亦在主震结束前滑动,但其主导失稳力学因素除主震力和重力外,滑坡震动力起重要作用; 迟震型地震滑坡在主震结束后滑动,主导失稳力学因素为滑坡震动力和重力.认为大型地震滑坡地面震动的发生与活断层导致地震类似,据此提出了滑坡震动加速度的估算方法,并以此为基础分析研究了各类型地震滑坡的启程剧动机理.  相似文献   
7.
在马里亚纳海沟"蛟龙号"载人深潜器海试区约7000m水深的海底,发现了硅藻化石软泥沉积物,为中国首次在此大深度船载采集到硅藻化石软泥。室内硅藻化石分析显示,重力柱状样顶部约10cm厚的软泥为大筛盘藻Ethmodiscus rex硅藻软泥。化石主要由大筛盘藻组成,呈碎片状,数量巨大,并伴生有Azpeitia等热带远洋浮游种类。硅藻化石软泥发育表明,本区曾发生过E.rex勃发事件,具有重要的古海洋学和古生物学研究意义。  相似文献   
8.
Prestack depth imaging of seismic data in complex areas such as salt structures requires extensive velocity model updating. In many cases, salt boundaries can be difficult to identify due to lack of seismic reflectivity. Traditional amplitude based segmentation methods do not properly tackle this problem, resulting in extensive manual editing. This paper presents a selection of seismic attributes that can reveal texture differences between the salt diapirs and the surrounding geology as opposed to amplitude‐sensitive attributes that are used in case of well defined boundaries. The approach consists of first extracting selected texture attributes, then using these attributes to train a classifier to estimate the probability that each pixel in the data set belongs to one of the following classes: near‐horizontal layering, highly‐dipping areas and the inside of the salt that appears more like a low amplitude area with small variations in texture. To find the border between the inside of the salt and the highly‐dipping surroundings, the posterior probability of the class salt is input to a graph‐cut algorithm that produces a smooth, continuous border. An in‐line seismic section and a timeslice from a 3D North Sea data set were employed to test the proposed approach. Comparisons between the automatically segmented salt contours and the corresponding contours as provided by an experienced interpreter showed a high degree of similarity.  相似文献   
9.
Phytoplankton biomass and primary production were monitored in the Hauraki Gulf and on the northeastern continental shelf, New Zealand - using ship surveys, moored instruments and satellite observations (1998-2001) - capturing variability across a range of space and time scales. A depth-integrated primary production model (DIM) was used to predict integrated productivity from surface parameters, enabling regional-specific estimates from satellite data. The shelf site was dominated by pico-phytoplankton, with low chlorophyll-a (<1 mg m−3) and annual production (136 g C m−2 yr−1). In contrast, the gulf contained a micro/nano-phytoplankton-dominated community, with relatively high chlorophyll-a (>1 mg m−3) and annual production (178 g C m−2 yr−1). Biomass and productivity responded to physico-chemical factors; a combination of light, critical mixing depths and/or nutrient limitation—particularly new nitrate-N. Relatively low biomass and production was observed during 1999. This coincided with inter-annual variability in the timing and extent of upwelling- and downwelling-favourable along-shelf wind-stress, influencing the fluxes of new nitrate-N to the shelf and gulf. Relationships with the Southern Oscillation Index are also discussed. Our multi-scaled sampling highlighted details associated with stratification and de-stratification events, and deep sub-surface chlorophyll-a not visible to satellite sensors. This study demonstrates the importance of multi-scaled sampling in gaining estimates of regional production and its responses to physico-chemical forcing.  相似文献   
10.
关于土地覆被遥感监测的几点思考   总被引:1,自引:0,他引:1  
针对国内外土地覆被遥感监测中存在的突出问题展开讨论,分析土地覆被分类系统的目标适应性; 总结现有分类算法的特点及存在问题,分析小尺度和大尺度监测技术的差异性和效果; 研究不同尺度土地覆被监测所解决的应用问题及尺度空间变化下的分类效果; 分析现有监测体系的分类精度及产生误差的原因和解决方法.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号