首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9194篇
  免费   366篇
  国内免费   136篇
测绘学   181篇
大气科学   780篇
地球物理   2183篇
地质学   3250篇
海洋学   877篇
天文学   1324篇
综合类   41篇
自然地理   1060篇
  2022年   58篇
  2021年   130篇
  2020年   181篇
  2019年   181篇
  2018年   230篇
  2017年   218篇
  2016年   277篇
  2015年   226篇
  2014年   295篇
  2013年   521篇
  2012年   326篇
  2011年   464篇
  2010年   410篇
  2009年   517篇
  2008年   432篇
  2007年   433篇
  2006年   368篇
  2005年   311篇
  2004年   310篇
  2003年   316篇
  2002年   256篇
  2001年   220篇
  2000年   225篇
  1999年   170篇
  1998年   156篇
  1997年   140篇
  1996年   149篇
  1995年   142篇
  1994年   120篇
  1993年   97篇
  1992年   98篇
  1991年   65篇
  1990年   98篇
  1989年   79篇
  1988年   83篇
  1987年   91篇
  1986年   83篇
  1985年   117篇
  1984年   134篇
  1983年   123篇
  1982年   106篇
  1981年   76篇
  1980年   60篇
  1979年   69篇
  1978年   69篇
  1977年   61篇
  1976年   61篇
  1975年   69篇
  1974年   55篇
  1973年   69篇
排序方式: 共有9696条查询结果,搜索用时 31 毫秒
991.
The geochemistry of lake sediments was used to identify anthropogenic factors influencing aquatic ecosystems of sub-alpine lakes in the western United States during the past century. Sediment cores were recovered from six high-elevation lakes in the central Great Basin of the United States. The proxies utilized to examine the degree of recent anthropogenic environmental change include spheroidal carbonaceous particle (SCP), mercury (Hg), and sediment organic content estimated using loss-on-ignition. Chronologies for the sediment cores, developed using 210Pb, indicate the cores span the twentieth century. Mercury flux varied between lakes but all exhibited increasing fluxes during the mid-twentieth century. The mean ratio of modern (post-A.D. 1985) to preindustrial (pre-A.D. 1880) Hg flux was 5.2, which is comparable to the results from previous studies conducted in western North America. Peak SCP flux for all lakes occurred between approximately A.D. 1940 and A.D. 1970, after which time the SCP flux was greatly reduced. The reduction in SCP input is likely due to better controls on combustion sources. Measured Hg concentrations and calculated sedimentation rates suggest atmospheric Hg flux increased in the early 1900s, from A.D. 1920 to A.D. 1990, and at present. Atmospheric deposition is the primary source of the anthropogenic inputs of Hg and SCPs to these high elevation lakes. The input of SCPs, which is largely driven by regional sources, has declined with the implementation of national pollution control regulations. Mercury deposition in the Great Basin has most likely been influenced more by regional inputs.  相似文献   
992.
A critical component of maintaining biodiversity in fragmented habitats is maintaining connectivity among the usable fragments. Least cost path (LCP) analysis is a tool that can be used for predicting the ability of an organism to move from one habitat patch to another, based on geographical features of the landscape and life history traits of the organism. While this analysis has been utilized for terrestrial habitats, it is rarely applied to aquatic environments. Aquatic hypoxic conditions occur when dissolved oxygen falls below 2 mg/L. These conditions can create barriers in the water column that can either force fish to leave a habitat, or avoid that habitat altogether. Using the lower St. Johns River (LSJR) estuary in Florida, USA, as a study system, the ability of an adult silver perch, Bairdiella chrysoura, to escape a large-scale hypoxic event was modeled using a multicriteria LCP approach. Criteria-specific cost grids were constructed based upon current speed, risk of predation, and whether oxygen levels in the habitat area were normoxic (>5.5 mg/L), or hypoxic (<2.0–1.5 mg/L) as a function of water depth for the LSJR. The criteria cost grids were combined using relative weighting to produce the multicriteria cost grid used to implement the LCP analysis. Three origin and destination point locations within the LSJR study area were selected for modeling whether or not a silver perch would be able to escape a hypoxic zone. Since the LCP model will always determine a LCP from the specified origin point location, ecologically relevant swimming capacities for silver perch under normoxic and hypoxic conditions were then applied to assess the model, and to determine whether the fish would be able to reach areas unimpacted by hypoxia. The LCP model and the swimming capacity results for this study predict that under normoxic conditions, fish movement was unimpeded. During the rapidly developing hypoxic event that was modeled, the results from the LCP model indicate that the fish could move outside the hypoxic zone, but when swimming capacities were applied to the model, the silver perch could not escape. Ecologically, the results of this study suggest that silver perch would experience high mortality under a rapidly developing hypoxic event. Additionally, the results of this study indicate that a LCP model can be applied to an aquatic habitat, as long as the cost grids incorporate relevant abiotic and biotic factors.  相似文献   
993.
This study investigates the physical conditions (water depth, current speed, salinity, temperature) in Lianzhou Bay, a shallow coastal bay in southern China, during two expeditions in the dry and wet seasons of 2011. Based on these expedition data, basic hydrodynamic parameters like Brunt-Väisälä Frequency, Richardson Number, Rossby radius, and Resonance Period are calculated. The results show that Lianzhou Bay is characterized by comparatively small quantity of freshwater input and weak stratification. Strong tides, which are spatially uniform within the bay, cause turbulent mixing. Residence time of the water is shorter in winter due to a stronger coastal current in that season. Consideration of the water movement may help to reduce the harmful ecological impact of aquaculture waste water discharge.  相似文献   
994.
Contourite drifts of alternating sand and mud, shaped by the Labrador Current, formed during the late Quaternary in Flemish Pass seaward of the Grand Banks of Newfoundland, Canada. The drifts preserve a record of Labrador Current flow variations through the last glacial maximum. A high-resolution seismic profile and a transect of four cores were collected across Beothuk drift on the southeast side of Flemish Pass. Downcore and lateral trends in grain size and sedimentation rate provide evidence that, between 16 and 13 ka, sediment was partitioned across Beothuk drift and the adjacent Flemish Pass floor by a strong current flow but, from 29 to 16 ka, sedimentation was more of a blanketing style, represented by draped reflections interpreted as being due to a weaker current. The data poorly resolve the low sedimentation rates since 13 ka, but the modern Labrador Current in Flemish Pass is the strongest it has been in at least the past 29 ka. Pre-29 ka current flow is interpreted based on reflection architecture in seismic profiles. A prominent drift on the southwestern side of Flemish Pass formed above a mid-Miocene erosion surface, but was buried by a mass-transport deposit after the penultimate glacial maximum and after drift deposition switched to eastern Flemish Pass. These findings illustrate the temporal complexity of drift sedimentation and provide the first detailed proxy for Labrador Current flow since the last glacial maximum.  相似文献   
995.
In Global Navigation Satellite Systems (GNSS) using L-band frequencies, the ionosphere causes signal delays that correspond with link related range errors of up to 100 m. In a first order approximation the range error is proportional to the total electron content (TEC) of the ionosphere. Whereas this first order range error can be corrected in dual-frequency measurements by a linear combination of carrier phase- or code-ranges of both frequencies, single-frequency users need additional information to mitigate the ionospheric error. This information can be provided by TEC maps deduced from corresponding GNSS measurements or by ionospheric models. In this paper we discuss and compare different ionospheric correction methods for single-frequency users. The focus is on the comparison of the positioning quality using dual-frequency measurements, the Klobuchar model, the NeQuick model, the IGS TEC maps, the Neustrelitz TEC Model (NTCM-GL) and the reconstructed NTCM-GL TEC maps both provided via the ionosphere data service SWACI (http://swaciweb.dlr.de) in near real-time. For that purpose, data from different locations covering several days in 2011 and 2012 are investigated, including periods of quiet and disturbed ionospheric conditions. In applying the NTCM-GL based corrections instead of the Klobuchar model, positioning accuracy improvements up to several meters have been found for the European region in dependence on the ionospheric conditions. Further in mid- and low-latitudes the NTCM-GL model provides results comparable to NeQuick during the considered time periods. Moreover, in regions with a dense GNSS ground station network the reconstructed NTCM-GL TEC maps are partly at the same level as the final IGS TEC maps.  相似文献   
996.
We introduce a probabilistic framework for vulnerability analysis and use it to quantify current and future vulnerability of the US water supply system. We also determine the contributions of hydro-climatic and socio-economic drivers to the changes in projected vulnerability. For all scenarios and global climate models examined, the US Southwest including California and the southern Great Plains was consistently found to be the most vulnerable. For most of the US, the largest contributions to changes in vulnerability come from changes in supply. However, for some areas of the West changes in vulnerability are caused mainly by changes in demand. These changes in supply and demand result mainly from changes in evapotranspiration rather than from changes in precipitation. Importantly, changes in vulnerability from projected changes in the standard deviations of precipitation and evapotranspiration are of about the same magnitude or larger than those from changes in the corresponding means over most of the US, except in large areas of the Great Plains, in central California and southern and central Texas.  相似文献   
997.
As land use change (LUC), including deforestation, is a patchy process, estimating the impact of LUC on carbon emissions requires spatially accurate underlying data on biomass distribution and change. The methods currently adopted to estimate the spatial variation of above- and below-ground biomass in tropical forests, in particular the Brazilian Amazon, are usually based on remote sensing analyses coupled with field datasets, which tend to be relatively scarce and often limited in their spatial distribution. There are notable differences among the resulting biomass maps found in the literature. These differences subsequently result in relatively high uncertainties in the carbon emissions calculated from land use change, and have a larger impact when biomass maps are coded into biomass classes referring to specific ranges of biomass values. In this paper we analyze the differences among recently-published biomass maps of the Amazon region, including the official information used by the Brazilian government for its communication to the United Nation Framework on Climate Change Convention of the United Nations. The estimated average pre-deforestation biomass in the four maps, for the areas of the Amazon region that had been deforested during the 1990–2009 period, varied from 205?±?32 Mg ha?1 during 1990–1999, to 216?±?31 Mg ha?1 during 2000–2009. The biomass values of the deforested areas in 2011 were between 7 and 24 % higher than for the average deforested areas during 1990–1999, suggesting that although there was variation in the mean value, deforestation was tending to occur in increasingly carbon-dense areas, with consequences for carbon emissions. To summarize, our key findings were: (i) the current maps of Amazonian biomass show substantial variation in both total biomass and its spatial distribution; (ii) carbon emissions estimates from deforestation are highly dependent on the spatial distribution of biomass as determined by any single biomass map, and on the deforestation process itself; (iii) future deforestation in the Brazilian Amazon is likely to affect forests with higher biomass than those deforested in the past, resulting in smaller reductions in carbon dioxide emissions than expected purely from the recent reductions in deforestation rates; and (iv) the current official estimate of carbon emissions from Amazonian deforestation is probably overestimated, because the recent loss of higher-biomass forests has not been taken into account.  相似文献   
998.
This study presents the first consolidation of palaeoclimate proxy records from multiple archives to develop statistical rainfall reconstructions for southern Africa covering the last two centuries. State-of-the-art ensemble reconstructions reveal multi-decadal rainfall variability in the summer and winter rainfall zones. A decrease in precipitation amount over time is identified in the summer rainfall zone. No significant change in precipitation amount occurred in the winter rainfall zone, but rainfall variability has increased over time. Generally synchronous rainfall fluctuations between the two zones are identified on decadal scales, with common wet (dry) periods reconstructed around 1890 (1930). A strong relationship between seasonal rainfall and sea surface temperatures (SSTs) in the surrounding oceans is confirmed. Coherence among decadal-scale fluctuations of southern African rainfall, regional SST, SSTs in the Pacific Ocean and rainfall in south-eastern Australia suggest SST-rainfall teleconnections across the southern hemisphere. Temporal breakdowns of the SST-rainfall relationship in the southern African regions and the connection between the two rainfall zones are observed, for example during the 1950s. Our results confirm the complex interplay between large-scale teleconnections, regional SSTs and local effects in modulating multi-decadal southern African rainfall variability over long timescales.  相似文献   
999.
A monthly index based on the persistence of the westerly winds over the English Chanel is constructed for 1685–2008 using daily data from ships’ logbooks and comprehensive marine meteorological datasets. The so-called Westerly Index (WI) provides the longest instrumental record of atmospheric circulation currently available. Anomalous WI values are associated with spatially coherent climatic signals in temperature and precipitation over large areas of Europe, which are stronger for precipitation than for temperature and in winter and summer than in transitional seasons. Overall, the WI series accord with the known European climatic history, and reveal that the frequency of the westerlies in the eastern Atlantic during the twentieth century and the Late Maunder Minimum was not exceptional in the context of the last three centuries. It is shown that the WI provides additional and complementary information to the North Atlantic Oscillation (NAO) indices. The analysis of WI series during the industrial era indicates an overall good agreement with the winter and high-summer NAO, with the exception of several multidecadal periods of weakened correlation. These decoupled periods between the frequency and the intensity of the zonal flow are interpreted on the basis of several sources of non-stationarity affecting the centres of the variability of the North Atlantic and their teleconnections. Comparisons with NAO reconstructions and long instrumental indices extending back to the seventeenth century suggest that similar situations have occurred in the past, which call for caution when reconstructing the past atmospheric circulation from climatic proxies. The robustness and extension of its climatic signal, the length of the series and its instrumental nature make the WI an excellent benchmark for proxy calibration in Europe and Greenland.  相似文献   
1000.
The predictability of the Arctic sea ice is investigated at the interannual time scale using decadal experiments performed within the framework of the fifth phase of the Coupled Model Intercomparison Project with the CNRM-CM5.1 coupled atmosphere–ocean global climate model. The predictability of summer Arctic sea ice extent is found to be weak and not to exceed 2 years. In contrast, robust prognostic potential predictability (PPP) up to several years is found for winter sea ice extent and volume. This predictability is regionally contrasted. The marginal seas in the Atlantic sector and the central Arctic show the highest potential predictability, while the marginal seas in the Pacific sector are barely predictable. The PPP is shown to decrease drastically in the more recent period. Regarding sea ice extent, this decrease is explained by a strong reduction of its natural variability in the Greenland–Iceland–Norwegian Seas due to the quasi-disappearance of the marginal ice zone in the center of the Greenland Sea. In contrast, the decrease of predictability of sea ice volume arises from the combined effect of a reduction of its natural variability and an increase in its chaotic nature. The latter is attributed to a thinning of sea ice cover over the whole Arctic, making it more sensitive to atmospheric fluctuations. In contrast to the PPP assessment, the prediction skill as measured by the anomaly correlation coefficient is found to be mostly due to external forcing. Yet, in agreement with the PPP assessment, a weak added value of the initialization is found in the Atlantic sector. Nevertheless, the trend-independent component of this skill is not statistically significant beyond the forecast range of 3 months. These contrasted findings regarding potential predictability and prediction skill arising from the initialization suggest that substantial improvements can be made in order to enhance the prediction skill.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号