首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6824篇
  免费   303篇
  国内免费   83篇
测绘学   245篇
大气科学   557篇
地球物理   1591篇
地质学   2268篇
海洋学   574篇
天文学   1257篇
综合类   30篇
自然地理   688篇
  2022年   31篇
  2021年   78篇
  2020年   96篇
  2019年   127篇
  2018年   189篇
  2017年   172篇
  2016年   243篇
  2015年   185篇
  2014年   212篇
  2013年   444篇
  2012年   274篇
  2011年   357篇
  2010年   303篇
  2009年   401篇
  2008年   358篇
  2007年   303篇
  2006年   281篇
  2005年   272篇
  2004年   283篇
  2003年   230篇
  2002年   246篇
  2001年   128篇
  2000年   156篇
  1999年   109篇
  1998年   120篇
  1997年   84篇
  1996年   88篇
  1995年   86篇
  1994年   93篇
  1993年   78篇
  1992年   85篇
  1991年   67篇
  1990年   54篇
  1989年   52篇
  1988年   60篇
  1987年   52篇
  1986年   55篇
  1985年   69篇
  1984年   70篇
  1983年   78篇
  1982年   60篇
  1981年   67篇
  1980年   57篇
  1979年   65篇
  1978年   51篇
  1977年   36篇
  1976年   29篇
  1975年   27篇
  1974年   27篇
  1973年   32篇
排序方式: 共有7210条查询结果,搜索用时 31 毫秒
961.
The ECOMAN was initiated in 2001 by the University of Plymouth, UK, Plymouth Marine Laboratory and the Department of the Environment, Fisheries and Rural Affairs (DEFRA) to address the need for more pragmatic assessment techniques linking environmental degradation with its causes. The primary aim of the project was to develop an evidence-based approach in which suites of easy-to-use, cost-effective and environmentally valid biological responses (biomarkers) could be used together to assess the health of coastal systems through the general condition of individuals. A range of sub-lethal endpoints, chosen to reflect successive levels of biological organisation (molecular, cellular, physiological), was evaluated in common coastal organisms showing different feeding types (filter feeding, grazing, predation) and habitat requirements (estuary, rocky shore). Initially, the suite of biomarkers was used in laboratory studies to determine the relative sensitivities of key species within different functional groups to common contaminants. These results were then validated in field studies performed in a range of ecosystems exhibiting different degrees/signatures of contamination. Here, an example is provided of a field study in the Humber Estuary, UK, which illustrates how multivariate statistical analysis can be used to identify patterns of response to discriminate between contaminated and clean sites. The use of a holistic, integrated approach of this kind is advocated as a practical means of assessing the impact of chemical contamination on organismal health and of ranking the status of marine ecosystems.  相似文献   
962.
A representative economic model is used to analyze local policies to reestablish full employment. It addresses three types of barriers: 1. Rigid wages, 2. Occupational (or industrial) immobility, and 3. Geographic immobility. These factors are considered in the framework of three types of policies: 1. Laissez-faire, 2. Narrowly targeted employment incentives, and 3. Broadly targeted incentives. The paper concludes that the usefulness of the abstract model will depends upon the ability of local development officials to combine the implications of the model with information about the local economic development narrative. In turn, the narrative depends upon an in-depth local knowledge of history, institutions, practices and personalities.  相似文献   
963.
A simple binary mixing model is used to determine the isotopic ratios of lead (Pb) pollution sources to a lake located near a smelter closed because of excessive Pb aerosols (Horseshoe Lake Madison County, Illinois, USA). As a control, we also examine a relatively unpolluted lake in a rural area of Southern Illinois (Horseshoe Lake Alexander County). Sediment cores were taken from both lakes and analyzed for Pb and Pb isotopes by ICP-MS. The mixing model shows that Madison County Horseshoe Lake had 3 different sources of Pb in its history. The first source is sediment from the Mississippi River with an intermediate 206Pb/207Pb ratio (1.223 ± 0.009) which dominates inputs in pre-settlement times. From 1750 to 1933, the source of pollution Pb has the high 206Pb/207Pb ratio (1.256 ± 0.005) characteristic of ore from the southeast Missouri Pb mines. The most recently deposited pollution Pb comes from a source with a low 206Pb/207Pb ratio (1.202 ± 0.005). This source is similar in isotopic composition to pollution Pb found by several other investigators in the Eastern US and probably represents the mixture of ores used in modern industrial processes. It is unclear from the isotopic composition whether this source at Horseshoe Lake is the local Pb smelter or vehicle exhaust. The sediment core from Horseshoe Lake, Alexander County, shows a less variable isotopic composition. The binary mixing model showed a source composition of 1.225 ± 0.003 before 1850 and 1.231 ± 0.003 after this date. The change does not indicate a pollution source, but may be a shift in the sources of natural sediment with slightly different isotopic ratios to the lake. Our results show the value of simple binary mixing models to reconstruct the isotopic composition of Pb sources to lakes.  相似文献   
964.
This paper synthesizes the results from the model intercomparison exercise among regionalized global energy-economy models conducted in the context of the RECIPE project. The economic adjustment effects of long-term climate policy are investigated based on the cross-comparison of the intertemporal optimization models ReMIND-R and WITCH as well as the recursive dynamic computable general equilibrium model IMACLIM-R. A number of robust findings emerge. If the international community takes immediate action to mitigate climate change, the costs of stabilizing atmospheric CO2 concentrations at 450?ppm (roughly 530?C550?ppm-e) discounted at 3% are estimated to be 1.4% or lower of global consumption over the twenty-first century. Second best settings with either a delay in climate policy or restrictions to the deployment of low-carbon technologies can result in substantial increases of mitigation costs. A delay of global climate policy until 2030 would render the 450?ppm target unachievable. Renewables and CCS are found to be the most critical mitigation technologies, and all models project a rapid switch of investments away from freely emitting energy conversion technologies towards renewables, CCS and nuclear. Concerning end use sectors, the models consistently show an almost full scale decarbonization of the electricity sector by the middle of the twenty-first century, while the decarbonization of non-electric energy demand, in particular in the transport sector remains incomplete in all mitigation scenarios. The results suggest that assumptions about low-carbon alternatives for non-electric energy demand are of key importance for the costs and achievability of very low stabilization scenarios.  相似文献   
965.
Combining policies to remove carbon dioxide (CO2) from the atmosphere with policies to reduce emissions could decrease CO2 concentrations faster than possible via natural processes. We model the optimal selection of a dynamic portfolio of abatement, research and development (R&D), and negative emission policies under an exogenous CO2 constraint and with stochastic technological change. We find that near-term abatement is not sensitive to the availability of R&D policies, but the anticipated availability of negative emission strategies can reduce the near-term abatement optimally undertaken to meet 2°C temperature limits. Further, planning to deploy negative emission technologies shifts optimal R&D funding from ??carbon-free?? technologies into ??emission intensity?? technologies. Making negative emission strategies available enables an 80% reduction in the cost of keeping year 2100 CO2 concentrations near their current level. However, negative emission strategies are less important if the possibility of tipping points rules out using late-century net negative emissions to temporarily overshoot the CO2 constraint earlier in the century.  相似文献   
966.
Coral reefs and other coastal ecosystems such as seagrasses and mangroves are widely recognized to provide protection against the devastating effects of strong waves associated with tsunamis and storms. The predicted warming climate brings to fore the role of these ecosystems in providing protection against stronger typhoons that can result in more devastating waves of greater amplitude. We performed a model simulation of storm generated waves on a Philippine reef, which is located along the path of tropical storms, i.e., at least 10 typhoons on the average pass through the study site yearly. A model to simulate wave propagation was developed using Simulating Waves Nearshore (SWAN) and DELFT3D-WAVE computer simulation software. Scenarios involving local monsoonal wind forcing and storm conditions were simulated. In addition, as climate change may also result to increased relative sea level, a 0.3 m and 1 m rise in sea level scenarios were also used in the wave model simulations. Results showed that the extensive reef system in the site helped dissipate wave energy that in turn reduced wave run-up on land. A significant reduction in wave energy was observed in both climate change, i.e., stronger wind and higher sea level, and non-climate change scenarios. This present study was conducted in a reef whose coral cover is in excellent condition (i.e., 50 to 80% coral cover). Estimates of coral reef growth are in the same order of magnitude as estimates of relative sea level rise based on tide gauge and satellite altimeter data, thus it is possible that the role of reefs in attenuating wave energy may be maintained if coral reef growth can keep up with the change in sea level. Nonetheless, to maintain reef growth, it is imperative to manage coral reef ecosystems sustainably and to eliminate the stressors that are within human control. Minimizing activities such as illegal and destructive blast and poison fishing methods, pollution and siltation, is crucial to minimize the impacts of high-energy waves that may increase with climate change.  相似文献   
967.
Probabilistic seasonal predictions of rainfall that incorporate proper uncertainties are essential for climate risk management. In this study, three different multi-model ensemble (MME) approaches are used to generate probabilistic seasonal hindcasts of the Indian summer monsoon rainfall based on a set of eight global climate models for the 1982–2009 period. The three MME approaches differ in their calculation of spread of the forecast distribution, treated as a Gaussian, while all three use the simple multi-model subdivision average to define the mean of the forecast distribution. The first two approaches use the within-ensemble spread and error residuals of ensemble mean hindcasts, respectively, to compute the variance of the forecast distribution. The third approach makes use of the correlation between the ensemble mean hindcasts and the observations to define the spread using a signal-to-noise ratio. Hindcasts are verified against high-resolution gridded rainfall data from India Meteorological Department in terms of meteorological subdivision spatial averages. The use of correlation for calculating the spread provides better skill than the other two methods in terms of rank probability skill score. In order to further improve the skill, an additional method has been used to generate multi-model probabilistic predictions based on simple averaging of tercile category probabilities from individual models. It is also noted that when such a method is used, skill of probabilistic forecasts is improved as compared with using the multi-model ensemble mean to define the mean of the forecast distribution and then probabilities are estimated. However, skill of the probabilistic predictions of the Indian monsoon rainfall is too low.  相似文献   
968.
969.
A mechanism contributing to centennial variability of the Atlantic Meridional Overturning Circulation (AMOC) is tested with multi-millennial control simulations of several coupled general circulation models (CGCMs). These are a substantially extended integration of the 3rd Hadley Centre Coupled Climate Model (HadCM3), the Kiel Climate Model (KCM), and the Max Plank Institute Earth System Model (MPI-ESM). Significant AMOC variability on time scales of around 100?years is simulated in these models. The centennial mechanism links changes in the strength of the AMOC with oceanic salinities and surface temperatures, and atmospheric phenomena such as the Intertropical Convergence Zone (ITCZ). 2 of the 3 models reproduce all aspects of the mechanism, with the third (MPI-ESM) reproducing most of them. A comparison with a high resolution paleo-proxy for Sea Surface Temperatures (SSTs) north of Iceland over the last 4,000?years, also linked to the ITCZ, suggests that elements of this mechanism may also be detectable in the real world.  相似文献   
970.
The TerraSAR-X (TSX) synthetic aperture radar (SAR) marks the recent emergence of a new generation of spaceborne radar sensors that can for the first time lay claim to localization accuracies in the sub-meter range. The TSX platform’s extremely high orbital stability and the sensor’s hardware timing accuracy combine to enable direct measurements of atmospheric refraction and solid Earth movements. By modeling these effects for individual TSX acquisitions, absolute pixel geolocation accuracy on the order of several centimeters can be achieved without need for even a single tiepoint. A 16-month time series of images was obtained over a fixed test site, making it possible to validate both an atmospheric refraction and a solid Earth tide model, while at the same time establishing the instrument’s long-term stability. These related goals were achieved by placing trihedral corner reflectors (CRs) at the test site and estimating their phase centers with centimeter-level accuracy using differential GPS (DGPS). Oriented in pairs toward a given satellite track, the CRs could be seen as bright “points” in the images, providing a geometric reference set. SAR images from the high-resolution spotlight (HS) mode were obtained in alternating ascending and descending orbit configurations. The highest-resolution products were selected for their small sample dimensions, as positions can be more precisely determined. Based on the delivered product annotations, the CR image positions were predicted, and these predictions were compared with their measured image positions both before and after compensation for atmospheric refraction and systematic solid Earth deviations. It was possible to show that when the atmospheric distortion and Earth tides are taken into account, the TSX HS products have geolocation accuracies far exceeding the specified requirements. Furthermore, this accuracy was maintained for the duration of the 16-month test period. It could be demonstrated that with a correctly calibrated sensor, and after accounting for atmospheric and tidal effects, tiepoint-free geolocation is possible with TSX with an absolute product accuracy of about 5 cm.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号