首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6131篇
  免费   581篇
  国内免费   161篇
测绘学   250篇
大气科学   654篇
地球物理   2184篇
地质学   2358篇
海洋学   349篇
天文学   524篇
综合类   196篇
自然地理   358篇
  2021年   23篇
  2020年   15篇
  2019年   17篇
  2018年   447篇
  2017年   390篇
  2016年   261篇
  2015年   163篇
  2014年   136篇
  2013年   177篇
  2012年   674篇
  2011年   455篇
  2010年   153篇
  2009年   168篇
  2008年   158篇
  2007年   148篇
  2006年   162篇
  2005年   879篇
  2004年   909篇
  2003年   676篇
  2002年   197篇
  2001年   89篇
  2000年   71篇
  1999年   38篇
  1998年   30篇
  1997年   39篇
  1996年   23篇
  1995年   20篇
  1994年   21篇
  1993年   15篇
  1992年   15篇
  1991年   21篇
  1990年   24篇
  1989年   16篇
  1988年   9篇
  1987年   19篇
  1986年   9篇
  1985年   17篇
  1984年   15篇
  1983年   14篇
  1982年   15篇
  1981年   17篇
  1980年   12篇
  1979年   9篇
  1978年   10篇
  1977年   9篇
  1976年   9篇
  1975年   13篇
  1974年   6篇
  1973年   10篇
  1972年   6篇
排序方式: 共有6873条查询结果,搜索用时 15 毫秒
961.
We present a web application named Let-It-Rain that is able to generate a 1-h temporal resolution synthetic rainfall time series using the modified Bartlett–Lewis rectangular pulse (MBLRP) model, a type of Poisson stochastic rainfall generator. Let-It-Rain, which can be accessed through the web address http://www.LetItRain.info, adopts a web-based framework combining ArcGIS Server from server side for parameter value dissemination and JavaScript from client side to implement the MBLRP model. This enables any desktop and mobile end users with internet access and web browser to obtain the synthetic rainfall time series at any given location at which the parameter regionalization work has been completed (currently the contiguous United States and Republic of Korea) with only a few mouse clicks. Let-It-Rain shows satisfactory performance in its ability to reproduce observed rainfall mean, variance, auto-correlation, and probability of zero rainfall at hourly through daily accumulation levels. It also shows a reasonably good performance in reproducing watershed runoff depth and peak flow. We expect that Let-It-Rain can stimulate the uncertainty analysis of hydrologic variables across the world.  相似文献   
962.
Extreme flood events have detrimental effects on society, the economy and the environment. Widespread flooding across South East Queensland in 2011 and 2013 resulted in the loss of lives and significant cost to the economy. In this region, flood risk planning and the use of traditional flood frequency analysis (FFA) to estimate both the magnitude and frequency of the 1-in-100 year flood is severely limited by short gauging station records. On average, these records are 42 years in Eastern Australia and many have a poor representation of extreme flood events. The major aim of this study is to test the application of an alternative method to estimate flood frequency in the form of the Probabilistic Regional Envelope Curve (PREC) approach which integrates additional spatial information of extreme flood events. In order to better define and constrain a working definition of an extreme flood, an Australian Envelope Curve is also produced from available gauging station data. Results indicate that the PREC method shows significant changes to the larger recurrence intervals (≥100 years) in gauges with either too few, or too many, extreme flood events. A decision making process is provided to ascertain when this method is preferable for FFA.  相似文献   
963.
In flood frequency analysis, a suitable probability distribution function is required in order to establish the flood magnitude-return period relationship. Goodness of fit (GOF) techniques are often employed to select a suitable distribution function in this context. But they have been often criticized for their inability to discriminate between statistical distributions for the same application. This paper investigates the potential utility of subsampling, a resampling technique with the aid of a GOF test to select the best distribution for frequency analysis. The performance of the methodology is assessed by applying the methodology to observed and simulated annual maximum (AM) discharge data series. Several AM series of different record lengths are used as case studies to determine the performance. Numerical analyses are carried out to assess the performance in terms of sample size, subsample size and statistical properties inherent in the AM data series. The proposed methodology is also compared with the standard Anderson–Darling (AD) test. It is found that the methodology is suitable for a longer data series. A better performance is obtained when the subsample size is taken around half of the underlying data sample. The methodology has also outperformed the standard AD test in terms of effectively discriminating between distributions. Overall, all results point that the subsampling technique can be a promising tool in discriminating between distributions.  相似文献   
964.
We propose a stochastic methodology for risk assessment of a large earthquake when a long time has elapsed from the last large seismic event. We state an approximate probability distribution for the occurrence time of the next large earthquake, by knowing that the last large seismic event occurred a long time ago. We prove that, under reasonable conditions, such a distribution is exponential with a rate depending on the asymptotic slope of the cumulative intensity function corresponding to a nonhomogeneous Poisson process. As it is not possible to obtain an empirical cumulative distribution function of the waiting time for the next large earthquake, an estimator of its cumulative distribution function based on existing data is derived. We conduct a simulation study for detecting scenario in which the proposed methodology would perform well. Finally, a real-world data analysis is carried out to illustrate its potential applications, including a homogeneity test for the times between earthquakes.  相似文献   
965.
Changing climate and precipitation patterns make the estimation of precipitation, which exhibits two-dimensional and sometimes chaotic behavior, more challenging. In recent decades, numerous data-driven methods have been developed and applied to estimate precipitation; however, these methods suffer from the use of one-dimensional approaches, lack generality, require the use of neighboring stations and have low sensitivity. This paper aims to implement the first generally applicable, highly sensitive two-dimensional data-driven model of precipitation. This model, named frequency based imputation (FBI), relies on non-continuous monthly precipitation time series data. It requires no determination of input parameters and no data preprocessing, and it provides multiple estimations (from the most to the least probable) of each missing data unit utilizing the series itself. A total of 34,330 monthly total precipitation observations from 70 stations in 21 basins within Turkey were used to assess the success of the method by removing and estimating observation series in annual increments. Comparisons with the expectation maximization and multiple linear regression models illustrate that the FBI method is superior in its estimation of monthly precipitation. This paper also provides a link to the software code for the FBI method.  相似文献   
966.
The highest seismic activity in Vietnam is observed in the northwest of the country, hence the practical significance of more accurate assessment of the earthquake hazard for the area. The worldwide experience of seismicity, in particular, the recent Tohoku mega-earthquake (March 11, 2011, M w = 9.0, Japan) shows that instrumental and historical data alone are insufficient to reliably estimate earthquake hazard. This is all the more relevant in relation to Vietnam where the period of instrumental observation is short and historical evidence is nearly lacking. In this connection we made an attempt to construct maps of earthquake hazard based on known seismicity data using the available geological and geophysical data and the method of G.I. Reisner and his associates for classification of areas by seismic potential. Since the question of what geological and geophysical parameters are to be used and with what weights remains unresolved, we developed a program package to estimate Mmax based on different options in the use of geological and geophysical data. In this paper we discuss the first results and the promise held by this program package.  相似文献   
967.
Chloride contamination of groundwater in urban areas due to deicing is a well‐documented phenomenon in northern climates. The objective of this study was to evaluate the effects of permeable pavement on degraded urban groundwater. Although low impact development practices have been shown to improve stormwater quality, no infiltration practice has been found to prevent road salt chlorides from entering groundwater. The few studies that have investigated chlorides in permeable asphalt have involved sampling directly beneath the asphalt; no research has looked more broadly at surrounding groundwater conditions. Monitoring wells were installed upgradient and downgradient of an 860 m2 permeable asphalt parking lot at the University of Connecticut (Storrs, Connecticut). Water level and specific conductance were measured continuously, and biweekly samples were analyzed for chloride. Samples were also analyzed for sodium (Na), calcium (Ca), and magnesium (Mg). Analysis of variance analysis indicated a significantly (p < 0.001) lower geometric mean Cl concentration downgradient (303.7 mg/L) as compared to upgradient (1280 mg/L). Concentrations of all alkali metals increased upgradient and downgradient during the winter months as compared to nonwinter months, indicating that cation exchange likely occurred. Despite the frequent high peaks of chloride in the winter months as well as the increases in alkali metals observed, monitoring revealed lower Cl concentrations downgradient than upgradient for the majority of the year. These results suggest that the use of permeable asphalt in impacted urban environments with high ambient chloride concentrations can be beneficial to shallow groundwater quality, although these results may not be generalizable to areas with low ambient chloride concentrations.  相似文献   
968.
In meandering rivers cut into bedrock, erosion across a channel cross‐section can be strongly asymmetric. At a meander apex, deep undercutting of the outer bank can result in the formation of a hanging cliff (which may drive hillslope failure), whereas the inner bank adjoins a slip‐off slope that connects to the hillslope itself. Here we propose a physically‐based model for predicting channel planform migration and incision, point bar and slip‐off slope formation, bedrock abrasion, the spatial distribution of alluvial cover, and adaptation of channel width in a mixed bedrock‐alluvial channel. We simplify the analysis by considering a numerical model of steady, uniform bend flow satisfying cyclic boundary conditions. Thus in our analysis, ‘sediment supply’, i.e. the total volume of alluvium in the system, is conserved. In our numerical simulations, the migration rate of the outer bank is a specified parameter. Our simulations demonstrate the existence of an approximate state of dynamic equilibrium corresponding to a near‐solution of permanent form in which a bend of constant curvature, width, cross‐sectional shape and alluvial cover distribution migrates diagonally downward at constant speed, leaving a bedrock equivalent of a point bar on the inside of the bend. Channel width is set internally by the processes of migration and incision. We find that equilibrium width increases with increasing sediment supply, but is insensitive to outer bank migration rate. The slope of the bedrock point bar varies inversely with both outer bank migration rate and sediment supply. Although the migration rate of the outer bank is externally imposed here, we discuss a model modification that would allow lateral side‐wall abrasion to be treated in a manner similar to the process of bedrock incision. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
969.
Reservoirs are the most important constructions for water resources management and flood control. Great concern has been paid to the effects of reservoir on downstream area and the differences between inflows and dam site floods due to the changes of upstream flow generation and concentration conditions after reservoir’s impoundment. These differences result in inconsistency between inflow quantiles and the reservoir design criteria derived by dam site flood series, which can be a potential risk and must be quantificationally evaluated. In this study, flood frequency analysis (FFA) and flood control risk analysis (FCRA) methods are used with the long reservoir inflow series derived from a multiple inputs and single output model and a copula-based inflow estimation model. The results of FFA and FCRA are compared and the influences on reservoir flood management are also discussed. The Three Gorges Reservoir (TGR) in China is selected as a case study. Results show that the differences between the TGR inflow and dam site floods are significant which result in changes on its flood control risk rates. The mean values of TGR’s annual maximum inflow peak discharge and 3 days flood volume have increased 5.58 and 3.85% than the dam site ones, while declined by 1.82 and 1.72% for the annual maximum 7 and 15 days flood volumes. The flood control risk rates of middle and small flood events are increased while extreme flood events are declined. It is shown that the TGR can satisfy the flood control task under current hydrologic regime and the results can offer references for better management of the TGR.  相似文献   
970.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号