全文获取类型
收费全文 | 6373篇 |
免费 | 312篇 |
国内免费 | 28篇 |
专业分类
测绘学 | 205篇 |
大气科学 | 473篇 |
地球物理 | 2269篇 |
地质学 | 2149篇 |
海洋学 | 433篇 |
天文学 | 789篇 |
综合类 | 41篇 |
自然地理 | 354篇 |
出版年
2022年 | 43篇 |
2021年 | 117篇 |
2020年 | 116篇 |
2019年 | 103篇 |
2018年 | 437篇 |
2017年 | 332篇 |
2016年 | 372篇 |
2015年 | 266篇 |
2014年 | 314篇 |
2013年 | 393篇 |
2012年 | 294篇 |
2011年 | 342篇 |
2010年 | 298篇 |
2009年 | 318篇 |
2008年 | 281篇 |
2007年 | 224篇 |
2006年 | 211篇 |
2005年 | 162篇 |
2004年 | 170篇 |
2003年 | 147篇 |
2002年 | 124篇 |
2001年 | 110篇 |
2000年 | 85篇 |
1999年 | 68篇 |
1998年 | 94篇 |
1997年 | 66篇 |
1996年 | 63篇 |
1995年 | 51篇 |
1994年 | 55篇 |
1993年 | 47篇 |
1992年 | 50篇 |
1991年 | 41篇 |
1990年 | 62篇 |
1989年 | 43篇 |
1988年 | 34篇 |
1987年 | 38篇 |
1986年 | 39篇 |
1985年 | 32篇 |
1984年 | 40篇 |
1983年 | 35篇 |
1982年 | 44篇 |
1981年 | 43篇 |
1980年 | 36篇 |
1979年 | 32篇 |
1978年 | 28篇 |
1977年 | 30篇 |
1976年 | 27篇 |
1975年 | 27篇 |
1973年 | 31篇 |
1971年 | 40篇 |
排序方式: 共有6713条查询结果,搜索用时 15 毫秒
991.
Mario?GómezEmail authorView authors OrcID profile M.?Concepción Ausín M.?Carmen Domínguez 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(5):1107-1121
Modelling glacier discharge is an important issue in hydrology and climate research. Glaciers represent a fundamental water resource when melting of ice and snow contributes to runoff. Glaciers are also studied as natural global warming sensors. GLACKMA association has implemented one of their Pilot Experimental Catchment areas at the King George Island in the Antarctica which records values of the liquid discharge from Collins glacier. In this paper, we propose the use of time-varying copula models for analyzing the relationship between air temperature and glacier discharge, which is clearly non constant and non linear through time. A seasonal copula model is defined where both the marginal and copula parameters vary periodically along time following a seasonal dynamic. Full Bayesian inference is performed such that the marginal and copula parameters are estimated in a one single step, in contrast with the usual two-step approach. Bayesian prediction and model selection is also carried out for the proposed model such that Bayesian credible intervals can be obtained for the conditional glacier discharge given a value of the temperature at any given time point. The proposed methodology is illustrated using the GLACKMA real data where there is, in addition, a hydrological year of missing discharge data which were not possible to measure accurately due to problems in the sounding. 相似文献
992.
Binquan?LiEmail authorView authors OrcID profile Zhongmin?Liang Yingqing?He Lin?Hu Weimin?Zhao Kumud?Acharya 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(5):1045-1059
Parameter uncertainty in hydrologic modeling is crucial to the flood simulation and forecasting. The Bayesian approach allows one to estimate parameters according to prior expert knowledge as well as observational data about model parameter values. This study assesses the performance of two popular uncertainty analysis (UA) techniques, i.e., generalized likelihood uncertainty estimation (GLUE) and Bayesian method implemented with the Markov chain Monte Carlo sampling algorithm, in evaluating model parameter uncertainty in flood simulations. These two methods were applied to the semi-distributed Topographic hydrologic model (TOPMODEL) that includes five parameters. A case study was carried out for a small humid catchment in the southeastern China. The performance assessment of the GLUE and Bayesian methods were conducted with advanced tools suited for probabilistic simulations of continuous variables such as streamflow. Graphical tools and scalar metrics were used to test several attributes of the simulation quality of selected flood events: deterministic accuracy and the accuracy of 95 % prediction probability uncertainty band (95PPU). Sensitivity analysis was conducted to identify sensitive parameters that largely affect the model output results. Subsequently, the GLUE and Bayesian methods were used to analyze the uncertainty of sensitive parameters and further to produce their posterior distributions. Based on their posterior parameter samples, TOPMODEL’s simulations and the corresponding UA results were conducted. Results show that the form of exponential decline in conductivity and the overland flow routing velocity were sensitive parameters in TOPMODEL in our case. Small changes in these two parameters would lead to large differences in flood simulation results. Results also suggest that, for both UA techniques, most of streamflow observations were bracketed by 95PPU with the containing ratio value larger than 80 %. In comparison, GLUE gave narrower prediction uncertainty bands than the Bayesian method. It was found that the mode estimates of parameter posterior distributions are suitable to result in better performance of deterministic outputs than the 50 % percentiles for both the GLUE and Bayesian analyses. In addition, the simulation results calibrated with Rosenbrock optimization algorithm show a better agreement with the observations than the UA’s 50 % percentiles but slightly worse than the hydrographs from the mode estimates. The results clearly emphasize the importance of using model uncertainty diagnostic approaches in flood simulations. 相似文献
993.
Ke-Sheng?ChengEmail authorView authors OrcID profile Yi-Ting?Lien Yii-Chen?Wu Yuan-Fong?Su 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(5):1123-1146
Model performance evaluation for real-time flood forecasting has been conducted using various criteria. Although the coefficient of efficiency (CE) is most widely used, we demonstrate that a model achieving good model efficiency may actually be inferior to the naïve (or persistence) forecasting, if the flow series has a high lag-1 autocorrelation coefficient. We derived sample-dependent and AR model-dependent asymptotic relationships between the coefficient of efficiency and the coefficient of persistence (CP) which form the basis of a proposed CE–CP coupled model performance evaluation criterion. Considering the flow persistence and the model simplicity, the AR(2) model is suggested to be the benchmark model for performance evaluation of real-time flood forecasting models. We emphasize that performance evaluation of flood forecasting models using the proposed CE–CP coupled criterion should be carried out with respect to individual flood events. A single CE or CP value derived from a multi-event artifactual series by no means provides a multi-event overall evaluation and may actually disguise the real capability of the proposed model. 相似文献
994.
Impact of sensor measurement error on sensor positioning in water quality monitoring networks 总被引:1,自引:1,他引:0
Seong-Hee?Kim Mustafa?M.?Aral Yongsoon?Eun Jisu?J.?Park Chuljin?ParkEmail authorView authors OrcID profile 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(3):743-756
This paper studies the impact of sensor measurement error on designing a water quality monitoring network for a river system, and shows that robust sensor locations can be obtained when an optimization algorithm is combined with a statistical process control (SPC) method. Specifically, we develop a possible probabilistic model of sensor measurement error and the measurement error model is embedded into a simulation model of a river system. An optimization algorithm is used to find the optimal sensor locations that minimize the expected time until a spill detection in the presence of a constraint on the probability of detecting a spill. The experimental results show that the optimal sensor locations are highly sensitive to the variability of measurement error and false alarm rates are often unacceptably high. An SPC method is useful in finding thresholds that guarantee a false alarm rate no more than a pre-specified target level, and an optimization algorithm combined with the thresholds finds a robust sensor network. 相似文献
995.
Daryl?LamEmail authorView authors OrcID profile Chris?Thompson Jacky?Croke 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(8):2011-2031
Extreme flood events have detrimental effects on society, the economy and the environment. Widespread flooding across South East Queensland in 2011 and 2013 resulted in the loss of lives and significant cost to the economy. In this region, flood risk planning and the use of traditional flood frequency analysis (FFA) to estimate both the magnitude and frequency of the 1-in-100 year flood is severely limited by short gauging station records. On average, these records are 42 years in Eastern Australia and many have a poor representation of extreme flood events. The major aim of this study is to test the application of an alternative method to estimate flood frequency in the form of the Probabilistic Regional Envelope Curve (PREC) approach which integrates additional spatial information of extreme flood events. In order to better define and constrain a working definition of an extreme flood, an Australian Envelope Curve is also produced from available gauging station data. Results indicate that the PREC method shows significant changes to the larger recurrence intervals (≥100 years) in gauges with either too few, or too many, extreme flood events. A decision making process is provided to ascertain when this method is preferable for FFA. 相似文献
996.
Raúl?Fierro Víctor?LeivaEmail author 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(9):2327-2336
We propose a stochastic methodology for risk assessment of a large earthquake when a long time has elapsed from the last large seismic event. We state an approximate probability distribution for the occurrence time of the next large earthquake, by knowing that the last large seismic event occurred a long time ago. We prove that, under reasonable conditions, such a distribution is exponential with a rate depending on the asymptotic slope of the cumulative intensity function corresponding to a nonhomogeneous Poisson process. As it is not possible to obtain an empirical cumulative distribution function of the waiting time for the next large earthquake, an estimator of its cumulative distribution function based on existing data is derived. We conduct a simulation study for detecting scenario in which the proposed methodology would perform well. Finally, a real-world data analysis is carried out to illustrate its potential applications, including a homogeneity test for the times between earthquakes. 相似文献
997.
Fatih?DikbasEmail authorView authors OrcID profile 《Stochastic Environmental Research and Risk Assessment (SERRA)》2017,31(9):2415-2434
Changing climate and precipitation patterns make the estimation of precipitation, which exhibits two-dimensional and sometimes chaotic behavior, more challenging. In recent decades, numerous data-driven methods have been developed and applied to estimate precipitation; however, these methods suffer from the use of one-dimensional approaches, lack generality, require the use of neighboring stations and have low sensitivity. This paper aims to implement the first generally applicable, highly sensitive two-dimensional data-driven model of precipitation. This model, named frequency based imputation (FBI), relies on non-continuous monthly precipitation time series data. It requires no determination of input parameters and no data preprocessing, and it provides multiple estimations (from the most to the least probable) of each missing data unit utilizing the series itself. A total of 34,330 monthly total precipitation observations from 70 stations in 21 basins within Turkey were used to assess the success of the method by removing and estimating observation series in annual increments. Comparisons with the expectation maximization and multiple linear regression models illustrate that the FBI method is superior in its estimation of monthly precipitation. This paper also provides a link to the software code for the FBI method. 相似文献
998.
Frontal dynamics boost primary production in the summer stratified Mediterranean sea 总被引:1,自引:1,他引:0
Antonio Olita Arthur Capet Mariona Claret Amala Mahadevan Pierre Marie Poulain Alberto Ribotti Simón Ruiz Joaquín Tintoré Antonio Tovar-Sánchez Ananda Pascual 《Ocean Dynamics》2017,67(6):767-782
Bio-physical glider measurements from a unique process-oriented experiment in the Eastern Alboran Sea (AlborEx) allowed us to observe the distribution of the deep chlorophyll maximum (DCM) across an intense density front, with a resolution (~ 400 m) suitable for investigating sub-mesoscale dynamics. This front, at the interface between Atlantic and Mediterranean waters, had a sharp density gradient (Δρ ~ 1 kg/m3 in ~ 10 km) and showed imprints of (sub-)mesoscale phenomena on tracer distributions. Specifically, the chlorophyll-a concentration within the DCM showed a disrupted pattern along isopycnal surfaces, with patches bearing a relationship to the stratification (buoyancy frequency) at depths between 30 and 60 m. In order to estimate the primary production (PP) rate within the chlorophyll patches observed at the sub-surface, we applied the Morel and Andrè (J Geophys Res 96:685–698 1991) bio-optical model using the photosynthetic active radiation (PAR) from Argo profiles collected simultaneously with glider data. The highest production was located concurrently with domed isopycnals on the fresh side of the front, suggestive that (sub-)mesoscale upwelling is carrying phytoplankton patches from less to more illuminated levels, with a contemporaneous delivering of nutrients. Integrated estimations of PP (1.3 g C m?2d?1) along the glider path are two to four times larger than the estimations obtained from satellite-based algorithms, i.e., derived from the 8-day composite fields extracted over the glider trip path. Despite the differences in spatial and temporal sampling between instruments, the differences in PP estimations are mainly due to the inability of the satellite to measure DCM patches responsible for the high production. The deepest (depth > 60 m) chlorophyll patches are almost unproductive and probably transported passively (subducted) from upper productive layers. Finally, the relationship between primary production and oxygen is also investigated. The logarithm of the primary production in the DCM interior (chlorophyll (Chl) > 0.5 mg/m3) shows a linear negative relationship with the apparent oxygen utilization, confirming that high chlorophyll patches are productive. The slope of this relationship is different for Atlantic, mixed interface waters and Mediterranean waters, suggesting the presence of differences in planktonic communities (whether physiological, population, or community level should be object of further investigation) on the different sides of the front. In addition, the ratio of optical backscatter to Chl is high within the intermediate (mixed) waters, which is suggestive of large phytoplankton cells, and lower within the core of the Atlantic and Mediterranean waters. These observations highlight the relevance of fronts in triggering primary production at DCM level and shaping the characteristic patchiness of the pelagic domain. This gains further relevance considering the inadequacy of optical satellite sensors to observe DCM concentrations at such fine scales. 相似文献
999.
Of the many topographic features, more specifically seamounts, that are ubiquitous in the ocean floor, we focus our attention on those with relatively shallow summits that can interact with wind-generated surface waves. Among these, especially relatively long waves crossing the oceans (swells) and stormy seas are able to affect the water column up to a considerable depth and therefore interact with these deep-sea features. We quantify this interaction through numerical experiments using a numerical wave model (SWAN), in which a simply shaped seamount is exposed to waves of different length. The results show a strong interaction that leads to significant changes in the wave field, creating wake zones and regions of large wave amplification. This is then exemplified in a practical case where we analyze the interaction of more realistic sea conditions with a very shallow rock in the Yellow Sea. Potentially important for navigation and erosion processes, mutatis mutandis, these results are also indicative of possible interactions with emerged islands and sand banks in shelf seas. 相似文献
1000.
Luis Alejandro Morales-Marín Kwok Pan Chun Howard Simon Wheater Karl-Erich Lindenschmidt 《水文科学杂志》2017,62(4):657-679
Nutrient loadings in many river catchments continue to increase due to rapid expansion of agriculture, urban and industrial development, and population growth. Nutrient enrichment of water bodies has intensified eutrophication which degrades water quality and ecosystem health. In this study, we carried out a trend analysis of total phosphorus and total nitrogen loads in the South Saskatchewan River (SSR) catchment using a novel approach to analyse nutrient time series. Seasonal analysis of trends at each of the water quality stations was performed to determine the relationships between annual flow regimes and nutrient loads in the catchment, in particular, the influence of the high spring runoff on nutrient export. Decadal analysis was also performed to determine the long-term relationships of nutrients with anthropogenic changes in the catchment. Although it was found that seasonal and historical variability of nutrient load trends is mainly determined by streamflow regime changes, there is evidence that increases in nitrogen concentration can also be attributed to anthropogenic changes. 相似文献