首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6194篇
  免费   629篇
  国内免费   174篇
测绘学   252篇
大气科学   677篇
地球物理   2233篇
地质学   2476篇
海洋学   333篇
天文学   519篇
综合类   194篇
自然地理   313篇
  2023年   6篇
  2022年   17篇
  2021年   29篇
  2020年   40篇
  2019年   44篇
  2018年   487篇
  2017年   446篇
  2016年   307篇
  2015年   203篇
  2014年   170篇
  2013年   180篇
  2012年   696篇
  2011年   487篇
  2010年   169篇
  2009年   182篇
  2008年   180篇
  2007年   157篇
  2006年   174篇
  2005年   868篇
  2004年   910篇
  2003年   689篇
  2002年   199篇
  2001年   84篇
  2000年   57篇
  1999年   28篇
  1998年   14篇
  1997年   22篇
  1996年   19篇
  1995年   7篇
  1994年   4篇
  1993年   9篇
  1992年   5篇
  1991年   12篇
  1990年   14篇
  1989年   8篇
  1988年   4篇
  1987年   8篇
  1983年   6篇
  1980年   6篇
  1976年   5篇
  1975年   4篇
  1968年   2篇
  1965年   3篇
  1963年   2篇
  1961年   2篇
  1959年   2篇
  1955年   2篇
  1954年   2篇
  1951年   2篇
  1948年   2篇
排序方式: 共有6997条查询结果,搜索用时 15 毫秒
941.
Bovine tuberculosis (TB) poses a serious threat for agricultural industry in several countries, it involves potential interactions between wildlife and cattle and creates societal problems in terms of human-wildlife conflict. This study addresses connectedness network analysis, the spatial, and temporal dynamics of TB between cattle in farms and the European badger (Meles meles) using a large dataset generated by a calibrated agent based model. Results showed that infected network connectedness was lower in badgers than in cattle. The contribution of an infected individual to the mean distance of disease spread over time was considerably lower for badger than cattle; badgers mainly spread the disease locally while cattle infected both locally and across longer distances. The majority of badger-induced infections occurred when individual badgers leave their home sett, and this was positively correlated with badger population growth rates. Point pattern analysis indicated aggregation in the spatial pattern of TB prevalence in badger setts across all scales. The spatial distribution of farms that were not TB free was aggregated at different scales than the spatial distribution of infected badgers and became random at larger scales. The spatial cross correlation between infected badger setts and infected farms revealed that generally infected setts and farms do not coexist except at few scales. Temporal autocorrelation detected a two year infection cycle for badgers, while there was both within the year and longer cycles for infected cattle. Temporal cross correlation indicated that infection cycles in badgers and cattle are negatively correlated. The implications of these results for understanding the dynamics of the disease are discussed.  相似文献   
942.
This paper studies the impact of sensor measurement error on designing a water quality monitoring network for a river system, and shows that robust sensor locations can be obtained when an optimization algorithm is combined with a statistical process control (SPC) method. Specifically, we develop a possible probabilistic model of sensor measurement error and the measurement error model is embedded into a simulation model of a river system. An optimization algorithm is used to find the optimal sensor locations that minimize the expected time until a spill detection in the presence of a constraint on the probability of detecting a spill. The experimental results show that the optimal sensor locations are highly sensitive to the variability of measurement error and false alarm rates are often unacceptably high. An SPC method is useful in finding thresholds that guarantee a false alarm rate no more than a pre-specified target level, and an optimization algorithm combined with the thresholds finds a robust sensor network.  相似文献   
943.
We propose a scenario-based method for simulating and mapping the risk of surge floods for use by local authorities concerned with public safety and urban planning in coastal areas. Focusing on the triad of hazard, vulnerability and adaptation capability, we estimate the comprehensive risk and display its spatial distribution using the raster calculation tool in ArcGIS. The detailed methodology is introduced via a case study of Yuhuan, an island county in Zhejiang Province, China, which is frequently affected by typhoon storm surges. First, we designed 24 typhoon scenarios and modeled the flood process in each scenario using the hydrodynamic module of MIKE 21. Second, flood depth and area were used for hazard assessment; an authorized indicator system of land use categories and a survey of emergency shelters were used for vulnerability and adaptation capability assessment, respectively; and a quantified model was used for assessment of the comprehensive risk. Lastly, we used the GIS raster calculation tool for mapping the risk of storm surges in multiple typhoon scenarios. Our principal findings are as follows: (1) Seawalls are more likely to be overtopped or destroyed by more severe storm surges with increasing typhoon intensity. (2) Most of the residential areas with inadequate emergency shelters are highly vulnerable to flood events. (3) As projected in the risk mapping, if an exceptional typhoon with a central pressure of 915 or 925 hPa made a landfall in Yuhuan, a wide range of areas would be flooded and at high risk. (4) Determining optimal strategies based on identification of risk-inducing factors is the most effective way of promoting safe and sustainable development in coastal cities.  相似文献   
944.
A number of statistical downscaling methodologies have been introduced to bridge the gap in scale between outputs of climate models and climate information needed to assess potential impacts at local and regional scales. Four statistical downscaling methods [bias-correction/spatial disaggregation (BCSD), bias-correction/constructed analogue (BCCA), multivariate adaptive constructed analogs (MACA), and bias-correction/climate imprint (BCCI)] are applied to downscale the latest climate forecast system reanalysis (CFSR) data to stations for precipitation, maximum temperature, and minimum temperature over South Korea. All methods are calibrated with observational station data for 19 years from 1973 to 1991 and validated for the more recent 19-year period from 1992 to 2010. We construct a comprehensive suite of performance metrics to inter-compare methods, which is comprised of five criteria related to time-series, distribution, multi-day persistence, extremes, and spatial structure. Based on the performance metrics, we employ technique for order of preference by similarity to ideal solution (TOPSIS) and apply 10,000 different weighting combinations to the criteria of performance metrics to identify a robust statistical downscaling method and important criteria. The results show that MACA and BCSD have comparable skill in the time-series related criterion and BCSD outperforms other methods in distribution and extremes related criteria. In addition, MACA and BCCA, which incorporate spatial patterns, show higher skill in the multi-day persistence criterion for temperature, while BCSD shows the highest skill for precipitation. For the spatial structure related criterion, BCCA and MACA outperformed BCSD and BCCI. From the TOPSIS analysis, we found that MACA is the most robust method for all variables in South Korea, and BCCA and BCSD are the second for temperature and precipitation, respectively. We also found that the contribution of the multi-day persistence and spatial structure related criteria are crucial to ranking the skill of statistical downscaling methods.  相似文献   
945.
We present a web application named Let-It-Rain that is able to generate a 1-h temporal resolution synthetic rainfall time series using the modified Bartlett–Lewis rectangular pulse (MBLRP) model, a type of Poisson stochastic rainfall generator. Let-It-Rain, which can be accessed through the web address http://www.LetItRain.info, adopts a web-based framework combining ArcGIS Server from server side for parameter value dissemination and JavaScript from client side to implement the MBLRP model. This enables any desktop and mobile end users with internet access and web browser to obtain the synthetic rainfall time series at any given location at which the parameter regionalization work has been completed (currently the contiguous United States and Republic of Korea) with only a few mouse clicks. Let-It-Rain shows satisfactory performance in its ability to reproduce observed rainfall mean, variance, auto-correlation, and probability of zero rainfall at hourly through daily accumulation levels. It also shows a reasonably good performance in reproducing watershed runoff depth and peak flow. We expect that Let-It-Rain can stimulate the uncertainty analysis of hydrologic variables across the world.  相似文献   
946.
Extreme flood events have detrimental effects on society, the economy and the environment. Widespread flooding across South East Queensland in 2011 and 2013 resulted in the loss of lives and significant cost to the economy. In this region, flood risk planning and the use of traditional flood frequency analysis (FFA) to estimate both the magnitude and frequency of the 1-in-100 year flood is severely limited by short gauging station records. On average, these records are 42 years in Eastern Australia and many have a poor representation of extreme flood events. The major aim of this study is to test the application of an alternative method to estimate flood frequency in the form of the Probabilistic Regional Envelope Curve (PREC) approach which integrates additional spatial information of extreme flood events. In order to better define and constrain a working definition of an extreme flood, an Australian Envelope Curve is also produced from available gauging station data. Results indicate that the PREC method shows significant changes to the larger recurrence intervals (≥100 years) in gauges with either too few, or too many, extreme flood events. A decision making process is provided to ascertain when this method is preferable for FFA.  相似文献   
947.
In flood frequency analysis, a suitable probability distribution function is required in order to establish the flood magnitude-return period relationship. Goodness of fit (GOF) techniques are often employed to select a suitable distribution function in this context. But they have been often criticized for their inability to discriminate between statistical distributions for the same application. This paper investigates the potential utility of subsampling, a resampling technique with the aid of a GOF test to select the best distribution for frequency analysis. The performance of the methodology is assessed by applying the methodology to observed and simulated annual maximum (AM) discharge data series. Several AM series of different record lengths are used as case studies to determine the performance. Numerical analyses are carried out to assess the performance in terms of sample size, subsample size and statistical properties inherent in the AM data series. The proposed methodology is also compared with the standard Anderson–Darling (AD) test. It is found that the methodology is suitable for a longer data series. A better performance is obtained when the subsample size is taken around half of the underlying data sample. The methodology has also outperformed the standard AD test in terms of effectively discriminating between distributions. Overall, all results point that the subsampling technique can be a promising tool in discriminating between distributions.  相似文献   
948.
We propose a stochastic methodology for risk assessment of a large earthquake when a long time has elapsed from the last large seismic event. We state an approximate probability distribution for the occurrence time of the next large earthquake, by knowing that the last large seismic event occurred a long time ago. We prove that, under reasonable conditions, such a distribution is exponential with a rate depending on the asymptotic slope of the cumulative intensity function corresponding to a nonhomogeneous Poisson process. As it is not possible to obtain an empirical cumulative distribution function of the waiting time for the next large earthquake, an estimator of its cumulative distribution function based on existing data is derived. We conduct a simulation study for detecting scenario in which the proposed methodology would perform well. Finally, a real-world data analysis is carried out to illustrate its potential applications, including a homogeneity test for the times between earthquakes.  相似文献   
949.
Changing climate and precipitation patterns make the estimation of precipitation, which exhibits two-dimensional and sometimes chaotic behavior, more challenging. In recent decades, numerous data-driven methods have been developed and applied to estimate precipitation; however, these methods suffer from the use of one-dimensional approaches, lack generality, require the use of neighboring stations and have low sensitivity. This paper aims to implement the first generally applicable, highly sensitive two-dimensional data-driven model of precipitation. This model, named frequency based imputation (FBI), relies on non-continuous monthly precipitation time series data. It requires no determination of input parameters and no data preprocessing, and it provides multiple estimations (from the most to the least probable) of each missing data unit utilizing the series itself. A total of 34,330 monthly total precipitation observations from 70 stations in 21 basins within Turkey were used to assess the success of the method by removing and estimating observation series in annual increments. Comparisons with the expectation maximization and multiple linear regression models illustrate that the FBI method is superior in its estimation of monthly precipitation. This paper also provides a link to the software code for the FBI method.  相似文献   
950.
The main objective of the LAgrangian Transport EXperiment (LATEX) project was to study the influence of coastal mesoscale and submesoscale physical processes on circulation dynamics, cross-shelf exchanges, and biogeochemistry in the western continental shelf of the Gulf of Lion, Northwestern Mediterranean Sea. LATEX was a five-year multidisciplinary project based on the combined analysis of numerical model simulations and multi-platform field experiments. The model component included a ten-year realistic 3D numerical simulation, with a 1 km horizontal resolution over the gulf, nested in a coarser 3 km resolution model. The in situ component involved four cruises, including a large-scale multidisciplinary campaign with two research vessels in 2010. This review concentrates on the physics results of LATEX, addressing three main subjects: (1) the investigation of the mesoscale to submesoscale processes. The eddies are elliptic, baroclinic, and anticyclonic; the strong thermal and saline front is density compensated. Their generation processes are studied; (2) the development of sampling strategies for their direct observations. LATEX has implemented an adaptive strategy Lagrangian tool, with a reference software available on the web, to perform offshore campaigns in a Lagrangian framework; (3) the quantification of horizontal mixing and cross-shelf exchanges. Lateral diffusivity coefficients, calculated in various ways including a novel technique, are in the range classically encountered for their associated scales. Cross-shelf fluxes have been calculated, after retrieving the near-inertial oscillation contribution. Further perspectives are discussed, especially for the ongoing challenge of studying submesoscale features remotely and from in situ data.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号