首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   590篇
  免费   41篇
  国内免费   20篇
测绘学   28篇
大气科学   39篇
地球物理   210篇
地质学   283篇
海洋学   26篇
天文学   28篇
综合类   7篇
自然地理   30篇
  2023年   4篇
  2022年   15篇
  2021年   23篇
  2020年   45篇
  2019年   35篇
  2018年   58篇
  2017年   56篇
  2016年   74篇
  2015年   40篇
  2014年   45篇
  2013年   47篇
  2012年   41篇
  2011年   40篇
  2010年   29篇
  2009年   14篇
  2008年   14篇
  2007年   9篇
  2006年   16篇
  2005年   4篇
  2004年   7篇
  2003年   5篇
  2002年   4篇
  2001年   1篇
  2000年   4篇
  1999年   1篇
  1997年   1篇
  1996年   1篇
  1994年   1篇
  1993年   4篇
  1992年   1篇
  1991年   1篇
  1990年   1篇
  1989年   5篇
  1988年   1篇
  1985年   1篇
  1982年   1篇
  1980年   1篇
  1979年   1篇
排序方式: 共有651条查询结果,搜索用时 15 毫秒
81.
Modelling uncertainty can significantly affect the structural seismic reliability assessment. However, the limit state excursion due to this type of uncertainty may not be described by a Poisson process as it lacks renewal properties with the occurrence of each earthquake event. Furthermore, considering uncertainties related to ground motion representation by employing recorded ground motions together with modelling uncertainties is not a trivial task. Robust fragility assessment, proposed previously by the authors, employs the structural response to recorded ground motion as data in order to update prescribed seismic fragility models. Robust fragility can be extremely efficient for considering also the structural modelling uncertainties by creating a dataset of one-to-one assignments of structural model realizations and as-recorded ground motions. This can reduce the computational effort by more than 1 order of magnitude. However, it should be kept in mind that the fragility concept itself is based on the underlying assumption of Poisson-type renewal. Using the concept of updated robust reliability, considering both the uncertainty in ground motion representation based on as-recorded ground motion and non ergodic modelling uncertainties, the error introduced through structural reliability assessment by using the robust fragility is quantified. It is shown through specific application to an existing RC frame that this error is quite small when the product of the time interval and the standard deviation of failure rate is small and is on the conservative side.  相似文献   
82.
Analysis of amplitude variation with offset is an essential step for reservoir characterization. For an accurate reservoir characterization, the amplitude obtained with an isotropic assumption of the reservoir must be corrected for the anisotropic effects. The objective is seismic anisotropic amplitude correction in an effective medium, and, to this end, values and signs of anisotropic parameter differences (Δδ and Δε) across the reflection interfaces are needed. These parameters can be identified by seismic and well log data. A new technique for anisotropic amplitude correction was developed to modify amplitude changes in seismic data in transversely isotropic media with a vertical axis of symmetry. The results show that characteristics of pre-stack seismic data, that is, amplitude variation with offset gradient, can be potentially related to the sign of anisotropic parameter differences (Δδ and Δε) between two layers of the reflection boundary. The proposed methodology is designed to attain a proper fit between modelled and observed amplitude variation with offset responses, after anisotropic correction, for all possible lithofacies at the reservoir boundary. We first estimate anisotropic parameters, that is, δ and ε, away from the wells through Backus averaging of elastic properties resulted from the first pass of isotropic pre-stack seismic inversion, on input data with no amplitude correction. Next, we estimate the anisotropic parameter differences at reflection interfaces (values and signs of Δδ and Δε). We then generate seismic angle gather data after anisotropic amplitude correction using Rüger's equation for the P-P reflection coefficient. The second pass of isotropic pre-stack seismic inversion is then performed on the amplitude-corrected data, and elastic properties are estimated. Final outcome demonstrates how introduced methodology helps to reduce the uncertainty of elastic property prediction. Pre-stack seismic inversion on amplitude-corrected seismic data results in more accurate elastic property prediction than what can be obtained from non-corrected data. Moreover, a new anisotropy attribute (ν) is presented for improvement of lithology identification.  相似文献   
83.
ABSTRACT

Developing a general framework to capture the complexities associated with the non-linear and adaptive nature of farmers facing water resources scarcity is a challenging problem. This paper integrates agent-based modelling (ABM) and a data mining method to develop a hybrid socio-hydrological framework to provide future insights for policy-makers. The data associated with the farmers’ main characteristics were collected through field surveys and interviews. Afterwards, the association rule was employed to discover the main patterns representing the farmers’ agricultural decisions. The discovered patterns were then used as the behavioural rules in ABM to simulate the agricultural activities. The proposed framework has been was applied to explore the interactions between agricultural activities and the main river feeding the Urmia-Lake, Iran. The outcomes indicate that farmers’ acquisitive traits and belongings have significant impacts on their socio-hydrological interactions. The reported values of the efficiency criteria may support the satisfactory performance of the proposed framework.  相似文献   
84.
Matching pursuit belongs to the category of spectral decomposition approaches that use a pre-defined discrete wavelet dictionary in order to decompose a signal adaptively. Although disengaged from windowing issues, matching point demands high computational costs as extraction of all local structure of signal requires a large size dictionary. Thus in order to find the best match wavelet, it is required to search the whole space. To reduce the computational cost of greedy matching pursuit, two artificial intelligence methods, (1) quantum inspired evolutionary algorithm and (2) particle swarm optimization, are introduced for two successive steps: (a) initial estimation and (b) optimization of wavelet parameters. We call this algorithm quantum swarm evolutionary matching pursuit. Quantum swarm evolutionary matching pursuit starts with a small colony of population at which each individual, is potentially a transformed form of a time-frequency atom. To attain maximum pursuit of the potential candidate wavelets with the residual, the colony members are adjusted in an evolutionary way. In addition, the quantum computing concepts such as quantum bit, quantum gate, and superposition of states are introduced into the method. The algorithm parameters such as social and cognitive learning factors, population size and global migration period are optimized using seismic signals. In applying matching pursuit to geophysical data, typically complex trace attributes are used for initial estimation of wavelet parameters, however, in this study it was shown that using complex trace attributes are sensitive to noisy data and would have lower rate of convergence. The algorithm performance over noisy signals, using non-orthogonal dictionaries are investigated and compared with other methods such as orthogonal matching pursuit. The results illustrate that quantum swarm evolutionary matching pursuit has the least sensitivity to noise and higher rate of convergence. Finally, the algorithm is applied to both modelled seismograms and real data for detection of low frequency anomalies to validate the findings.  相似文献   
85.
The complex stream bank profiles in alluvial channels and rivers that are formed after reaching equilibrium has been a popular topic of research for many geomorphologists and river engineers. The entropy theory has recently been successfully applied to this problem. However, the existing methods restrict the further application of the entropy parameter to determine the cross-section slope of the river banks. To solve this limitation, we introduce a novel approach in the extraction of the equation based on the calculation of the entropy parameter (λ) and the transverse slope of the bank profile at threshold channel conditions. The effects of different hydraulic and geometric parameters are evaluated on a variation of the entropy parameter. Sensitivity analysis on the parameters affecting the entropy parameter shows that the most effective parameter on the λ-slope multiplier is the maximum slope of the bank profile and the dimensionless lateral distance of the river banks.  相似文献   
86.
87.
Geomagnetism and Aeronomy - In this study, a hypothesis is proposed about the possible effect of Geomagnetic field (GMF) on the charge structure of a thundercloud based on Lorentz force equation...  相似文献   
88.
One important tool for water resources management in arid and semi-arid areas is groundwater potential mapping. In this study, four data-mining models including K-nearest neighbor (KNN), linear discriminant analysis (LDA), multivariate adaptive regression splines (MARS), and quadric discriminant analysis (QDA) were used for groundwater potential mapping to get better and more accurate groundwater potential maps (GPMs). For this purpose, 14 groundwater influence factors were considered, such as altitude, slope angle, slope aspect, plan curvature, profile curvature, slope length, topographic wetness index (TWI), stream power index, distance from rivers, river density, distance from faults, fault density, land use, and lithology. From 842 springs in the study area, in the Khalkhal region of Iran, 70 % (589 springs) were considered for training and 30 % (253 springs) were used as a validation dataset. Then, KNN, LDA, MARS, and QDA models were applied in the R statistical software and the results were mapped as GPMs. Finally, the receiver operating characteristics (ROC) curve was implemented to evaluate the performance of the models. According to the results, the area under the curve of ROCs were calculated as 81.4, 80.5, 79.6, and 79.2 % for MARS, QDA, KNN, and LDA, respectively. So, it can be concluded that the performances of KNN and LDA were acceptable and the performances of MARS and QDA were excellent. Also, the results depicted high contribution of altitude, TWI, slope angle, and fault density, while plan curvature and land use were seen to be the least important factors.  相似文献   
89.
Using methods to estimate the value of environmental goods seems to be essential for economic planning and moving toward development. In this paper, using methods of discrete payment vehicle (dichotomous choice), i.e., single-bounded and Double-Bounded Dichotomous Choice, the value of air pollution in Tehran and households’ willingness to pay to improve air quality, in four selected regions (Shahr-e-Ray, Shoosh, Haft-e-Tir and Tajrish) is estimated and the corresponding results of these two techniques were compared. The results showed that the total value of air quality improvement, calculated through two techniques of Double-Bounded and Single-Bounded Dichotomous Choice were, respectively, 2,398,657,500 and 1,492,566,000 thousands Rials (1USD = 35,000R) in a year and weighted mean of each citizen’s willingness to pay to improve air quality, calculated through these two methods was estimated to be 282,192 and 175,596 Rials in a year, respectively. Considering the annual damage to health, for any 1% increase in pollutants and yearly cost of pollution reduction which are, respectively, 1,199,000,000 and 7,336,000,000 thousands Rials, it was determined that citizens’ willingness to pay through two methods includes 20 and 30% of the cost of pollution control, respectively, while 70% of pollution is due to mobile sources. However, citizens’ low willingness to pay is attributed to citizens’ distrust of government policies as well as their ignorance of the harmful effects of air pollution. In general, the results of this study, with regard to the cost of pollution from Single-Bounded Dichotomous Choice are closer to actual market conditions.  相似文献   
90.
The present paper is an attempt to integrate a semi-automated object-based image analysis (OBIA) classification framework and a cellular automata-Markov model to study land use/land cover (LULC) changes. Land use maps for the Sarab plain in Iran for the years 2000, 2006, and 2014 were created from Landsat satellite data, by applying an OBIA classification using the normalized difference vegetation index, salinity index, moisture stress index, soil-adjusted vegetation index, and elevation and slope indicators. The classifications yielded overall accuracies of 91, 93, and 94% for 2000, 2006, and 2014, respectively. Finally, using the transition matrix, the spatial distribution of land use was simulated for 2020. The results of the study revealed that the number of orchards with irrigated agriculture and dry-farm agriculture in the Sarab plain is increasing, while the amount of bare land is decreasing. The results of this research are of great importance for regional authorities and decision makers in strategic land use planning.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号