首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   252篇
  免费   21篇
  国内免费   11篇
测绘学   9篇
大气科学   21篇
地球物理   96篇
地质学   110篇
海洋学   16篇
天文学   14篇
综合类   3篇
自然地理   15篇
  2023年   2篇
  2022年   6篇
  2021年   12篇
  2020年   17篇
  2019年   18篇
  2018年   23篇
  2017年   20篇
  2016年   35篇
  2015年   16篇
  2014年   24篇
  2013年   16篇
  2012年   15篇
  2011年   19篇
  2010年   10篇
  2009年   6篇
  2008年   7篇
  2007年   5篇
  2006年   8篇
  2005年   4篇
  2004年   2篇
  2003年   3篇
  2002年   1篇
  2001年   1篇
  2000年   2篇
  1999年   1篇
  1997年   1篇
  1994年   1篇
  1993年   3篇
  1992年   1篇
  1990年   1篇
  1989年   1篇
  1985年   1篇
  1982年   1篇
  1979年   1篇
排序方式: 共有284条查询结果,搜索用时 31 毫秒
31.
Analysis of amplitude variation with offset is an essential step for reservoir characterization. For an accurate reservoir characterization, the amplitude obtained with an isotropic assumption of the reservoir must be corrected for the anisotropic effects. The objective is seismic anisotropic amplitude correction in an effective medium, and, to this end, values and signs of anisotropic parameter differences (Δδ and Δε) across the reflection interfaces are needed. These parameters can be identified by seismic and well log data. A new technique for anisotropic amplitude correction was developed to modify amplitude changes in seismic data in transversely isotropic media with a vertical axis of symmetry. The results show that characteristics of pre-stack seismic data, that is, amplitude variation with offset gradient, can be potentially related to the sign of anisotropic parameter differences (Δδ and Δε) between two layers of the reflection boundary. The proposed methodology is designed to attain a proper fit between modelled and observed amplitude variation with offset responses, after anisotropic correction, for all possible lithofacies at the reservoir boundary. We first estimate anisotropic parameters, that is, δ and ε, away from the wells through Backus averaging of elastic properties resulted from the first pass of isotropic pre-stack seismic inversion, on input data with no amplitude correction. Next, we estimate the anisotropic parameter differences at reflection interfaces (values and signs of Δδ and Δε). We then generate seismic angle gather data after anisotropic amplitude correction using Rüger's equation for the P-P reflection coefficient. The second pass of isotropic pre-stack seismic inversion is then performed on the amplitude-corrected data, and elastic properties are estimated. Final outcome demonstrates how introduced methodology helps to reduce the uncertainty of elastic property prediction. Pre-stack seismic inversion on amplitude-corrected seismic data results in more accurate elastic property prediction than what can be obtained from non-corrected data. Moreover, a new anisotropy attribute (ν) is presented for improvement of lithology identification.  相似文献   
32.
ABSTRACT

Developing a general framework to capture the complexities associated with the non-linear and adaptive nature of farmers facing water resources scarcity is a challenging problem. This paper integrates agent-based modelling (ABM) and a data mining method to develop a hybrid socio-hydrological framework to provide future insights for policy-makers. The data associated with the farmers’ main characteristics were collected through field surveys and interviews. Afterwards, the association rule was employed to discover the main patterns representing the farmers’ agricultural decisions. The discovered patterns were then used as the behavioural rules in ABM to simulate the agricultural activities. The proposed framework has been was applied to explore the interactions between agricultural activities and the main river feeding the Urmia-Lake, Iran. The outcomes indicate that farmers’ acquisitive traits and belongings have significant impacts on their socio-hydrological interactions. The reported values of the efficiency criteria may support the satisfactory performance of the proposed framework.  相似文献   
33.
Matching pursuit belongs to the category of spectral decomposition approaches that use a pre-defined discrete wavelet dictionary in order to decompose a signal adaptively. Although disengaged from windowing issues, matching point demands high computational costs as extraction of all local structure of signal requires a large size dictionary. Thus in order to find the best match wavelet, it is required to search the whole space. To reduce the computational cost of greedy matching pursuit, two artificial intelligence methods, (1) quantum inspired evolutionary algorithm and (2) particle swarm optimization, are introduced for two successive steps: (a) initial estimation and (b) optimization of wavelet parameters. We call this algorithm quantum swarm evolutionary matching pursuit. Quantum swarm evolutionary matching pursuit starts with a small colony of population at which each individual, is potentially a transformed form of a time-frequency atom. To attain maximum pursuit of the potential candidate wavelets with the residual, the colony members are adjusted in an evolutionary way. In addition, the quantum computing concepts such as quantum bit, quantum gate, and superposition of states are introduced into the method. The algorithm parameters such as social and cognitive learning factors, population size and global migration period are optimized using seismic signals. In applying matching pursuit to geophysical data, typically complex trace attributes are used for initial estimation of wavelet parameters, however, in this study it was shown that using complex trace attributes are sensitive to noisy data and would have lower rate of convergence. The algorithm performance over noisy signals, using non-orthogonal dictionaries are investigated and compared with other methods such as orthogonal matching pursuit. The results illustrate that quantum swarm evolutionary matching pursuit has the least sensitivity to noise and higher rate of convergence. Finally, the algorithm is applied to both modelled seismograms and real data for detection of low frequency anomalies to validate the findings.  相似文献   
34.
The complex stream bank profiles in alluvial channels and rivers that are formed after reaching equilibrium has been a popular topic of research for many geomorphologists and river engineers. The entropy theory has recently been successfully applied to this problem. However, the existing methods restrict the further application of the entropy parameter to determine the cross-section slope of the river banks. To solve this limitation, we introduce a novel approach in the extraction of the equation based on the calculation of the entropy parameter (λ) and the transverse slope of the bank profile at threshold channel conditions. The effects of different hydraulic and geometric parameters are evaluated on a variation of the entropy parameter. Sensitivity analysis on the parameters affecting the entropy parameter shows that the most effective parameter on the λ-slope multiplier is the maximum slope of the bank profile and the dimensionless lateral distance of the river banks.  相似文献   
35.
36.
Geomagnetism and Aeronomy - In this study, a hypothesis is proposed about the possible effect of Geomagnetic field (GMF) on the charge structure of a thundercloud based on Lorentz force equation...  相似文献   
37.
Helical piles are structural deep foundation elements, which can be categorized as torque-driven piles without any limitations to implement in marine situations. Different methods are used to predict the axial capacity of helical piles, such as static analysis, but have some limitation for this type of piles on marine conditions. In situ testing methods as supplement of static analysis have been rarely used for helical piles. In geotechnical engineering practice, the most common in situ tests particularly applicable for coastal or offshore site investigation are cone penetration test (CPT) and piezocone penetration test (CPTu). The CPT is simple, repeatable, and prepares the continuous records of soil layers. In this paper, a data bank has been compiled by collecting the results of static pile load tests on thirty-seven helical piles in ten different sites including CPT or CPTu data. Axial capacities of thirty-seven helical piles in different sites were predicted by direct CPT methods and static analysis. Accuracy estimation of ten direct CPT methods to predict the axial capacity of helical piles was investigated in this study. Comparisons have been made among predicted values and measured capacity from the pile load tests. Results indicated that the recently developed methods such as NGI-05 (2005), ICP-05 (2005), and UWA-05 (2005) predicted axial capacity of helical piles more accurately than the other methods such as Meyerhof (1983), Schmertmann (1978), Dutch (1979), LCPC (1982), or Unicone (1997). However, more investigations are required to establish better correlation between CPT data and axial capacity of helical piles.  相似文献   
38.
One important tool for water resources management in arid and semi-arid areas is groundwater potential mapping. In this study, four data-mining models including K-nearest neighbor (KNN), linear discriminant analysis (LDA), multivariate adaptive regression splines (MARS), and quadric discriminant analysis (QDA) were used for groundwater potential mapping to get better and more accurate groundwater potential maps (GPMs). For this purpose, 14 groundwater influence factors were considered, such as altitude, slope angle, slope aspect, plan curvature, profile curvature, slope length, topographic wetness index (TWI), stream power index, distance from rivers, river density, distance from faults, fault density, land use, and lithology. From 842 springs in the study area, in the Khalkhal region of Iran, 70 % (589 springs) were considered for training and 30 % (253 springs) were used as a validation dataset. Then, KNN, LDA, MARS, and QDA models were applied in the R statistical software and the results were mapped as GPMs. Finally, the receiver operating characteristics (ROC) curve was implemented to evaluate the performance of the models. According to the results, the area under the curve of ROCs were calculated as 81.4, 80.5, 79.6, and 79.2 % for MARS, QDA, KNN, and LDA, respectively. So, it can be concluded that the performances of KNN and LDA were acceptable and the performances of MARS and QDA were excellent. Also, the results depicted high contribution of altitude, TWI, slope angle, and fault density, while plan curvature and land use were seen to be the least important factors.  相似文献   
39.
Geostatistical optimization in designing infill boreholes is an important cost-effective approach in increasing the accuracy of the tonnage and grade of an ore deposit. In this research, a new approach is proposed to design the optimum infill directional boreholes. In the proposed approach, the Kriging estimation variance is considered as the objective function and the number and properties of the optimum boreholes are estimated to minimize the objective function. The optimization procedure is implemented by Particle Swarm Optimization (PSO) algorithm. Range of the spatial and directional properties of new boreholes is determined by considering the primary information of the mineralization and administrative constraint of drilling. Then, the PSO algorithm is iteratively applied, and in each iteration, the variation of the estimated Kriging variance after drilling the new boreholes is determined and properties of the new boreholes are updated. The iterative procedure of the algorithm is continued until minimum Kriging variance is satisfied. The approach was applied to the Dalli Cu-Au porphyry deposit in Iran and three new infill directional boreholes were designed by considering six earlier boreholes from the preliminary exploration stage. New optimum boreholes were located where less information from the preliminary exploration stage exists and the highest variance is considered. Two new boreholes are near to vertical (78°) and the third is an inclined with 55° dip. By drilling these three new boreholes, the estimated grade model could be upgraded by 20%. For simplicity, quickness and the ability to search for the required numbers and specifications of a group of directional boreholes in a 3D environment are the most advantages aspects of the proposed approach.  相似文献   
40.
Quantitative analyses of groundwater flow and transport typically rely on a physically‐based model, which is inherently subject to error. Errors in model structure, parameter and data lead to both random and systematic error even in the output of a calibrated model. We develop complementary data‐driven models (DDMs) to reduce the predictive error of physically‐based groundwater models. Two machine learning techniques, the instance‐based weighting and support vector regression, are used to build the DDMs. This approach is illustrated using two real‐world case studies of the Republican River Compact Administration model and the Spokane Valley‐Rathdrum Prairie model. The two groundwater models have different hydrogeologic settings, parameterization, and calibration methods. In the first case study, cluster analysis is introduced for data preprocessing to make the DDMs more robust and computationally efficient. The DDMs reduce the root‐mean‐square error (RMSE) of the temporal, spatial, and spatiotemporal prediction of piezometric head of the groundwater model by 82%, 60%, and 48%, respectively. In the second case study, the DDMs reduce the RMSE of the temporal prediction of piezometric head of the groundwater model by 77%. It is further demonstrated that the effectiveness of the DDMs depends on the existence and extent of the structure in the error of the physically‐based model.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号