首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   577篇
  免费   22篇
  国内免费   4篇
测绘学   26篇
大气科学   38篇
地球物理   171篇
地质学   283篇
海洋学   33篇
天文学   26篇
综合类   3篇
自然地理   23篇
  2024年   1篇
  2023年   4篇
  2022年   11篇
  2021年   14篇
  2020年   10篇
  2019年   7篇
  2018年   46篇
  2017年   39篇
  2016年   39篇
  2015年   31篇
  2014年   49篇
  2013年   34篇
  2012年   37篇
  2011年   47篇
  2010年   33篇
  2009年   26篇
  2008年   14篇
  2007年   11篇
  2006年   12篇
  2005年   37篇
  2004年   55篇
  2003年   26篇
  2002年   3篇
  2001年   3篇
  1999年   2篇
  1998年   2篇
  1997年   4篇
  1996年   3篇
  1995年   1篇
  1993年   1篇
  1984年   1篇
排序方式: 共有603条查询结果,搜索用时 15 毫秒
71.
A challenge when working with multivariate data in a geostatistical context is that the data are rarely Gaussian. Multivariate distributions may include nonlinear features, clustering, long tails, functional boundaries, spikes, and heteroskedasticity. Multivariate transformations account for such features so that they are reproduced in geostatistical models. Projection pursuit as developed for high dimensional data exploration can also be used to transform a multivariate distribution into a multivariate Gaussian distribution with an identity covariance matrix. Its application within a geostatistical modeling context is called the projection pursuit multivariate transform (PPMT). An approach to incorporate exhaustive secondary variables in the PPMT is introduced. With this approach the PPMT can incorporate any number of secondary variables with any number of primary variables. A necessary alteration to the approach to make this numerically practical was the implementation of a continuous probability estimator that relies on Bernstein polynomials for the transformation that takes place in the projections. Stopping criteria were updated to incorporate a bootstrap t test that compares data sampled from a multivariate Gaussian distribution with the data undergoing transformation.  相似文献   
72.
Several risk factors associated with the increased likelihood of healthcare-associated Clostridium difficile infection (CDI) have been identified in the literature. These risk factors are mainly related to age, previous CDI, antimicrobial exposure, and prior hospitalization. No model is available in the published literature that can be used to predict the CDI incidence using healthcare administration data. However, the administrative data can be imprecise and may challenge the building of classical statistical models. Fuzzy set theory can deal with the imprecision inherent in such data. This research aimed to develop a model based on deterministic and fuzzy mathematical techniques for the prediction of hospital-associated CDI by using the explanatory variables controllable by hospitals and health authority administration. Retrospective data on CDI incidence and other administrative data obtained from 22 hospitals within a regional health authority in British Columbia were used to develop a decision tree (deterministic technique based) and a fuzzy synthetic evaluation model (fuzzy technique based). The decision tree model had a higher prediction accuracy than that of the fuzzy based model. However, among the common results predicted by two models, 72 % were correct. Therefore, this relationship was used to combine their results to increase the precision and the strength of evidence of the prediction. These models were further used to develop an Excel-based tool called C. difficile Infection Incidence Prediction in Hospitals (CDIIPH). The tool can be utilized by health authorities and hospitals to predict the magnitude of CDI incidence in the following quarter.  相似文献   
73.
Regional frequency analysis is an important tool to properly estimate hydrological characteristics at ungauged or partially gauged sites in order to prevent hydrological disasters. The delineation of homogeneous groups of sites is an important first step in order to transfer information and obtain accurate quantile estimates at the target site. The Hosking–Wallis homogeneity test is usually used to test the homogeneity of the selected sites. Despite its usefulness and good power, it presents some drawbacks including the subjective choice of a parametric distribution for the data and a poorly justified rejection threshold. The present paper addresses these drawbacks by integrating nonparametric procedures in the L-moment homogeneity test. To assess the rejection threshold, three resampling methods (permutation, bootstrap and Pólya resampling) are considered. Results indicate that permutation and bootstrap methods perform better than the parametric Hosking–Wallis test in terms of power as well as in time and procedure simplicity. A real-world case study shows that the nonparametric tests agree with the HW test concerning the homogeneity of the volume and the bivariate case while they disagree for the peak case, but that the assumptions of the HW test are not well respected.  相似文献   
74.
Large observed datasets are not stationary and/or depend on covariates, especially, in the case of extreme hydrometeorological variables. This causes the difficulty in estimation, using classical hydrological frequency analysis. A number of non-stationary models have been developed using linear or quadratic polynomial functions or B-splines functions to estimate the relationship between parameters and covariates. In this article, we propose regularised generalized extreme value model with B-splines (GEV-B-splines models) in a Bayesian framework to estimate quantiles. Regularisation is based on penalty and aims to favour parsimonious model especially in the case of large dimension space. Penalties are introduced in a Bayesian framework and the corresponding priors are detailed. Five penalties are considered and the corresponding priors are developed for comparison purpose as: Least absolute shrinkage and selection (Lasso and Ridge) and smoothing clipped absolute deviations (SCAD) methods (SCAD1, SCAD2 and SCAD3). Markov chain Monte Carlo (MCMC) algorithms have been developed for each model to estimate quantiles and their posterior distributions. Those approaches are tested and illustrated using simulated data with different sample sizes. A first simulation was made on polynomial B-splines functions in order to choose the most efficient model in terms of relative mean biais (RMB) and the relative mean-error (RME) criteria. A second simulation was performed with the SCAD1 penalty for sinusoidal dependence to illustrate the flexibility of the proposed approach. Results show clearly that the regularized approaches leads to a significant reduction of the bias and the mean square error, especially for small sample sizes (n < 100). A case study has been considered to model annual peak flows at Fort-Kent catchment with the total annual precipitations as covariates. The conditional quantile curves were given for the regularized and the maximum likelihood methods.  相似文献   
75.
Seismic behavior of gravity dams has long been evaluated using a representative two‐dimensional (2D) system. Formulated for the gravity dams built in wide canyons, the assumption is nevertheless utilized extensively for almost all concrete dams due to the established procedures as well as the expected computational costs of a three‐dimensional model. However, a significant number of roller‐compacted concrete dams, characterized as such systems, do not conform to the basic assumptions of these methods by violating the conditions on canyon dimensions and joint‐spacing/details. Based on the premise that the 2D modeling assumption is overstretched for practical purposes in a variety of settings, the purpose of this study is to critically evaluate the use of 2D modeling for the prediction of the seismic demands on these systems. Using a rigorous soil–structure interaction approach, the difference between the two and three‐dimensional response for gravity dams was investigated first in the frequency domain for a range of canyon widths and foundation to dam moduli ratios. Then, the time domain differences between the crest displacements and the maximum principal stress were obtained using 70 different ground motions in order to show the possible bias introduced into the analysis results due to the modeling approach. The results of the study show that even for relatively wide canyons, the 2D analysis can lead to misleading predictions. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   
76.
This paper reports findings of an experimental study conducted on replaceable links for steel eccentrically braced frames (EBFs). A replaceable link detail which is based on splicing the directly connected braces and the beam outside the link is proposed. This detail eliminates the need to use hydraulic jacks and flame cutting operations for replacement purposes. Performance of this proposed replaceable link was studied by conducting eight nearly full‐scale EBF tests under quasi‐static cyclic loading. The link length ratio, stiffening of the link, loading protocol, connection type, bolt pretension, gap size of splice connections, and demand‐to‐capacity ratios of members were considered as the prime variables. The specimens primarily showed two types of failure modes: link web fracture and fracture of the flange at the link‐to‐brace connection. No failures were observed at the splice connections indicating that the proposed replaceable link detail provides an excellent response. The inelastic rotation capacity provided by the replaceable links satisfied the requirements of the AISC Seismic Provisions for Structural Steel Buildings (AISC341–10). The overstrength factor of the links exceeded 2.0, which is larger than the value assumed for EBF links by design provisions. The high level of overstrength resulted in brace buckling in one of the specimens demonstrating the importance of overstrength factor used for EBF links. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   
77.
The variogram is a key parameter for geostatistical estimation and simulation. Preferential sampling may bias the spatial structure and often leads to noisy and unreliable variograms. A novel technique is proposed to weight variogram pairs in order to compensate for preferential or clustered sampling . Weighting the variogram pairs by global kriging of the quadratic differences between the tail and head values gives each pair the appropriate weight, removes noise and minimizes artifacts in the experimental variogram. Moreover, variogram uncertainty could be computed by this technique. The required covariance between the pairs going into variogram calculation, is a fourth order covariance that must be calculated by second order moments. This introduces some circularity in the calculation whereby an initial variogram must be assumed before calculating how the pairs should be weighted for the experimental variogram. The methodology is assessed by synthetic and realistic examples. For synthetic example, a comparison between the traditional and declustered variograms shows that the declustered variograms are better estimates of the true underlying variograms. The realistic example also shows that the declustered sample variogram is closer to the true variogram.  相似文献   
78.
Many catalogues, agency reports and research articles have been published on seismicity of Turkey and its surrounding since 1950s. Given existing magnitude heterogeneity, erroneous information on epicentral location, event date and time, this past published data however is far from fulfilling the required standards. Paucity of a standardized format in the available catalogues have reinforced the need for a refined and updated catalogue for earthquake related hazard and risk studies. During this study, ~37,000 earthquakes and related parametric data were evaluated by utilizing more than 41 published studies and, an integrated database was prepared in order to analyse all parameters acquired from the catalogues and references for each event. Within the scope of this study, the epicentral locations of M ≥ 5.0 events were firstly reappraised based on the updated Active Fault Map of Turkey. An improved catalogue of 12.674 events for the period 1900–2012 was as a result recompiled for the region between 32–45N° and 23–48E° by analyzing in detail accuracy of all seismological parameters available for each event. The events consist of M ≥ 4.0 are reported in several magnitude scales (e.g. moment magnitude, Mw; surface wave magnitude, MS; body-wave magnitude mb; local magnitude ML and duration magnitude Md) whereas the maximum focal depth reaches up to 225-km. In order to provide homogenous data, the improved catalogue is unified in terms of Mw. Fore-and aftershocks were also removed from the catalogue and completeness analyses were performed both separately for various tectonic sources and as a whole for the study region of interest. Thus, the prepared homogenous and declustered catalogue consisting of 6573 events provides the basis for a reliable input to the seismic hazard assessment studies for Turkey and its surrounding areas.  相似文献   
79.
A review on the historical evolution of seismic hazard maps in Turkey is followed by summarizing the important aspects of the updated national probabilistic seismic hazard maps. Comparisons with the predecessor probabilistic seismic hazard maps as well as the implications on the national design codes conclude the paper.  相似文献   
80.
Effectiveness of ultrasonication, microwave technologies, and enzyme addition prior to anaerobic digestion is investigated using sludge samples taken from the secondary settling tank of a domestic wastewater treatment plant to improve methane production, enhance dewaterability characteristics of the sludge, and to reduce excess sludge. Microwave pre‐treatment (1500 W, 10 min at 175 °C) results in better extra digester performance (compared to the control reactor) in terms of methane production (25 m3 ton?1 suspended solids (SS)) than ultrasound (no improvement) and enzyme pre‐treatment (11 m3 ton?1 SS). While methane production is not improved as a result of ultrasonication pre‐treatment (15 000 kJ kg?1 SS), a noticeable increase (19%) is observed in the case of microwave pre‐treatment. Higher compactibility values are obtained after ultrasonication and microwave application compared to the control (i.e., from 7.1 to 8.7 and 9.2%, respectively) before anaerobic digestion. Although ultrasonication and microwave application decrease the dewaterability of the raw sludge (capillary suction time (CST) from 827 to 1364 and 2290 s, respectively), similar dewaterability results are obtained at the end of the anaerobic digestion process for all pre‐treated sludge samples. An economic assessment of this study shows that pre‐treatment with microwave results in more than 10‐fold less net cost compared to the enzyme application.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号