首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   118篇
  免费   4篇
大气科学   4篇
地球物理   24篇
地质学   49篇
海洋学   7篇
天文学   29篇
自然地理   9篇
  2023年   1篇
  2022年   1篇
  2021年   3篇
  2020年   2篇
  2019年   3篇
  2018年   4篇
  2017年   1篇
  2016年   5篇
  2015年   4篇
  2014年   4篇
  2013年   7篇
  2012年   6篇
  2011年   11篇
  2010年   8篇
  2009年   5篇
  2008年   4篇
  2007年   7篇
  2006年   5篇
  2005年   10篇
  2004年   1篇
  2003年   4篇
  2002年   2篇
  2001年   2篇
  1999年   1篇
  1997年   2篇
  1996年   2篇
  1993年   1篇
  1991年   3篇
  1988年   1篇
  1987年   1篇
  1985年   1篇
  1984年   2篇
  1983年   3篇
  1982年   1篇
  1981年   1篇
  1980年   1篇
  1979年   1篇
  1977年   1篇
排序方式: 共有122条查询结果,搜索用时 15 毫秒
1.
An important aspect in mineral resource evaluation is the reduction of variance when post-processing the grade distributions defined on the support (volume) of the available data into distributions defined on the support of the proposed selective mining units. Although the volume-variance relationship is well understood for the estimation of global grade distributions, it is still an unsolved issue for local estimation studies based on non-parametric geostatistical methods, such as indicator kriging, for which the support correction is not inherent to the method. To clarify this relationship, the local change of support problem is examined in the scope of two parametric models (multi-Gaussian and discrete Gaussian models). It is shown that the variance reduction factor between point and block-support local distributions depends on the block being considered and is less than the global variance reduction factor. As a consequence, post-processing the local point-support grade distributions on the basis of the latter systematically understates the importance of the change of support at the local scale and makes selective mining appear more economically attractive than it really is. In the light of these results, a methodology is proposed to post-process the local point-support distributions obtained via non-parametric (indicator) methods into block-support distributions. An application to simulated data indicates that this methodology provides an accurate estimation at the block support when dealing with diffusion-type random fields.  相似文献   
2.
This paper presents a methodology for assessing local probability distributions by disjunctive kriging when the available data set contains some imprecise measurements, like noisy or soft information or interval constraints. The basic idea consists in replacing the set of imprecise data by a set of pseudohard data simulated from their posterior distribution; an iterative algorithm based on the Gibbs sampler is proposed to achieve such a simulation step. The whole procedure is repeated many times and the final result is the average of the disjunctive kriging estimates computed from each simulated data set. Being data-independent, the kriging weights need to be calculated only once, which enables fast computing. The simulation procedure requires encoding each datum as a pre-posterior distribution and assuming a Markov property to allow the updating of pre-posterior distributions into posterior ones. Although it suffers some imperfections, disjunctive kriging turns out to be a much more flexible approach than conditional expectation, because of the vast class of models that allows its computation, namely isofactorial models.  相似文献   
3.
Histogram and variogram inference in the multigaussian model   总被引:1,自引:4,他引:1  
Several iterative algorithms are proposed to improve the histogram and variogram inference in the framework of the multigaussian model. The starting point is the variogram obtained after a traditional normal score transform. The subsequent step consists in simulating many sets of gaussian values with this variogram at the data locations, so that the ranking of the original values is honored. The expected gaussian transformation and the expected variogram are computed by an averaging operation over the simulated datasets. The variogram model is then updated and the procedure is repeated until convergence. Such an iterative algorithm can adapt to the case of tied data and despike the histogram. Two additional issues are also examined, referred to the modeling of the empirical transformation function and to the optimal pair weighting when computing the sample variogram.  相似文献   
4.
This work deals with the geostatistical simulation of a family of stationary random field models with bivariate isofactorial distributions. Such models are defined as the sum of independent random fields with mosaic-type bivariate distributions and infinitely divisible univariate distributions. For practical applications, dead leaf tessellations are used since they provide a wide range of models and allow conditioning the realizations to a set of data via an iterative procedure (simulated annealing). The model parameters can be determined by comparing the data variogram and madogram, and enable to control the spatial connectivity of the extreme values in the realizations. An illustration to a forest dataset is presented, for which a negative binomial model is used to characterize the distribution of coniferous trees over a wooded area.  相似文献   
5.
This work focuses on a random function model with gamma marginal and bivariate isofactorial distributions, which has been applied in mining geostatistics for estimating recoverable reserves by disjunctive kriging. The objective is to widen its use to conditional simulation and further its application to the modeling of continuous attributes in geosciences. First, the main properties of the bivariate gamma isofactorial distributions are analyzed, with emphasis in the destructuring of the extreme values, the presence of a proportional effect (higher variability in high-valued areas), and the asymmetry in the spatial correlation of the indicator variables with respect to the median threshold. Then, we provide examples of stationary random functions with such bivariate distributions, for which the shape parameter of the marginal distribution is half an integer. These are defined as the sum of squared independent Gaussian random fields. An iterative algorithm based on the Gibbs sampler is proposed to perform the simulation conditional to a set of existing data. Such ‘multivariate chi-square’ model generalizes the well-known multigaussian model and is more flexible, since it allows defining a shape parameter which controls the asymmetry of the marginal and bivariate distributions.  相似文献   
6.
Franchthi Cave, bordering Kiladha Bay, in Greece, is a key archaeological site, due to its long occupation time, from?~?40,000 to?~?5000 year BP. To date, no clear evidence of Neolithic human dwellings in the cave was found, supporting the assumption that Neolithic people may have built a village where there is now Kiladha Bay. During the Neolithic period/Early Holocene, wide areas of the bay were indeed emerged above sea level. Bathymetric and seismic data identified a terrace incised by a valley in?~?1 to 2 m sediment depth. Eight sediment cores, up to 6.3-m-long, were retrieved and analysed using petrophysical, sedimentological, geochemical, and chronostratigraphic methods. The longest core extends into the exposure surface, consisting of a layer of carbonate rubble in a finer matrix, representing weathering processes. Dated organic remains place this unit at?~?8500 cal year BP. It is overlain by stiff silty mud representing an estuarine environment. This mud is capped by reduced sediments with roots marking an exposure surface. A shell-layer, dated to?~?6300 cal year BP, overlies this terrestrial sequence, reflecting the marine transgression. This layer occurs at 10.8 mbsl, 7.7 m deeper than the global sea level at that time, suggesting tectonic subsidence in the area. It is overlain by finer-grained marine carbonate-rich sediments. The top of the core shows traces of eutrophication, pebbles and marine shells, all likely a result of modern anthropogenic processes. These results are interpreted in the context of human occupation: the exposed surface contains pottery sherds, one dating to the Early to Middle Neolithic period, indicating that Neolithic people were present in this dynamic landscape interacting with a migrating coastline. Even if the artefacts are isolated, future investigations of the submerged landscape off Franchthi Cave might lead to the discovery of a Neolithic village, which eventually became buried under marine sediments.  相似文献   
7.
8.
9.
We present a study of the long-term evolution of the cloud of aerosols produced in the atmosphere of Jupiter by the impact of an object on 19 July 2009 (Sánchez-Lavega, A. et al. [2010]. Astrophys. J. 715, L155-L159). The work is based on images obtained during 5 months from the impact to 31 December 2009 taken in visible continuum wavelengths and from 20 July 2009 to 28 May 2010 taken in near-infrared deep hydrogen-methane absorption bands at 2.1-2.3 μm. The impact cloud expanded zonally from ∼5000 km (July 19) to 225,000 km (29 October, about 180° in longitude), remaining meridionally localized within a latitude band from 53.5°S to 61.5°S planetographic latitude. During the first two months after its formation the site showed heterogeneous structure with 500-1000 km sized embedded spots. Later the reflectivity of the debris field became more homogeneous due to clump mergers. The cloud was mainly dispersed in longitude by the dominant zonal winds and their meridional shear, during the initial stages, localized motions may have been induced by thermal perturbation caused by the impact’s energy deposition. The tracking of individual spots within the impact cloud shows that the westward jet at 56.5°S latitude increases its eastward velocity with altitude above the tropopause by 5-10 m s−1. The corresponding vertical wind shear is low, about 1 m s−1 per scale height in agreement with previous thermal wind estimations. We found evidence for discrete localized meridional motions with speeds of 1-2 m s−1. Two numerical models are used to simulate the observed cloud dispersion. One is a pure advection of the aerosols by the winds and their shears. The other uses the EPIC code, a nonlinear calculation of the evolution of the potential vorticity field generated by a heat pulse that simulates the impact. Both models reproduce the observed global structure of the cloud and the dominant zonal dispersion of the aerosols, but not the details of the cloud morphology. The reflectivity of the impact cloud decreased exponentially with a characteristic timescale of 15 days; we can explain this behavior with a radiative transfer model of the cloud optical depth coupled to an advection model of the cloud dispersion by the wind shears. The expected sedimentation time in the stratosphere (altitude levels 5-100 mbar) for the small aerosol particles forming the cloud is 45-200 days, thus aerosols were removed vertically over the long term following their zonal dispersion. No evidence of the cloud was detected 10 months after the impact.  相似文献   
10.
Numerical models constitute the most advanced physical-based methods for modeling complex ground water systems. Spatial and/or temporal variability of aquifer parameters, boundary conditions, and initial conditions (for transient simulations) can be assigned across the numerical model domain. While this constitutes a powerful modeling advantage, it also presents the formidable challenge of overcoming parameter uncertainty, which, to date, has not been satisfactorily resolved, inevitably producing model prediction errors. In previous research, artificial neural networks (ANNs), developed with more accessible field data, have achieved excellent predictive accuracy over discrete stress periods at site-specific field locations in complex ground water systems. In an effort to combine the relative advantages of numerical models and ANNs, a new modeling paradigm is presented. The ANN models generate accurate predictions for a limited number of field locations. Appending them to a numerical model produces an overdetermined system of equations, which can be solved using a variety of mathematical techniques, potentially yielding more accurate numerical predictions. Mathematical theory and a simple two-dimensional example are presented to overview relevant mathematical and modeling issues. Two of the three methods for solving the overdetermined system achieved an overall improvement in numerical model accuracy for various levels of synthetic ANN errors using relatively few constrained head values (i.e., cells), which, while demonstrating promise, requires further research. This hybrid approach is not limited to ANN technology; it can be used with other approaches for improving numerical model predictions, such as regression or support vector machines (SVMs).  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号