首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11篇
  免费   0篇
地质学   11篇
  2003年   1篇
  2001年   1篇
  2000年   2篇
  1994年   1篇
  1992年   2篇
  1991年   1篇
  1988年   1篇
  1985年   2篇
排序方式: 共有11条查询结果,搜索用时 390 毫秒
1.
Compositional Geometry and Mass Conservation   总被引:1,自引:0,他引:1  
A geometrical structure is imposed on compositional data by physical and chemical laws, principally mass conservation. Therefore, statistical or mathematical investigation of possible relations between data values and such laws must be consistent with this structure. This demands that geometrical concepts, such as points that specify both mass and composition in linear space, and lines in projective space that specify composition only, be clearly defined and consistent with mass conservation. Mass thus becomes the norm in composition space in place of the Euclidean norm of ordinary space. Coordinate transformations inconsistent with this geometry are accordingly unnatural and misleading. They are also unnecessary because correlation arising from the constant mass presents no unusual difficulty in the analysis of the underlying quadratic form.  相似文献   
2.
Mathematical Geosciences -  相似文献   
3.
This paper describes a new method of analyzing the risk incurred when the outcome of a decision depends on interpolated values, for example, on the flow through an aquifer sparsely sampled for permeability or on the ratio of waste to ore in a mineral deposit sparsely sampled for grade. The method uses large families of interpolations constructed between sample values using adaptations of the well-known midpoint displacement method for generating pseudo-fractional Brownian motion trajectories. The parameters defining each family are chosen interactively by specialists to incorporate their expert knowledge. Each family, or ensemble, then defines a population of values for any global characteristic (functional) such as flow rate or waste ratio. The probabilities of various outcomes are estimated by counting them and calculating their ratios. For example, if 900 out of 1000 are acceptable the chance of success is estimated to be 90%.  相似文献   
4.
    
  相似文献   
5.
Conclusion In closing, I think it not inappropriate to make clear—both toReply's authors and to the geostatistical community at large—that my purpose in pressing these issues is not mere whimsical or frivolous harassment.My purpose is to urge the mining industry, as I have done before (1984), to insist upon reserve estimation methods that are soundly based onall applicable mathematical principles undistorted by speculative interpretation, selection, or practice. My criticism of geostatistics is not so much of what its practitioners do — and of what I first did long ago (1959)—but of the undisciplined and misleading things its pundits say about what they do.  相似文献   
6.
Stochastic process theory involves integrals of measurable functions over probability measure spaces. One of these is the ensemble space, Ω, whose members are sample functions on Euclidean spaceR k and the other isR k itself. What geostatisticians call the “theory of regionalized variables” is said to based on stochastic theory. A recent paper inMathematical Geology proclaims a distinction between “probabilistic” and “deterministic” geostatistics. The former is said to rely on “ensemble integrals” over Ω and the latter on “spatial integrals” overR k. This study shows that the proposed distinction rests on an arbitrary choice between two estimators for the covariance of a stochastic process; neither is an ensemble integral, both are spatial integrals, and both are Kolmogorov inconsistent. The “deterministic” estimator is identical with that of classical bivariate least-squares regression in which “spatial structure” is of no consequence. This study shows that both stochastic models are suboptimal approximations to the unique nonstationary classical statistical multivariate regression model generated by each sample pattern. The stochastic process model and its “spatial continuity measures,” thus, appear as questionable mathematical embellishments on suboptimal estimates, correspondence with geomorphic reality is tenuous, and estimates are biased and distorted. Various related misconceptions in the paper are also discussed.  相似文献   
7.
8.
Fourier optics and an optical bench model are used to construct an ensemble of candidate functions representing variational patterns in an undersampled two dimensional functiong(x,y). The known sample functions(x,y) is the product ofg(x,y) and a set of unit impulses on the sample point patternp(x,y) which, from the optical point of view, is an aperture imposing strict mathematical limits on what the sample can tell aboutg(x,y). The laws of optics enforce much needed—and often lacking—conceptual discipline in reconstructing candidate variational patterns ing(x,y). The Fourier transform (FT) ofs(x,y) is the convolution of the FT's ofg(x,y) andp(x,y). If the convolution shows aliasing or confounding of frequencies undersampling is surely present and all reconstructions are indeterminate. Then information from outsides(x,y) is required and it is easily expressed in frequency terms so that the principles of optical filtering and image reconstruction can be applied. In the application described and pictured the FT ofs(x,y) was filtered to eliminate unlikely or uninteresting high frequency amplitude maxima. A menu of the 100 strongest remaining terms was taken as indicating the principle variational patterns ing(x,y). Subsets of 10 terms from the menu were chosen using stepwise regression. By so restricting the subset size both the variance and the span of their inverse transforms were made consistent with those of the data. The amplitudes of the patterns being overdetermined, it was possible to estimate the phases also. The inverse transforms of 9 patterns so selected are regarded as ensembles of reconstructions, that is as stochastic process models, from which estimates of the mean and other moments can be calculated.This paper was presented at Emerging Concepts, MGUS-87 Conference, Redwood City, California, 13–15 April 1987.  相似文献   
9.
10.
Pseudo-fractal interpolation for risk analysis  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号