首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Summary  Size distributions of fragments of crushed rock in conveyor belts and of blasted rock in a muckpile obtained by sieving are compared with the size distributions obtained by digital image analysis of photographs of the same materials taken on-site. Several calculation methods are tested, based on the raw distribution of fragment areas and on the volume-transformed ones. The influence of the calibration of the system on the results and the performance of the system in a non-calibrated mode are evaluated. The capacity of some distributions (Rosin-Rammler, Swebrec and lognormal) to fit the data in the coarse region (where particles can be delineated, i.e. discriminated individually) and to extrapolate to the non-delineated fines (where particles cannot be outlined and their contour delineated) is assessed. The error between the sizes measured and the sizes of the reference distributions (determined by sieving) increases from the coarse to the fines region. The maximum error at a given size depends primarily on its value relative to the fines cut-off (FCO) of the image analysis. In general, at sizes greater than the FCO, where the system is able to delineate fragments reliably, both volume and surface-based, calibrated, calculations can determine the sizes with maximum error expectancy of about 30%. Below the FCO, only the calibrated, volume calculation maintains a maximum error of 30%, down to sizes of about one fourth the FCO, rapidly increasing for smaller sizes. Where the calibration is done based on data above the FCO, errors can be large below this point, in excess of 80% at sizes half the FCO. In the fines range (sizes smaller than 0.2 times the FCO) the maximum errors can be close to or greater than 100% for most of the calculations and function fittings. Of the distributions tested, all of them are acceptable at sizes above the FCO; below that, the Swebrec function seems to adapt better towards the fines than the Rosin-Rammler and lognormal. Correspondence: José A. Sanchidrián, Universidad Politécnica de Madrid, E.T.S.I. Minas, Rios Rosas 21, 28003 Madrid, Spain  相似文献   

2.
受工程勘察成本及试验场地限制,可获得的试验数据通常有限,基于有限的试验数据难以准确估计岩土参数统计特征和边坡可靠度。贝叶斯方法可以融合有限的场地信息降低对岩土参数不确定性的估计进而提高边坡可靠度水平。但是,目前的贝叶斯更新研究大多假定参数先验概率分布为正态、对数正态和均匀分布,似然函数为多维正态分布,这种做法的合理性有待进一步验证。总结了岩土工程贝叶斯分析常用的参数先验概率分布及似然函数模型,以一个不排水黏土边坡为例,采用自适应贝叶斯更新方法系统探讨了参数先验概率分布和似然函数对空间变异边坡参数后验概率分布推断及可靠度更新的影响。计算结果表明:参数先验概率分布对空间变异边坡参数后验概率分布推断及可靠度更新均有一定的影响,选用对数正态和极值I型分布作为先验概率分布推断的参数后验概率分布离散性较小。选用Beta分布和极值I型分布获得的边坡可靠度计算结果分别偏于保守和危险,选用对数正态分布获得的边坡可靠度计算结果居中。相比之下,似然函数的影响更加显著。与其他类型似然函数相比,由多维联合正态分布构建的似然函数可在降低对岩土参数不确定性估计的同时,获得与场地信息更为吻合的计算结果。另外,构建似然函数时不同位置处测量误差之间的自相关性对边坡后验失效概率也具有一定的影响。  相似文献   

3.
就岩石在爆炸载作用下破坏块度分布的物理机理进行了分析。从分析可以看出,岩石破坏块度的对数正态分布与材料的多重破坏有关。在封闭爆炸情况下这种分布描述爆心附近岩石破坏块度的分布,此处材料处于静水压力状态,应变率很高,材料的破坏为多重破坏。而Rosin-Rammler分布主要描述离爆心较远处岩石破坏的块度分布,此处岩石的破坏主要是由环向拉力引起的径向裂纹所致,以单重破坏为主。  相似文献   

4.
Characterization of Geochemical Distributions Using Multifractal Models   总被引:2,自引:0,他引:2  
The use of multifractals in the applied sciences has proven useful in the characterization and modeling of complex phenomena. Multifractal theory has also been recently applied to the study and characterization of geochemical distributions, and its relation to spatial statistics clearly stated. The present paper proposes a two-dimensional multifractal model based on a trinomial multiplicative cascade as a proxy to some geochemical distribution. The equations for the generalized dimensions, mass exponent, coarse Lipschitz–Hölder exponent, and multifractal spectrum are derived. This model was tested with an example data set used for geochemical exploration of gold deposits in Northwest Portugal. The element used was arsenic because a large number of sample assays were below detection limit for gold. Arsenic, however, has a positive correlation with gold, and the two generations of arsenopyrite identified in the gold quartz veins are consistent with different mineralizing events, which gave rise to different gold grades. Performing the multifractal analysis has shown problems arising in the subdivision of the area with boxes of constant side length and in the uncertainty the edge effects produce in the experimental estimation of the mass exponent. However, it was possible to closely fit a multifractal spectrum to the data with enrichment factors in the range 2.4–2.6 and constant K1 = 1.3. Such parameters may give some information on the magnitude of the concentration efficiency and heterogeneity of the distribution of arsenic in the mineralized structures. In a simple test with estimated points using ordinary lognormal kriging, the fitted multifractal model showed the magnitude of smoothing in estimated data. Therefore, it is concluded that multifractal models may be useful in the stochastic simulation of geochemical distributions.  相似文献   

5.
It is assumed that a rock on the lunar surface loses mass as a result of bombardment by hypervelocity meteoroids. The mass of rock and its fragments can be modeled as a nonincreasing stochastic process with independent increments. In the case of a self-similar, one-shot splitting law, Filippov's extension of Kolmogorov's results produces asymptotic mass densities (number densities) which can be of lognormal, fractional exponential (Rosin-Rammler), or inverse-power (Pareto) types. The results are extended in three directions. A new explicit formula for the number density is obtained in the case where the splitting law is a two-term polynomial. The effect of splitting laws and splitting rates which depend on randomly varying parameters, e.g., meteoroid mass and velocity, is considered. The average number density with respect to a distribution of initial rock masses and initial rock birthdays also is studied. The asymptotic average density for an inverse-power distribution of initial masses has the same shape as the unaveraged density, but a beta (Β, 1) distribution of rock birthdays strongly alters the shape of the asymptotic number density. Research supported by Office of Naval Research, under contract NONR 4010(09) awarded to Department of Statistics, The Johns Hopkins University.  相似文献   

6.
Turbidite bed thickness distributions are often interpreted in terms of power laws, even when there are significant departures from a single straight line on a log–log exceedence probability plot. Alternatively, these distributions have been described by a lognormal mixture model. Statistical methods used to analyse and distinguish the two models (power law and lognormal mixture) are presented here. In addition, the shortcomings of some frequently applied techniques are discussed, using a new data set from the Tarcău Sandstone of the East Carpathians, Romania, and published data from the Marnoso‐Arenacea Formation of Italy. Log–log exceedence plots and least squares fitting by themselves are inappropriate tools for the analysis of bed thickness distributions; they must be accompanied by the assessment of other types of diagrams (cumulative probability, histogram of log‐transformed values, q–q plots) and the use of a measure of goodness‐of‐fit other than R2, such as the chi‐square or the Kolmogorov–Smirnov statistics. When interpreting data that do not follow a single straight line on a log–log exceedence plot, it is important to take into account that ‘segmented’ power laws are not simple mixtures of power law populations with arbitrary parameters. Although a simple model of flow confinement does result in segmented plots at the centre of a basin, the segmented shape of the exceedence curve breaks down as the sampling location moves away from the basin centre. The lognormal mixture model is a sedimentologically intuitive alternative to the power law distribution. The expectation–maximization algorithm can be used to estimate the parameters and thus to model lognormal bed thickness mixtures. Taking into account these observations, the bed thickness data from the Tarcău Sandstone are best described by a lognormal mixture model with two components. Compared with the Marnoso‐Arenacea Formation, in which bed thicknesses of thin beds have a larger variability than thicknesses of the thicker beds, the thinner‐bedded population of the Tarcău Sandstone has a lower variability than the thicker‐bedded population. Such differences might reflect contrasting depositional settings, such as the difference between channel levées and basin plains.  相似文献   

7.
On unbiased backtransform of lognormal kriging estimates   总被引:4,自引:0,他引:4  
Lognormal kriging is an estimation technique that was devised for handling highly skewed data distributions. This technique takes advantage of a logarithmic transformation that reduces the data variance. However, backtransformed lognormal kriging estimates are biased because the nonbias term is totally dependent on a semivariogram model. This paper proposes a new approach for backtransforming lognormal kriging estimates that not only presents none of the problems reported in the literature but also reproduces the sample histogram and, consequently, the sample mean.  相似文献   

8.
Maximum likelihood estimation of joint size from trace length measurements   总被引:5,自引:1,他引:5  
Summary Usually, rock joints are observed in outcrops and excavation walls only as traces. Under some assumptions about the shapes of the joints and the nature of their size distributions, the underlying joint size distribution can be estimated from trace length measurements. However, the interpretation of trace length distributions from line mapping data should be approached with caution. The data are always length-biased and furthermore, the semi-trace length, the trace length, and the underlying joint size may have different distributional forms. Semi-trace length distributions are monotonic decreasing functions not sensitive to changes in the real trace length distributions. Experimental full trace length distributions are shown to have lognormal distributions and to be insensitive to major changes in the underlying joint size distributions. Under the assumptions of joint convexity and circularity a parametric model for the three-dimensional distribution of joint sizes is developed. A maximum likelihood estimation of the distribution of joint diameters, which best reflects the observed joint trace data, and corrects simultaneously for joint censoring, truncation and size bias, is developed. The theory is illustrated with numerical examples using data collected from five field sites.  相似文献   

9.
TORE P°SSE 《Sedimentology》1997,44(6):1011-1014
The grain size distribution within a unimodal sediment can be described as a lognormal distribution when the distribution is formed by only one process. However, most sediments are formed by more than one process giving polymodal sediments. Polymodal sediments have to be described as the sum of several normal distributions, one for each process involved within the formation. Grain size distributions are usually interpreted with the help of graphical methods. Interpretations of polymodal sediments require mathematical methods. In mathematical terms a unimodal sediment can be described as a tangential hyperbolic function (tanh) and a polymodal sediment can generally be described by the sum of two or three tanh-functions. The tanh-method is a tool for identifying and estimating the number of modes within a grain size distribution and helps interpret the processes involved within the formation of a deposit. The mathematical method can also be used to computerize sediment data, allowing storage with just a few numbers. Different samples can easily be compared and classified. Also, this method could be a valuable tool for calculations of various sediment parameters both in geotechnology and hydrogeology.  相似文献   

10.
研究建立用多分量感应测井资料同时快速重建水平层状横向同性介质中横向与纵向电阻率和层界面深度的有效方法。首先,利用电磁场摄动方程、电导率函数与模型参数关系方程以及模式匹配算法得到电磁场并矢Green函数的半解析解,建立多分量感应测井响应的Frèchet导数矩阵的快速算法;在此基础上,借助于规范化处理和奇异值分解技术,给出同时反演水平层状地层中各个地层的纵、横向电阻率以及层界面深度的迭代过程,实现理论合成资料与输入资料的最佳拟合。数值计算证明,该反演算法能够取得较满意的反演效果。  相似文献   

11.
为了克服目前对标准化降水指数(SPI)计算必须首先假设服从某种分布的不足,依据最大熵理论分布对SPI进行计算,以东江流域为例,分别利用最大熵理论分布、Gamma分布、Weibull分布以及对数正态分布四种概率密度函数拟合多年不同时间尺度的降雨数据,并利用AIC、KS、AD法进行拟合度检验,最后将最大熵理论分布与Gamma分布计算的SPI结果进行对比分析。结果表明:相对于其他三种分布,最大熵理论分布的概率密度函数更适用于东江流域15个站点的3、6、12个月的降雨分布;在极端干旱(洪涝)的情况下,相对于Gamma分布,最大熵理论分布的SPI值更小(大),表明其对极端干旱(洪涝)的识别更为敏感。  相似文献   

12.
For national or global resource estimation of frequencies of metals, a lognormal distribution has commonly been recommended but not adequately tested. Tests of frequencies of Cu, Zn, Pb, Ag, and Au contents of 1 984 well-explored mineral deposits display a poor fit to the lognormal distribution. When the same metals plus Mo, Co, Nb2O3, and REE2O3 are grouped into 19 geologically defined deposit types, only eight of the 73 tests fail to be fit by lognormal distribution, and most of those failures are in two deposit types suggesting a problem with those types. Estimates of the mean and standard deviation of each of the metals in each of the deposit types are provided for modeling.   相似文献   

13.
For national or global resource estimation of frequencies of metals a lognormal distribution has sometimes been assumed but never adequately tested. Tests of frequencies of Cu, Zn, Pb, Ag, Au, Mo, Re, Ni, Co, Nb2O3, REE2O3, Cr2O3, Pt, Pd, Ir, Rh, and Ru, contents in over 3000 well-explored mineral deposits display a poor fit to the lognormal distribution. Neither a lognormal distribution nor a power law is an adequate model of the metal contents across all deposits. When these metals are grouped into 28 geologically defined deposit types, only nine of the over 100 tests fail to be fit by the lognormal distribution, and most of those failures are in two deposit types suggesting problems with those types. Significant deviations from lognormal distributions of most metals when ignoring deposit types demonstrate that there is not a global lognormal or power law equation for these metals. Mean and standard deviation estimates of each metal within deposit types provide a basis for modeling undiscovered resources. When tracts of land permissive for specific deposit types are delineated, deposit density estimates and contained metal statistics can be used in Monte Carlo simulations to estimate total amounts of undiscovered metals with associated explicit uncertainties as demonstrated for undiscovered porphyry copper deposits in the Tibetan Plateau of China.  相似文献   

14.
Individual probability-density distributions for the masses of compact objects in 20 X-ray binary systems have been constructed. The mass distributions were modeled using Monte-Carlo simulations. The closeness of the components in systems with massive optical stars was taken into account using K corrections. The parameters of the resulting black-hole mass distributions were obtained using nonparametric statistical methods. The presence of a statistically significant mass gap in the range 3–5M is confirmed. The currently observed probability-density distributions of the compact-object masses are stable against small amounts of data contamination.  相似文献   

15.
Based on the regional hydrogeology and the stratigraphy beneath the Los Alamos National Laboratory (LANL) site, New Mexico (USA), a site-scale groundwater model has been built with more than 20 stratified hydrofacies. A stepwise inverse method was developed to estimate permeabilities for these hydrofacies by coupling observation data from different sources and at various spatial scales including single-well test, multiple-well pumping test and regional aquifer monitoring data. Statistical analyses of outcrop permeability measurements and single-well test results were used to define the prior distributions of the parameters. These distributions were used to define the parameter initial values and the lower and upper bounds for inverse modeling. A number of inverse modeling steps were performed including the use of drawdown data from the pump tests at two wells (PM-2 and PM-4) separately, and a joint inversion coupling PM-2 and PM-4 pump test data and head data from regional aquifer monitoring. Parameter sensitivity coefficients for different data sets were computed to analyze if the model parameters can be estimated accurately with the data provided at different steps. The joint inversion offers a reasonable fit to all data sets. The uncertainty of estimated parameters for the hydrofacies is addressed with the parameter confidence intervals.  相似文献   

16.
Undiscovered oil and gas assessments are commonly reported as aggregate estimates of hydrocarbon volumes. Potential commercial value and discovery costs are, however, determined by accumulation size, so engineers, economists, decision makers, and sometimes policy analysts are most interested in projected discovery sizes. The lognormal and Pareto distributions have been used to model exploration target sizes. This note contrasts the outcomes of applying these alternative distributions to the play level assessments of the U.S. Geological Survey's 1995 National Oil and Gas Assessment. Using the same numbers of undiscovered accumulations and the same minimum, medium, and maximum size estimates, substitution of the shifted truncated lognormal distribution for the shifted truncated Pareto distribution reduced assessed undiscovered oil by 16% and gas by 15%. Nearly all of the volume differences resulted because the lognormal had fewer larger fields relative to the Pareto. The lognormal also resulted in a smaller number of small fields relative to the Pareto. For the Permian Basin case study presented here, reserve addition costs were 20% higher with the lognormal size assumption.  相似文献   

17.
All variables of several large data sets from regional geochemical and environmental surveys were tested for a normal or lognormal data distribution. As a general rule, almost all variables (up to more than 50 analysed chemical elements per data set) show neither a normal or a lognormal data distribution. Even when different transformation methods are used more than 70 % of all variables in every single data set do not approach a normal distribution. Distributions are usually skewed, have outliers and originate from more than one process. When dealing with regional geochemical or environmental data normal and/or lognormal distributions are an exception and not the rule. This observation has serious consequences for the further statistical treatment of geochemical and environmental data. The most widely used statistical methods are all based on the assumption that the studied data show a normal or lognormal distribution. Neglecting that geochemcial and environmental data show neither a normal or lognormal distribution will lead to biased or faulty results when such techniques are used. Received: 21 June 1999 · Accepted: 14 August 1999  相似文献   

18.
岩体不连续面迹长、产状、空间组合形态等的不确定性和复杂性,使结构面整体几何形态研究存在困难。利用岩体内大量随机展布的不连续面所具有的统计相似性,结合具有统计意义的数学方法进行结构面参数描述,是岩体随机不连续面三维网络模拟的基础,也为岩体稳定性评价提供可靠依据。本文基于数字近景摄影测量技术对长春市净月开发区东升采石场坡面进行调查研究,获取高精度的大量岩体结构面迹长、产状、间距、张开度等几何参数。利用Matlab编制Kolmogorov-Smirnov(K-S)检验程序,实现结构面参数的K-S法概率分布检验,分析了结构面参数所服从的概率分布类型;同时用SPSS软件检验程序正确性,并引入最优拟合度,选取符合多种概率分布参数的最优拟合概率模型。结果表明:研究区内结构面迹长、间距服从对数正态分布规律,产状服从伽马分布规律。  相似文献   

19.
Extreme value analysis provides a semiparametric method for analyzing the extreme long tails of skew distributions which may be observed when handling mining data. The estimation of important tail characteristics, such as the extreme value index, allows for a discrimination between competing distribution models. It measures the thickness of such long tailed distributions, if only a limited sample is available. This paper stresses the practical implementation of extreme value theory, which is used to discriminate a lognormal from a mixed lognormal distribution in a case study of size distributions for alluvial diamonds.  相似文献   

20.
The assumption of lognormal (parent) field size distributions has for a long time been applied to resource appraisal and evaluation of exploration strategy by the petroleum industry. However, frequency distributions estimated with observed data and used to justify this hypotheses are conditional. Examination of various observed field size distributions across basins and over time shows that such distributions should be regarded as the end result of an economic filtering process. Commercial discoveries depend on oil and gas prices and field development costs. Some new fields are eliminated due to location, depths, or water depths. This filtering process is called economic truncation. Economic truncation may occur when predictions of a discovery process are passed through an economic appraisal model. We demonstrate that (1) economic resource appraisals, (2) forecasts of levels of petroleum industry activity, and (3) expected benefits of developing and implementing cost reducing technology are sensitive to assumptions made about the nature of that portion of (parent) field size distribution subject to economic truncation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号