首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Infill-sampling design and the Cost of classification errors   总被引:2,自引:0,他引:2  
The criterion used to select infill sample locations should depend on the sampling objective. Minimizing the global estimation variance is the most widely used criterion and is suitable for many problems. However, when the objective of the sampling program is to partition an area of interest into zones of high values and zones of low values, minimizing the expected cost of classification errors is a more appropriate criterion. Unlike the global estimation variance, the cost of classification errors incorporates both the sample locations and the sample values into an objective infill-sampling design criterion.  相似文献   

2.
Sampling design optimization for spatial functions   总被引:4,自引:0,他引:4  
A new procedure is presented for minimizing the sampling requirements necessary to estimate a mappable spatial function at a specified level of accuracy. The technique is based on universal kriging, an estimation method within the theory of regionalized variables. Neither actual implementation of the sampling nor universal kriging estimations are necessary to make an optimal design. The average standard errorand maximum standard error of estimationover the sampling domain are used as global indices of sampling efficiency. The procedure optimally selects those parameters controlling the magnitude of the indices, including the density and spatial pattern of the sample elements and the number of nearest sample elements used in the estimation. As an illustration, the network of observation wells used to monitor the water table in the Equus Beds of Kansas is analyzed and an improved sampling pattern suggested. This example demonstrates the practical utility of the procedure, which can be applied equally well to other spatial sampling problems, as the procedure is not limited by the nature of the spatial function.  相似文献   

3.
Sample schemes used in geostatistical surveys must be suitable for both variogram estimation and kriging. Previously schemes have been optimized for one of these steps in isolation. Ordinary kriging generally requires the sampling locations to be evenly dispersed over the region. Variogram estimation requires a more irregular pattern of sampling locations since comparisons must be made between measurements separated by all lags up to and beyond the range of spatial correlation. Previous studies have not considered how to combine these optimized schemes into a single survey and how to decide what proportion of sampling effort should be devoted to variogram estimation and what proportion devoted to kriging An expression for the total error in a geostatistical survey accounting for uncertainty due to both ordinary kriging and variogram uncertainty is derived. In the same manner as the kriging variance, this expression is a function of the variogram but not of the sampled response data. If a particular variogram is assumed the total error in a geostatistical survey may be estimated prior to sampling. We can therefore design an optimal sample scheme for the combined processes of variogram estimation and ordinary kriging by minimizing this expression. The minimization is achieved by spatial simulated annealing. The resulting sample schemes ensure that the region is fairly evenly covered but include some close pairs to analyse the spatial correlation over short distances. The form of these optimal sample schemes is sensitive to the assumed variogram. Therefore a Bayesian approach is adopted where, rather than assuming a single variogram, we minimize the expected total error over a distribution of plausible variograms. This is computationally expensive so a strategy is suggested to reduce the number of computations required  相似文献   

4.
This study proposes an interactive sampling strategy for locating the hot spot or maximum regions of a concerned attribute in a given area of survey. In the proposed strategy, information analysis is performed based on the ordinary kriging from the existing sample data to suggest a new batch of samples under the criterion of the highest information free energy. The information free energy (F) is a function of information energy (U) and information entropy (S) through F = U - TS, where T is information temperature and is used to coordinate the contribution of U and S to F. Information energy is the value of the concerned attribute, and information entropy is the transformed error variance of kriging and therefore measures the evenness and density of coverage of samples over the area under survey. At early sampling batches, information temperature is high and information entropy dominates the information free energy, and samples are suggested to give an even and dense enough coverage of the whole area under investigation. As samples accumulate, information temperature decreases to enlarge the contribution of information energy, and future samples are taken toward the locations with high attribute values. Two examples demonstrate the efficiency and effectiveness of the proposed sampling strategy in locating the hot spot regions of various fields: (1) a heavy metal contaminated site reproduced by modeling on 55 real field data; (2) a simulated two-dimensional field by the random phase volume (RPV) model. The results show that the proposed strategy, a robust interactive sampling procedure, is able to locate hot spot regions without compromising with the overall profile of an under-survey area.  相似文献   

5.
When estimating the mean value of a variable, or the total amount of a resource, within a specified region it is desirable to report an estimated standard error for the resulting estimate. If the sample sites are selected according to a probability sampling design, it usually is possible to construct an appropriate design-based standard error estimate. One exception is systematic sampling for which no such standard error estimator exists. However, a slight modification of systematic sampling, termed 2-step tessellation stratified (2TS) sampling, does permit the estimation of design-based standard errors. This paper develops a design-based standard error estimator for 2TS sampling. It is shown that the Taylor series approximation to the variance of the sample mean under 2TS sampling may be expressed in terms of either a deterministic variogram or a deterministic covariance function. Variance estimation then can be approached through the estimation of a variogram or a covariance function. The resulting standard error estimators are compared to some more traditional variance estimators through a simulation study. The simulation results show that estimators based on the new approach may perform better than traditional variance estimators.  相似文献   

6.
An Alternative Measure of the Reliability of Ordinary Kriging Estimates   总被引:4,自引:0,他引:4  
This paper presents an interpolation variance as an alternative to the measure of the reliability of ordinary kriging estimates. Contrary to the traditional kriging variance, the interpolation variance is data-values dependent, variogram dependent, and a measure of local accuracy. Natural phenomena are not homogeneous; therefore, local variability as expressed through data values must be recognized for a correct assessment of uncertainty. The interpolation variance is simply the weighted average of the squared differences between data values and the retained estimate. Ordinary kriging or simple kriging variances are the expected values of interpolation variances; therefore, these traditional homoscedastic estimation variances cannot properly measure local data dispersion. More precisely, the interpolation variance is an estimate of the local conditional variance, when the ordinary kriging weights are interpreted as conditional probabilities associated to the n neighboring data. This interpretation is valid if, and only if, all ordinary kriging weights are positive or constrained to be such. Extensive tests illustrate that the interpolation variance is a useful alternative to the traditional kriging variance.  相似文献   

7.
Although several researchers have pointed out some advantages and disadvantages of various soil sampling designs in the presence of spatial autocorrelation, a more detailed study is presented herein which examines the geometrical relationship of three sampling designs, namely the square, the equilateral triangle, and the regular hexagon. Both advantages and disadvantages exist in the use of these designs with respect to estimation of the semivariogram and their effect on the mean square error or variance of error. This research could be used to design optimal sampling strategies; it is based on the theory of regionalized variables, in which the intrinsic hypothesis is satisfied. Among alternative designs, an equilateral triangle design gives the most reliable estimate of the semivariogram. It also gives the minimum maximum mean square error of point estimation of the concentration over the other two designs for the same number of measurements when the nugget effect is small relative to the variance. If the nugget effect is large (.90 2 or more), and the linear sampling density is >0.85r where r is the range, the hexagonal design is best. This study computes and compares the maximum mean square error for each of these designs.  相似文献   

8.
A 1 km square regular grid system created on the Universal Transverse Mercator zone 54 projected coordinate system is used to work with volcanism related data for Sengan region. The following geologic variables were determined as the most important for identifying volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater pH value, presence of volcanic rocks and presence of hydrothermal alteration. Data available for each of these important geologic variables were used to perform directional variogram modeling and kriging to estimate geologic variable vectors at each of the 23949 centers of the chosen 1 km cell grid system. Cluster analysis was performed on the 23949 complete variable vectors to classify each center of 1 km cell into one of five different statistically homogeneous groups with respect to potential volcanism spanning from lowest possible volcanism to highest possible volcanism with increasing group number. A discriminant analysis incorporating Bayes’ theorem was performed to construct maps showing the probability of group membership for each of the volcanism groups. The said maps showed good comparisons with the recorded locations of volcanism within the Sengan region. No volcanic data were found to exist in the group 1 region. The high probability areas within group 1 have the chance of being the no volcanism region. Entropy of classification is calculated to assess the uncertainty of the allocation process of each 1 km cell center location based on the calculated probabilities. The recorded volcanism data are also plotted on the entropy map to examine the uncertainty level of the estimations at the locations where volcanism exists. The volcanic data cell locations that are in the high volcanism regions (groups 4 and 5) showed relatively low mapping estimation uncertainty. On the other hand, the volcanic data cell locations that are in the low volcanism region (group 2) showed relatively high mapping estimation uncertainty. The volcanic data cell locations that are in the medium volcanism region (group 3) showed relatively moderate mapping estimation uncertainty. Areas of high uncertainty provide locations where additional site characterization resources can be spent most effectively. The new data collected can be added to the existing database to perform future regionalized mapping and reduce the uncertainty level of the existing estimations.  相似文献   

9.
Site investigations that aim to sufficiently characterize a soil profile for foundation design, typically consist of a combination of in situ and laboratory tests. The number of tests and/or soil samples is generally determined by the budget and time considerations placed upon the investigation. Therefore, it is necessary to plan the locations of such tests to provide the most suitable information for use in design. This is considered the sampling strategy. However, the spatial variability of soil properties increases the complexity of this exercise. Results presented in this paper identify the errors associated with using soil properties from a single sample location on a pad foundation designed for settlement. Sample locations are distributed around the site to identify the most appropriate sample location and the relative benefits of taking soil samples closer to the center of the proposed footing. The variability of the underlying soil profile is also shown to a have a significant effect on the errors due to sampling location. Such effects have been shown in terms of the statistical properties of the soil profile. The performance of several common settlement relationships to design a foundation based on the results of a single sample location have also been examined.  相似文献   

10.
In geostatistics, factorial kriging is often proposed to filter noise. This filter is built from a linear model which is ideally suited to a Gaussian signal with additive independent noise. Robustness of the performance of factorial kriging is evaluated in less congenial situations. Three different types of noise are considered all perturbing a lognormally distributed signal. The first noise model is independent of the signal. The second noise model is heteroscedastic; its variance depends on the signal, yet noise and signal are uncorrelated. The third noise model is both heteroscedastic and linearly correlated with the signal. In ideal conditions, exhaustive sampling and additive independent noise, factorial kriging succeeds to reproduce the spatial patterns of high signal values. This score remains good in presence of heteroscedastic noise variance but falls quickly in presence of noise-to-signal correlation as soon as the sample becomes sparser.  相似文献   

11.
Studies of site exploration, data assimilation, or geostatistical inversion measure parameter uncertainty in order to assess the optimality of a suggested scheme. This study reviews and discusses measures for parameter uncertainty in spatial estimation. Most measures originate from alphabetic criteria in optimal design and were transferred to geostatistical estimation. Further rather intuitive measures can be found in the geostatistical literature, and some new measures will be suggested in this study. It is shown how these measures relate to the optimality alphabet and to relative entropy. Issues of physical and statistical significance are addressed whenever they arise. Computational feasibility and efficient ways to evaluate the above measures are discussed in this paper, and an illustrative synthetic case study is provided. A major conclusion is that the mean estimation variance and the averaged conditional integral scale are a powerful duo for characterizing conditional parameter uncertainty, with direct correspondence to the well-understood optimality alphabet. This study is based on cokriging generalized to uncertain mean and trends because it is the most general representative of linear spatial estimation within the Bayesian framework. Generalization to kriging and quasi-linear schemes is straightforward. Options for application to non-Gaussian and non-linear problems are discussed.  相似文献   

12.
Looking at kriging problems with huge numbers of estimation points and measurements, computational power and storage capacities often pose heavy limitations to the maximum manageable problem size. In the past, a list of FFT-based algorithms for matrix operations have been developed. They allow extremely fast convolution, superposition and inversion of covariance matrices under certain conditions. If adequately used in kriging problems, these algorithms lead to drastic speedup and reductions in storage requirements without changing the kriging estimator. However, they require second-order stationary covariance functions, estimation on regular grids, and the measurements must also form a regular grid. In this study, we show how to alleviate these rather heavy and many times unrealistic restrictions. Stationarity can be generalized to intrinsicity and beyond, if decomposing kriging problems into the sum of a stationary problem and a formally decoupled regression task. We use universal kriging, because it covers arbitrary forms of unknown drift and all cases of generalized covariance functions. Even more general, we use an extension to uncertain rather than unknown drift coefficients. The sampling locations may now be irregular, but must form a subset of the estimation grid. Finally, we present asymptotically exact but fast approximations to the estimation variance and point out application to conditional simulation, cokriging and sequential kriging. The drastic gain in computational and storage efficiency is demonstrated in test cases. Especially high-resolution and data-rich fields such as rainfall interpolation from radar measurements or seismic or other geophysical inversion can benefit from these improvements.  相似文献   

13.
Geostatistical optimization in designing infill boreholes is an important cost-effective approach in increasing the accuracy of the tonnage and grade of an ore deposit. In this research, a new approach is proposed to design the optimum infill directional boreholes. In the proposed approach, the Kriging estimation variance is considered as the objective function and the number and properties of the optimum boreholes are estimated to minimize the objective function. The optimization procedure is implemented by Particle Swarm Optimization (PSO) algorithm. Range of the spatial and directional properties of new boreholes is determined by considering the primary information of the mineralization and administrative constraint of drilling. Then, the PSO algorithm is iteratively applied, and in each iteration, the variation of the estimated Kriging variance after drilling the new boreholes is determined and properties of the new boreholes are updated. The iterative procedure of the algorithm is continued until minimum Kriging variance is satisfied. The approach was applied to the Dalli Cu-Au porphyry deposit in Iran and three new infill directional boreholes were designed by considering six earlier boreholes from the preliminary exploration stage. New optimum boreholes were located where less information from the preliminary exploration stage exists and the highest variance is considered. Two new boreholes are near to vertical (78°) and the third is an inclined with 55° dip. By drilling these three new boreholes, the estimated grade model could be upgraded by 20%. For simplicity, quickness and the ability to search for the required numbers and specifications of a group of directional boreholes in a 3D environment are the most advantages aspects of the proposed approach.  相似文献   

14.
Typically, datasets originated from mining exploration sites, industrially polluted and hazardous waste sites are correlated spatially over the region under investigation. Ordinary kriging (OK) is a well-established geostatistical tool used for predicting variables, such as precious metal contents, biomass, species counts, and environmental pollutants at unsampled spatial locations based on data collected from the neighboring sampled locations at these sites. One of the assumptions required to perform OK is that the mean of the characteristic of concern is constant for the entire region under consideration (e.g., there is no spatial trend present in the contaminant distribution across the site). This assumption may be violated by dalasets obtained from environmental applications. The occurrence of spatial trend in a dataset collected from a polluted site is an indication of the presence of two or more statistical populations (strata) with significantly different mean concentrations. Use of OK in these situations can result in inaccurate kriging estimates with higher SDs which, in turn, can lead to incorrect decisions regarding all subsequent environmental monitoring and remediation activities. A univariate and a multivariate approach have been described to identify spatial trend that may be present at the site. The trend then is removed by subtracting the respective means from the corresponding populations. The results of OK before and after trend removal are being compared. Using a real dataset, it is shown that standard deviations (SDs) of the kriging estimates obtained after trend removal are uniformly smaller than the corresponding SDs of the estimates obtained without the trend removal.  相似文献   

15.
Ordinary kriging and non-linear geostatistical estimators are now well accepted methods in mining grade control and mine reserve estimation. In kriging, the search volume or ‘kriging neighbourhood’ is defined by the user. The definition of the search space can have a significant impact on the outcome of the kriging estimate. In particular, too restrictive neighbourhood, can result in serious conditional bias. Kriging is commonly described as a ‘minimum variance estimator’ but this is only true when the neighbourhood is properly selected. Arbitrary decisions about search space are highly risky. The criteria to consider when evaluating a particular kriging neighbourhood are the slope of the regression of the ‘true’ and ‘estimated’ block grades, the number of kriging negative weights and the kriging variance. Search radius is one of the most important parameters of search volume which often is determined on the basis of influence of the variogram. In this paper the above-mentioned parameters are used to determine optimal search radius.  相似文献   

16.
Two fundamentally different sources of randomness exist on which design and inference in spatial sampling can be based: (a) variation that would occur on resampling the same spatial population with other sampling configurations generated by the same design, and (b) variation occurring on sampling other populations, hypothetically generated by the same spatial model, using the same sampling configuration. The former leads to the design-based approach, which uses classical sampling theory; the latter leads to the model-based approach and uses geostatistical theory. Failure to recognize these two sources of randomness causes misunderstanding about dependence of variables and the role of randomization in sampling, unwarranted narrowing down the choice of sampling strategies to those that are model-based, and abuse in simulation experiments. This is exemplified in Barnes' publication on the required sample size for geologic site characterization by nonparametric tolerance intervals. A basic design-based strategy like Simple Random Sampling is shown to require smaller sample sizes than the model-based strategy advocated by Barnes. In addition, Simple Random Sampling is completely robust against model errors and less complicated.  相似文献   

17.
    
Two fundamentally different sources of randomness exist on which design and inference in spatial sampling can be based: (a) variation that would occur on resampling the same spatial population with other sampling configurations generated by the same design, and (b) variation occurring on sampling other populations, hypothetically generated by the same spatial model, using the same sampling configuration. The former leads to the design-based approach, which uses classical sampling theory; the latter leads to the model-based approach and uses geostatistical theory. Failure to recognize these two sources of randomness causes misunderstanding about dependence of variables and the role of randomization in sampling, unwarranted narrowing down the choice of sampling strategies to those that are model-based, and abuse in simulation experiments. This is exemplified in Barnes' publication on the required sample size for geologic site characterization by nonparametric tolerance intervals. A basic design-based strategy like Simple Random Sampling is shown to require smaller sample sizes than the model-based strategy advocated by Barnes. In addition, Simple Random Sampling is completely robust against model errors and less complicated.  相似文献   

18.
Average kriging variance is a standard tool used in optimization of the location of additional drill holes. However, this tool cannot distinguish between areas with different priorities. This limitation could be eliminated by using weighted average kriging variance. This paper extends the problem of optimal location to three dimensional cases, use grade as a weight and search optimum locations by simulated annealing. Weighted average kriging variance is used as objective function. The method is applied to a copper deposit. Results have shown that weighting of the estimation variance with ??grade?? is effective only when the difference among the grades estimated for different blocks is considerable.  相似文献   

19.
A stationary specification of anisotropy does not always capture the complexities of a geologic site. In this situation, the anisotropy can be varied locally. Directions of continuity and the range of the variogram can change depending on location within the domain being modeled. Kriging equations have been developed to use a local anisotropy specification within kriging neighborhoods; however, this approach does not account for variation in anisotropy within the kriging neighborhood. This paper presents an algorithm to determine the optimum path between points that results in the highest covariance in the presence of locally varying anisotropy. Using optimum paths increases covariance, results in lower estimation variance and leads to results that reflect important curvilinear structures. Although CPU intensive, the complex curvilinear structures of the kriged maps are important for process evaluation. Examples highlight the ability of this methodology to reproduce complex features that could not be generated with traditional kriging.  相似文献   

20.
Mineral inventory determination consists of estimating the amount of mineral resources on a block-by-block basis and classifying individual blocks into categories with increasing level of geologic confidence. Such classification is a crucial issue for mining companies, investors, financial institutions, and authorities, but it remains subject to some confusion because of the wide variations in methodologies and the lack of standardized procedures. The first part of this paper considers some of the criteria used to classify resources in practice and their impact through a sensitivity study using data from a Chilean porphyry copper deposit. Five classification criteria are compared and evaluated, namely: Search neighborhoods, absolute and relative kriging variances, absolute and relative conditional simulation variances. It is shown that some classification criteria either favor or penalize the high-grade areas if the grade distribution presents a proportional effect. In the second part of the paper, conditional simulations are used to quantify the uncertainty on the overall mineral resources. This approach is promising for risk analysis and decision-making. Unlike linear kriging, simulations allow inclusion of a cutoff grade in the calculation of the resources and also provide measures of their joint uncertainty over production volumes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号