首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Estimation of Pearson’s correlation coefficient between two time series, in the evaluation of the influences of one time-dependent variable on another, is an often used statistical method in climate sciences. Data properties common to climate time series, namely non-normal distributional shape, serial correlation, and small data sizes, call for advanced, robust methods to estimate accurate confidence intervals to support the correlation point estimate. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, Math Geol 35(6):651–665, 2003), where the main intention is to obtain accurate confidence intervals for correlation coefficients between two time series by taking the serial dependence of the data-generating process into account. However, Monte Carlo experiments show that the coverage accuracy of the confidence intervals for smaller data sizes can be substantially improved. In the present paper, the existing program is adapted into a new version, called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique that performs a second bootstrap loop (it resamples from the bootstrap resamples). It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap resampling is used to preserve the serial dependence of both time series. The calibration is applied to standard error-based bootstrap Student’s $t$ confidence intervals. The performance of the calibrated confidence interval is examined with Monte Carlo simulations and compared with the performance of confidence intervals without calibration. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small already (i.e., within a few percentage points) for data sizes as small as 20.  相似文献   

2.
Assessment of the sampling variance of the experimental variogram is an important topic in geostatistics as it gives the uncertainty of the variogram estimates. This assessment, however, is repeatedly overlooked in most applications mainly, perhaps, because a general approach has not been implemented in the most commonly used software packages for variogram analysis. In this paper the authors propose a solution that can be implemented easily in a computer program, and which, subject to certain assumptions, is exact. These assumptions are not very restrictive: second-order stationarity (the process has a finite variance and the variogram has a sill) and, solely for the purpose of evaluating fourth-order moments, a Gaussian distribution for the random function. The approach described here gives the variance–covariance matrix of the experimental variogram, which takes into account not only the correlation among the experiemental values but also the multiple use of data in the variogram computation. Among other applications, standard errors may be attached to the variogram estimates and the variance–covariance matrix may be used for fitting a theoretical model by weighted, or by generalized, least squares. Confidence regions that hold a given confidence level for all the variogram lag estimates simultaneously have been calculated using the Bonferroni method for rectangular intervals, and using the multivariate Gaussian assumption for K-dimensional elliptical intervals (where K is the number of experimental variogram estimates). A general approach for incorporating the uncertainty of the experimental variogram into the uncertainty of the variogram model parameters is also shown. A case study with rainfall data is used to illustrate the proposed approach.  相似文献   

3.
The conditional probabilities (CP) method implements a new procedure for the generation of transmissivity fields conditional to piezometric head data capable to sample nonmulti-Gaussian random functions and to integrate soft and secondary information. The CP method combines the advantages of the self-calibrated (SC) method with probability fields to circumvent some of the drawbacks of the SC method—namely, its difficulty to integrate soft and secondary information or to generate non-Gaussian fields. The SC method is based on the perturbation of a seed transmissivity field already conditional to transmissivity and secondary data, with the perturbation being function of the transmissivity variogram. The CP method is also based on the perturbation of a seed field; however, the perturbation is made function of the full transmissivity bivariate distribution and of the correlation to the secondary data. The two methods are applied to a sample of an exhaustive non-Gaussian data set of natural origin to demonstrate the interest of using a simulation method that is capable to model the spatial patterns of transmissivity variability beyond the variogram. A comparison of the probabilistic predictions of convective transport derived from a Monte Carlo exercise using both methods demonstrates the superiority of the CP method when the underlying spatial variability is non-Gaussian.  相似文献   

4.
The spatial distribution of cobalt-rich crust thicknesses on seamounts is partly controlled by water depth and slope gradients. Conventional distance–direction-based variogram have not effectively expressed the spatial self-correlation or anisotropy of the thicknesses of cobalt-rich crusts. To estimate resources in cobalt-rich crusts on seamounts using geostatistics, we constructed a new variogram model to adapt to the spatial distribution of the thicknesses of the cobalt-rich crusts. In this model, we defined the data related to cobalt-rich crusts on seamounts as three-dimensional surface random variables, presented an experimental variogram process based on the distance–gradient or distance–“relative water depth,” and provided a theoretical variogram model that follows this process. This method was demonstrated by the spatial estimation of the thicknesses of cobalt-rich crusts on a seamount, and the results indicated that the new variogram model reflects the spatial self-correlation of the thicknesses of cobalt-rich crusts well. Substituted into the Kriging equation, the new variogram model successfully estimated the spatial thickness distribution of these cobalt-rich crusts.  相似文献   

5.
 This paper describes a geostatistical technique based on conditional simulations to assess confidence intervals of local estimates of lake pH values on the Canadian Shield. This geostatistical approach has been developed to deal with the estimation of phenomena with a spatial autocorrelation structure among observations. It uses the autocorrelation structure to derive minimum-variance unbiased estimates for points that have not been measured, or to estimate average values for new surfaces. A survey for lake water chemistry has been conducted by the Ministère de l'Environnement du Québec between 1986 and 1990, to assess surface water quality and delineate the areas affected by acid precipitation on the southern Canadian Shield in Québec. The spatial structure of lake pH was modeled using two nested spherical variogram models, with ranges of 20 km and 250 km, accounting respectively for 20% and 55% of the spatial variation, plus a random component accounting for 25%. The pH data have been used to construct a number of geostatistical simulations that produce plausible realizations of a given random function model, while 'honoring' the experimental values (i.e., the real data points are among the simulated data), and that correspond to the same underlying variogram model. Post-processing of a large number of these simulations, that are equally likely to occur, enables the estimation of mean pH values, the proportion of affected lakes (lakes with pH≤5.5), and the potential error of these parameters within small regions (100 km×100 km). The method provides a procedure to establish whether acid rain control programs will succeed in reducing acidity in surface waters, allowing one to consider small areas with particular physiographic features rather than large drainage basins with several sources of heterogeneity. This judgment on the reduction of surface water acidity will be possible only if the amount of uncertainty in the estimation of mean pH is properly quantified. Received: 3 March 1997 · Accepted: 16 November 1998  相似文献   

6.
Seismic measurements may be used in geostatistical techniques for estimation and simulation of petrophysical properties such as porosity. The good correlation between seismic and rock properties provides a basis for these techniques. Seismic data have a wide spatial coverage not available in log or core data. However, each seismic measurement has a characteristic response function determined by the source-receiver geometry and signal bandwidth. The image response of the seismic measurement gives a filtered version of the true velocity image. Therefore the seismic image cannot reflect exactly the true seismic velocity at all scales of spatial heterogeneities present in the Earth. The seismic response function can be approximated conveniently in the spatial spectral domain using the Born approximation. How the seismic image response affects the estimation of variogram. and spatial scales and its impact on geostatistical results is the focus of this paper. Limitations of view angles and signal bandwidth not only smooth the seismic image, increasing the variogram range, but also can introduce anisotropic spatial structures into the image. The seismic data are enhanced by better characterizing and quantifying these attributes. As an exercise, examples of seismically assisted cokriging and cosimulation of porosity between wells are presented.  相似文献   

7.
The variogram is a critical input to geostatistical studies: (1) it is a tool to investigate and quantify the spatial variability of the phenomenon under study, and (2) most geostatistical estimation or simulation algorithms require an analytical variogram model, which they will reproduce with statistical fluctuations. In the construction of numerical models, the variogram reflects some of our understanding of the geometry and continuity of the variable, and can have a very important impact on predictions from such numerical models. The principles of variogram modeling are developed and illustrated with a number of practical examples. A three-dimensional interpretation of the variogram is necessary to fully describe geologic continuity. Directional continuity must be described simultaneously to be consistent with principles of geological deposition and for a legitimate measure of spatial variability for geostatistical modeling algorithms. Interpretation principles are discussed in detail. Variograms are modeled with particular functions for reasons of mathematical consistency. Used correctly, such variogram models account for the experimental data, geological interpretation, and analogue information. The steps in this essential data integration exercise are described in detail through the introduction of a rigorous methodology.  相似文献   

8.
The experimental variogram computed in the usual way by the method of moments and the Haar wavelet transform are similar in that they filter data and yield informative summaries that may be interpreted. The variogram filters out constant values; wavelets can filter variation at several spatial scales and thereby provide a richer repertoire for analysis and demand no assumptions other than that of finite variance. This paper compares the two functions, identifying that part of the Haar wavelet transform that gives it its advantages. It goes on to show that the generalized variogram of order k=1, 2, and 3 filters linear, quadratic, and cubic polynomials from the data, respectively, which correspond with more complex wavelets in Daubechies's family. The additional filter coefficients of the latter can reveal features of the data that are not evident in its usual form. Three examples in which data recorded at regular intervals on transects are analyzed illustrate the extended form of the variogram. The apparent periodicity of gilgais in Australia seems to be accentuated as filter coefficients are added, but otherwise the analysis provides no new insight. Analysis of hyerpsectral data with a strong linear trend showed that the wavelet-based variograms filtered it out. Adding filter coefficients in the analysis of the topsoil across the Jurassic scarplands of England changed the upper bound of the variogram; it then resembled the within-class variogram computed by the method of moments. To elucidate these results, we simulated several series of data to represent a random process with values fluctuating about a mean, data with long-range linear trend, data with local trend, and data with stepped transitions. The results suggest that the wavelet variogram can filter out the effects of long-range trend, but not local trend, and of transitions from one class to another, as across boundaries.  相似文献   

9.
Estimation of linear combinations is accomplished by using the observed (available) data. Accordingly, to require the negative of a modeled variogram function to be positive definite for all possible data combinations is unnecessary when only the observed data are used in estimation. The requirement that the negative of a variogram model be conditionally positive semidefinite is then relaxed to apply at the observed spatial locations only. In this setting a simple, yet crude, sufficient condition is developed to ensure that a variogram model will yield nonnegative variances for the available data. It is seen that the condition is independent of the dimensionality of the data and applies to both isotropic and anisotropic models. An example of the application of the condition is also presented. The condition is harder to satisfy as the amount of data increases and must be adjusted as the variogram changes to accommodate new data.  相似文献   

10.
Using kriging has been accepted today as the most common method of estimating spatial data in such different fields as the geosciences. To be able to apply kriging methods, it is necessary that the data and variogram model parameters be precise. To utilize the imprecise (fuzzy) data and parameters, use is made of fuzzy kriging methods. Although it has been 30 years since different fuzzy kriging algorithms were proposed, its use has not become as common as other kriging methods (ordinary, simple, log, universal, etc.); lack of a comprehensive software that can perform, based on different fuzzy kriging algorithms, the related calculations in a 3D space can be the main reason. This paper describes an open-source software toolbox (developed in Matlab) for running different algorithms proposed for fuzzy kriging. It also presents, besides a short presentation of the fuzzy kriging method and introduction of the functions provided by the FuzzyKrig toolbox, 3 cases of the software application under the conditions where: 1) data are hard and variogram model parameters are fuzzy, 2) data are fuzzy and variogram model parameters are hard, and 3) both data and variogram model parameters are fuzzy.  相似文献   

11.
This paper describes two new approaches that can be used to compute the two-dimensional experimental wavelet variogram. They are based on an extension from earlier work in one dimension. The methods are powerful 2D generalizations of the 1D variogram that use one- and two-dimensional filters to remove different types of trend present in the data and to provide information on the underlying variation simultaneously. In particular, the two-dimensional filtering method is effective in removing polynomial trend with filters having a simple structure. These methods are tested with simulated fields and microrelief data, and generate results similar to those of the ordinary method of moments variogram. Furthermore, from a filtering point of view, the variogram can be viewed in terms of a convolution of the data with a filter, which is computed fast in O(NLogN) number of operations in the frequency domain. We can also generate images of the filtered data corresponding to the nugget effect, sill and range of the variogram. This in turn provides additional tools to analyze the data further.  相似文献   

12.
In the context of robust statistics, the breakdown point of an estimator is an important feature of reliability. It measures the highest fraction of contamination in the data that an estimator can support before being destroyed. In geostatistics, variogram estimators are based on measurements taken at various spatial locations. The classical notion of breakdown point needs to be extended to a spatial one, depending on the construction of most unfavorable configurations of perturbation. Explicit upper and lower bounds are available for the spatial breakdown point in the regular unidimensional case. The difficulties arising in the multidimensional case are presented on an easy example in IR2 , as well as some simulations on irregular grids. In order to study the global effects of perturbations on variogram estimators, further simulations are carried out on data located on a regular or irregular bidimensional grid. Results show that if variogram estimation is performed with a 50% classical breakdown point scale estimator, the number of initial data likely to be contaminated before destruction of the estimator is roughly 30% on average. Theoretical results confirm the previous statement on data in IRd , d 1.  相似文献   

13.
Geostatistical analysis of spatial random functions frequently uses sample variograms computed from increments of samples of a regionalized random variable. This paper addresses the theory of computing variograms not from increments but from spatial variances. The objective is to extract information about the point support space from the average or larger support data. The variance is understood as a parametric and second moment average feature of a population. However, it is well known that when the population is for a stationary random function, spatial variance within a region is a function of the size and geometry of the region and not a function of location. Spatial variance is conceptualized as an estimation variance between two physical regions or a region and itself. If such a spatial variance could be measured within several sizes of windows, such variances allow the computation of the sample variogram. The approach is extended to covariances between attributes that lead to the cross-variogram. The case of nonpoint sample support of the blocks or elements composing each window is also included. A numerical example illustrates the application of this conceptualization.  相似文献   

14.
The classical variogram estimator proposed by Matheron can be written as a quadratic form of the observations. When data have an elliptically contoured distribution with constant mean, the correlation between the classical variogram estimator at two different lags is a function of the spatial design matrix, the covariance matrix, and the kurtosis. Several specific cases are studied closely. A subclass of elliptically contoured distributions with a particular family of covariance matrices is shown to possess exactly the same correlation structure for the classical variogram estimator as the multivariate independent Gaussian distribution. The consequences on variogram fitting by generalized least squares are discussed.  相似文献   

15.
    
Geostatistics provides a suite of methods, summarized as kriging, to analyze a finite data set to describe a continuous property of the Earth. Kriging methods consist of moving window optimum estimation techniques, which are based on a least-squares principle and use a spatial structure function, usually the variogram. Applications of kriging techniques have become increasingly wide-spread, with ordinary kriging and universal kriging being the most popular ones. The dependence of the final map or model on the input, however, is not generally understood. Herein we demonstrate how changes in the kriging parameters and the neighborhood search affect the cartographic result. Principles are illustrated through a glaciological study. The objective is to map ice thickness and subglacial topography of Storglaciären, Kebnekaise Massif, northern Sweden, from several sets of radio-echo soundings and hot water drillings. New maps are presented.  相似文献   

16.
Geostatistics provides a suite of methods, summarized as kriging, to analyze a finite data set to describe a continuous property of the Earth. Kriging methods consist of moving window optimum estimation techniques, which are based on a least-squares principle and use a spatial structure function, usually the variogram. Applications of kriging techniques have become increasingly wide-spread, with ordinary kriging and universal kriging being the most popular ones. The dependence of the final map or model on the input, however, is not generally understood. Herein we demonstrate how changes in the kriging parameters and the neighborhood search affect the cartographic result. Principles are illustrated through a glaciological study. The objective is to map ice thickness and subglacial topography of Storglaciären, Kebnekaise Massif, northern Sweden, from several sets of radio-echo soundings and hot water drillings. New maps are presented.  相似文献   

17.
The classical variogram estimator proposed by Matheron can be written as a quadratic form of the observations. When data have an elliptically contoured distribution with constant mean, the correlation between the classical variogram estimator at two different lags is a function of the spatial design matrix, the covariance matrix, and the kurtosis. Several specific cases are studied closely. A subclass of elliptically contoured distributions with a particular family of covariance matrices is shown to possess exactly the same correlation structure for the classical variogram estimator as the multivariate independent Gaussian distribution. The consequences on variogram fitting by generalized least squares are discussed.  相似文献   

18.
The effect of outliers on estimates of the variogram depends on how they are distributed in space. The ‘spatial breakdown point’ is the largest proportion of observations which can be drawn from some arbitrary contaminating process without destroying a robust variogram estimator, when they are arranged in the most damaging spatial pattern. A numerical method is presented to find the spatial breakdown point for any sample array in two dimensions or more. It is shown by means of some examples that such a numerical approach is needed to determine the spatial breakdown point for two or more dimensions, even on a regular square sample grid, since previous conjectures about the spatial breakdown point in two dimensions do not hold. The ‘average spatial breakdown point’ has been used as a basis for practical guidelines on the intensity of contaminating processes that can be tolerated by robust variogram estimators. It is the largest proportion of contaminating observations in a data set such that the breakdown point of the variance estimator used to obtain point estimates of the variogram is not exceeded by the expected proportion of contaminated pairs of observations over any lag. In this paper the behaviour of the average spatial breakdown point is investigated for cases where the contaminating process is spatially dependent. It is shown that in two dimensions the average spatial breakdown point is 0.25. Finally, the ‘empirical spatial breakdown point’, a tool for the exploratory analysis of spatial data thought to contain outliers, is introduced and demonstrated using data on metal content in the soils of Sheffield, England. The empirical spatial breakdown point of a particular data set can be used to indicate whether the distribution of possible contaminants is likely to undermine a robust variogram estimator.  相似文献   

19.
20.
Is the ocean floor a fractal?   总被引:1,自引:0,他引:1  
The topographic structure of the ocean bottom is investigated at different scales of resolution to answer the question: Can the seafloor be described as a fractal process? Methods from geostatistics, the theory of regionalized variables, are used to analyze the spatial structure of the ocean floor at different scales of resolution. The key to the analysis is the variogram criterion: Self-similarity of a stochastic process implies self-similarity of its variogram. The criterion is derived and proved here: it also is valid for special cases of self-affinity (in a sense adequate for topography). It has been proposed that seafloor topography can be simulated as a fractal (an object of Hausdorff dimension strictly larger than its topological dimension), having scaling properties (self-similarity or self-affinity). The objective of this study is to compare the implications of these concepts with observations of the seafloor. The analyses are based on SEABEAM bathymetric data from the East Pacific Rise at 13°N/104°W and at 9°N/104°W and use tracks that run both across the ridge crest and along the ridge flank. In the geostatistical evaluation, the data are considered as a stochastic process. The spatial continuity of this process is described by variograms that are calculated for different scales and directions. Applications of the variogram criterion to scale-dependent variogram models yields the following results: Although the seafloor may be a fractal in the sense of the definition involving the Hausdorff dimension, it is not self-similar, nor self-affine (in the given sense). Mathematical models of scale-dependent spatial structures are presented, and their relationship to geologic processes such as ridge evolution, crust formation, and sedimentation is discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号