首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
Computer vision provides several tools for analyzing and simulating textures. The principles of these techniques are similar to those in multiple-point geostatistics, namely, the reproduction of patterns and consistency in the results from a perceptual point of view, thus, ensuring the reproduction of long range connectivity. The only difference between these techniques and geostatistical simulation accounting for multiple-point statistics is that conditioning is not an issue in computer vision. We present a solution to the problem of conditioning simulated fields while simultaneously honoring multiple-point (pattern) statistics. The proposal is based on a texture synthesis algorithm where a fixed search (causal) pattern is used. Conditioning is achieved by adding a non-causal search neighborhood that modifies the conditional distribution from which the simulated category is drawn, depending on the conditioning information. Results show an excellent reproduction of the features from the training image, while respecting the conditioning information. Some issues related to the data structure and to the computer efficiency are discussed.  相似文献   

2.
3.
Truncated plurigaussian (TPG) simulation is a flexible method for simulating rock types in deposits with complicated ordering structures. The truncation of a multivariate Gaussian distribution controls the proportions and ordering of rock types in the simulation while the variogram for each Gaussian variable controls rock type continuity. The determination of a truncation procedure for complicated geological environments is not trivial. A method for determining the truncation and fitting variograms applicable to any number of rock types and multivariate Gaussian distribution is developed here to address this problem. Multidimensional scaling is applied to place dissimilar categories far apart and similar categories close together. The multivariate space is then mapped using a Voronoi decomposition and rotated to optimize variogram reproduction. A case study simulating geologic layers at a large mineral deposit demonstrates the potential of this method and compares the results with sequential indicator simulation (SIS). Input proportion and transition probability reproduction with TPG is demonstrated to be better than SIS. Variogram reproduction is comparable for both techniques.  相似文献   

4.
A new approach is described to allow conditioning to both hard data (HD) and soft data for a patch- and distance-based multiple-point geostatistical simulation. The multinomial logistic regression is used to quantify the link between HD and soft data. The soft data is converted by the logistic regression classifier into as many probability fields as there are categories. The local category proportions are used and compared to the average category probabilities within the patch. The conditioning to HD is obtained using alternative training images and by imposing large relative weights to HD. The conditioning to soft data is obtained by measuring the probability–proportion patch distance. Both 2D and 3D cases are considered. Synthetic cases show that a stationary TI can generate non-stationary realizations reproducing the HD, keeping the texture indicated by the TI and following the trends identified in probability maps obtained from soft data. A real case study, the Mallik methane-hydrate field, shows perfect reproduction of HD while keeping a good reproduction of the TI texture and probability trends.  相似文献   

5.
This work deals with the geostatistical simulation of a family of stationary random field models with bivariate isofactorial distributions. Such models are defined as the sum of independent random fields with mosaic-type bivariate distributions and infinitely divisible univariate distributions. For practical applications, dead leaf tessellations are used since they provide a wide range of models and allow conditioning the realizations to a set of data via an iterative procedure (simulated annealing). The model parameters can be determined by comparing the data variogram and madogram, and enable to control the spatial connectivity of the extreme values in the realizations. An illustration to a forest dataset is presented, for which a negative binomial model is used to characterize the distribution of coniferous trees over a wooded area.  相似文献   

6.
Exploring a valid model for the variogram of an isotropic spatial process   总被引:1,自引:1,他引:0  
The variogram is one of the most important tools in the assessment of spatial variability and a crucial parameter for kriging. It is widely known that an estimator for the variogram cannot be used as its representator in some contexts because of its lack of conditional semi negative definiteness. Consequently, once the variogram is estimated, a valid family must be chosen to fit an appropriate model. Under isotropy, this selection is carried out by eye from the observation of the variogram estimated curve. In this paper, a statistical methodology is proposed to explore a valid model for the variogram. The statistic for this approach is based on quadratic forms depending on smoothed random variables which gather the underlying spatial variation. The distribution of the test statistic is approximated by a shifted chi-square distribution. A simulation study is also carried out to check the power and size of the test. Reference bands, as a complementary graphical tool, are calculated. An example from the literature is used to illustrate the methodologies presented.  相似文献   

7.
朱成英  高小其 《内陆地震》2011,25(2):158-165
运用mapsis软件前兆异常分析中的差分、从属函数和变差对新20号井水位进行异常识别,结果认为:(1)差分、从属函数和变差异常形态均为高值异常;(2)差分和变差对近场M<,s>≥5地震漏报率较高,但有异常出现时,后续往往有地震对应,而从属函数对近场M<,s>≥5映震率为66.7%,对远场M<,s>≥7映震率为71.4%...  相似文献   

8.
For good groundwater flow and solute transport numerical modeling, it is important to characterize the formation properties. In this paper, we analyze the performance and important implementation details of a new approach for stochastic inverse modeling called inverse sequential simulation (iSS). This approach is capable of characterizing conductivity fields with heterogeneity patterns difficult to capture by standard multiGaussian-based inverse approaches. The method is based on the multivariate sequential simulation principle, but the covariances and cross-covariances used to compute the local conditional probability distributions are computed by simple co-kriging which are derived from an ensemble of conductivity and piezometric head fields, in a similar manner as the experimental covariances are computed in an ensemble Kalman filtering. A sensitivity analysis is performed on a synthetic aquifer regarding the number of members of the ensemble of realizations, the number of conditioning data, the number of piezometers at which piezometric heads are observed, and the number of nodes retained within the search neighborhood at the moment of computing the local conditional probabilities. The results show the importance of having a sufficiently large number of all of the mentioned parameters for the algorithm to characterize properly hydraulic conductivity fields with clear non-multiGaussian features.  相似文献   

9.
Multiple-point geostatistical simulation is used to simulate the spatial structures of geological phenomena. In contrast to conventional two-point variogram based geostatistical methods, the multiple-point approach is capable of simulating complex spatial patterns, shapes, and structures normally observed in geological media. A commonly used pattern based multiple-point geostatistical simulation algorithms is called FILTERSIM. In the conventional FILTERSIM algorithm, the patterns identified in training images are transformed into filter score space using fixed filters that are neither dependent on the training images nor on the characteristics of the patterns extracted from them. In this paper, we introduce two new methods, one for geostatistical simulation and another for conditioning the results. At first, new filters are designed using principal component analysis in such a way to include most structural information specific to the governing training images resulting in the selection of closer patterns in the filter score space. We then propose to combine adaptive filters with an overlap strategy along a raster path and an efficient conditioning method to develop an algorithm for reservoir simulation with high accuracy and continuity. We also combine image quilting with this algorithm to improve connectivity a lot. The proposed method, which we call random partitioning with adaptive filters simulation method, can be used both for continuous and discrete variables. The results of the proposed method show a significant improvement in recovering the expected shapes and structural continuity in the final simulated realizations as compared to those of conventional FILTERSIM algorithm and the algorithm is more than ten times faster than FILTERSIM because of using raster path and using small overlap specially when we use image quilting.  相似文献   

10.
Multivariate simulation is an important longstanding problem in geostatistics. Fitting a model of coregionalization to many variables is intractable and often not permitted; however, the matrix of collocated correlation coefficients is often well informed. Performing a matrix simulation with LU decomposition of the correlation matrix at each step of sequential simulation is implemented in some software. The target correlation matrix is not reproduced because of conditioning to local data and the particular variable ordering in the sequential/LU decomposition. A correction procedure is developed to calculate a modified correlation matrix that leads to reproduction of the target correlation matrix. The theoretical and practical aspects of this correction are developed.  相似文献   

11.
The variogram is a key parameter for geostatistical estimation and simulation. Preferential sampling may bias the spatial structure and often leads to noisy and unreliable variograms. A novel technique is proposed to weight variogram pairs in order to compensate for preferential or clustered sampling . Weighting the variogram pairs by global kriging of the quadratic differences between the tail and head values gives each pair the appropriate weight, removes noise and minimizes artifacts in the experimental variogram. Moreover, variogram uncertainty could be computed by this technique. The required covariance between the pairs going into variogram calculation, is a fourth order covariance that must be calculated by second order moments. This introduces some circularity in the calculation whereby an initial variogram must be assumed before calculating how the pairs should be weighted for the experimental variogram. The methodology is assessed by synthetic and realistic examples. For synthetic example, a comparison between the traditional and declustered variograms shows that the declustered variograms are better estimates of the true underlying variograms. The realistic example also shows that the declustered sample variogram is closer to the true variogram.  相似文献   

12.
The sequential algorithm is widely used to simulate Gaussian random fields. However, a rigorous application of this algorithm is impractical and some simplifications are required, in particular a moving neighborhood has to be defined. To examine the effect of such restriction on the quality of the realizations, a reference case is presented and several parameters are reviewed, mainly the histogram, variogram, indicator variograms, as well as the ergodic fluctuations in the first and second-order statistics. The study concludes that, even in a favorable case where the simulated domain is large with respect to the range of the model, the realizations may poorly reproduce the second-order statistics and be inconsistent with the stationarity and ergodicity assumptions. Practical tips such as the multiple-grid strategy do not overcome these impediments. Finally, extending the original algorithm by using an ordinary kriging should be avoided, unless an intrinsic random function model is sought after.  相似文献   

13.
A data assimilation method is developed to calibrate a heterogeneous hydraulic conductivity field conditioning on transient pumping test data. The ensemble Kalman filter (EnKF) approach is used to update model parameters such as hydraulic conductivity and model variables such as hydraulic head using available data. A synthetical two-dimensional flow case is used to assess the capability of the EnKF method to calibrate a heterogeneous conductivity field by assimilating transient flow data from observation wells under different hydraulic boundary conditions. The study results indicate that the EnKF method will significantly improve the estimation of the hydraulic conductivity field by assimilating continuous hydraulic head measurements and the hydraulic boundary condition will significantly affect the simulation results. For our cases, after a few data assimilation steps, the assimilated conductivity field with four Neumann boundaries matches the real field well while the assimilated conductivity field with mixed Dirichlet and Neumann boundaries does not. We found in our cases that the ensemble size should be 300 or larger for the numerical simulation. The number and the locations of the observation wells will significantly affect the hydraulic conductivity field calibration.  相似文献   

14.
Histogram and variogram inference in the multigaussian model   总被引:1,自引:4,他引:1  
Several iterative algorithms are proposed to improve the histogram and variogram inference in the framework of the multigaussian model. The starting point is the variogram obtained after a traditional normal score transform. The subsequent step consists in simulating many sets of gaussian values with this variogram at the data locations, so that the ranking of the original values is honored. The expected gaussian transformation and the expected variogram are computed by an averaging operation over the simulated datasets. The variogram model is then updated and the procedure is repeated until convergence. Such an iterative algorithm can adapt to the case of tied data and despike the histogram. Two additional issues are also examined, referred to the modeling of the empirical transformation function and to the optimal pair weighting when computing the sample variogram.  相似文献   

15.
An evaluation of conditioning data for solute transport prediction   总被引:1,自引:0,他引:1  
Scheibe TD  Chien YJ 《Ground water》2003,41(2):128-141
The large and diverse body of subsurface characterization data generated at a field research site near Oyster, Virginia, provides a unique opportunity to test the impact of conditioning data of various types on predictions of flow and transport. Bromide breakthrough curves (BTCs) were measured during a forced-gradient local-scale injection experiment conducted in 1999. Observed BTCs are available at 140 sampling points in a three-dimensional array within the transport domain. A detailed three-dimensional numerical model is used to simulate breakthrough curves at the same locations as the observed BTCs under varying assumptions regarding the character of hydraulic conductivity spatial distributions, and variable amounts and types of conditioning data. We present comparative results of six cases ranging from simple (deterministic homogeneous models) to complex (stochastic indicator simulation conditioned to cross-borehole geophysical observations). Quantitative measures of model goodness-of-fit are presented. The results show that conditioning to a large number of small-scale measurements does not significantly improve model predictions, and may lead to biased or overly confident predictions. However, conditioning to geophysical interpretations with larger spatial support significantly improves the accuracy and precision of model predictions. In all cases, the effects of model error appear to be significant in relation to parameter uncertainty.  相似文献   

16.
Temporal and spatial rainfall patterns were analysed to describe the distribution of daily rainfall across a medium‐sized (379km2) tropical catchment. Investigations were carried out to assess whether a climatological variogram model was appropriate for mapping rainfall taking into consideration the changing rainfall characteristics through the wet season. Exploratory, frequency and moving average analyses of 30 years' daily precipitation data were used to describe the reliability and structure of the rainfall regime. Four phases in the wet season were distinguished, with the peak period (mid‐August to mid‐September) representing the wettest period. A low‐cost rain gauge network of 36 plastic gauges with overflow reservoirs was installed and monitored to obtain spatially distributed rainfall data. Geostatistical techniques were used to develop global and wet season phase climatological variograms. The unscaled climatological variograms were cross‐validated and compared using a range of rainfall events. Ordinary Kriging was used as the interpolation method. The global climatological variogram performed better, and was used to optimize the number and location of rain gauges in the network. The research showed that although distinct wet season phases could be established based on the temporal analysis of daily rainfall characteristics, the interpolation of daily rainfall across a medium‐sized catchment based on spatial analysis was better served by using the global rather than the wet season phase climatological variogram model. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

17.
A comparison of two stochastic inverse methods in a field-scale application   总被引:1,自引:0,他引:1  
Inverse modeling is a useful tool in ground water flow modeling studies. The most frequent difficulties encountered when using this technique are the lack of conditioning information (e.g., heads and transmissivities), the uncertainty in available data, and the nonuniqueness of the solution. These problems can be addressed and quantified through a stochastic Monte Carlo approach. The aim of this work was to compare the applicability of two stochastic inverse modeling approaches in a field-scale application. The multi-scaling (MS) approach uses a downscaling parameterization procedure that is not based on geostatistics. The pilot point (PP) approach uses geostatistical random fields as initial transmissivity values and an experimental variogram to condition the calibration. The studied area (375 km2) is part of a regional aquifer, northwest of Montreal in the St. Lawrence lowlands (southern Québec). It is located in limestone, dolomite, and sandstone formations, and is mostly a fractured porous medium. The MS approach generated small errors on heads, but the calibrated transmissivity fields did not reproduce the variogram of observed transmissivities. The PP approach generated larger errors on heads but better reproduced the spatial structure of observed transmissivities. The PP approach was also less sensitive to uncertainty in head measurements. If reliable heads are available but no transmissivities are measured, the MS approach provides useful results. If reliable transmissivities with a well inferred spatial structure are available, then the PP approach is a better alternative. This approach however must be used with caution if measured transmissivities are not reliable.  相似文献   

18.
Spatial prediction of river channel topography by kriging   总被引:2,自引:0,他引:2  
Topographic information is fundamental to geomorphic inquiry, and spatial prediction of bed elevation from irregular survey data is an important component of many reach‐scale studies. Kriging is a geostatistical technique for obtaining these predictions along with measures of their reliability, and this paper outlines a specialized framework intended for application to river channels. Our modular approach includes an algorithm for transforming the coordinates of data and prediction locations to a channel‐centered coordinate system, several different methods of representing the trend component of topographic variation and search strategies that incorporate geomorphic information to determine which survey data are used to make a prediction at a specific location. For example, a relationship between curvature and the lateral position of maximum depth can be used to include cross‐sectional asymmetry in a two‐dimensional trend surface model, and topographic breaklines can be used to restrict which data are retained in a local neighborhood around each prediction location. Using survey data from a restored gravel‐bed river, we demonstrate how transformation to the channel‐centered coordinate system facilitates interpretation of the variogram, a statistical model of reach‐scale spatial structure used in kriging, and how the choice of a trend model affects the variogram of the residuals from that trend. Similarly, we show how decomposing kriging predictions into their trend and residual components can yield useful information on channel morphology. Cross‐validation analyses involving different data configurations and kriging variants indicate that kriging is quite robust and that survey density is the primary control on the accuracy of bed elevation predictions. The root mean‐square error of these predictions is directly proportional to the spacing between surveyed cross‐sections, even in a reconfigured channel with a relatively simple morphology; sophisticated methods of spatial prediction are no substitute for field data. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

19.
Modern methods of geostatistics deliver an essential contribution to Environmental Impact Assessment (EIA). These methods allow for spatial interpolation, forecast and risk assessment of expected impact during and after mining projects by integrating different sources of data and information. Geostatistical estimation and simulation algorithms are designed to provide both, a most likely forecast as well as information about the accuracy of the prediction. The representativeness of these measures depends strongly on the quality of the inferred model parameters, which are mainly defined by the parameters of the variogram or the covariance function. Available data may be sparse, trend affected and of different data type making the inference of representative geostatistical model parameters difficult. This contribution introduces a new method for best fitting of the geostatistical model parameters in the presence of a trend, which utilizes the empirical and theoretical differences between Universal Kriging and trend-predictions. The method extends well known approaches of cross validation in two aspects. Firstly, the model evaluation is not only limited to sample data locations but is performed on any prediction locations of the attribute in the domain. Secondly, it extends the measure used in cross validation, based on a single point replacement by using error curves. These allow defining rings of influence representing errors resulting from separate variogram lags. By analyzing the different variogram lags the fit of the complete covariance can be assessed and the influence of the several model parameters separated. The use of the proposed method in an EIA context is illustrated in a case study related on the prediction of mining-induced ground movements.  相似文献   

20.
Sand lenses at various spatial scales are recognized to add heterogeneity to glacial sediments. They have high hydraulic conductivities relative to the surrounding till matrix and may affect the advective transport of water and contaminants in clayey till settings. Sand lenses were investigated on till outcrops producing binary images of geological cross‐sections capturing the size, shape and distribution of individual features. Sand lenses occur as elongated, anisotropic geobodies that vary in size and extent. Besides, sand lenses show strong non‐stationary patterns on section images that hamper subsequent simulation. Transition probability (TP) and multiple‐point statistics (MPS) were employed to simulate sand lens heterogeneity. We used one cross‐section to parameterize the spatial correlation and a second, parallel section as a reference: it allowed testing the quality of the simulations as a function of the amount of conditioning data under realistic conditions. The performance of the simulations was evaluated on the faithful reproduction of the specific geological structure caused by sand lenses. Multiple‐point statistics offer a better reproduction of sand lens geometry. However, two‐dimensional training images acquired by outcrop mapping are of limited use to generate three‐dimensional realizations with MPS. One can use a technique that consists in splitting the 3D domain into a set of slices in various directions that are sequentially simulated and reassembled into a 3D block. The identification of flow paths through a network of elongated sand lenses and the impact on the equivalent permeability in tills are essential to perform solute transport modeling in the low‐permeability sediments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号