首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In porous aquifers, groundwater flow and solute transport strongly depend on the sedimentary facies distribution at fine scale, which determines the heterogeneity of the conductivity field; in particular, connected permeable sediments could form preferential flow paths. Therefore, properly defined statistics, e.g. total and intrinsic facies connectivity, should be correlated with transport features. In order to improve the assessment of the relevance of this relationship, some tests are conducted on two ensembles of equiprobable realizations, obtained with two different geostatistical simulation methods—sequential indicator simulation and multiple point simulation (MPS)—from the same dataset, which refers to an aquifer analogue of sediments deposited in a fluvial point-bar/channel association. The ensembles show different features; simulations with MPS are more structured and characterised by preferential flow paths. This is confirmed by the analysis of transport connectivities and by the interpretation of data from numerical experiments of conservative solute transport with single and dual domain models. The use of two ensembles permits (1) previous results obtained for single realizations to be consolidated on a more firm statistical basis and (2) the application of principal component analysis to assess which quantities are statistically the most relevant for the relationship between connectivity indicators and flow and transport properties.  相似文献   

2.
Two methods for generating representative realizations from Gaussian and lognormal random field models are studied in this paper, with term representative implying realizations efficiently spanning the range of possible attribute values corresponding to the multivariate (log)normal probability distribution. The first method, already established in the geostatistical literature, is multivariate Latin hypercube sampling, a form of stratified random sampling aiming at marginal stratification of simulated values for each variable involved under the constraint of reproducing a known covariance matrix. The second method, scarcely known in the geostatistical literature, is stratified likelihood sampling, in which representative realizations are generated by exploring in a systematic way the structure of the multivariate distribution function itself. The two sampling methods are employed for generating unconditional realizations of saturated hydraulic conductivity in a hydrogeological context via a synthetic case study involving physically-based simulation of flow and transport in a heterogeneous porous medium; their performance is evaluated for different sample sizes (number of realizations) in terms of the reproduction of ensemble statistics of hydraulic conductivity and solute concentration computed from a very large ensemble set generated via simple random sampling. The results show that both Latin hypercube and stratified likelihood sampling are more efficient than simple random sampling, in that overall they can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than the simple random sampling.  相似文献   

3.
In earth and environmental sciences applications, uncertainty analysis regarding the outputs of models whose parameters are spatially varying (or spatially distributed) is often performed in a Monte Carlo framework. In this context, alternative realizations of the spatial distribution of model inputs, typically conditioned to reproduce attribute values at locations where measurements are obtained, are generated via geostatistical simulation using simple random (SR) sampling. The environmental model under consideration is then evaluated using each of these realizations as a plausible input, in order to construct a distribution of plausible model outputs for uncertainty analysis purposes. In hydrogeological investigations, for example, conditional simulations of saturated hydraulic conductivity are used as input to physically-based simulators of flow and transport to evaluate the associated uncertainty in the spatial distribution of solute concentration. Realistic uncertainty analysis via SR sampling, however, requires a large number of simulated attribute realizations for the model inputs in order to yield a representative distribution of model outputs; this often hinders the application of uncertainty analysis due to the computational expense of evaluating complex environmental models. Stratified sampling methods, including variants of Latin hypercube sampling, constitute more efficient sampling aternatives, often resulting in a more representative distribution of model outputs (e.g., solute concentration) with fewer model input realizations (e.g., hydraulic conductivity), thus reducing the computational cost of uncertainty analysis. The application of stratified and Latin hypercube sampling in a geostatistical simulation context, however, is not widespread, and, apart from a few exceptions, has been limited to the unconditional simulation case. This paper proposes methodological modifications for adopting existing methods for stratified sampling (including Latin hypercube sampling), employed to date in an unconditional geostatistical simulation context, for the purpose of efficient conditional simulation of Gaussian random fields. The proposed conditional simulation methods are compared to traditional geostatistical simulation, based on SR sampling, in the context of a hydrogeological flow and transport model via a synthetic case study. The results indicate that stratified sampling methods (including Latin hypercube sampling) are more efficient than SR, overall reproducing to a similar extent statistics of the conductivity (and subsequently concentration) fields, yet with smaller sampling variability. These findings suggest that the proposed efficient conditional sampling methods could contribute to the wider application of uncertainty analysis in spatially distributed environmental models using geostatistical simulation.  相似文献   

4.
An efficient method to upscale hydraulic conductivity (K) from detailed three-dimensional geostatistical models of hydrofacies heterogeneity to a coarser model grid is presented. Geologic heterogeneity of an alluvial fan system was characterized using transition-probability-based geostatistical simulations of hydrofacies distributions. For comparison of different hydrofacies architecture, two alternative models with different hydrofacies structures and geometries and a multi-Gaussian model, all with the same mean and variance in K, were created. Upscaling was performed on five realizations of each of the geostatistical models using the arithmetic and harmonic means of the K-values within vertical grid columns. The effects of upscaling on model domain equivalent K were investigated by means of steady-state flow simulations. A logarithmic increase in model domain equivalent K with increasing upscaling, was found for all fields. The shape of that upscaling function depended on the structure and geometry of the hydrofacies bodies. For different realizations of one geostatistical model, however, the upscaling function was the same. From the upscaling function a factor could be calculated to correct the upscaled K-fields for the local effects of upscaling.  相似文献   

5.
Reservoir characterization needs the integration of various data through history matching, especially dynamic information such as production or 4D seismic data. Although reservoir heterogeneities are commonly generated using geostatistical models, random realizations cannot generally match observed dynamic data. To constrain model realizations to reproduce measured dynamic data, an optimization procedure may be applied in an attempt to minimize an objective function, which quantifies the mismatch between real and simulated data. Such assisted history matching methods require a parameterization of the geostatistical model to allow the updating of an initial model realization. However, there are only a few parameterization methods available to update geostatistical models in a way consistent with the underlying geostatistical properties. This paper presents a local domain parameterization technique that updates geostatistical realizations using assisted history matching. This technique allows us to locally change model realizations through the variation of geometrical domains whose geometry and size can be easily controlled and parameterized. This approach provides a new way to parameterize geostatistical realizations in order to improve history matching efficiency.  相似文献   

6.
Transmissivity and head data are sampled from an exhaustive synthetic reference field and used to predict the arrival positions and arrival times of a number of particles transported across the field, together with an uncertainty estimate. Different combinations of number of transmissivity data and number of head data used are considered in each one of a series of 64 Monte-Carlo analyses. In each analysis, 250 realizations of transmissivity fields conditioned to both transmissivity and head data are generated using a novel geostatistically based inverse method. Pooling the solutions of the flow and transport equations in all 250 realizations allows building conditional frequency distributions for particle arrival positions and arrival times. By comparing these fresquency distributions, we can assess the incremental gain that additional head data provide. The main conclusion is that the first few head data dramatically improve the quality of transport predictions.  相似文献   

7.
A hierarchical scale-up framework is formulated to study the scaling characteristics of reservoir attributes and input dispersivities at the transport modeling scale, where heterogeneity distribution exhibits both non-stationarity (trend) and sub-scale variability. The proposed method is flexible to handle heterogeneities occurring at multiple scales, without any explicit assumption regarding the multivariate distribution of the heterogeneity. This paper extends our previous work by incorporating the effects of non-stationarity into the modeling workflow. Rock property at a given location is modeled as a random variable, which is decomposed into the sum of a trend (available on the same resolution of the transport modeling scale) and a residual component (defined at a much smaller scale). First, to scale up the residual component to the transport modeling scale, the corresponding volume variance is computed; by sampling numerous sets of “conditioning data” via bootstrapping and constructing multiple realizations of the residual components at the transport modeling, uncertainty due to this scale-up process is captured. Next, to compute the input dispersivity at the transport modeling scale, a flow-based technique is adopted: multiple geostatistical realizations of the same physical size as the transport modeling scale are generated to describe the spatial heterogeneity below the modeling scale. Each realization is subjected to particle-tracking simulation. Effective longitudinal and transverse dispersivities are estimated by minimizing the difference in effluent history for each realization and that of an equivalent average medium. Probability distributions of effective dispersivities are established by aggregating results from all realizations. The results demonstrate that both large-scale non-stationarity and sub-scale variability are both contributing to anomalous non-Fickian behavior. In comparison with our previous work, which ignored large-scale non-stationarity, the non-Fickian characteristics observed in this study is dramatically more pronounced.  相似文献   

8.
In the analysis of petroleum reservoirs, one of the most challenging problems is to use inverse theory in the search for an optimal parameterization of the reservoir. Generally, scientists approach this problem by computing a sensitivity matrix and then perform a singular value decomposition in order to determine the number of degrees of freedom i.e. the number of independent parameters necessary to specify the configuration of the system. Here we propose a complementary approach: it uses the concept of refinement indicators to select those degrees which have the greatest sensitivity to an objective function quantifying the mismatch between measured and simulated data. We apply this approach to the problem of data integration for petrophysical reservoir charaterization where geoscientists are currently working with multimillion cell geological models. Data integration may be performed by gradually deforming (by a linear combination) a set of these multimillion grid geostatistical realizations during the optimization process. The inversion parameters are then reduced to the number of coefficients of this linear combination. However, there is an infinity of geostatistical realizations to choose from which may not be efficient regarding operational constraints. Following our new approach, we are able through a single objective function evaluation to compute refinement indicators that indicate which realizations might improve the iterative geological model in a significant way. This computation is extremely fast as it implies a single gradient computation through the adjoint state approach and dot products. Using only the most sensitive realizations from a given set, we are able to resolve quicker the optimization problem case. We applied this methodology to the integration of interference test data into 3D geostatistical models.  相似文献   

9.
Representing Spatial Uncertainty Using Distances and Kernels   总被引:8,自引:7,他引:1  
Assessing uncertainty of a spatial phenomenon requires the analysis of a large number of parameters which must be processed by a transfer function. To capture the possibly of a wide range of uncertainty in the transfer function response, a large set of geostatistical model realizations needs to be processed. Stochastic spatial simulation can rapidly provide multiple, equally probable realizations. However, since the transfer function is often computationally demanding, only a small number of models can be evaluated in practice, and are usually selected through a ranking procedure. Traditional ranking techniques for selection of probabilistic ranges of response (P10, P50 and P90) are highly dependent on the static property used. In this paper, we propose to parameterize the spatial uncertainty represented by a large set of geostatistical realizations through a distance function measuring “dissimilarity” between any two geostatistical realizations. The distance function allows a mapping of the space of uncertainty. The distance can be tailored to the particular problem. The multi-dimensional space of uncertainty can be modeled using kernel techniques, such as kernel principal component analysis (KPCA) or kernel clustering. These tools allow for the selection of a subset of representative realizations containing similar properties to the larger set. Without losing accuracy, decisions and strategies can then be performed applying a transfer function on the subset without the need to exhaustively evaluate each realization. This method is applied to a synthetic oil reservoir, where spatial uncertainty of channel facies is modeled through multiple realizations generated using a multi-point geostatistical algorithm and several training images.  相似文献   

10.
Sedimentological processes often result in complex three-dimensional subsurface heterogeneity of hydrogeological parameter values. Variogram-based stochastic approaches are often not able to describe heterogeneity in such complex geological environments. This work shows how multiple-point geostatistics can be applied in a realistic hydrogeological application to determine the impact of complex geological heterogeneity on groundwater flow and transport. The approach is applied to a real aquifer in Belgium that exhibits a complex sedimentary heterogeneity and anisotropy. A training image is constructed based on geological and hydrogeological field data. Multiple-point statistics are borrowed from this training image to simulate hydrofacies occurrence, while intrafacies permeability variability is simulated using conventional variogram-based geostatistical methods. The simulated hydraulic conductivity realizations are used as input to a groundwater flow and transport model to investigate the effect of small-scale sedimentary heterogeneity on contaminant plume migration. Results show that small-scale sedimentary heterogeneity has a significant effect on contaminant transport in the studied aquifer. The uncertainty on the spatial facies distribution and intrafacies hydraulic conductivity distribution results in a significant uncertainty on the calculated concentration distribution. Comparison with standard variogram-based techniques shows that multiple-point geostatistics allow better reproduction of irregularly shaped low-permeability clay drapes that influence solute transport.  相似文献   

11.
Geophysical techniques can help to bridge the inherent gap that exists with regard to spatial resolution and coverage for classical hydrological methods. This has led to the emergence of a new and rapidly growing research domain generally referred to as hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters, their inherent trade-off between resolution and range, as well as the notoriously site-specific nature of petrophysical parameter relations, the fundamental usefulness of multi-method surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative Integration of the resulting vast and diverse database into a unified model of the probed subsurface region that is consistent with all available measurements. To this end, we present a novel approach toward hydrogeophysical data integration based on a Monte-Carlo-type conditional stochastic simulation method that we consider to be particularly suitable for high-resolution local-scale studies. Monte Carlo techniques are flexible and versatile, allowing for accounting for a wide variety of data and constraints of differing resolution and hardness, and thus have the potential of providing, in a geostatistical sense, realistic models of the pertinent target parameter distributions. Compared to more conventional approaches, such as co-kriging or cluster analysis, our approach provides significant advancements in the way that larger-scale structural information contained in the hydrogeophysical data can be accounted for. After outlining the methodological background of our algorithm, we present the results of its application to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the detailed local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to a field dataset collected at the Boise Hydrogeophysical Research Site. Finally, we compare the performance of our data integration approach to that of more conventional methods with regard to the prediction of flow and transport phenomena in highly heterogeneous media and discuss the Implications arising.  相似文献   

12.
Geologic uncertainties and limited well data often render recovery forecasting a difficult undertaking in typical appraisal and early development settings. Recent advances in geologic modeling algorithms permit automation of the model generation process via macros and geostatistical tools. This allows rapid construction of multiple alternative geologic realizations. Despite the advances in geologic modeling, computation of the reservoir dynamic response via full-physics reservoir simulation remains a computationally expensive task. Therefore, only a few of the many probable realizations are simulated in practice. Experimental design techniques typically focus on a few discrete geologic realizations as they are inherently more suitable for continuous engineering parameters and can only crudely approximate the impact of geology. A flow-based pattern recognition algorithm (FPRA) has been developed for quantifying the forecast uncertainty as an alternative. The proposed algorithm relies on the rapid characterization of the geologic uncertainty space represented by an ensemble of sufficiently diverse static model realizations. FPRA characterizes the geologic uncertainty space by calculating connectivity distances, which quantify how different each individual realization is from all others in terms of recovery response. Fast streamline simulations are employed in evaluating these distances. By applying pattern recognition techniques to connectivity distances, a few representative realizations are identified within the model ensemble for full-physics simulation. In turn, the recovery factor probability distribution is derived from these intelligently selected simulation runs. Here, FPRA is tested on an example case where the objective is to accurately compute the recovery factor statistics as a function of geologic uncertainty in a channelized turbidite reservoir. Recovery factor cumulative distribution functions computed by FPRA compare well to the one computed via exhaustive full-physics simulations.  相似文献   

13.
Modern geostatistical techniques allow the generation of high-resolution heterogeneous models of hydraulic conductivity containing millions to billions of cells. Selective upscaling is a numerical approach for the change of scale of fine-scale hydraulic conductivity models into coarser scale models that are suitable for numerical simulations of groundwater flow and mass transport. Selective upscaling uses an elastic gridding technique to selectively determine the geometry of the coarse grid by an iterative procedure. The geometry of the coarse grid is built so that the variances of flow velocities within the coarse blocks are minimum. Selective upscaling is able to handle complex geological formations and flow patterns, and provides full hydraulic conductivity tensor for each block. Selective upscaling is applied to a cross-bedded formation in which the fine-scale hydraulic conductivities are full tensors with principal directions not parallel to the statistical anisotropy of their spatial distribution. Mass transport results from three coarse-scale models constructed by different upscaling techniques are compared to the fine-scale results for different flow conditions. Selective upscaling provides coarse grids in which mass transport simulation is in good agreement with the fine-scale simulations, and consistently superior to simulations on traditional regular (equal-sized) grids or elastic grids built without accounting for flow velocities.  相似文献   

14.
The heterogeneity of facies at the scale of individual lithological levels controls, at a macroscopic scale, water flow and contaminant transport in porous sediments. In particular the presence of organized features such as permeable connected levels, has a significant effect on travel times and dispersion. Here, the effects of facies heterogeneity on flow and transport are studied for three blocks, whose volume is of the order of a cubic meter, dug from alluvial sediments from the Ticino valley (Italy). Using the results of numerical tracer experiments on these domains, the longitudinal dispersion coefficient is computed with an Eulerian approach based on the fit of the breakthrough curves with the analytical solution of the convective-dispersive transport equation. Moreover, the dispersion tensor is computed with a Lagrangian approach from the second order moments of particle distributions. Three types of connectivity indicators are tested: (1) connectivity function; (2) flow, transport and statistical connectivity; (3) original (intrinsic, normal and total) indicators of facies connectivity. The connectivity function provides the most complete information. Some of the transport and statistical connectivity indicators are correlated with dispersivity. The simultaneous analysis of the three indicators of facies connectivity emphasizes the fundamental geometrical features that control transport.  相似文献   

15.
Seismic inverse modeling, which transforms appropriately processed geophysical data into the physical properties of the Earth, is an essential process for reservoir characterization. This paper proposes a work flow based on a Markov chain Monte Carlo method consistent with geology, well-logs, seismic data, and rock-physics information. It uses direct sampling as a multiple-point geostatistical method for generating realizations from the prior distribution, and Metropolis sampling with adaptive spatial resampling to perform an approximate sampling from the posterior distribution, conditioned to the geophysical data. Because it can assess important uncertainties, sampling is a more general approach than just finding the most likely model. However, since rejection sampling requires a large number of evaluations for generating the posterior distribution, it is inefficient and not suitable for reservoir modeling. Metropolis sampling is able to perform an equivalent sampling by forming a Markov chain. The iterative spatial resampling algorithm perturbs realizations of a spatially dependent variable, while preserving its spatial structure by conditioning to subset points. However, in most practical applications, when the subset conditioning points are selected at random, it can get stuck for a very long time in a non-optimal local minimum. In this paper it is demonstrated that adaptive subset sampling improves the efficiency of iterative spatial resampling. Depending on the acceptance/rejection criteria, it is possible to obtain a chain of geostatistical realizations aimed at characterizing the posterior distribution with Metropolis sampling. The validity and applicability of the proposed method are illustrated by results for seismic lithofacies inversion on the Stanford VI synthetic test sets.  相似文献   

16.
In many fields of the Earth Sciences, one is interested in the distribution of particle or void sizes within samples. Like many other geological attributes, size distributions exhibit spatial variability, and it is convenient to view them within a geostatistical framework, as regionalized functions or curves. Since they rarely conform to simple parametric models, size distributions are best characterized using their raw spectrum as determined experimentally in the form of a series of abundance measures corresponding to a series of discrete size classes. However, the number of classes may be large and the class abundances may be highly cross-correlated. In order to model the spatial variations of discretized size distributions using current geostatistical simulation methods, it is necessary to reduce the number of variables considered and to render them uncorrelated among one another. This is achieved using a principal components-based approach known as Min/Max Autocorrelation Factors (MAF). For a two-structure linear model of coregionalization, the approach has the attractive feature of producing orthogonal factors ranked in order of increasing spatial correlation. Factors consisting largely of noise and exhibiting pure nugget–effect correlation structures are isolated in the lower rankings, and these need not be simulated. The factors to be simulated are those capturing most of the spatial correlation in the data, and they are isolated in the highest rankings. Following a review of MAF theory, the approach is applied to the modeling of pore-size distributions in partially welded tuff. Results of the case study confirm the usefulness of the MAF approach for the simulation of large numbers of coregionalized variables.  相似文献   

17.
A new approach based on principal component analysis (PCA) for the representation of complex geological models in terms of a small number of parameters is presented. The basis matrix required by the method is constructed from a set of prior geological realizations generated using a geostatistical algorithm. Unlike standard PCA-based methods, in which the high-dimensional model is constructed from a (small) set of parameters by simply performing a multiplication using the basis matrix, in this method the mapping is formulated as an optimization problem. This enables the inclusion of bound constraints and regularization, which are shown to be useful for capturing highly connected geological features and binary/bimodal (rather than Gaussian) property distributions. The approach, referred to as optimization-based PCA (O-PCA), is applied here mainly for binary-facies systems, in which case the requisite optimization problem is separable and convex. The analytical solution of the optimization problem, as well as the derivative of the model with respect to the parameters, is obtained analytically. It is shown that the O-PCA mapping can also be viewed as a post-processing of the standard PCA model. The O-PCA procedure is applied both to generate new (random) realizations and for gradient-based history matching. For the latter, two- and three-dimensional systems, involving channelized and deltaic-fan geological models, are considered. The O-PCA method is shown to perform very well for these history matching problems, and to provide models that capture the key sand–sand and sand–shale connectivities evident in the true model. Finally, the approach is extended to generate bimodal systems in which the properties of both facies are characterized by Gaussian distributions. MATLAB code with the O-PCA implementation, and examples demonstrating its use are provided online as Supplementary Materials.  相似文献   

18.
Direct push (DP) technologies are typically used for cost-effective geotechnical characterization of unconsolidated soils and sediments. In more recent developments, DP technologies have been used for efficient hydraulic conductivity (K) characterization along vertical profiles with sampling resolutions of up to a few centimetres. Until date, however, only a limited number of studies document high-resolution in situ DP data for three-dimensional conceptual hydrogeological model development and groundwater flow model parameterization. This study demonstrates how DP technologies improve building of a conceptual hydrogeological model. We further evaluate the degree to which the DP-derived hydrogeological parameter K, measured across different spatial scales, improves performance of a regional groundwater flow model. The study area covers an area of ~60 km2 with two overlying, mainly unconsolidated sand aquifers separated by a 5–7 m thick highly heterogeneous clay layer (in north-eastern Belgium). The hydrostratigraphy was obtained from an analysis of cored boreholes and about 265 cone penetration tests (CPTs). The hydrogeological parameter K was derived from a combined analysis of core and CPT data and also from hydraulic direct push tests. A total of 50 three-dimensional realizations of K were generated using a non-stationary multivariate geostatistical approach. To preserve the measured K values in the stochastic realizations, the groundwater model K realizations were conditioned on the borehole and direct push data. Optimization was performed to select the best performing model parameterization out of the 50 realizations. This model outperformed a previously developed reference model with homogeneous K fields for all hydrogeological layers. Comparison of particle tracking simulations, based either on the optimal heterogeneous or reference homogeneous groundwater model flow fields, demonstrate the impact DP-derived subsurface heterogeneity in K can have on groundwater flow and solute transport. We demonstrated that DP technologies, especially when calibrated with site-specific data, provide high-resolution 3D subsurface data for building more reliable conceptual models and increasing groundwater flow model performance.  相似文献   

19.
Particle-tracking simulation offers a fast and robust alternative to conventional numerical discretization techniques for modeling solute transport in subsurface formations. A common challenge is that the modeling scale is typically much larger than the volume scale over which measurements of rock properties are made, and the scale-up of measurements have to be made accounting for the pattern of spatial heterogeneity exhibited at different scales. In this paper, a statistical scale-up procedure developed in our previous work is adopted to estimate coarse-scale (effective) transition time functions for transport modeling, while two significant improvements are proposed: considering the effects of non-stationarity (trend), as well as unresolved (residual) heterogeneity below the fine-scale model. Rock property is modeled as a multivariate random function, which is decomposed into the sum of a trend (which is defined at the same resolution of the transport modeling scale) and a residual (representing all heterogeneities below the transport modeling scale). To construct realizations of a given rock property at the transport modeling scale, multiple realizations of the residual components are sampled. Next, a flow-based technique is adopted to compute the effective transport parameters: firstly, it is assumed that additional unresolved heterogeneities occurring below the fine scale can be described by a probabilistic transit time distribution; secondly, multiple realizations of the rock property, with the same physical size as the transport modeling scale, are generated; thirdly, each realization is subjected to particle-tracking simulation; finally, probability distributions of effective transition time function are estimated by matching the corresponding effluent history for each realization with an equivalent medium consisting of averaged homogeneous rock properties and aggregating results from all realizations. The proposed method is flexible that it does not invoke any explicit assumption regarding the multivariate distribution of the heterogeneity.  相似文献   

20.
The Proportional Effect   总被引:2,自引:1,他引:1  
Many regionalized variables encountered in geostatistical application show a proportional effect, that is, greater variability in high valued areas. The proportional effect is a consequence of the univariate distribution of the variable being considered. This presents several challenges including inference of variograms and conditional distributions. The correlogram and relative variogram were devised to mitigate challenges with variography. Other techniques such as normalization, direct simulation, and indicators provide a means to characterize conditional distributions. A review of these tools is given along with several examples to provide an explanation of the proportional effect.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号