首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Sequential Gaussian simulation is one of the most widespread algorithms for simulating regionalized variables in the earth sciences. Simplicity and flexibility of this algorithm are the most important reasons that make it popular, but its implementation is highly dependent on a screen effect approximation that allows users to use a moving neighborhood instead of a unique neighborhood. Because of this, the size of the moving neighborhood the number of conditioning data and the size of variogram range are important in the simulation process and should be chosen carefully. In this work, different synthetic and real case studies are presented to show the effect of the neighborhood size the number of conditioning data and the size of variogram range on the simulation result, with respect to the reproduction of the model first and second-order parameters. Results indicate that, in both conditional and non-conditional simulation cases, using a neighborhood with <50 conditioning data may lead to an inaccurate reproduction of the model statistics, and some cases require considering more than 200 conditioning data. It also can be understood from the result of example 3 that when the variogram range is beg compared to the simulation domain determination of inaccurate simulation program is harder.  相似文献   

2.
This paper addresses the problem of simulating multivariate random fields with stationary Gaussian increments in a d-dimensional Euclidean space. To this end, one considers a spectral turning-bands algorithm, in which the simulated field is a mixture of basic random fields made of weighted cosine waves associated with random frequencies and random phases. The weights depend on the spectral density of the direct and cross variogram matrices of the desired random field for the specified frequencies. The algorithm is applied to synthetic examples corresponding to different spatial correlation models. The properties of these models and of the algorithm are discussed, highlighting its computational efficiency, accuracy and versatility.  相似文献   

3.
After analyzing many studies of fluid flow theory of multi-porous media in low and extra-low permeability reservoirs and the numerical simulation of non-Darcy flow, we found that a negative flow rate occurs in the existing non-Darcy flow equation, which is unreasonable. We believe that the existing equation can only be considered as a discriminant to judging Darcy flow or non-Darcy flow, and cannot be taken as a fluid flow governing equation of multi-porous media. Our analysis of the experimental results shows that the threshold pressure gradient (TPG) of low and extra-low permeability reservoirs is excessively high, and does not conform to fluid flow through multi-porous media in the actual reservoir situation. Therefore, we present a reasonable TPG ranging from 0.006 to 0.04 MPa/m at the well depth of 1500 m and oil drainage distance of 500 m. The results of our study also indicate that the non-Darcy flow phenomenon will disappear when the TPG reaches a certain value. In addition, the TPG or non-Darcy flow in low and extra-low permeability reservoirs does not need to be considered in the productivity prediction and reservoir numerical simulation. At present, the black oil model or dual-porous media is suitable for simulating low and extra-low permeability reservoirs.  相似文献   

4.
We analyze the impact of the choice of the variogram model adopted to characterize the spatial variability of natural log-transmissivity on the evaluation of leading (statistical) moments of hydraulic heads and contaminant travel times and trajectories within mildly (randomly) heterogeneous two-dimensional porous systems. The study is motivated by the fact that in several practical situations the differences between various variogram types and a typical noisy sample variogram are small enough to suggest that one would often have a hard time deciding which of the tested models provides the best fit. Likewise, choosing amongst a set of seemingly likely variogram models estimated by means of geostatistical inverse models of flow equations can be difficult due to lack of sensitivity of available model discrimination criteria. We tackle the problem within the framework of numerical Monte Carlo simulations for mean uniform and radial flow scenarios. The effect of three commonly used isotropic variogram models, i.e., Gaussian, Exponential and Spherical, is analyzed. Our analysis clearly shows that (ensemble) mean values of the quantities of interest are not considerably influenced by the variogram shape for the range of parameters examined. Contrariwise, prediction variances of the quantities examined are significantly affected by the choice of the variogram model of the log-transmissivity field. The spatial distribution of the largest/lowest values of the relative differences observed amongst the tested models depends on a combination of variogram shape and parameters and relative distance from internal sources and the outer domain boundary. Our findings suggest the need of developing robust techniques to discriminate amongst a set of seemingly equally likely alternative variogram models in order to provide reliable uncertainty estimates of state variables.  相似文献   

5.
This paper introduces an extension of the traditional stationary linear coregionalization model to handle the lack of stationarity. Under the proposed model, coregionalization matrices are spatially dependent, and basic univariate spatial dependence structures are non-stationary. A parameter estimation procedure of the proposed non-stationary linear coregionalization model is developed under the local stationarity framework. The proposed estimation procedure is based on the method of moments and involves a matrix-valued local stationary variogram kernel estimator, a weighted local least squares method in combination with a kernel smoothing technique. Local parameter estimates are knitted together for prediction and simulation purposes. The proposed non-stationary multivariate spatial modeling approach is illustrated using two real bivariate data examples. Prediction performance comparison is carried out with the classical stationary multivariate spatial modeling approach. According to several criteria, the prediction performance of the proposed non-stationary multivariate spatial modeling approach appears to be significantly better.  相似文献   

6.
岩相和储层物性参数是油藏表征的重要参数,地震反演是储层表征和油气藏勘探开发的重要手段.随机地震反演通常基于地质统计学理论,能够对不同类型的信息源进行综合,建立具有较高分辨率的储层模型,因而得到广泛关注.其中,概率扰动方法是一种高效的迭代随机反演策略,它能综合考虑多种约束信息,且只需要较少的迭代次数即可获得反演结果.在概率扰动的优化反演策略中,本文有效的联合多点地质统计学与序贯高斯模拟,并结合统计岩石物理理论实现随机反演.首先,通过多点地质统计学随机模拟,获得一系列等可能的岩相模型,扰动更新初始岩相模型后利用相控序贯高斯模拟建立多个储层物性参数模型;然后通过统计岩石物理理论,计算相应的弹性参数;最后,正演得到合成地震记录并与实际地震数据对比,通过概率扰动方法进行迭代,直到获得满足给定误差要求的反演结果.利用多点地质统计学,能够更好地表征储层空间特征.相控序贯高斯模拟的应用,能够有效反映不同岩相中储层物性参数的分布.提出的方法可在较少的迭代次数内同时获得具有较高分辨率的岩相和物性参数反演结果,模型测试和实际数据应用验证了方法的可行性和有效性.  相似文献   

7.
Accurate estimation of aquifer parameters, especially from crystalline hard rock area, assumes a special significance for management of groundwater resources. The aquifer parameters are usually estimated through pumping tests carried out on water wells. While it may be costly and time consuming for carrying out pumping tests at a number of sites, the application of geophysical methods in combination with hydro-geochemical information proves to be potential and cost effective to estimate aquifer parameters. Here a method to estimate aquifer parameters such as hydraulic conductivity, formation factor, porosity and transmissivity is presented by utilizing electrical conductivity values analysed via hydro-geochemical analysis of existing wells and the respective vertical electrical sounding (VES) points of Sindhudurg district, western Maharashtra, India. Further, prior to interpolating the distribution of aquifer parameters of the study area, variogram modelling was carried out using data driven techniques of kriging, automatic relevance determination based Bayesian neural networks (ARD-BNN) and adaptive neuro-fuzzy neural networks (ANFIS). In total, four variogram model fitting techniques such as spherical, exponential, ARD-BNN and ANFIS were compared. According to the obtained results, the spherical variogram model in interpolating transmissivity, ARD-BNN variogram model in interpolating porosity, exponential variogram model in interpolating aquifer thickness and ANFIS variogram model in interpolating hydraulic conductivity outperformed rest of the variogram models. Accordingly, the accurate aquifer parameters maps of the study area were produced by using the best variogram model. The present results suggest that there are relatively high value of hydraulic conductivity, porosity and transmissivity at Parule, Mogarne, Kudal, and Zarap, which would be useful to characterize the aquifer system over western Maharashtra.  相似文献   

8.
The plurigaussian model is used in mining engineering, oil reservoir characterization, hydrology and environmental sciences to simulate the layout of geological domains in the subsurface, while reproducing their spatial continuity and dependence relationships. However, this model is well-established only in the stationary case, when the spatial distribution of the domains is homogeneous in space, and suffers from theoretical and practical impediments in the non-stationary case. To overcome these limitations, this paper proposes extending the model to the truncation of intrinsic random fields of order k with Gaussian generalized increments, which allows reproducing spatial trends in the distribution of the geological domains. Methodological tools and algorithms are presented to infer the model parameters and to construct realizations of the geological domains conditioned to existing data. The proposal is illustrated with the simulation of rock type domains in an ore deposit in order to demonstrate its applicability. Despite the limited number of conditioning data, the results show a remarkable agreement between the simulated domains and the lithological model interpreted by geologists, while the conventional stationary plurigaussian model turns out to be unsuccessful.  相似文献   

9.
Histogram and variogram inference in the multigaussian model   总被引:1,自引:4,他引:1  
Several iterative algorithms are proposed to improve the histogram and variogram inference in the framework of the multigaussian model. The starting point is the variogram obtained after a traditional normal score transform. The subsequent step consists in simulating many sets of gaussian values with this variogram at the data locations, so that the ranking of the original values is honored. The expected gaussian transformation and the expected variogram are computed by an averaging operation over the simulated datasets. The variogram model is then updated and the procedure is repeated until convergence. Such an iterative algorithm can adapt to the case of tied data and despike the histogram. Two additional issues are also examined, referred to the modeling of the empirical transformation function and to the optimal pair weighting when computing the sample variogram.  相似文献   

10.
Exploring a valid model for the variogram of an isotropic spatial process   总被引:1,自引:1,他引:0  
The variogram is one of the most important tools in the assessment of spatial variability and a crucial parameter for kriging. It is widely known that an estimator for the variogram cannot be used as its representator in some contexts because of its lack of conditional semi negative definiteness. Consequently, once the variogram is estimated, a valid family must be chosen to fit an appropriate model. Under isotropy, this selection is carried out by eye from the observation of the variogram estimated curve. In this paper, a statistical methodology is proposed to explore a valid model for the variogram. The statistic for this approach is based on quadratic forms depending on smoothed random variables which gather the underlying spatial variation. The distribution of the test statistic is approximated by a shifted chi-square distribution. A simulation study is also carried out to check the power and size of the test. Reference bands, as a complementary graphical tool, are calculated. An example from the literature is used to illustrate the methodologies presented.  相似文献   

11.
Some time ago, we described and implemented two methods of seismic data compression. In the first method a seismic trace is considered as being the convolution of a distribution made up of the trace peak values with a Gaussian pseudo-pulse. The second method is performed through a truncation of the sequential (Walsh, Paley or Haar) spectrum of each trace. In this paper it is shown that neither method has adverse effects on quality when traces with their information compressed undergo conventional data processing, such as stacking and deconvolution.  相似文献   

12.
Computer vision provides several tools for analyzing and simulating textures. The principles of these techniques are similar to those in multiple-point geostatistics, namely, the reproduction of patterns and consistency in the results from a perceptual point of view, thus, ensuring the reproduction of long range connectivity. The only difference between these techniques and geostatistical simulation accounting for multiple-point statistics is that conditioning is not an issue in computer vision. We present a solution to the problem of conditioning simulated fields while simultaneously honoring multiple-point (pattern) statistics. The proposal is based on a texture synthesis algorithm where a fixed search (causal) pattern is used. Conditioning is achieved by adding a non-causal search neighborhood that modifies the conditional distribution from which the simulated category is drawn, depending on the conditioning information. Results show an excellent reproduction of the features from the training image, while respecting the conditioning information. Some issues related to the data structure and to the computer efficiency are discussed.  相似文献   

13.
The variogram is a key parameter for geostatistical estimation and simulation. Preferential sampling may bias the spatial structure and often leads to noisy and unreliable variograms. A novel technique is proposed to weight variogram pairs in order to compensate for preferential or clustered sampling . Weighting the variogram pairs by global kriging of the quadratic differences between the tail and head values gives each pair the appropriate weight, removes noise and minimizes artifacts in the experimental variogram. Moreover, variogram uncertainty could be computed by this technique. The required covariance between the pairs going into variogram calculation, is a fourth order covariance that must be calculated by second order moments. This introduces some circularity in the calculation whereby an initial variogram must be assumed before calculating how the pairs should be weighted for the experimental variogram. The methodology is assessed by synthetic and realistic examples. For synthetic example, a comparison between the traditional and declustered variograms shows that the declustered variograms are better estimates of the true underlying variograms. The realistic example also shows that the declustered sample variogram is closer to the true variogram.  相似文献   

14.
The stochastic continuum (SC) representation is one common approach for simulating the effects of fracture heterogeneity in groundwater flow and transport models. These SC reservoir models are generally developed using geostatistical methods (e.g., kriging or sequential simulation) that rely on the model semivariogram to describe the spatial variability of each continuum. Although a number of strategies for sampling spatial distributions have been published in the literature, little attention has been paid to the optimization of sampling in resource- or access-limited environments. Here we present a strategy for estimating the minimum sample spacing needed to define the spatial distribution of fractures on a vertical outcrop of basalt, located in the Box Canyon, east Snake River Plain, Idaho. We used fracture maps of similar basalts from the published literature to test experimentally the effects of different sample spacings on the resulting semivariogram model. Our final field sampling strategy was based on the lowest sample density that reproduced the semivariogram of the exhaustively sampled fracture map. Application of the derived sampling strategy to an outcrop in our field area gave excellent results, and illustrates the utility of this type of sample optimization. The method will work for developing a sampling plan for any intensive property, provided prior information for a similar domain is available; for example, fracture maps or ortho-rectified photographs from analogous rock types could be used to plan for sampling of a fractured rock outcrop.  相似文献   

15.
16.
This study aims to investigate the changing properties of drought events in Weihe River basin, China, by modeling the multivariate joint distribution of drought duration, severity and peak using trivariate Gaussian and Student t copulas. Monthly precipitations of Xi'an gauge are used to illustrate the meta‐elliptical copula‐based methodology for a single‐station application. Gaussian and Student t copulas are found to produce a better fit comparing with other six symmetrical and asymmetrical Archimedean copulas, and, checked by the goodness‐of‐fit tests based on a modified bootstrap version of Rosenblatt's transformation, both of them are acceptable to model the multivariate joint distribution of drought variables. Gaussian copula, the best fitting, is employed to construct the dependence structures of positively associated drought variables so as to obtain the multivariate joint and conditional probabilities of droughts. A Kendall's return period (KRP) introduced by Salvadori and De Michele (2010) is then adopted to assess the multivariate recurrent properties of drought events, and its spatial distributions indicate that prolonged droughts are likely to break out with rather short recurrence intervals in the whole region, while drought status in the southeast seems to be severer than the northwest. The study is of some merits in terms of multivariate drought modeling using a preferable copula‐based method, the results of which could serve as a reference for regional drought defense and water resources management. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

17.
为研究地震产生的应力波在断续节理岩体中的传播规律和应力分布趋势,首先,采用数值模拟方法分析应力波通过贯通节理的传播规律,并与已有理论研究结果进行对比,验证数值分析的准确性和适用性;然后,对应力波在断续节理岩体中的传播进行数值模拟,分析透射系数在水平方向的分布趋势以及不同节理连续性对波传播的影响,并结合波的衍射原理,给出定性的理论解释。结果表明:应力波通过断续节理时,节理的透射作用会使应力波振幅减小,引起波的衰减,岩桥的衍射作用则会使波阵面由平面变为曲面,波的传播方向发生改变,从而导致应力波振幅在水平方向的分布发生变化;应力波通过断续节理的透射系数与岩桥尺寸Lr和衍射角μ相关,当衍射角比较小时,透射系数主要受岩桥尺寸Lr的影响,当衍射角较大时,岩桥尺寸Lr和衍射角μ共同影响应力波在岩体中的传播。   相似文献   

18.
Truncated pluri-Gaussian simulation (TPGS) is suitable for the simulation of categorical variables that show natural ordering as the TPGS technique can consider transition probabilities. The TPGS assumes that categorical variables are the result of the truncation of underlying latent variables. In practice, only the categorical variables are observed. This translates the practical application of TPGS into a missing data problem in which all latent variables are missing. Latent variables are required at data locations in order to condition categorical realizations to observed categorical data. The imputation of missing latent variables at data locations is often achieved by either assigning constant values or spatially simulating latent variables subject to categorical observations. Realizations of latent variables can be used to condition all model realizations. Using a single realization or a constant value to condition all realizations is the same as assuming that latent variables are known at the data locations and this assumption affects uncertainty near data locations. The techniques for imputation of latent variables in TPGS framework are investigated in this article and their impact on uncertainty of simulated categorical models and possible effects on factors affecting decision making are explored. It is shown that the use of single realization of latent variables leads to underestimation of uncertainty and overestimation of measured resources while the use constant values for latent variables may lead to considerable over or underestimation of measured resources. The results highlight the importance of multiple data imputation in the context of TPGS.  相似文献   

19.
大跨度桥梁风场模拟方法对比研究   总被引:18,自引:4,他引:14  
本文将基于线性滤波器的ARMA模型应用于大跨度桥梁的风场模拟,推导出自回归(AR)阶数P和滑动回归(MA)阶数q不等情况下,ARMA模型用于模拟多变量稳态随机过程的公式,将ARMA风场模拟方法与目前广泛应用于大跨度桥梁风场模拟的谐波合成法应用于一座实际大跨度斜拉桥的风场模拟,通过对比研究得出一些有意义的结论,并证实了ARMA法能够在保证模拟精度的前提下,大大提高风场模拟的效率。  相似文献   

20.
In a spatial property modeling context, the variables of interest to be modeled often display complex nonlinear features. Techniques to incorporate these nonlinear features, such as multiple point statistics or cummulants, are often complex with input parameters that are difficult to infer. The methodology proposed in this paper uses a classical vector-based definition of locally varying anisotropy to characterize nonlinear features and incorporate locally varying anisotropy into numerical property models. The required input is an exhaustive field of anisotropy orientation and magnitude. The methodology consists of (1) using the shortest path distance between locations to define the covariance between points in space (2) multidimensional scaling of the domain to ensure positive definite kriging equations and (3) estimation or simulation with kriging or sequential Gaussian simulation. The only additional parameter required when kriging or simulating with locally varying anisotropy is the number of dimensions to retain in multidimensional scaling. The methodology is demonstrated on a CO2 emissions data set for the United States in 2002 and shows an improvement in cross validation results as well as a visual reproduction of nonlinear features.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号