首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 718 毫秒
1.
In the application of a conventional common‐reflection‐surface (CRS) stack, it is well‐known that only one optimum stacking operator is determined for each zero‐offset sample to be simulated. As a result, the conflicting dip situations are not taken into account and only the most prominent event contributes to any a particular stack sample. In this paper, we name this phenomenon caused by conflicting dip problems as ‘dip discrimination phenomenon’. This phenomenon is not welcome because it not only leads to the loss of weak reflections and tips of diffractions in the final zero‐offset‐CRS stacked section but also to a deteriorated quality in subsequent migration. The common‐reflection‐surface stack with the output imaging scheme (CRS‐OIS) is a novel technique to implement a CRS stack based on a unified Kirchhoff imaging approach. As far as dealing with conflicting dip problems is concerned, the CRS‐OIS is a better option than a conventional CRS stack. However, we think the CRS‐OIS can do more in this aspect. In this paper, we propose a workflow to handle the dip discrimination phenomenon based on a cascaded implementation of prestack time migration, CRS‐OIS and prestack time demigration. Firstly, a common offset prestack time migration is implemented. Then, a CRS‐OIS is applied to the time‐migrated common offset gather. Afterwards, a prestack time demigration is performed to reconstruct each unmigrated common offset gather with its reflections being greatly enhanced and diffractions being well preserved. Compared with existing techniques dealing with conflicting dip problems, the technique presented in this paper preserves most of the diffractions and accounts for reflections from all possible dips properly. More importantly, both the post‐stacked data set and prestacked data set can be of much better quality after the implementation of the presented scheme. It serves as a promising alternative to other techniques except that it cannot provide the typical CRS wavefield attributes. The numerical tests on a synthetic Marmousi data set and a real 2D marine data set demonstrated its effectiveness and robustness.  相似文献   

2.
Fluid depletion within a compacting reservoir can lead to significant stress and strain changes and potentially severe geomechanical issues, both inside and outside the reservoir. We extend previous research of time‐lapse seismic interpretation by incorporating synthetic near‐offset and full‐offset common‐midpoint reflection data using anisotropic ray tracing to investigate uncertainties in time‐lapse seismic observations. The time‐lapse seismic simulations use dynamic elasticity models built from hydro‐geomechanical simulation output and a stress‐dependent rock physics model. The reservoir model is a conceptual two‐fault graben reservoir, where we allow the fault fluid‐flow transmissibility to vary from high to low to simulate non‐compartmentalized and compartmentalized reservoirs, respectively. The results indicate time‐lapse seismic amplitude changes and travel‐time shifts can be used to qualitatively identify reservoir compartmentalization. Due to the high repeatability and good quality of the time‐lapse synthetic dataset, the estimated travel‐time shifts and amplitude changes for near‐offset data match the true model subsurface changes with minimal errors. A 1D velocity–strain relation was used to estimate the vertical velocity change for the reservoir bottom interface by applying zero‐offset time shifts from both the near‐offset and full‐offset measurements. For near‐offset data, the estimated P‐wave velocity changes were within 10% of the true value. However, for full‐offset data, time‐lapse attributes are quantitatively reliable using standard time‐lapse seismic methods when an updated velocity model is used rather than the baseline model.  相似文献   

3.
4.
地震反射走时拾取是反射走时层析成像的首要环节。本文提出一种基于共炮点域、共检波点域、共中心点域、共偏移距域的多域人机交互反射波走时拾取方法。通过分析地震记录在不同域的特征,选择最佳的域进行反射波同相轴的拾取,在人机交互的环境下采用人工和计算机相结合,提高拾取的准确度和效率。利用Qt语言编程实现了地震资料的多域显示及反射波走时多域人机交互拾取软件。合成地震记录和实际地震资料的走时拾取结果表明,该软件操作灵活方便,对复杂地震资料的反射波走时拾取取得良好效果。   相似文献   

5.
Decomposing seismic data in local slopes is the basic idea behind velocity‐independent imaging. Using accurate moveout approximations enables computing moveout attributes such as normal moveout velocity and nonhyperbolic parameters as functions of zero‐offset travel time. Mapping of moveout attributes is performed from the pre‐stack seismic data domain into the time‐migrated image domain. The different moveout attributes have different accuracy for a given moveout approximation that depends on the corresponding order of travel‐time derivative. The most accurate attribute is the zero‐offset travel time, and the nonhyperbolic parameter has the worst accuracy, regardless of the moveout approximation. Typically, the mapping of moveout attributes is performed using a point‐to‐point procedure, whereas the generalized moveout approximation requires two point‐to‐point mappings. Testing the attribute mapping on the different models shows that the accuracy of mapped attributes is model dependent, whereas the generalized moveout approximation gives practically exact results.  相似文献   

6.
In common‐reflection‐surface imaging the reflection arrival time field is parameterized by operators that are of higher dimension or order than in conventional methods. Using the common‐reflection‐surface approach locally in the unmigrated prestack data domain opens a potential for trace regularization and interpolation. In most data interpolation methods based on local coherency estimation, a single operator is designed for a target sample and the output amplitude is defined as a weighted average along the operator. This approach may fail in presence of interfering events or strong amplitude and phase variations. In this paper we introduce an alternative scheme in which there is no need for an operator to be defined at the target sample itself. Instead, the amplitude at a target sample is constructed from multiple operators estimated at different positions. In this case one operator may contribute to the construction of several target samples. Vice versa, a target sample might receive contributions from different operators. Operators are determined on a grid which can be sparser than the output grid. This allows to dramatically decrease the computational costs. In addition, the use of multiple operators for a single target sample stabilizes the interpolation results and implicitly allows several contributions in case of interfering events. Due to the considerable computational expense, common‐reflection‐surface interpolation is limited to work in subsets of the prestack data. We present the general workflow of a common‐reflection‐surface‐based regularization/interpolation for 3D data volumes. This workflow has been applied to an OBC common‐receiver volume and binned common‐offset subsets of a 3D marine data set. The impact of a common‐reflection‐surface regularization is demonstrated by means of a subsequent time migration. In comparison to the time migrations of the original and DMO‐interpolated data, the results show particular improvements in view of the continuity of reflections events. This gain is confirmed by an automatic picking of a horizon in the stacked time migrations.  相似文献   

7.
The common focal point (CFP) method and the common reflection surface (CRS) stack method are compared. The CRS method is a fast, highly automated procedure that provides high S/N ratio simulation of zero‐offset (ZO) images by combining, per image point, the reflection energy of an arc segment that is tangential to the reflector. It uses smooth parametrized two‐way stacking operators, based on a data‐driven triplet of attributes in 2D (eight parameters in 3D). As a spin‐off, the attributes can be used for several applications, such as the determination of the geometrical spreading factor, multiple prediction, and tomographic inversion into a smooth background velocity model. The CFP method aims at decomposing two‐way seismic reflection data into two full‐aperture one‐way propagation operators. By applying an iterative updating procedure in a half‐migrated domain, it provides non‐smooth focusing operators for prestack imaging using only the energy from one focal point at the reflector. The data‐driven operators inhibit all propagation effects of the overburden. The CFP method provides several spin‐offs, amongst which is the CFP matrix related to one focal point, which displays the reflection amplitudes as measured at the surface for each source–receiver pair. The CFP matrix can be used to determine the specular reflection source–receiver pairs and the Fresnel zone at the surface for reflection in one single focal point. Other spin‐offs are the prediction of internal multiples, the determination of reflectivity effects, velocity‐independent redatuming and tomographic inversion to obtain a velocity–depth model. The CFP method is less fast and less automated than the CRS method. From a pointwise comparison of features it is concluded that one method is not a subset of the other, but that both methods can be regarded as being to some extent complementary.  相似文献   

8.
立体层析成像是一种新的地震反射波层析成像方法,能为叠前深度偏移提供较为精确的宏观速度模型。本文研究了立体层析成像的实现方法,包括斜率与走时数据的拾取、离散速度模型构建和初始化、射线参数的确定、斜率和走时及射线计算以及反演问题解法等,建立立体层析成像的算法流程。并通过对Marmousi模型试验,对立体层析成像运行所需的主要参数,如初始速度模型、拾取数据量、离散网格尺寸、速度平滑权重等进行测试和分析,总结这些不同参数对立体层析反演结果的影响规律,用以指导生产实践。  相似文献   

9.
Conventional seismic data processing methods based on post‐stack time migration have been playing an important role in coal exploration for decades. However, post‐stack time migration processing often results in low‐quality images in complex geological environments. In order to obtain high‐quality images, we present a strategy that applies the Kirchhoff prestack time migration (PSTM) method to coal seismic data. In this paper, we describe the implementation of Kirchhoff PSTM to a 3D coal seam. Meanwhile we derive the workflow of 3D Kirchhoff PSTM processing based on coal seismic data. The processing sequence of 3D Kirchhoff PSTM includes two major steps: 1) the estimation of the 3D root‐mean‐square (RMS) velocity field; 2) Kirchhoff prestack time migration processing. During the construction of a 3D velocity model, dip moveout velocity is served as an initial migration velocity field. We combine 3D Kirchhoff PSTM with the continuous adjustment of a 3D RMS velocity field by the criteria of flattened common reflection point gathers. In comparison with post‐stack time migration, the application of 3D Kirchhoff PSTM to coal seismic data produces better images of the coal seam reflections.  相似文献   

10.
Estimation of elastic properties of rock formations from surface seismic amplitude measurements remains a subject of interest for the exploration and development of hydrocarbon reservoirs. This paper develops a global inversion technique to estimate and appraise 1D distributions of compressional‐wave velocity, shear‐wave velocity and bulk density, from normal‐moveout‐corrected PP prestack surface seismic amplitude measurements. Specific objectives are: (a) to evaluate the efficiency of the minimization algorithm (b) to appraise the impact of various data misfit functions, and (c) to assess the effect of the degree and type of smoothness criterion enforced by the inversion. Numerical experiments show that very fast simulated annealing is the most efficient minimization technique among alternative approaches considered for global inversion. It is also found that an adequate choice of data misfit function is necessary for a reliable and efficient match of noisy and sparse seismic amplitude measurements. Several procedures are considered to enforce smoothness of the estimated 1D distributions of elastic parameters, including predefined quadratic measures of length, flatness and roughness. Based on the general analysis of global inversion techniques, we introduce a new stochastic inversion algorithm that initializes the search for the minimum with constrained random distributions of elastic parameters and enforces predefined autocorrelation functions (semivariograms). This strategy readily lends itself to the assessment of model uncertainty. The new global inversion algorithm is successfully tested on noisy synthetic amplitude data. Moreover, we present a feasibility analysis of the resolution and uncertainty of prestack seismic amplitude data to infer 1D distributions of elastic parameters measured with wireline logs in the deepwater Gulf of Mexico. The new global inversion algorithm is computationally more efficient than the alternative global inversion procedures considered here.  相似文献   

11.
Most seismic processing algorithms generally consider the sea surface as a flat reflector. However, acquisition of marine seismic data often takes place in weather conditions where this approximation is inaccurate. The distortion in the seismic wavelet introduced by the rough sea may influence (for example) deghosting results, as deghosting operators are typically recursive and sensitive to the changes in the seismic signal. In this paper, we study the effect of sea surface roughness on conventional (5–160 Hz) and ultra‐high‐resolution (200–3500 Hz) single‐component towed‐streamer data. To this end, we numerically simulate reflections from a rough sea surface using the Kirchhoff approximation. Our modelling demonstrates that for conventional seismic frequency band sea roughness can distort results of standard one‐dimensional and two‐dimensional deterministic deghosting. To mitigate this effect, we introduce regularisation and optimisation based on the minimum‐energy criterion and show that this improves the processing output significantly. Analysis of ultra‐high‐resolution field data in conjunction with modelling shows that even relatively calm sea state (i.e., 15 cm wave height) introduces significant changes in the seismic signal for ultra‐high‐frequency band. These changes in amplitude and arrival time may degrade the results of deghosting. Using the field dataset, we show how the minimum‐energy optimisation of deghosting parameters improves the processing result.  相似文献   

12.
In this case study we consider the seismic processing of a challenging land data set from the Arabian Peninsula. It suffers from rough top‐surface topography, a strongly varying weathering layer, and complex near‐surface geology. We aim at establishing a new seismic imaging workflow, well‐suited to these specific problems of land data processing. This workflow is based on the common‐reflection‐surface stack for topography, a generalized high‐density velocity analysis and stacking process. It is applied in a non‐interactive manner and provides an entire set of physically interpretable stacking parameters that include and complement the conventional stacking velocity. The implementation introduced combines two different approaches to topography handling to minimize the computational effort: after initial values of the stacking parameters are determined for a smoothly curved floating datum using conventional elevation statics, the final stack and also the related residual static correction are applied to the original prestack data, considering the true source and receiver elevations without the assumption of nearly vertical rays. Finally, we extrapolate all results to a chosen planar reference level using the stacking parameters. This redatuming procedure removes the influence of the rough measurement surface and provides standardized input for interpretation, tomographic velocity model determination, and post‐stack depth migration. The methodology of the residual static correction employed and the details of its application to this data example are discussed in a separate paper in this issue. In view of the complex near‐surface conditions, the imaging workflow that is conducted, i.e. stack – residual static correction – redatuming – tomographic inversion – prestack and post‐stack depth migration, leads to a significant improvement in resolution, signal‐to‐noise ratio and reflector continuity.  相似文献   

13.
刘国昌  李超 《地球物理学报》2020,63(4):1569-1584
描述地震波衰减特征的品质因子Q对地震数据处理和油藏描述非常重要,在地震勘探领域,Q值一般通过垂直地震剖面(VSP)数据或地面地震数据得到.由于叠前地面地震数据具有复杂的射线路径且存在噪声、调谐干涉效应等影响,从叠前地震数据中准确估计Q值相对困难.本文以地震波射线传播为基础,根据同相轴局部斜率和射线参数的映射关系,将多射线波形频谱同时带入谱比法联合反演估计Q值,提出了基于多射线联合反演的速度无关叠前Q值估计方法.该方法通过局部斜率属性避开了速度对Q值估计的影响,局部斜率携带地震波传播的速度信息,具有相同局部斜率的地震反射波具有相同的传播射线参数.同相轴局部斜率是地震数据域的属性,而速度是模型域的参数,在估计Q值中采用数据域的属性参数可以直接应用于数据的联合反演,而不需要通过速度对其做进一步的转化,从而提高了Q值估计的精度.同时,本方法采用预测映射(predictive mapping)技术将非零炮检距反射信息映射到零炮检距处,从而获得零偏移距走时对应的Q值.模拟和实际算例验证了本文方法的有效性.  相似文献   

14.
Seismic data acquired along rugged topographic surfaces present well‐known problems in seismic imaging. In conventional seismic data processing, datum statics are approximated by the surface consistence assumption, which states that all seismic rays travel vertically in the top layer. Hence, the datum static for each single trace is constant. In case this assumption does not apply, non‐constant statics are required. The common reflection surface (CRS) stack for rugged surface topography provides the capability to deal with this non‐vertical static issue. It handles the surface elevation as a coordinate component and treats the elevation variation in the sense of directional datuming. In this paper I apply the CRS stack method to a synthetic data set that simulates the acquisition along an irregular surface topography. After the CRS stack, by means of the wavefield attributes, a simple algorithm for redatuming the CRS stack section to an arbitrarily chosen planar surface is performed. The redatumed section simulates a stack section whose acquisition surface is the chosen planar surface.  相似文献   

15.
Optimization of sub-band coding method for seismic data compression   总被引:2,自引:0,他引:2  
Seismic data volumes, which require huge transmission capacities and massive storage media, continue to increase rapidly due to acquisition of 3D and 4D multiple streamer surveys, multicomponent data sets, reprocessing of prestack seismic data, calculation of post‐stack seismic data attributes, etc. We consider lossy compression as an important tool for efficient handling of large seismic data sets. We present a 2D lossy seismic data compression algorithm, based on sub‐band coding, and we focus on adaptation and optimization of the method for common‐offset gathers. The sub‐band coding algorithm consists of five stages: first, a preprocessing phase using an automatic gain control to decrease the non‐stationary behaviour of seismic data; second, a decorrelation stage using a uniform analysis filter bank to concentrate the energy of seismic data into a minimum number of sub‐bands; third, an iterative classification algorithm, based on an estimation of variances of blocks of sub‐band samples, to classify the sub‐band samples into a fixed number of classes with approximately the same statistics; fourth, a quantization step using a uniform scalar quantizer, which gives an approximation of the sub‐band samples to allow for high compression ratios; and fifth, an entropy coding stage using a fixed number of arithmetic encoders matched to the corresponding statistics of the classified and quantized sub‐band samples to achieve compression. Decompression basically performs the opposite operations in reverse order. We compare the proposed algorithm with three other seismic data compression algorithms. The high performance of our optimized sub‐band coding method is supported by objective and subjective results.  相似文献   

16.
In 2017, the Metal Earth multi-disciplinary exploration project acquired a total of 921 km of regional deep seismic reflection profiles and 184 km of high-resolution seismic reflection profiles in the Abitibi and Wabigoon greenstone belts of the Superior province of Canada. The Abitibi belt hosts several world-class mineral deposits, whereas the Wabigoon has sparse economic mineral deposits. Two high-resolution surveys in the Swayze area, a poorly endowed part of the western Abitibi greenstone belt, served as pioneer surveys with which to better understand subsurface geology and design a strategy to process other surveys in the near future. Swayze seismic data were acquired with crooked survey geometries along roads. Designing an effective seismic processing flow to address these geometries and complex geology required straight common midpoint lines along which both two-dimensional prestack dip-moveout correction and poststack migration processing were applied. The resulting seismic sections revealed steeply dipping and subhorizontal reflections; some correlate with folded surface rocks. An interpreted fault/deformation zone imaged in Swayze north would be a target for metal endowment if it extends the Porcupine–Destor structure. Because of the crooked line geometry of the surveys, two-dimensional /three-dimensional prestack time migration and swath three-dimensional processing were tested. The prestack time migration algorithm confirmed reflections at the interpreted base of the Abitibi greenstone belt. The swath three-dimensional images provided additional spatial details about the geometries of some reflections, but also had less resolution and did not detect many reflectors observed in two dimensions. Geological contacts between felsic, mafic and ultramafic greenstone rock layers are thought the main cause of reflectivity in the Swayze area.  相似文献   

17.
Data interpolation is an important step for seismic data analysis because many processing tasks, such as multiple attenuation and migration, are based on regularly sampled seismic data. Failed interpolations may introduce artifacts and eventually lead to inaccurate final processing results. In this paper, we generalised seismic data interpolation as a basis pursuit problem and proposed an iteration framework for recovering missing data. The method is based on non‐linear iteration and sparse transform. A modified Bregman iteration is used for solving the constrained minimisation problem based on compressed sensing. The new iterative strategy guarantees fast convergence by using a fixed threshold value. We also propose a generalised velocity‐dependent formulation of the seislet transform as an effective sparse transform, in which the non‐hyperbolic normal moveout equation serves as a bridge between local slope patterns and moveout parametres in the common‐midpoint domain. It can also be reduced to the traditional velocity‐dependent seislet if special heterogeneity parametre is selected. The generalised velocity‐dependent seislet transform predicts prestack reflection data in offset coordinates, which provides a high compression of reflection events. The method was applied to synthetic and field data examples, and the results show that the generalised velocity‐dependent seislet transform can reconstruct missing data with the help of the modified Bregman iteration even for non‐hyperbolic reflections under complex conditions, such as vertical transverse isotropic (VTI) media or aliasing.  相似文献   

18.
For 3‐D shallow‐water seismic surveys offshore Abu Dhabi, imaging the target reflectors requires high resolution. Characterization and monitoring of hydrocarbon reservoirs by seismic amplitude‐versus‐offset techniques demands high pre‐stack amplitude fidelity. In this region, however, it still was not clear how the survey parameters should be chosen to satisfy the required data quality. To answer this question, we applied the focal‐beam method to survey evaluation and design. This subsurface‐ and target‐oriented approach enables quantitative analysis of attributes such as the best achievable resolution and pre‐stack amplitude fidelity at a fixed grid point in the subsurface for a given acquisition geometry at the surface. This method offers an efficient way to optimize the acquisition geometry for maximum resolution and minimum amplitude‐versus‐offset imprint. We applied it to several acquisition geometries in order to understand the effects of survey parameters such as the four spatial sampling intervals and apertures of the template geometry. The results led to a good understanding of the relationship between the survey parameters and the resulting data quality and identification of the survey parameters for reflection imaging and amplitude‐versus‐offset applications.  相似文献   

19.
We have developed a practical approach for updating the velocity of PS converted waves based on the inverse normal‐moveout common‐image‐point gather obtained from prestack Kirchhoff time migration. We have integrated all the steps involved in updating the migration velocity model into an interactive tool and have applied this approach to a real seismic data set from the Alba Field in the North Sea. Based on experience in handling the real data, we discuss various practical aspects of updating the velocity model, including: what kind of initial velocity model should be used; which parameters in the velocity model should be updated; and how to update them. Application of prestack Kirchhoff time migration to the data set using the updated velocity model produces an improved image of the Alba Field.  相似文献   

20.
深反射地震剖面法为了获取深部结构特征常常采取大的偏移距采集数据.目前公开发表的相关资料中,鲜有利用深反射地震炮集数据获取近地表的结构特征.为此,本文通过正演测试了相关数据处理流程,即利用有限差分正演了起伏地表模型的大偏移距地震单炮弹性波场特征,通过共检波点域面波信号F-K频谱叠加构建新方法,从深反射地震数据集中提取了高品质的多阶面波频散曲线,再利用多阶面波联合反演获得了近地表的结构特征.在前述正演流程基础上,利用跨越班公湖—怒江缝合带的SinoProbe深反射地震剖面中的实际炮集数据,求取了基阶和一阶瑞利波频散曲线,联合反演后得到近地表横波速度结构.该结果与初至波走时反演获取的纵波速度结构具有较好的一致性,且在近地表的浅层分辨率较纵波速度结构特征更高,而更与已有地质认识相吻合.本文提供的相关数据处理流程表明利用深反射地震炮集数据,也能够获取近地表浅层的横波速度结构.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号