首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In recent years, a variety of Marchenko methods for the attenuation of internal multiples has been developed. These methods have been extensively tested on two-dimensional synthetic data and applied to two-dimensional field data, but only little is known about their behaviour on three-dimensional synthetic data and three-dimensional field data. Particularly, it is not known whether Marchenko methods are sufficiently robust for sparse acquisition geometries that are found in practice. Therefore, we start by performing a series of synthetic tests to identify the key acquisition parameters and limitations that affect the result of three-dimensional Marchenko internal multiple prediction and subtraction using an adaptive double-focusing method. Based on these tests, we define an interpolation strategy and use it for the field data application. Starting from a wide azimuth dense grid of sources and receivers, a series of decimation tests are performed until a narrow azimuth streamer geometry remains. We evaluate the effect of the removal of sail lines, near offsets, far offsets and outer cables on the result of the adaptive double-focusing method. These tests show that our method is most sensitive to the limited aperture in the crossline direction and the sail line spacing when applying it to synthetic narrow azimuth streamer data. The sail line spacing can be interpolated, but the aperture in the crossline direction is a limitation of the acquisition. Next, we apply the adaptive Marchenko double-focusing method to the narrow azimuth streamer field data from the Santos Basin, Brazil. Internal multiples are predicted and adaptively subtracted, thereby improving the geological interpretation of the target area. These results imply that our adaptive double-focusing method is sufficiently robust for the application to three-dimensional field data, although the key acquisition parameters and limitations will naturally differ in other geological settings and for other types of acquisition.  相似文献   

2.
The rough sea surface causes perturbations in the seismic data that can be significant for time‐lapse studies. The perturbations arise because the reflection response of the non‐flat sea perturbs the seismic wavelet. In order to remove these perturbations from the received seismic data, special deconvolution methods can be used, but these methods require, as input, the time varying wave elevation above each hydrophone in the streamer. In addition, the vertical displacement of the streamer itself must also be known at the position of each hydrophone and at all times. This information is not available in conventional seismic acquisition. However, it can be obtained from the hydrophone measurements provided that the hydrophones are recorded individually (not grouped), that the recording bandwidth is extended down to 0.05 Hz and that data are recorded without gaps between the shot records. The sea surface elevation, and also the wave‐induced vertical displacement of the streamer, can be determined from the time‐varying pressure that the sea waves cause in the hydrophone measurements. When this was done experimentally, using a single sensor seismic streamer without a conventional low cut filter, the wave induced pressure variations were easily detected. The inversion of these experimental data gives results for the sea surface elevation that are consistent with the weather and sea state at the time of acquisition. A high tension approximation allows a simplified solution of the equations that does not demand a knowledge of the streamer tension. However, best results at the tail end of the streamer are obtained using the general equation.  相似文献   

3.
In order to deconvolve the ghost response from marine seismic data, an estimate of the ghost operator is required. Typically, this estimate is made using a model of in‐plane propagation, i.e., the ray path at the receiver falls in the vertical plane defined by the source and receiver locations. Unfortunately, this model breaks down when the source is in a crossline position relative to the receiver spread. In this situation, in‐plane signals can only exist in a small region of the signal cone. In this paper, we use Bayes' theory to model the posterior probability distribution functions for the vertical component of the ray vector given the known source–receiver azimuth and the measured inline component of the ray vector. This provides a model for the ghost delay time based on the acquisition geometry and the dip of the wave in the plane of the streamer. The model is fairly robust with regard to the prior assumptions and controlled by a single parameter that is related to the likelihood of in‐plane propagation. The expected values of the resulting distributions are consistent with the deterministic in‐plane model when in‐plane likelihood is high but valid everywhere in the signal cone. Relaxing the in‐plane likelihood to a reasonable degree radically simplifies the shape of the expected‐value surface, lending itself for use in deghosting algorithms. The model can also be extended to other plane‐wave processing problems such as interpolation.  相似文献   

4.
本文发展基于波动方程的上下缆鬼波压制方法,推导了上下缆地震波场频率波数域波动方程延拓合并公式.基于Fourier变换的波场解析延拓确保上下缆资料振幅相位的一致性,消除了长拖缆远偏移距信号的计算误差,同时具有较高的计算效率;上下缆地震波场的波动方程法合并有效解偶鬼波干涉,实现综合利用上下缆地震数据压制鬼波.理论模型数据和实际采集地震数据的测试表明了方法的有效性.  相似文献   

5.
A method for interpolation of multicomponent streamer data based on using the local directionality structure is presented. The derivative components are used to estimate a vector field that locally describes the direction with the least variability. Given this vector field, the interpolation can be phrased in terms of the solution of a partial differential equation that describes how energy is transported between regions of missing data. The approach can be efficiently implemented using readily available routines for computer graphics. The method is robust to noise in the measurements and particularly towards high levels of low‐frequent noise that is present in the derivative components of the multicomponent streamer data.  相似文献   

6.
作为一种特殊的噪声,鬼波对一次波的波形及频带宽度产生极大的影响,鬼波压制是提高海上地震资料分辨率及保真度的重要因素.以格林公式为基础,详细论述了基于格林函数理论的鬼波压制方法,在不需要地下介质信息的条件下,进行地震数据驱动鬼波压制,并根据"Double Dirichlet"(双狄利克雷)边界条件,预测压力波场和垂直速度波场.建立了基于格林函数理论鬼波压制的处理流程,数值模拟和实际资料处理结果表明,基于格林函数理论鬼波压制方法在很好地去除鬼波的同时极大地拓宽了地震资料的频带,尤其提升了低频端能量,有利于后续资料的处理解释.  相似文献   

7.
Three‐dimensional receiver ghost attenuation (deghosting) of dual‐sensor towed‐streamer data is straightforward, in principle. In its simplest form, it requires applying a three‐dimensional frequency–wavenumber filter to the vertical component of the particle motion data to correct for the amplitude reduction on the vertical component of non‐normal incidence plane waves before combining with the pressure data. More elaborate techniques use three‐dimensional filters to both components before summation, for example, for ghost wavelet dephasing and mitigation of noise of different strengths on the individual components in optimum deghosting. The problem with all these techniques is, of course, that it is usually impossible to transform the data into the crossline wavenumber domain because of aliasing. Hence, usually, a two‐dimensional version of deghosting is applied to the data in the frequency–inline wavenumber domain. We investigate going down the “dimensionality ladder” one more step to a one‐dimensional weighted summation of the records of the collocated sensors to create an approximate deghosting procedure. We specifically consider amplitude‐balancing weights computed via a standard automatic gain control before summation, reminiscent of a diversity stack of the dual‐sensor recordings. This technique is independent of the actual streamer depth and insensitive to variations in the sea‐surface reflection coefficient. The automatic gain control weights serve two purposes: (i) to approximately correct for the geometric amplitude loss of the Z data and (ii) to mitigate noise strength variations on the two components. Here, Z denotes the vertical component of the velocity of particle motion scaled by the seismic impedance of the near‐sensor water volume. The weights are time‐varying and can also be made frequency‐band dependent, adapting better to frequency variations of the noise. The investigated process is a very robust, almost fully hands‐off, approximate three‐dimensional deghosting step for dual‐sensor data, requiring no spatial filtering and no explicit estimates of noise power. We argue that this technique performs well in terms of ghost attenuation (albeit, not exact ghost removal) and balancing the signal‐to‐noise ratio in the output data. For instances where full three‐dimensional receiver deghosting is the final product, the proposed technique is appropriate for efficient quality control of the data acquired and in aiding the parameterisation of the subsequent deghosting processing.  相似文献   

8.
CO2 has been injected into the saline aquifer Utsira Fm at the Sleipner field since 1996. In order to monitor the movement of the CO2 in the sub‐surface, the seventh seismic monitor survey was acquired in 2010, with dual sensor streamers which enabled optimal towing depths compared to previous surveys. We here report both on the time‐lapse observations and on the improved resolution compared to the conventional streamer surveys. This study shows that the CO2 is still contained in the subsurface, with no indications of leakage. The time‐lapse repeatability of the dual sensor streamer data versus conventional data is sufficient for interpreting the time‐lapse effects of the CO2 at Sleipner, and the higher resolution of the 2010 survey has enabled a refinement of the interpretation of nine CO2 saturated layers with improved thickness estimates of the layers. In particular we have estimated the thickness of the uppermost CO2 layer based on an analysis of amplitude strength together with time‐separation of top and base of this layer and found the maximum thickness to be 11 m. This refined interpretation gives a good base line for future time‐lapse surveys at the Sleipner CO2 injection site.  相似文献   

9.
In single‐streamer acquisition, the use of acoustic transducers to constrain the receiver positions is not possible, thus implying the use of compass birds to gather information on the streamer location. The compasses are, however, sensitive to the declination of the local magnetic field of the earth, and local fluctuations not accounted for can degrade the accuracy of reconstructed positions. In order to correct these small‐scale fluctuations, we propose a simple deterministic method to calculate a spatial correction to apply to the compasses that enhances the positioning accuracy. The local compass declination is calculated after a first reconstruction on the whole survey area. This method was applied with success to navigation data from a 3D survey offshore Japan, and the positioning accuracy was improved to the level of the DGPS accuracy.  相似文献   

10.
This paper addresses two artefacts inherent to marine towed‐streamer surveys: 1) ghost reflections and 2) too sparse a sampling in the crossline direction. A ghost reflection is generated when an upcoming reflection bounces off the sea surface back into the sensors and can, in principle, be removed by decomposing the measured wavefield into its up‐ and downgoing constituents. This process requires a dense sampling of the wavefield in both directions along and perpendicular to the streamers. A dense sampling in the latter direction is, however, often impossible due to economical and operational constraints. Recent multi‐component streamers have been designed to record the spatial gradients on top of the pressure, which not only benefits the wavefield decomposition but also facilitates a lower‐than‐Nyquist sampling rate of the pressure. In this paper, wavefield reconstruction and deghosting are posed as a joint inverse problem. We present two approaches to establish a system matrix that embeds both a deghosting and an interpolation operator. The first approach is derived with a ghost model, whereas the second approach is derived without a ghost model. The embodiment of a ghost model leads to an even lower sampling rate but relies on a more restrictive assumption on the sea surface.  相似文献   

11.
When anomalous gravity gradient signals provide a large signal‐to‐noise ratio, airborne and marine surveys can be considered with wide line spacing. In these cases, spatial resolution and sampling requirements become the limiting factors for specifying the line spacing, rather than anomaly detectability. This situation is analysed by generating known signals from a geological model and then sub‐sampling them using a simulated airborne gravity gradient survey with a line spacing much wider than the characteristic anomaly size. The data are processed using an equivalent source inversion, which is used subsequently to predict and grid the field in‐between the survey lines by means of forward calculations. Spatial and spectral error analysis is used to quantify the accuracy and resolution of the processed data and the advantages of acquiring multiple gravity gradient components are demonstrated. With measurements of the full tensor along survey lines spaced at 4 × 4 km, it is shown that the vertical gravity gradient can be reconstructed accurately over a bandwidth of 2 km with spatial root‐mean square errors less than 30%. A real airborne full‐tensor gravity gradient survey is presented to confirm the synthetic analysis in a practical situation.  相似文献   

12.
基于稀疏反演三维表面多次波压制方法   总被引:2,自引:1,他引:1       下载免费PDF全文
三维表面多次波压制是海洋地震资料预处理中的重要研究课题,基于波动理论的三维表面多次波压制方法(3DSRME)是数据驱动的方法,理论上来说,可有效压制复杂构造地震数据表面多次波.但该方法因对原始地震数据采集要求高而很难在实际资料处理中广泛应用.本文基于贡献道集的概念,将稀疏反演方法引入到表面多次波压制中,应用稀疏反演代替横测线积分求和,无需对横测线进行大规模重建,进而完成三维表面多次波预测,这样可有效解决实际三维地震数据横测线方向稀疏的问题.基于纵测线多次波积分道集为抛物线的假设,为保证预测后三维表面多次波和全三维数据预测的多次波在运动学和动力学特征上基本一致,文中对预测数据实施基于稳相原理的相位校正.理论模型和实际数据的测试结果表明,本文基于稀疏反演三维表面多次波压制方法可在横测线稀疏的情况下,有效压制三维复杂介质地震资料中的表面多次波,从而更好地提高海洋地震资料的信噪比,为高分辨率地震成像提供可靠的预处理数据保障.  相似文献   

13.
地震干涉是近几年勘探地球物理领域研究的一个热点方向,它是波场重构的有力工具.然而,地震干涉往往引入虚假同相轴,影响波场重构的质量.为进一步分析虚假同相轴产生的原因并改善波场重构的质量,本文基于稳相分析,详细探讨子波主频、激发炮数、检波器埋深、检波器间距和地层倾角五个因素对波场重构的影响.模型结果表明,震检排列方式和地层倾角等因素通过改变有限震检范围内稳相点的位置来影响波场重构的质量.将垂直地震剖面VSP中的下行直达波与下行反射波进行地震干涉处理,可有效重构来自高陡反射面的下行反射波,将传统的VSP转化为单井地震剖面SWP.直接利用重构SWP波场进行成像,不仅扩大传统VSP的成像范围,而且避免常规勘探面临的静校正和近地表速度建模问题,为高陡构造成像提供一种新方法.  相似文献   

14.
Repeatability of seismic data plays a crucial role in time‐lapse seismic analysis. There are several factors that can decrease the repeatability, such as positioning errors, varying tide, source variations, velocity changes in the water layer (marine data) and undesired effects of various processing steps. In this work, the complexity of overburden structure, as an inherent parameter that can affect the repeatability, is studied. A multi‐azimuth three‐dimensional vertical‐seismic‐profiling data set with 10 000 shots is used to study the relationship between overburden structure and repeatability of seismic data. In most repeatability studies, two data sets are compared, but here a single data set has been used because a significant proportion of the 10 000 shots are so close to each other that a repeatability versus positioning error is possible. We find that the repeatability decreases by a factor of approximately 2 under an overburden lens. Furthermore, we find that the X‐ and Y‐components have approximately the same sensitivity to positioning errors as the Z‐component (for the same events) in this three‐dimensional vertical‐seismic‐profiling experiment. This indicates that in an area with complex overburden, positioning errors between monitor and base seismic surveys are significantly more critical than outside such an area. This study is based on a three‐dimensional three‐component vertical‐seismic‐profiling data set from a North Sea reservoir and care should be taken when extrapolating these observations into a general four‐dimensional framework.  相似文献   

15.
A suite of three tests was performed to characterize the signal fidelity of OBC 4C acquisition systems. The test methodology was to evaluate individual sensor stations by acquiring source lines that were parallel to the in‐line and cross‐line horizontal sensors and source lines that were at 45° to the in‐line and cross‐line sensors. This technique provides constant azimuth gathers with a uniform offset range and removes issues associated with source array directivity. Characterization of the test data identified the frequency content of the geophone signals and the correlation between the vertical and cross‐line geophones as the most sensitive indicators of signal infidelity. In the former case, stations with questionable signal fidelity exhibited a very reverberatory signal. This signal was most evident on the cross‐line sensor. In the latter case, when normalized cross‐correlation coefficients are computed in a moving window, the cross‐line sensor and the vertical sensor are highly correlated, beginning several hundred milliseconds after the first arrivals. These characteristics can be exploited to allow stations with questionable signal fidelity to be programmatically identified. One means of identifying questionable stations is to compute the histogram of the instantaneous frequency. The frequency distributions from questionable stations are unambiguously distinguishable from stations that exhibit better signal fidelity. It was noted that signal fidelity appeared as a range, between acceptable and poor. To characterize the signal fidelity of an acquisition system adequately, the number of test samples must be statistically significant.  相似文献   

16.
We built a five-component (5C) land seismic sensor that measures both the three-component (3C) particle acceleration and two vertical gradients of the horizontal wavefield through a pair of 3C microelectromechanical accelerometers. The sensor is a small cylindrical device planted vertically just below the earth's surface. We show that seismic acquisition and processing 5C sensor data has the potential to replace conventional seismic acquisition with analogue geophone groups by single 5C sensors placed at the same station interval when combined with a suitable aliased ground roll attenuation algorithm. The 5C sensor, therefore, allows for sparser, more efficient, data acquisition. The accuracy of the 5C sensor wavefield gradients depends on the 3C accelerometers, their sensitivity, self-noise and their separation. These sensor component specifications are derived from various modelling studies. The design principles of the 5C sensor are validated using test data from purpose-built prototypes. The final prototype was constructed with a pair of 3C accelerometers separated by 20 cm and with a self-noise of 35 ng Hz−1/2. Results from a two-dimensional seismic line show that the seismic image of 5C sensor data with ground roll attenuated using 5C sensor gradient data was comparable to simulated analogue group data as is the standard in the industry. This field example shows that up to three times aliased ground roll was attenuated. The 5C sensor also allows for correcting vertical component accelerometer data for sensor tilt. It is shown that a vertical component sensor that is misaligned with the vertical direction by 10° introduces an error in the seismic data of around –20 dB with respect to the seismic signal, which can be fully corrected. Advances in sensor specifications and processing algorithms are expected to lead to even more effective ground roll attenuation, enabling a reduction in the receiver density resulting in a smaller number of sensors that must be deployed and, therefore, improving the operational efficiency while maintaining image quality.  相似文献   

17.
During the time taken for seismic data to be acquired, reservoir pressure may fluctuate as a consequence of field production and operational procedures and fluid fronts may move significantly. These variations prevent accurate quantitative measurement of the reservoir change using 4D seismic data. Modelling studies on the Norne field simulation model using acquisition data from ocean-bottom seismometer and towed streamer systems indicate that the pre-stack intra-survey reservoir fluctuations are important and cannot be neglected. Similarly, the time-lapse seismic image in the post-stack domain does not represent a difference between two states of the reservoir at a unique base and monitor time, but is a mixed version of reality that depends on the sequence and timing of seismic shooting. The outcome is a lack of accuracy in the measurement of reservoir changes using the resulting processed and stacked 4D seismic data. Even for perfect spatial repeatability between surveys, a spatially variant noise floor is still anticipated to remain. For our particular North Sea acquisition data, we find that towed streamer data are more affected than the ocean-bottom seismometer data. We think that this may be typical for towed streamers due to their restricted aperture compared to ocean-bottom seismometer acquisitions, even for a favourable time sequence of shooting and spatial repeatability. Importantly, the pressure signals on the near and far offset stacks commonly used in quantitative 4D seismic inversion are found to be inconsistent due to the acquisition timestamp. Saturation changes at the boundaries of fluid fronts appear to show a similar inconsistency across sub-stacks. We recommend that 4D data are shot in a consistent manner to optimize aerial time coverage, and that additionally, the timestamp of the acquisition should be used to optimize pre-stack quantitative reservoir analysis.  相似文献   

18.
Elastic imaging from ocean bottom cable (OBC) data can be challenging because it requires the prior estimation of both compressional‐wave (P‐wave) and shear‐wave (S‐wave) velocity fields. Seismic interferometry is an attractive technique for processing OBC data because it performs model‐independent redatuming; retrieving ‘pseudo‐sources’ at positions of the receivers. The purpose of this study is to investigate multicomponent applications of interferometry for processing OBC data. This translates into using interferometry to retrieve pseudo‐source data on the sea‐bed not only for multiple suppression but for obtaining P‐, converted P to S‐wave (PS‐wave) and possibly pure mode S‐waves. We discuss scattering‐based, elastic interferometry with synthetic and field OBC datasets. Conventional and scattering‐based interferometry integrands computed from a synthetic are compared to show that the latter yields little anti‐causal response. A four‐component (4C) pseudo‐source response retrieves pure‐mode S‐reflections as well at P‐ and PS‐reflections. Pseudo‐source responses observed in OBC data are related to P‐wave conversions at the seabed rather than to true horizontal or vertical point forces. From a Gulf of Mexico OBC data set, diagonal components from a nine‐component pseudo‐source response demonstrate that the P‐wave to S‐wave velocity ratio (VP/VS) at the sea‐bed is an important factor in the conversion of P to S for obtaining the pure‐mode S‐wave reflections.  相似文献   

19.
The rough‐sea reflection‐response varies (1) along the streamer (2) from shot to shot and (3) with time along the seismic trace. The resulting error in seismic data can be important for time‐lapse imaging. One potential way of reducing the rough‐sea receiver error is to use conventional statistical deconvolution, but special care is needed in the choice of the design and application windows. The well‐known deconvolution problem associated with the non‐whiteness of the reflection series is exacerbated by the requirement of an unusually short design window – a requirement that is imposed by the non‐stationary nature of the rough‐sea receiver wavelet. For a synthetic rough‐sea data set, with a white 1D reflection series, the design window needs to be about 1000 ms long, with an application window about 400 ms long, centred within the design window. Although such a short design window allows the deconvolution operator to follow the time‐variation of the rough‐sea wavelet, it is likely to be too short to prevent the non‐whiteness of the geology from corrupting the operator when it is used on real data. If finely spatial‐sampled traces are available from the streamer, the design window can be extended to neighbouring traces, making use of the spatial correlations of the rough‐sea wavelet. For this ‘wave‐following’ approach to be fruitful, the wind (and hence the dominant wave direction) needs to be roughly along the line of the streamer.  相似文献   

20.
4D seismic is widely used to remotely monitor fluid movement in subsurface reservoirs. This technique is especially effective offshore where high survey repeatability can be achieved. It comes as no surprise that the first 4D seismic that successfully monitored the CO2 sequestration process was recorded offshore in the Sleipner field, North Sea. In the case of land projects, poor repeatability of the land seismic data due to low S/N ratio often obscures the time‐lapse seismic signal. Hence for a successful on shore monitoring program improving seismic repeatability is essential. Stage 2 of the CO2CRC Otway project involves an injection of a small amount (around 15,000 tonnes) of CO2/CH4 gas mixture into a saline aquifer at a depth of approximately 1.5 km. Previous studies at this site showed that seismic repeatability is relatively low due to variations in weather conditions, near surface geology and farming activities. In order to improve time‐lapse seismic monitoring capabilities, a permanent receiver array can be utilised to improve signal to noise ratio and hence repeatability. A small‐scale trial of such an array was conducted at the Otway site in June 2012. A set of 25 geophones was installed in 3 m deep boreholes in parallel to the same number of surface geophones. In addition, four geophones were placed into boreholes of 1–12 m depth. In order to assess the gain in the signal‐to‐noise ratio and repeatability, both active and passive seismic surveys were carried out. The surveys were conducted in relatively poor weather conditions, with rain, strong wind and thunderstorms. With such an amplified background noise level, we found that the noise level for buried geophones is on average 20 dB lower compared to the surface geophones. The levels of repeatability for borehole geophones estimated around direct wave, reflected wave and ground roll are twice as high as for the surface geophones. Both borehole and surface geophones produce the best repeatability in the 30–90 Hz frequency range. The influence of burying depth on S/N ratio and repeatability shows that significant improvement in repeatability can be reached at a depth of 3 m. The level of repeatability remains relatively constant between 3 and 12 m depths.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号