首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We propose a workflow of deblending methodology comprised of rank-reduction filtering followed by a signal enhancing process. This methodology can be used to preserve coherent subsurface reflections and at the same time to remove incoherent and interference noise. In pseudo-deblended data, the blending noise exhibits coherent events, whereas in any other data domain (i.e. common receiver, common midpoint and common offset), it appears incoherent and is regarded as an outlier. In order to perform signal deblending, a robust implementation of rank-reduction filtering is employed to eliminate the blending noise and is referred to as a joint sparse and low-rank approximation. Deblending via rank-reduction filtering gives a reasonable result with a sufficient signal-to-noise ratio. However, for land data acquired using unconstrained simultaneous shooting, rank-reduction–based deblending applications alone do not completely attenuate the interference noise. A considerable amount of signal leakage is observed in the residual component, which can affect further data processing and analyses. In this study, we propose a deblending workflow via a rank-reduction filter followed by post-processing steps comprising a nonlinear masking filter and a local orthogonalization weight application. Although each application shows a few footprints of leaked signal energy, the proposed combined workflow restores the signal energy from the residual component achieving significantly signal-to-noise ratio enhancement. These hierarchical schemes are applied on land simultaneous shooting acquisition data sets and produced cleaner and reliable deblended data ready for further data processing.  相似文献   

2.
In this paper, we compare the denoising- and inversion-based deblending methods using Stolt migration operators. We use Stolt operator as a kernel to efficiently compute apex-shifted hyperbolic Radon transform. Sparsity promoting transforms, such as Radon transform, can focus seismic data into a sparse model to separate signals, remove noise or interpolate missing traces. Therefore, Radon transforms are a suitable tool for either the denoising- or the inversion-based deblending methods. The denoising-based deblending treats blending interferences as random noise by sorting the data into new gathers, such as common receiver gather. In these gathers, blending interferences exhibit random structures due to the randomization of the source firing times. Alternatively, the inversion-based deblending treats blending interferences as a signal, and the transform models this signal by incorporating the blending operator to formulate an inversion problem. We compare both methods using a robust inversion algorithm with sparse regularization. Results of synthetic and field data examples show that the inversion-based deblending can produce more accurate signal separation for highly blended data.  相似文献   

3.
魏亚杰  张盼  许卓 《地球物理学报》2019,62(10):4000-4009
混合震源采集技术相对于传统的地震数据采集,在极大提高采集效率的同时引入了混叠噪声,很大程度上影响了成像结果的精度.二维混采数据中,我们通常利用混叠噪声在非共炮域呈非相干分布这一特点来压制混叠噪声,从而实现混合震源数据分离.相对于二维混采数据,三维混采数据具有数据量巨大,构建混合震源算子困难,混合度的增加引入了高强度混叠噪声的特点.针对上述问题,本文采用稀疏约束反演方法在Radon域实现混采数据分离,混叠噪声强度比较大的情况下,稀疏约束反演方法能够得到更高精度的分离结果;利用震源激发的GPS时间通过长记录的方式在共接收点道集对上一次迭代分离结果做混合、伪分离,实现了单个共接收点道集自身混合、伪分离,避免了对整个数据做运算,同时不需要构建混合震源算子.通过模拟数据和实际数据计算来验证上述方法的适用性.  相似文献   

4.
We introduce a concept of generalized blending and deblending, develop its models and accordingly establish a method of deblended-data reconstruction using these models. The generalized models can handle real situations by including random encoding into the generalized operators both in the space and time domain, and both at the source and receiver side. We consider an iterative optimization scheme using a closed-loop approach with the generalized blending and deblending models, in which the former works for the forward modelling and the latter for the inverse modelling in the closed loop. We applied our method to existing real data acquired in Abu Dhabi. The results show that our method succeeded to fully reconstruct deblended data even from the fully generalized, thus quite complicated blended data. We discuss the complexity of blending properties on the deblending performance. In addition, we discuss the applicability to time-lapse seismic monitoring as it ensures high repeatability of the surveys. Conclusively, we should acquire blended data and reconstruct deblended data without serious problems but with the benefit of blended acquisition.  相似文献   

5.
随着高密度以及深层地震勘探的发展和普及,采集数据量随之急剧增加,常规采集方法已经不能有效地适应这些大数据量的勘探项目需求,高效地震采集方法成为必然.近几年来,混叠采集是较新发展的一种高效采集方法.本文在调研目前常用的混合源和同时源方法的基础上,总结提出了混叠采集的新概念.根据概念建立起理论正演模型进行模拟,模拟混叠采集方法与常规采集方法的区别、不同参数对混叠效果的影响等,得出相关的结论.在华北东部廊固凹陷进行了验证性试验.廊固凹陷地处华北平原,交通发达,村庄密集,存在较大的空炮率和超强的环境噪声,对资料品质有较大的影响,野外试验与模拟试验结果大致相同,也有一定的差异.与以往规则型的混叠方式不同,本研究在试验中创新性引入了任意随机与不同激发信号的混叠方式方法,取得了一些新的认识.研究结果表明:混叠采集方法能显著地提高生产效率;混叠带来的噪声可以通过不同域来去除;混叠采集需要一定的合适的空间间隔;混叠采集数据品质略差于常规方法,但可以通过提高采集密度和生产效率来弥补;混叠参数选取要考虑平衡施工效率、噪声水平和资料品质.  相似文献   

6.
A marine source generates both a direct wavefield and a ghost wavefield. This is caused by the strong surface reflectivity, resulting in a blended source array, the blending process being natural. The two unblended response wavefields correspond to the real source at the actual location below the water level and to the ghost source at the mirrored location above the water level. As a consequence, deghosting becomes deblending (‘echo‐deblending’) and can be carried out with a deblending algorithm. In this paper we present source deghosting by an iterative deblending algorithm that properly includes the angle dependence of the ghost: It represents a closed‐loop, non‐causal solution. The proposed echo‐deblending algorithm is also applied to the detector deghosting problem. The detector cable may be slanted, and shot records may be generated by blended source arrays, the blending being created by simultaneous sources. Similar to surface‐related multiple elimination the method is independent of the complexity of the subsurface; only what happens at and near the surface is relevant. This means that the actual sea state may cause the reflection coefficient to become frequency dependent, and the water velocity may not be constant due to temporal and lateral variations in the pressure, temperature, and salinity. As a consequence, we propose that estimation of the actual ghost model should be part of the echo‐deblending algorithm. This is particularly true for source deghosting, where interaction of the source wavefield with the surface may be far from linear. The echo‐deblending theory also shows how multi‐level source acquisition and multi‐level streamer acquisition can be numerically simulated from standard acquisition data. The simulated multi‐level measurements increase the performance of the echo‐deblending process. The output of the echo‐deblending algorithm on the source side consists of two ghost‐free records: one generated by the real source at the actual location below the water level and one generated by the ghost source at the mirrored location above the water level. If we apply our algorithm at the detector side as well, we end up with four ghost‐free shot records. All these records are input to migration. Finally, we demonstrate that the proposed echo‐deblending algorithm is robust for background noise.  相似文献   

7.
Within the field of seismic data acquisition with active sources, the technique of acquiring simultaneous data, also known as blended data, offers operational advantages. The preferred processing of blended data starts with a step of deblending, that is separation of the data acquired by the different sources, to produce data that mimic data from a conventional seismic acquisition and can be effectively processed by standard methods. Recently, deep learning methods based on the deep neural network have been applied to the deblending task with promising results, in particular using an iterative approach. We propose an enhancement to deblending with an iterative deep neural network, whereby we modify the training stage of the deep neural network in order to achieve better performance through the iterations. We refer to the method that only uses the blended data as the input data as the general training method. Our new multi-data training method allows the deep neural network to be trained by the data set with the input patches composed of blended data, noisy data with low amplitude crosstalk noise, and unblended data, which can improve the ability of the deep neural network to remove crosstalk noise and protect weak signal. Based on such an extended training data set, the multi-data training method embedded in the iterative separation framework can result in different outputs at different iterations and converge to the best result in a shorter iteration number. Transfer learning can further improve the generalization and separation efficacy of our proposed method to deblend the simultaneous-source data. Our proposed method is tested on two synthetic data and two field data to prove the effectiveness and superiority in the deblending of the simultaneous-source data compared with the general training method, generic noise attenuation network and low-rank matrix factorization methods.  相似文献   

8.
Hard rock seismic exploration normally has to deal with rather complex geological environments. These types of environments are usually characterized by a large number of local heterogeneity (e.g., faults, fracture zones, and steeply dipping interfaces). The seismic data from such environments often have a poor signal‐to‐noise ratio because of the complexity of hard rock geology. To be able to obtain reliable images of subsurface structures in such geological conditions, processing algorithms that are capable of handling seismic data with a low signal‐to‐noise ratio are required for a reflection seismic exploration. In this paper, we describe a modification of the 3D Kirchhoff post‐stack migration algorithm that utilizes coherency attributes obtained by the diffraction imaging algorithm in 3D to steer the main Kirchhoff summation. The application to a 3D synthetic model shows the stability of the presented steered migration to the presence of high level of the random noise. A test on the 3D seismic volume, acquired on a mine site located in Western Australia, reveals the capability of the approach to image steep and sharp objects such as fracture and fault zones and lateral heterogeneity.  相似文献   

9.
Blended acquisition along with efficient spatial sampling is capable of providing high-quality seismic data in a cost-effective and productive manner. While deblending and data reconstruction conventionally accompany this way of data acquisition, the recorded data can be processed directly to estimate subsurface properties. We establish a workflow to design survey parameters that account for the source blending as well as the spatial sampling of sources and detectors. The proposed method involves an iterative scheme to derive the survey design leading to optimum reflectivity and velocity estimation via joint migration inversion. In the workflow, we extend the standard implementation of joint migration inversion to cope with the data acquired in a blended fashion along with irregular detector and source geometries. This makes a direct estimation of reflectivity and velocity models feasible without the need of deblending or data reconstruction. During the iterations, the errors in reflectivity and velocity estimates are used to update the survey parameters by integrating a genetic algorithm and a convolutional neural network. Bio-inspired operators enable the simultaneous update of the blending and sampling operators. To relate the choice of survey parameters to the performance of joint migration inversion, we utilize a convolutional neural network. The applied network architecture discards suboptimal solutions among newly generated ones. Conversely, it carries optimal ones to the subsequent step, which improves the efficiency of the proposed approach. The resultant acquisition scenario yields a notable enhancement in both reflectivity and velocity estimation attributable to the choice of survey parameters.  相似文献   

10.
Multi-source seismic technology is an efficient seismic acquisition method that requires a group of blended seismic data to be separated into single-source seismic data for subsequent processing. The separation of blended seismic data is a linear inverse problem. According to the relationship between the shooting number and the simultaneous source number of the acquisition system, this separation of blended seismic data is divided into an easily determined or overdetermined linear inverse problem and an underdetermined linear inverse problem that is difficult to solve. For the latter, this paper presents an optimization method that imposes the sparsity constraint on wavefields to construct the object function of inversion, and the problem is solved by using the iterative thresholding method. For the most extremely underdetermined separation problem with single-shooting and multiple sources, this paper presents a method of pseudo-deblending with random noise filtering. In this method, approximate common shot gathers are received through the pseudo-deblending process, and the random noises that appear when the approximate common shot gathers are sorted into common receiver gathers are eliminated through filtering methods. The separation methods proposed in this paper are applied to three types of numerical simulation data, including pure data without noise, data with random noise, and data with linear regular noise to obtain satisfactory results. The noise suppression effects of these methods are sufficient, particularly with single-shooting blended seismic data, which verifies the effectiveness of the proposed methods.  相似文献   

11.
混采数据分离中插值与去噪的同步处理   总被引:3,自引:1,他引:2       下载免费PDF全文
近年来,由于新兴的混合采集观测系统在很大程度上提高了采集效率,因此得到了很多学者和石油公司的青睐.但在实际应用中,这种特殊观测系统的采集质量却受到很多因素的影响.一方面,该观测系统采集到的炮记录会受到相邻炮记录的干扰;另一方面,复杂的采集环境使得地震记录中包含部分空道;另外,采集过程中的场地环境干扰会不可避免地带入随机噪音,它们都会影响采集质量.虽然已有一些学者对这些因素做了相关研究,但都是单独分析,未能综合考虑各种干扰因素.本文基于稀疏约束反演的基本原理,将混合炮数据的分离、缺失道集的插值以及对随机噪音的压制问题整合在一起,通过一步处理同时减小如上三方面因素的不利影响,在改善信噪比的同时极大地提高了地震资料的处理效率.文章利用模拟数据和实际数据对这种新方法进行了验证,均得到了比较满意的效果.  相似文献   

12.
13.
Simultaneous sources acquisition, also referred to as “blended acquisition”, involves recording two or more shots simultaneously. It allows for denser spatial sampling and can greatly speed up the field data acquisition. Thus, it has potential advantage to improve seismic data quality and reduce acquisition cost. In order to achieve the goal of blended acquisition, a deblending procedure is necessary. It attenuates the interference and thus improves the resolution of the pre-stack time migration image. In this paper, we propose an efficient deblending method, which applies frequency-varying median and mean filters to cross-spread azimuth-offset gathers (XSPR-AO). The method can be used with variable window sizes according to the characteristics of the interference. The effectiveness of the method is validated by a field data example.  相似文献   

14.
The application of blended acquisition has drawn considerable attention owing to its ability to improve the operational efficiency as well as the data quality and health, safety and environment performance. Furthermore, the acquisition of less data contributes to the business aspect, while the desired data density is still realizable via subsequent data reconstruction. The use of fewer detectors and sources also minimizes operational risks in the field. Therefore, a combined implementation of these technologies potentially enhances the value of a seismic survey further. One way to encourage this is to minimize any imperfection in deblending and data reconstruction during processing. In addition, one may derive survey parameters that enable a further improvement in these processes as introduced in this study. The proposed survey design workflow iteratively performs the following steps to derive the survey parameters responsible for source blending as well as the spatial sampling of detectors and sources. The first step is the application of blending and sampling operators to unblended and well-sampled data. We then apply closed-loop deblending and data reconstruction. The residue for a given design from this step is evaluated and subsequently used by genetic algorithms to simultaneously update the survey parameters related to both blending and spatial sampling. The updated parameters are fed into the next iteration until they satisfy the given termination criteria. We also propose a repeated encoding sequence to form a parameter sequence in genetic algorithms, making the size of problem space manageable. The results of the proposed workflow are outlined using blended dispersed source array data incorporating different scenarios that represent acquisition in marine, transition zone and land environments. Clear differences attributed solely to the parameter design are easily recognizable. Additionally, a comparison among different optimization schemes illustrates the ability of genetic algorithms along with a repeated encoding sequence to find better solutions within a computationally affordable time. The optimized parameters yield a notable enhancement in the deblending and data reconstruction quality and consequently provide optimal acquisition scenarios.  相似文献   

15.
利用高阶统计量所具有的可抑制高斯噪声和比常规的自相关函数包含更多信息的优点,并结合地震解释的实际问题,本文采用四阶矩函数代替互相关函数进行相干计算,对第一代相干体算法进行了改进,发展了基于高阶统计量的相干体算法.该方法不仅计算速度快,而且抑制噪声能力强.通过与传统相干算法实际应用对比,该算法有效地突出了地层的高连续性特征,有利于层位追踪和断层解释,尤其利于中深层构造解释.  相似文献   

16.
在地震资料采集过程中,表层虚反射界面对地震波的激发效果影响较大,但也是确定炸药震源激发深度的重要因素之一.本文讨论了确定炸药震源距虚反射界面距离的有效方法,分析了复杂地表条件下虚反射界面对地震激发的影响,阐述了如何用之选择最佳激发井深,尽可能地减小由震源产生的各类次生干扰,获得频率响应较好的地震激发子波,以便采集到理想的地震资料.  相似文献   

17.
For data acquired with conventional acquisition techniques, surface multiples are usually considered as noise events that obscure the primaries. However, in this paper we demonstrate that for the situation of blended acquisition, meaning that different sources are shooting in a time‐overlapping fashion, multiples can be used to ‘deblend’ the seismic measurements. We utilize the recently introduced estimation of primaries by sparse inversion (EPSI) methodology, in which the primary impulse responses are considered to be the unknowns in a large‐scale inversion process. With some modifications the estimation of primaries by sparse inversion method can be used for blended seismic data. As output this process gives unblended primary impulse responses with point sources and receivers at the surface, which can be used directly in traditional imaging schemes. It turns out that extra information is needed to improve on the deblending of events that do not have much associated multiple energy in the data, such as steep events at large offsets. We demonstrate that this information can be brought in during acquisition and during processing. The methodology is illustrated on 2D synthetic data.  相似文献   

18.
The simulation of a zero-offset (ZO) stack section from multi-coverage reflection data is a standard imaging method in seismic processing. It significantly reduces the amount of data and increases the signal-to-noise ratio due to constructive interference of correlated events. Conventional imaging methods, e.g., normal moveout (NMO)/dip moveout (DMO)/stack or pre-stack migration, require a sufficiently accurate macro-velocity model to yield appropriate results, whereas the recently introduced common-reflection-surface stack does not depend on a macro-velocity model. For two-dimensional seismic acquisition, its stacking operator depends on three wavefield attributes and approximates the kinematic multi-coverage reflection response of curved interfaces in laterally inhomogeneous media. The common-reflection-surface stack moveout formula defines a stacking surface for each particular sample in the ZO section to be simulated. The stacking surfaces that fit best to actual events in the multi-coverage data set are determined by means of coherency analysis. In this way, we obtain a coherency section and a section of each of the three wavefield attributes defining the stacking operator. These wavefield attributes characterize the curved interfaces and, thus, can be used for a subsequent inversion. In this paper, we focus on an application to a real land data set acquired over a salt dome. We propose three separate one-parametric search and coherency analyses to determine initial common-reflection-surface stack parameters. Optionally, a subsequent optimization algorithm can be performed to refine these initial parameters. The simulated ZO section obtained by the common-reflection-surface stack is compared to the result of a conventional NMO/DMO/stack processing sequence. We observe an increased signal-to-noise ratio and an improved continuity along the events for our proposed method — without loss of lateral resolution.  相似文献   

19.
Marine seismic interference noise occurs when energy from nearby marine seismic source vessels is recorded during a seismic survey. Such noise tends to be well preserved over large distances and causes coherent artefacts in the recorded data. Over the years, the industry has developed various denoising techniques for seismic interference removal, but although well performing, they are still time-consuming in use. Machine-learning-based processing represents an alternative approach, which may significantly improve the computational efficiency. In the case of conventional images, autoencoders are frequently employed for denoising purposes. However, due to the special characteristics of seismic data as well as the noise, autoencoders failed in the case of marine seismic interference noise. We, therefore, propose the use of a customized U-Net design with element-wise summation as part of the skip-connection blocks to handle the vanishing gradient problem and to ensure information fusion between high- and low-level features. To secure a realistic study, only seismic field data were employed, including 25,000 training examples. The customized U-Net was found to perform well, leaving only minor residuals, except for the case when seismic interference noise comes from the side. We further demonstrate that such noise can be treated by slightly increasing the depth of our network. Although our customized U-Net does not outperform a standard commercial algorithm in quality, it can (after proper training) read and process one single shot gather in approximately 0.02 s. This is significantly faster than any existing industry denoising algorithm. In addition, the proposed network processes shot gathers in a sequential order, which is an advantage compared with industry algorithms that typically require a multi-shot input to break the coherency of the noise.  相似文献   

20.
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running‐window energy ratio of the short‐term average to the long‐term average of the passive seismic data for each trace. We show that for the common case of a low signal‐to‐noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross‐correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal‐to‐noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号