首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
在野外数据采集过程中,空间非均匀采样下的地震道缺失现象经常出现,为了不影响后续资料处理,必须进行高精度数据重建.然而大多数常规方法只能对空间均匀采样下的地震缺失道进行重建,而对于非均匀采样的地震数据则无能为力.为此本文在以往多尺度多方向二维曲波变换的基础上,首先引入非均匀快速傅里叶变换,建立均匀曲波系数与空间非均匀采样下地震缺失道数据之间的规则化反演算子,在L1最小范数约束下,使用线性Bregman方法进行反演计算得到均匀曲波系数,最后再进行均匀快速离散曲波反变换,从而形成基于非均匀曲波变换的高精度地震数据重建方法.该方法不仅可以重建非均匀带假频的缺失数据,而且具有较强的抗噪声能力,同时也可以将非均匀网格数据归为到任意指定的均匀采样网格.理论与实际数据的处理表明了该方法重建效果远优于非均匀傅里叶变换方法,可以有效地指导复杂地区数据采集设计及重建.  相似文献   

2.
Least squares Fourier reconstruction is basically a solution to a discrete linear inverse problem that attempts to recover the Fourier spectrum of the seismic wavefield from irregularly sampled data along the spatial coordinates. The estimated Fourier coefficients are then used to reconstruct the data in a regular grid via a standard inverse Fourier transform (inverse discrete Fourier transform or inverse fast Fourier transform). Unfortunately, this kind of inverse problem is usually under‐determined and ill‐conditioned. For this reason, the least squares Fourier reconstruction with minimum norm adopts a damped least squares inversion to retrieve a unique and stable solution. In this work, we show how the damping can introduce artefacts on the reconstructed 3D data. To quantitatively describe this issue, we introduce the concept of “extended” model resolution matrix, and we formulate the reconstruction problem as an appraisal problem. Through the simultaneous analysis of the extended model resolution matrix and of the noise term, we discuss the limits of the Fourier reconstruction with minimum norm reconstruction and assess the validity of the reconstructed data and the possible bias introduced by the inversion process. Also, we can guide the parameterization of the forward problem to minimize the occurrence of unwanted artefacts. A simple synthetic example and real data from a 3D marine common shot gather are used to discuss our approach and to show the results of Fourier reconstruction with minimum norm reconstruction.  相似文献   

3.
Seismic field data are often irregularly or coarsely sampled in space due to acquisition limits. However, complete and regular data need to be acquired in most conventional seismic processing and imaging algorithms. We have developed a fast joint curvelet‐domain seismic data reconstruction method by sparsity‐promoting inversion based on compressive sensing. We have made an attempt to seek a sparse representation of incomplete seismic data by curvelet coefficients and solve sparsity‐promoting problems through an iterative thresholding process to reconstruct the missing data. In conventional iterative thresholding algorithms, the updated reconstruction result of each iteration is obtained by adding the gradient to the previous result and thresholding it. The algorithm is stable and accurate but always requires sufficient iterations. The linearised Bregman method can accelerate the convergence by replacing the previous result with that before thresholding, thus promoting the effective coefficients added to the result. The method is faster than conventional one, but it can cause artefacts near the missing traces while reconstructing small‐amplitude coefficients because some coefficients in the unthresholded results wrongly represent the residual of the data. The key process in the joint curvelet‐domain reconstruction method is that we use both the previous results of the conventional method and the linearised Bregman method to stabilise the reconstruction quality and accelerate the recovery for a while. The acceleration rate is controlled through weighting to adjust the contribution of the acceleration term and the stable term. A fierce acceleration could be performed for the recovery of comparatively small gaps, whereas a mild acceleration is more appropriate when the incomplete data has a large gap of high‐amplitude events. Finally, we carry out a fast and stable recovery using the trade‐off algorithm. Synthetic and field data tests verified that the joint curvelet‐domain reconstruction method can effectively and quickly reconstruct seismic data with missing traces.  相似文献   

4.
地震数据规则化是地震信号处理中一个重要步骤,近年来受到广泛关注的压缩感知技术已经被应用到地震数据规则化中。压缩感知技术突破了传统的Shannon-Nyqiust采样定理的限制,可以用采集的少量地震数据重构完整数据。基于压缩感知技术的地震数据规则化质量主要受三个因素影响,除了受地震信号在不同变换域的稀疏表达和11范数重构算法的影响外,极大地取决于地震道随机稀疏采样方式。尽管已有学者开展了2D地震数据离散均匀分布随机采样方式研究,但设计新的稀疏采样方案仍然很有必要。在本文中,我们提出满足Bernoulli分布规律的Bernoulli随机稀疏采样方式和它的抖动形式。对2D数值模拟数据进行四种随机稀疏采样方案和两种变换(Fourier变换和Curvelet变换)实验,对获取的不完整数据应用11范数谱投影梯度算法(SPGL1)进行重构。考虑到不同随机种子点产生不同约束矩阵R会有不同的规则化质量,对每种方案和每个稀疏采样因子进行10次规则化实验,并计算出相应信噪比(SNR)的平均值和标准偏差。实验结果表明,我们提出的新方案好于或等于已有的离散均匀分布采样方案。  相似文献   

5.
基于压缩感知的Curvelet域联合迭代地震数据重建   总被引:8,自引:7,他引:1       下载免费PDF全文
由于野外采集环境的限制,常常无法采集得到完整规则的野外地震数据,为了后续地震处理、解释工作的顺利进行,地震数据重建工作被广泛的研究.自压缩感知理论的提出,相继出现了基于该理论的多种迭代阈值方法,如CRSI方法(Curvelet Recovery by Sparsity-promoting Inversion method)、Bregman迭代阈值算法(the linearized Bregman method)等.CSRI方法利用地震波形在Curvelet的稀疏特性,通过一种基于最速下降的迭代算法在Curvelet变换域恢复出高信噪比地震数据,该迭代算法稳定,收敛,但其收敛速度慢.Bregman迭代阈值法与CRSI最大区别在于每次迭代时把上一次恢复结果中的阈值前所有能量都保留到本次恢复结果中,从而加快了收敛速度,但随着迭代的进行重构数据中噪声干扰越来越严重,导致最终恢复出的数据信噪比低.综合两种经典方法的优缺点,本文构造了一种新的联合迭代算法框架,在每次迭代中将CRSI和Bregman的恢复量加权并同时加回本次迭代结果中,从而加快了迭代初期的收敛速度,又避免了迭代后期噪声干扰的影响.合成数据和实际数据试算结果表明,我们提出的新方法不仅迭代快速收敛稳定,且能得到高信噪比的重建结果.  相似文献   

6.
基于Bregman迭代的复杂地震波场稀疏域插值方法   总被引:2,自引:1,他引:1  
在地震勘探中,野外施工条件等因素使观测系统很难记录到完整的地震波场,因此,资料处理中的地震数据插值是一个重要的问题。尤其在复杂构造条件下,缺失的叠前地震数据给后续高精度处理带来严重的影响。压缩感知理论源于解决图像采集问题,主要包含信号的稀疏表征以及数学组合优化问题的求解,它为地震数据插值问题的求解提供了有效的解决方案。在应用压缩感知求解复杂地震波场的插值问题中,如何最佳化表征复杂地震波场以及快速准确的迭代算法是该理论应用的关键问题。Seislet变换是一个特殊针对地震波场表征的稀疏多尺度变换,该方法能有效地压缩地震波同相轴。同时,Bregman迭代算法在以稀疏表征为核心的压缩感知理论中,是一种有效的求解算法,通过选取适当的阈值参数,能够开发地震波动力学预测理论、图像处理变换方法和压缩感知反演算法相结合的地震数据插值方法。本文将地震数据插值问题纳入约束最优化问题,选取能够有效压缩复杂地震波场的OC-seislet稀疏变换,应用Bregman迭代方法求解压缩感知理论框架下的混合范数反问题,提出了Bregman迭代方法中固定阈值选取的H曲线方法,实现地震波场的快速、准确重建。理论模型和实际数据的处理结果验证了基于H曲线准则的Bregman迭代稀疏域插值方法可以有效地恢复复杂波场的缺失信息。  相似文献   

7.
Seismic data contain random noise interference and are affected by irregular subsampling. Presently, most of the data reconstruction methods are carried out separately from noise suppression. Moreover, most data reconstruction methods are not ideal for noisy data. In this paper, we choose the multiscale and multidirectional 2D curvelet transform to perform simultaneous data reconstruction and noise suppression of 3D seismic data. We introduce the POCS algorithm, the exponentially decreasing square root threshold, and soft threshold operator to interpolate the data at each time slice. A weighing strategy was introduced to reduce the reconstructed data noise. A 3D simultaneous data reconstruction and noise suppression method based on the curvelet transform was proposed. When compared with data reconstruction followed by denoizing and the Fourier transform, the proposed method is more robust and effective. The proposed method has important implications for data acquisition in complex areas and reconstructing missing traces.  相似文献   

8.
Data interpolation is an important step for seismic data analysis because many processing tasks, such as multiple attenuation and migration, are based on regularly sampled seismic data. Failed interpolations may introduce artifacts and eventually lead to inaccurate final processing results. In this paper, we generalised seismic data interpolation as a basis pursuit problem and proposed an iteration framework for recovering missing data. The method is based on non‐linear iteration and sparse transform. A modified Bregman iteration is used for solving the constrained minimisation problem based on compressed sensing. The new iterative strategy guarantees fast convergence by using a fixed threshold value. We also propose a generalised velocity‐dependent formulation of the seislet transform as an effective sparse transform, in which the non‐hyperbolic normal moveout equation serves as a bridge between local slope patterns and moveout parametres in the common‐midpoint domain. It can also be reduced to the traditional velocity‐dependent seislet if special heterogeneity parametre is selected. The generalised velocity‐dependent seislet transform predicts prestack reflection data in offset coordinates, which provides a high compression of reflection events. The method was applied to synthetic and field data examples, and the results show that the generalised velocity‐dependent seislet transform can reconstruct missing data with the help of the modified Bregman iteration even for non‐hyperbolic reflections under complex conditions, such as vertical transverse isotropic (VTI) media or aliasing.  相似文献   

9.
刘洋  张鹏  刘财  张雅晨 《地球物理学报》2018,61(4):1400-1412
人工地震方法由于受到野外观测系统和经济因素等的限制,采集的数据在空间方向总是不规则分布.但是,许多地震数据处理技术的应用(如:多次波衰减,偏移和时移地震)都基于空间规则分布条件下的地震数据体.因此,数据插值技术是地震数据处理流程中关键环节之一.失败的插值方法往往会引入虚假信息,给后续处理环节带来严重的影响.迭代插值方法是目前广泛应用的地震数据重建思路,但是常规的迭代插值方法往往很难保证插值精度,并且迭代收敛速度较慢,尤其存在随机噪声的情况下,插值地震道与原始地震道之间存在较大的信噪比差异.因此开发快速的、有效的迭代数据插值方法具有重要的工业价值.本文将地震数据插值归纳为数学基追踪问题,在压缩感知理论框架下,提出新的非线性Bregman整形迭代算法来求解约束最小化问题,同时在迭代过程中提出两种匹配的迭代控制准则,通过有效的稀疏变换对缺失数据进行重建.通过理论模型和实际数据测试本文方法,并且与常规迭代插值算法进行比较,结果表明Bregman整形迭代插值方法能够更加有效地恢复含有随机噪声的缺失地震信息.  相似文献   

10.
This paper describes an effective implementation of the inverse data-space multiple elimination method via the three-dimensional (3D) curvelet domain. The method can separate the surface-related operator (A) and primaries (P 0) through seismic data matrix inversion. A 3D curvelet transform is introduced to sparsely represent the seismic data in the inverse data space. Hence, this approach is suitable for obtaining an accurate solution because of its multiscale and multidirectional analysis properties. The L1 norm is used to promote sparseness in the transform domain. Then, a high-fidelity separation of the operator (A) and the primaries (P 0) is realized. The proposed method is applied to synthetic data from a model containing a salt structure. We compare the results with that of the traditional inverse data-space multiple elimination method and also with that of two-dimensional surface-related multiple elimination. The findings fully demonstrate the superiority of the proposed method over the traditional inverse method; moreover, the proposed method protects the primary energy more effectively than the SRME method.  相似文献   

11.
受野外观测条件的限制,采集的地震数据体通常不规则,并缺失一部分数据道。传统的单道提高分辨率方法无法兼顾横向地震信息,处理结果存在空间一致性问题。为此,本文提出在曲波域内进行不规则地震数据,通过曲波变换实现对地震数据的稀疏表征,将提高分辨率问题转化为曲波域1-范数约束的稀疏促进求解,得到规则化的高分辨率地震数据体。该方法避免传统单道提高分辨率方法存在的局限性,在提高分辨率的同时,能够恢复缺失的地震数据、压制随机噪声,进而提高地震数据的完备性,模型和实际资料试算,验证了该方法的正确性、有效性和适用性。   相似文献   

12.
基于Curvelet变换与POCS方法的三维数字岩心重建   总被引:1,自引:1,他引:0       下载免费PDF全文
随着页岩气勘探与开发的深入,研究页岩裂隙的三维空间展布成为页岩岩石物理研究的必要步骤之一.但由于仪器的限制,页岩切片在深度上具有不连续性,以及数字岩心纵向上成像最小间隔与横向分辨率的不一致成为影响裂隙表征和数字岩石物理模拟精度提高的重要因素.为了更好的研究裂隙在三维的空间展布,本文将curvelet稀疏变换与凸集投影(POCS)迭代算法有效结合,实现三维数字岩心重建.首先对X射线扫描砂岩得到的三维数据体进行隔片抽稀,利用本文方法实现三维数据体重建,重建结果与完整数据体具有很好的一致性,且优于现有方法(spgl1),验证了新方法的有效性与先进性.其次对聚焦离子束扫描电镜(FIB-SEM)得到的纳米级页岩二维切片在深度上进行了加密重建,获得纵向上成像最小间隔与横向分辨率基本一致的三维数字岩心,由于仪器限制引起的页岩切片深度上的不连续性得到减弱,裂隙展布更加清晰.砂岩CT图像以及页岩FIB-SEM成像数据的重建结果验证了本文方法的有效性与先进性.  相似文献   

13.
In the field of seismic exploration, ground roll seriously affects the deep effective reflections from subsurface deep structures. Traditional curvelet transform cannot provide an adaptive basis function to achieve a suboptimal denoised result. In this paper, we propose a method based on empirical curvelet transform (ECT) for ground roll attenuation. Unlike the traditional curvelet transform, this method not only decomposes seismic data into multiscale and multi-directional components, but also provides an adaptive filter bank according to frequency content of seismic data itself. So, ground roll can be separated by using this method. However, as the frequency of reflection and ground roll components are close, we apply singular value decomposition (SVD) in the curvelet domain to differentiate the ground roll and reflection better. Examples of synthetic and field seismic data reveal that the proposed method based ECT performs better than the traditional curvelet method in terms of the suppression of ground roll.  相似文献   

14.
Fourier reconstruction with sparse inversion   总被引:2,自引:0,他引:2  
The problem of seismic data reconstruction is posed as an inverse problem where the objective is to obtain the Fourier coefficients that synthesize the signal. Once the coefficients have been found, they are used to reconstruct the data on a uniformly spaced grid. A non‐quadratic model weight function is included to stabilize the inversion and to provide the additional information required to interpolate through gaps. In the reconstruction of a non‐uniformly sampled trace, an image and a marine 3D VSP shot‐record, the method shows improved reconstruction in large gaps and is less sensitive to the spatial bandwidth used in the inversion compared to Fourier reconstruction without the non‐quadratic model weight function.  相似文献   

15.
Cadzow filtering is currently considered as one of the most effective approaches for seismic data reconstruction. The basic version of Cadzow filtering first reorders each frequency slice of the seismic data (to be reconstructed) to a block Hankel/Toeplitz matrix, and then implements a rank-reduction operator, that is truncated singular value decomposition, to the Hankel/Toeplitz matrix. However, basic Cadzow filtering cannot deal with the problem of recovering regularly missing data (up-sampling) in the case of strongly aliased energy, because the regularly missing data will mix with signals in the singular spectrum. To solve this problem, it has been proposed to precondition the reconstruction of high-frequency components using information from the low-frequency components which are less aliased. In this paper, we further extend the de-aliased Cadzow filtering approach to reconstruct regularly sampled seismic traces from the noisy observed data by modifying the reinserting operation, in which the high-frequency components are projected onto the sub-space spanned by several singular vectors of the low-frequency components. At each iteration, the filtered data are weighted to the original data as a feedback. The weighting factor is related to the background noise level and changes with iteration. The feasibility of the proposed technique is validated via two-dimensional, three-dimensional and five-dimensional synthetic data examples, as well as two-dimensional post-stack and three-dimensional pre-stack field data examples. The results demonstrate that the proposed technique can effectively interpolate regularly sampled data and is robust in noisy environments.  相似文献   

16.
《Advances in water resources》2007,30(4):1027-1045
Streamline methods have shown to be effective for reservoir simulation. For a regular grid, it is common to use the semi-analytical Pollock’s method to obtain streamlines and time-of-flight coordinates (TOF). The usual way of handling irregular grids is by trilinear transformation of each grid cell to a unit cube together with a linear flux interpolation scaled by the Jacobian. The flux interpolation allows for fast integration of streamlines, but is inaccurate even for uniform flow. To improve the tracing accuracy, we introduce a new interpolation method, which we call corner-velocity interpolation. Instead of interpolating the velocity field based on discrete fluxes at cell edges, the new method interpolates directly from reconstructed point velocities given at the corner points in the grid. This allows for reproduction of uniform flow, and eliminates the influence of cell geometries on the velocity field. Using several numerical examples, we demonstrate that the new method is more accurate than the standard tracing methods.  相似文献   

17.
表驱动的二维非规则采样快速傅里叶变换   总被引:3,自引:2,他引:1       下载免费PDF全文
熊登  张剑锋 《地球物理学报》2008,51(6):1860-1867
非规则采样快速傅里叶变换(NFFT)主要用于快速计算非规则采样数据的频谱及重建.该方法为非规则采样数据频谱重建技术的核心算法.在实现NFFT算法时,高速度和高精度计算是其应用的前提和关键.本文针对二维NFFT计算效率,应用表驱动思路进行改进,将Gauss褶积算子由矩形改进为椭圆以减少计算量,将e指数计算改进为乘法以加快计算速度,并建表解决NFFT算法在地震资料处理中的应用问题.本文同时给出了非规则采样地震数据NFFT谱重建方法.最后本文给出算例验证提出方法的计算速度和精度,和非规则采样地震资料重建结果.  相似文献   

18.
A new seismic interpolation and denoising method with a curvelet transform matching filter, employing the fast iterative shrinkage thresholding algorithm (FISTA), is proposed. The approach treats the matching filter, seismic interpolation, and denoising all as the same inverse problem using an inversion iteration algorithm. The curvelet transform has a high sparseness and is useful for separating signal from noise, meaning that it can accurately solve the matching problem using FISTA. When applying the new method to a synthetic noisy data sets and a data sets with missing traces, the optimum matching result is obtained, noise is greatly suppressed, missing seismic data are filled by interpolation, and the waveform is highly consistent. We then verified the method by applying it to real data, yielding satisfactory results. The results show that the method can reconstruct missing traces in the case of low SNR (signal-to-noise ratio). The above three problems can be simultaneously solved via FISTA algorithm, and it will not only increase the processing efficiency but also improve SNR of the seismic data.  相似文献   

19.
地震资料室内处理过程要求野外采集的地震资料越多越好, 而地震数据远距离快速传输又要求野外地震数据量越少越好. 为解决这一矛盾, 将基于曲波变换与压缩感知的数据重建技术引入到地震资料处理中, 对实际的野外不完整数据进行压缩重建. 结果表明, 曲波变换相对于傅里叶变换在数据压缩采样方法中占有一定的优势. 但是, 在对实际资料进行处理时, 首先要对资料中的面波进行处理, 同时还要在一定曲波基元尺寸的情况下, 考虑缺失道数量的影响. 最终, 得到的重建数据图像纹理清晰、 连接自然, 从而验证了该方法的实用性和有效性.   相似文献   

20.
Oil and gas exploration gradually changes to the deep and complex areas. The quality of seismic data restricts the effective application of conventional time-frequency analysis technology, especially in the case of low signal-to-noise ratio. To address this problem, we propose a curvelet-based time-frequency analysis method, which is suitable for seismic data, and takes into account the lateral variation of seismic data. We first construct a kind of curvelet adapted to seismic data. By adjusting the rotation mode of the curvelet in the form of time skewing, the scale parameter can be directly related to the frequency of the seismic data. Therefore, the curvelet coefficients at different scales can reflect the time-frequency information of the seismic data. Then, the curvelet coefficients, which represent the dominant azimuthal pattern, are converted to the time-frequency domain. Since the curvelet transform is a kind of sparse representation for the signal, the screening process of the dominant coefficient masks most of the random noise, which enables the method to adapt for the low signal-to-noise ratio data. Results of synthetic and field data experiments using the proposed method demonstrate that it is a good approach to identify weak signals from strong noise in the time-frequency domain.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号