首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Passive microseismic data are commonly buried in noise, which presents a significant challenge for signal detection and recovery. For recordings from a surface sensor array where each trace contains a time‐delayed arrival from the event, we propose an autocorrelation‐based stacking method that designs a denoising filter from all the traces, as well as a multi‐channel detection scheme. This approach circumvents the issue of time aligning the traces prior to stacking because every trace's autocorrelation is centred at zero in the lag domain. The effect of white noise is concentrated near zero lag; thus, the filter design requires a predictable adjustment of the zero‐lag value. Truncation of the autocorrelation is employed to smooth the impulse response of the denoising filter. In order to extend the applicability of the algorithm, we also propose a noise prewhitening scheme that addresses cases with coloured noise. The simplicity and robustness of this method are validated with synthetic and real seismic traces.  相似文献   

2.
There has been limited success in determining critical thresholds of ground cover or soil characteristics that relate to significant changes in runoff or sediment production at the microscale (<1 m2), particularly in semi‐arid systems where management of ground cover is critical. Despite this lack of quantified thresholds, there is an increasing research focus on the two‐phase mosaic of vegetation patches and inter‐patches in semi‐arid systems. In order to quantify ground cover and soil related thresholds for runoff and sediment production, we used a data mining technique known as conditional inference tree analysis to determine statistically significant values of a range of measured variables that predicted average runoff, peak runoff, sediment concentration and sediment production at the microscale. On Chromic Luvisols across a range of vegetation states in semi‐arid south‐eastern Australia, large changes in runoff and sediment production were related to a hierarchy of different variables and thresholds, but the percentage of bare soil played a primary role in predicting runoff and sediment production in most instances. The identified thresholds match well with previous thresholds found in semi‐arid and temperate regions (including the approximate values of 30%, 50% and 70% total ground cover). The analysis presented here identified the critical role of soil surface roughness, particularly where total ground cover is sparse. The analysis also provided evidence that a two‐phase mosaic of patches and inter‐patches identified via rapid visual assessment could be further delineated into distinct groups of hydrological response, or a multi‐phase rather than a two‐phase system. The approach used here may aid in assessing scale‐dependent responses and address data non‐linearity in studies of semi‐arid hydrology. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

3.
Testing the ability of surface arrays to monitor microseismic activity   总被引:2,自引:0,他引:2  
Recently there has been much interest in the use of data from surface arrays in conjunction with migration‐based processing methods for passive seismic monitoring. In this study we use an example of this kind of data recorded whilst 18 perforation shots, with a variety of positions and propellant amounts, were detonated in the subsurface. As the perforation shots provide signals with known source positions and origin times, the analysis of these data is an invaluable opportunity to test the accuracy and ability of surface arrays to detect and locate seismic sources in the subsurface. In all but one case the signals from the perforation shots are not visible in the raw or preprocessed data. However, clear source images are produced for 12 of the perforation shots showing that arrays of surface sensors are capable of imaging microseismic events, even when the signals are not visible in individual traces. We find that point source locations are within typically 45 m (laterally) of the true shot location, however the depths are less well constrained (~150 m). We test the sensitivity of our imaging method to the signal‐to‐noise ratio in the data using signals embedded in realistic noise. We find that the position of the imaged shot location is quite insensitive to the level of added noise, the primary effect of increased noise being to defocus the source image. Given the migration approach, the array geometry and the nature of coherent noise during the experiment, signals embedded in noise with ratios ≥0.1 can be used to successfully image events. Furthermore, comparison of results from data and synthetic signals embedded in noise shows that, in this case, prestack corrections of traveltimes to account for near‐surface structure will not enhance event detectability. Although, the perforation shots have a largely isotropic radiation pattern the results presented here show the potential for the use of surface sensors in microseismic monitoring as a viable alternative to classical downhole methods.  相似文献   

4.
For data acquired with conventional acquisition techniques, surface multiples are usually considered as noise events that obscure the primaries. However, in this paper we demonstrate that for the situation of blended acquisition, meaning that different sources are shooting in a time‐overlapping fashion, multiples can be used to ‘deblend’ the seismic measurements. We utilize the recently introduced estimation of primaries by sparse inversion (EPSI) methodology, in which the primary impulse responses are considered to be the unknowns in a large‐scale inversion process. With some modifications the estimation of primaries by sparse inversion method can be used for blended seismic data. As output this process gives unblended primary impulse responses with point sources and receivers at the surface, which can be used directly in traditional imaging schemes. It turns out that extra information is needed to improve on the deblending of events that do not have much associated multiple energy in the data, such as steep events at large offsets. We demonstrate that this information can be brought in during acquisition and during processing. The methodology is illustrated on 2D synthetic data.  相似文献   

5.
Time‐lapse seismic analysis is utilized in CO2 geosequestration to verify the CO2 containment within a reservoir. A major risk associated with geosequestration is a possible leakage of CO2 from the storage formation into overlaying formations. To mitigate this risk, the deployment of carbon capture and storage projects requires fast and reliable detection of relatively small volumes of CO2 outside the storage formation. To do this, it is necessary to predict typical seepage scenarios and improve subsurface seepage detection methods. In this work we present a technique for CO2 monitoring based on the detection of diffracted waves in time‐lapse seismic data. In the case of CO2 seepage, the migrating plume might form small secondary accumulations that would produce diffracted, rather than reflected waves. From time‐lapse data analysis, we are able to separate the diffracted waves from the predominant reflections in order to image the small CO2 plumes. To explore possibilities to detect relatively small amounts of CO2, we performed synthetic time‐lapse seismic modelling based on the Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC) Otway project data. The detection method is based on defining the CO2 location by measuring the coherency of the signal along diffraction offset‐traveltime curves. The technique is applied to a time‐lapse stacked section using a stacking velocity to construct offset‐traveltime curves. Given the amount of noise found in the surface seismic data, the predicted minimum detectable amount of CO2 is 1000–2000 tonnes. This method was also applied to real data obtained from a time‐lapse seismic physical model. The use of diffractions rather than reflections for monitoring small amounts of CO2 can enhance the capability of subsurface monitoring in CO2 geosequestration projects.  相似文献   

6.
7.
Static shifts from near‐surface inhomogeneities very often represent the key problem in the processing of seismic data from arid regions. In this case study, the deep bottom fill of a wadi strongly degrades the image quality of a 2D seismic data set. The resulting static and dynamic problems are solved by both conventional and common‐reflection‐surface (CRS) processing. A straightforward approach derives conventional refraction statics from picked first breaks and then goes through several iterations of manual velocity picking and residual statics calculation. The surface‐induced static and dynamic inhomogeneities, however, are not completely solved by these conventional methods. In CRS processing, the local adaptation of the CRS stacking parameters results in very detailed dynamic corrections. They resolve the local inhomogeneities that were not detected by manual picking of stacking velocities and largely compensate for the surface‐induced deterioration in the stack. The subsequent CRS residual statics calculations benefit greatly from the large CRS stacking fold which increases the numbers of estimates for single static shifts. This improves the surface‐consistent averaging of static shifts and the convergence of the static solution which removes the remaining static shifts in the 2D seismic data. The large CRS stacking fold also increases the signal‐to‐noise ratio in the final CRS stack.  相似文献   

8.
We present an automatic method of processing microseismic data acquired at the surface by a star‐like array. The back‐projection approach allows successive determination of the hypocenter position of each event and of its focal mechanisms. One‐component vertical geophone groups and three‐component accelerometers are employed to monitor both P‐ and S‐waves. Hypocenter coordinates are determined in a grid by back‐projection stacking of the short‐time‐average‐to‐long‐time‐average ratio of absolute amplitudes at vertical components and polarization norm derived from horizontal components of the P‐ and S‐waves, respectively. To make the location process more efficient, calculation is started with a coarse grid and zoomed to the optimum hypocenter using an oct‐tree algorithm. The focal mechanism is then determined by stacking the vertical component seismograms corrected for the theoretical P‐wave polarity of the focal mechanism. The mechanism is resolved in the coordinate space of strike, dip, and rake angles. The method is tested on 34 selected events of a dataset of hydraulic fracture monitoring of a shale gas play in North America. It was found that, by including S‐waves, the vertical accuracy of locations improved by a factor of two and is equal to approximately the horizontal location error. A twofold enhancement of horizontal location accuracy is achieved if a denser array of geophone groups is used instead of the sparse array of three‐component seismometers. The determined focal mechanisms are similar to those obtained by other methods applied to the same dataset.  相似文献   

9.
In recent years, the use of wide source arrays in marine seismic surveys has been a topic of interest in the seismic industry. Although one motivation for wide arrays is to get more guns in a source array without increasing the in-line array dimension, wide arrays can also provide the benefit of suppressing side-scattered energy. Comparisons of common midpoint (CMP) stacks of data acquired offshore Washington and Alaska with wide and conventional-width source arrays, however, show only small and sometimes inconsistent differences. These data were acquired in areas where side-scattered energy is a problem. Comparisons of pre-stack data, however, show substantial differences between the wide and conventional source array data. The disparity between the stacked and prestack data is explained by analysing the effective suppression of back-scattered energy by CMP stacking. Energy reflected from scatterer positions broadside to a given CMP location has a lower stacking velocity than that of the primary reflection events. Thus, CMP stacking attenuates the side-scattered energy. In both survey areas the action of CMP stacking was so powerful in suppressing the broadside energy that the additional action of the wide array was inconsequential in the final stacked sections. In other areas, where the scattering velocity is comparable to the primary stacking velocity, wide arrays could provide considerable advantage. Even though CMP stacked data from wide and conventional-width arrays may appear similar, the reduced amount of side-scattered energy in wide-array prestack data may provide a benefit for data dependent processes such as predictive deconvolution and velocity analysis. However, wide arrays cannot be used indiscriminately because they can degrade cross-dipping primary events. They should be considered primarily as a special tool for attacking severe source-generated noise from back-scattered waves in areas where the action of CMP stacking is insufficient.  相似文献   

10.
Stream water quality can change substantively during diurnal cycles, discrete flow events, and seasonal time scales. In this study, we assessed event responses in surface water nutrient concentrations and biogeochemical parameters through the deployment of continuous water quality sensors from March to October 2011 in the East Fork Jemez River, located in northern New Mexico, USA. Events included two pre‐fire non‐monsoonal precipitation events in April, four post‐fire precipitation events in August and September (associated with monsoonal thunderstorms), and two post‐fire non‐monsoonal precipitation events in October. The six post‐fire events occurred after the Las Conchas wildfire burned a significant portion of the contributing watershed (36%) beginning in June 2011. Surface water nitrate (NO3? N) concentrations increased by an average of 50% after pre‐fire and post‐fire non‐monsoonal precipitation events and were associated with small increases in turbidity (up to 15 NTU). Beginning 1 month after the start of the large regional wildfire, monsoonal precipitation events resulted in large multi‐day increases in dissolved NO3? N (6 × background levels), dissolved phosphate (100 × background levels), specific conductance (5 × background levels), and turbidity (>100 × background levels). These periods also corresponded with substantial sags in dissolved oxygen (<4 mg l?1) and pH (<6.5). The short duration and rapid rates of change during many of these flow events, particularly following wildfire, highlight the importance of continuous water quality monitoring to quantify the timing and magnitude of event responses in streams and to examine large water quality excursions linked to catchment disturbance. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

11.
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running‐window energy ratio of the short‐term average to the long‐term average of the passive seismic data for each trace. We show that for the common case of a low signal‐to‐noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross‐correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal‐to‐noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.  相似文献   

12.
We assess the ability of multivariate statistical analyses applied to event hydrographs parameters, to characterize a catchment hydrological behaviour. Motivation for such an approach lies in the fact that streamflow records have yet to be exploited to their full potential towards hydrological interpretation and can be used to infer a catchment state of connectivity from a qualitative standpoint. We have therefore processed 96 event hydrographs from a small headwater temperate humid forested catchment using principal component analysis, variation partitioning and classification tree analysis. These techniques prove to be promising in discriminating contrasted types of hydrologic responses (e.g. low‐ vs high‐magnitude events, slow vs quick timing events), identifying the main hydro‐meteorological variables that control these responses and determining thresholds values of the hydro‐meteorological variables leading to a switch between catchment response types. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

13.
Wave‐equation redatuming can be a very efficient method of overcoming the overburden imprint on the target area. Owing to the growing amount of 3D data, it is increasingly important to develop a feasible method for the redatuming of 3D prestack data. Common 3D acquisition designs produce relatively sparse data sets, which cannot be redatumed successfully by applying conventional wave‐equation redatuming. We propose a redatuming approach that can be used to perform wave‐equation redatuming of sparse 3D data. In this new approach, additional information about the medium velocity below the new datum is included, i.e. redatumed root‐mean‐square (RMS) velocities, which can be extracted from the input data set by conventional velocity analysis, are used. Inclusion of this additional information has the following implications: (i) it becomes possible to simplify the 4D redatuming integral into a 2D integral such that the number of traces needed to calculate one output time sample and the computational effort are both reduced; (ii) the information about the subsurface enables an infill of traces which are needed for the integral calculation but which are missing in the sparse input data set. Two tests applying this new approach to fully sampled 2D data show satisfactory results, implying that this method can certainly be used for the redatuming of sparse 3D data sets.  相似文献   

14.
Little Kickapoo Creek (LKC), a low‐gradient stream, mobilizes its streambed–fundamentally altering its near‐surface hyporheic zone–more frequently than do higher‐gradient mountain and karst streams. LKC streambed mobility was assessed through streambed surveys, sediment sampling, and theoretical calculations comparing basal shear stress (τb) with critical shear stress (τc). Baseflow τb is capable of entraining a d50 particle; bankfull flow could entrain a 51·2 mm particle. No particle that large occurs in the top 30 cm of the substrate, suggesting that the top 30 cm of the substrate is mobilized and redistributed during bankfull events. Bankfull events occur on average every 7·6 months; flows capable of entraining d50 and d85 particles occur on average every 0·85 and 2·1 months, respectively. Streambed surveys verify streambed mobility at conditions below bankfull. While higher gradient streams have higher potential energy than LKC, they achieve streambed‐mobilization thresholds less frequently. Heterogeneous sediment redistribution creates an environment where substrate hydraulic conductivity (K) varies over four orders of magnitude. The frequency and magnitude of the substrate entrainment has implications on hyporheic zone function in fluid, solute and thermal transport models, interpretations of hyporheic zone stability, and understanding of LKC's aquatic ecosystem. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

15.
The effects of large floods on river morphology are variable and poorly understood. In this study, we apply multi‐temporal datasets collected with small unmanned aircraft systems (UASs) to analyze three‐dimensional morphodynamic changes associated with an extreme flood event that occurred from 19 to 23 June 2013 on the Elbow River, Alberta. We documented reach‐scale spatial patterns of erosion and deposition using high‐resolution (4–5 cm/pixel) orthoimagery and digital elevation models (DEMs) produced from photogrammetry. Significant bank erosion and channel widening occurred, with an average elevation change of ?0.24 m. The channel pattern was reorganized and overall elevation variation increased as the channel adjusted to full mobilization of most of the bed surface sediments. To test the extent to which geomorphic changes can be predicted from initial conditions, we compared shear stresses from a two‐dimensional hydrodynamic model of peak discharge to critical shear stresses for bed surface sediment sizes. We found no relation between modeled normalized shear stresses and patterns of scour and fill, confirming the complex nature of sediment mobilization and flux in high‐magnitude events. However, comparing modeled peak flows through the pre‐ and post‐flood topography showed that the flood resulted in an adjustment that contributes to overall stability, with lower percentages of bed area below thresholds for full mobility in the post‐flood geomorphic configuration. Overall, this work highlights the potential of UAS‐based remote sensing for measuring three‐dimensional changes in fluvial settings and provides a detailed analysis of potential relationships between flood forces and geomorphic change. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

16.
We apply a redatuming methodology, designed to handle rugged topography and the presence of high‐velocity layers near the acquisition surface, to a 2D land seismic data set acquired in Saudi Arabia. This methodology is based on a recently developed prestack operator, which we call the topographic datuming operator (TDO). The TDO, unlike static corrections, allows for the movement of reflections laterally with respect to their true locations, corresponding to the new datum level. Thus, it mitigates mispositioning of events and velocity bias introduced by the assumption of surface consistency and the time‐invariant time shifts brought about by static corrections. Using the shallow velocities estimated from refracted events, the TDO provides a superior continuity of reflections and better focusing than that obtained from conventional static corrections in most parts of the processed 2D line. The computational cost of applying the TDO is only slightly higher than static corrections. The marginal additional computational cost and the possibility of estimating, after TDO redatuming, stacking velocities that are not affected by a spurious positive bias, as in the case of static corrections, are further advantages of the proposed methodology. The likelihood of strong heterogeneities in the most complex part of the line limits the applicability of any approach based upon geometrical optics; however, the TDO produces results that are slightly better than those obtained from static corrections because of its ability to partially collapse diffractions generated in the near surface.  相似文献   

17.
I study the responses of two different triaxial induction tools to invaded dipping anisotropic formations. I show that the triaxial measurements have generally higher sensitivity to the radial invasion profile, compared to the conventional induction measurements. This enables accurate interpretation of both the anisotropic formation properties and the invasion parameters. Multi‐spacing and single‐spacing multi‐frequency triaxial induction tools can both be used for this purpose. Failure to take the invasion properties into account may lead to misinterpretation of the vertical formation resistivity. Symmetrization of the apparent conductivity matrix opens ways for a visual interpretation of triaxial induction logs for the formation and the invasion zone properties. This technique enables simpler and faster inversion algorithms. I study how the effect of a conductive annulus forming around the invasion zone couples with effects of the dipping anisotropy and the dipping boundaries and show when these effects are additive. Thus, a visual detection of log parts affected by a conductive annulus becomes possible. The key tool for interpretation in complex 3D scenarios is efficient modelling software. I use a 3D finite‐difference modelling approach to simulate responses of induction logging tools of the new generation. Its high efficiency enables simultaneous multi‐spacing and multi‐frequency computing of the tool responses to arbitrary 3D anisotropic formations that made the study possible.  相似文献   

18.
Static correction is a common step in a seismic data proccessing flowchart for land data. Here we propose a new algorithm for automatic short‐period static correction. The algorithm is based on the assumption that seismic events after short‐period static correction should be locally plane nearly everywhere. No other assumptions are made. Therefore the proposed method does not require a preliminary velocity analysis. The algorithm consists in two main parts: evaluation of second spatial differences of trajectories and subsequent regularized integration of these differences. The proposed method proves its robustness and shows results comparable with conventional residual static correction based on improving common‐midpoint stacking. In contrast to the conventional residual static, the proposed algorithm can estimate short‐period statics in complex cases where common‐midpoint stacking fails because of non‐hyperbolic events.  相似文献   

19.
The method of common reflection surface (CRS) extends conventional stacking of seismic traces over offset to multidimensional stacking over offset‐midpoint surfaces. We propose a new form of the stacking surface, derived from the analytical solution for reflection traveltime from a hyperbolic reflector. Both analytical comparisons and numerical tests show that the new approximation can be significantly more accurate than the conventional CRS approximation at large offsets or at large midpoint separations while using essentially the same parameters.  相似文献   

20.
In conventional seismic exploration, especially in marine seismic exploration, shot gathers with missing near‐offset traces are common. Interferometric interpolation methods are one of a range of different methods that have been developed to solve this problem. Interferometric interpolation methods differ from conventional interpolation methods as they utilise information from multiples in the interpolation process. In this study, we apply both conventional interferometric interpolation (shot domain) and multi‐domain interferometric interpolation (shot and receiver domain) to a synthetic and a real‐towed marine dataset from the Baltic Sea with the primary aim of improving the image of the seabed by extrapolation of a near‐offset gap. We utilise a matching filter after interferometric interpolation to partially mitigate artefacts and coherent noise associated with the far‐field approximation and a limited recording aperture size. The results show that an improved image of the seabed is obtained after performing interferometric interpolation. In most cases, the results from multi‐domain interferometric interpolation are similar to those from conventional interferometric interpolation. However, when the source–receiver aperture is limited, the multi‐domain method performs better. A quantitative analysis for assessing the performance of interferometric interpolation shows that multi‐domain interferometric interpolation typically performs better than conventional interferometric interpolation. We also benchmark the interpolated results generated by interferometric interpolation against those obtained using sparse recovery interpolation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号