首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
3D seismic data are usually recorded and processed on rectangular grids, for which sampling requirements are generally derived from the usual 1D viewpoint. For a 3D data set, the band region (the region of the Fourier space in which the amplitude spectrum is not zero) can be approximated by a domain bounded by two cones. Considering the particular shape of this band region we can use the 3D sampling viewpoint, which leads to weaker sampling requirements than does the 1D viewpoint; i.e. fewer sample points are needed to represent data with the same degree of accuracy. The 3D sampling viewpoint considers regular nonrectangular sampling grids. The recording and processing of 3D seismic data on a hexagonal sampling grid is explored. The acquisition of 3D seismic data on a hexagonal sampling grid is an advantageous economic alternative because it requires 13.4% fewer sample points than a rectangular sampling grid. The hexagonal sampling offers savings in data storage and processing of 3D seismic data. A fast algorithm for 3D discrete spectrum evaluation and trace interpolation in the case of a 3D seismic data set sampled on a hexagonal grid is presented and illustrated by synthetic examples. It is shown that by using this algorithm the hexagonal sampling offers, approximately, the same advantage of saving 13.4% in data storage and computational time for 3D phase-shift migration.  相似文献   

2.
MAROS: a decision support system for optimizing monitoring plans   总被引:3,自引:0,他引:3  
The Monitoring and Remediation Optimization System (MAROS), a decision-support software, was developed to assist in formulating cost-effective ground water long-term monitoring plans. MAROS optimizes an existing ground water monitoring program using both temporal and spatial data analyses to determine the general monitoring system category and the locations and frequency of sampling for future compliance monitoring at the site. The objective of the MAROS optimization is to minimize monitoring locations in the sampling network and reduce sampling frequency without significant loss of information, ensuring adequate future characterization of the contaminant plume. The interpretive trend analysis approach recommends the general monitoring system category for a site based on plume stability and site-specific hydrogeologic information. Plume stability is characterized using primary lines of evidence (i.e., Mann-Kendall analysis and linear regression analysis) based on concentration trends, and secondary lines of evidence based on modeling results and empirical data. The sampling optimization approach, consisting of a two-dimensional spatial sampling reduction method (Delaunay method) and a temporal sampling analysis method (Modified CES method), provides detailed sampling location and frequency results. The Delaunay method is designed to identify and eliminate redundant sampling locations without causing significant information loss in characterizing the plume. The Modified CES method determines the optimal sampling frequency for a sampling location based on the direction, magnitude, and uncertainty in its concentration trend. MAROS addresses a variety of ground water contaminants (fuels, solvents, and metals), allows import of various data formats, and is designed for continual modification of long-term monitoring plans as the plume or site conditions change over time.  相似文献   

3.
本文推导了声波方程频散函数, 分析了伪谱法的空间网格大小和采样周期对数值频散的影响, 通过数值模拟实验得到了最佳空间参数选择方法。 结果表明: 伪谱方法稳定数值模拟的最大空间采样间距选取原则是使中波长(奈奎斯特频率的一半)的采样点数为2个; 对于所有维度, 稳定性随空间采样间距的增加而增加, 但不易变化太大, 变化太大时需要适当减小震源子波的主频, 以满足空间合理采样; 空间采样间距的大小设置, 需要考虑满足采样定理和稳定性计算条件, 并且稳定性条件对空间采样间距的要求更加严格; 伪谱法数值模拟的最佳(数值频散最小)空间参数选择为中波长2个采样点, 对应主波长约6~7个采样点。 以上研究对于采用伪谱法进行声波方程数值模拟过程中, 如何合理选择模拟参数提供一些参考。  相似文献   

4.
The sampling error formalism by North and Nakamoto (1989) has been widely referenced in research papers on sampling using space-borne sensors or ground-borne sensors. However, their formalism is found to not only underestimate the sampling error, especially for the raingauge network case, but also not be applicable for the cases of using a line of raingauges or microwave attenuation measurements. In this paper, the sampling error formalism has been revised and applied to the same sampling design and the same rainrate model as in North and Nakamoto (1989) for the comparison. The sampling error estimated using the revised formula was found to be more than 50% higher than that by North and Nakamoto (1989). For the case of using a line of raingauges we found that the sampling error converges to a certain value, not zero as in North and Nakamoto formalism, as the number of gauges increases. The microwave attenuation measurements case, which is the same as the case of using a line of infinite raingauges, also gives non-zero sampling errors. Finally, the combined sampling using both satellite and ground-borne sensors (e.g., raingauge network, a line of raingauges, or microwave attenuation measurements) was reviewed to check their design orthogonality and estimated the sampling errors for the combination of satellite and raingauge network case to see its behavior depending on various settings of these two different measurements.  相似文献   

5.
The human mediated transfer of harmful organisms via shipping, specifically via ballast water transport, leading to the loss of biodiversity, alteration of ecosystems, negative impacts on human health and in some regions economic loss, has raised considerable attention especially in the last decade. Ballast water sampling is very important for biological invasions risk management. The complexity of ballast water sampling is a result of both the variety of organism diversity and behaviour, as well as ship design including availability of ballast water sampling points. Furthermore, ballast water sampling methodology is influenced by the objectives of the sampling study. In the course of research conducted in Slovenia, new sampling equipment for ships' ballast water was developed and tested. In this paper new ballast water sampling methods and equipment together with practical shipboard testing results are presented.  相似文献   

6.
The authors have recently used several innovative sampling techniques for ground water monitoring at hazardous waste sites. Two of these techniques were used for the first time on the Biscayne Aquifer Super-fund Project in Miami, Florida. This is the largest sampling program conducted so far under the U.S. Environmental Protection Agency (EPA) Superfund Program.
One sampling technique involved the use of the new ISCO Model 2600 submersible portable well sampling pump. A compressed air source forces water from the well into the pump casing and then delivers it to the surface (through a pulsating action). This pump was used in wells that could not be sampled with surface lift devices.
Another sampling technique involved the use of a Teflon manifold sampling device. The manifold is inserted into the top of the sampling bottle and a peristaltic pump creates a vacuum to draw the water sample from the well into the bottle. The major advantage of using this sampling technique for ground water monitoring at hazardous waste sites is the direct delivery of the water sample into the collection container. In this manner, the potential for contamination is reduced because, prior to delivery to the sample container, the sample contacts only the Teflon, which is well-known for its inert properties.
Quality assurance results from the Superfund project indicate that these sampling techniques are successful in reducing cross-contamination between monitoring wells. Analysis of field blanks using organic-free water in contact with these sampling devices did not show any concentration at or above the method detection limit for each priority pollutant.  相似文献   

7.
Flow-stratified sampling is a new method for sampling water quality constituents such as suspended sediment to estimate loads. As with selection-at-list-time (SALT) and time-stratified sampling, flow-stratified sampling is a statistical method requiring random sampling, and yielding unbiased estimates of load and variance. It can be used to estimate event yields or to estimate mean concentrations in flow classes for detecting change over time or differences from water quality standards. Flow-stratified sampling is described and its variance compared with those of SALT and time-stratified sampling. Time-stratified sampling generally gives the smallest variance of the three methods for estimating storm yields. Flow-stratified sampling of individual storms may fail to produce estimates in some short-lived strata because they may have sample sizes of zero. SALT will tend to give small samples and relatively high variances for small stroms. For longer and more complex hydrographs, having numerous peaks, flow-stratified sampling gives the lowest variance, and the SALT variance is lower than that of time-stratified sampling unless the sample size is very large. A desirable feature of flow-stratified sampling is that the variance can be reduced after sampling by splitting strata, particularly high flow strata that have been visited just once, and recalculating the total and variance. SALT has the potential to produce the lowest variance, but cannot be expected to do so with an auxiliary variable based on stage.  相似文献   

8.
The ensemble Kalman filter (EnKF) performs well because that the covariance of background error is varying along time. It provides a dynamic estimate of background error and represents the reasonable statistic characters of background error. However, high computational cost due to model ensemble in EnKF is employed. In this study, two methods referred as static and dynamic sampling methods are proposed to obtain a good performance and reduce the computation cost. Ensemble adjustment Kalman filter (EAKF) method is used in a global surface wave model to examine the performance of EnKF. The 24-h interval difference of simulated significant wave height (SWH) within 1 year is used to compose the static samples for ensemble errors, and these errors are used to construct the ensemble states at each time the observations are available. And then, the same method of updating the model states in the EAKF is applied for the ensemble states constructed by a static sampling method. The dynamic sampling method employs a similar method to construct the ensemble states, but the period of the simulated SWH is changing with time. Here, 7 days before and after the observation time is used as this period. To examine the performance of three schemes, EAKF, static, or dynamic sampling method, observations from satellite Jason-2 in 2014 are assimilated into a global wave model, and observations from satellite Saral are used for validation. The results indicate that the EAKF performs best, while the static sampling method is relatively worse. The dynamic sampling method improves an assimilation effect dramatically compared to the static sampling method, and its overall performance is closed to the EAKF. In low latitudes, the dynamic sampling method has a slight advantage over the EAKF. In the dynamic or static sampling methods, only one wave model is required to run and their computational cost is reduced sharply. According to the performance of these three methods, the dynamic sampling method can treated as an effective alternative of EnKF, which could reduce the computational cost and provide a good performance of data assimilation.  相似文献   

9.
Sampling frequency for monitoring the actual state of groundwater systems   总被引:4,自引:0,他引:4  
Sampling frequency is a very important variable in the design of a groundwater monitoring network. Given the objective of sampling as monitoring the actual state of groundwater systems, criteria for the determination of sampling frequency can be based on the trend detectability, the accuracy of estimation of periodic fluctuations and the accuracy of estimation of the mean values of the stationary component of the state variables (such as groundwater heads, temperature, and concentration of hydrochemical constituents). The' criteria are applied to the determination of sampling frequency for monitoring groundwater levels around the Spannenburg pumping station. The analysis and verification of the sampling frequency indicate that the most appropriate sampling frequency is once a month.  相似文献   

10.
Air quality assessment studies have either high sampling or analyses costs. In these studies the representation of a city by a single sampling point is still a serious problem, especially in the metropolitans of developing countries, because of the absence of equipments required for sampling at different locations. In this paper, the smoke data measurements of long years are used in the determination of the region for the representation of the city by drawing contours through the Kriging method. Then, the selection of the sampling site in this region is done on the basis of the criteria recommended by the EPA. By this way, the data taken from this sampling point are used for assessing the average city concentrations with lower sampling and analyses costs. This information is valuable for monitoring the air quality and defining environmental policies, although the local distribution and the extreme concentration values over the city are not measurable.  相似文献   

11.
Abstract

Abstract Stream sampling programmes for water quality estimation constitute a statistical survey of a correlated population. The properties of parameter and other estimates made from sample values from such programmes are set in the context of statistical sampling theory. It is shown that a model-based rather than a design-based approach to statistical analysis is usually appropriate. The influence of model structure and sampling design on the robustness and suitability of estimation procedures is investigated, and relationships with kriging are demonstrated. Methodology is discussed with reference to data from a UK sampling programme  相似文献   

12.
Purging influence on soil‐gas concentrations for volatile organic compounds (VOCs), as affected by sampling tube inner diameter and sampling depth (i.e., system volume) for temporary probes in fine‐grained soils, was evaluated at three different field sites. A macro‐purge sampling system consisted of a standard, hollow, 3.2‐cm outer diameter (OD) drive probe with a retractable sampling point attached to an appropriate length of 0.48‐cm inner diameter (ID) Teflon® tubing. The macro‐purge sampling system had a purge system volume of 24.5 mL at a 1‐m depth. In contrast, the micro‐purge sampling systems were slightly different between the field sites and consisted of a 1.27‐cm OD drive rod with a 0.10‐cm ID stainless steel tube or a 3.2‐cm OD drive rod with a 0.0254‐cm inner diameter stainless steel tubing resulting in purge system volumes of 1.2 and 7.05 mL at 1‐m depths, respectively. At each site and location within the site, with a few exceptions, the same contaminants were identified in the same relative order of abundances indicating the sampling of the same general soil atmosphere. However, marked differences in VOC concentrations were identified between the sampling systems, with micro‐purge samples having up to 27 times greater concentrations than their corresponding macro‐purge samples. The higher concentrations are the result of a minimal disturbance of the ambient soil atmosphere during purging. The minimal soil‐gas atmospheric disturbance of the micro‐purge sampling system allowed for the collection of a sample that is more representative of the soil atmosphere surrounding the sampling point. That is, a sample that does not contain an atmosphere that has migrated from distance through the geologic material or from the surface in response to the vacuum induced during purging soil‐gas concentrations. It is thus recommended that when soil‐gas sampling is conducted using temporary probes in fine‐grained soils, the sampling system use the smallest practical ID soil‐gas tubing and minimize purge volume to obtain the soil‐gas sample with minimal risk of leakage so that proper decisions, based on more representative soil‐gas concentrations, about the site can be made.  相似文献   

13.
The River Frome was sampled at sub-daily sampling interval, with additional storm sampling, through an annual cycle. Samples were analysed for total phosphorus (TP), soluble reactive phosphorus (SRP), total oxidisable nitrogen (TON) and dissolved reactive silicon (Si). The resulting data set was artificially decimated to mimic sampling frequencies from 12 h to monthly time interval. Monthly sampling interval resulted in significant errors in the estimated annual TP and SRP load of up to 35% and 28% respectively, and the resulting data sets were insufficient to observe peaks in P concentration in response to storm events. Weekly sampling reduced the maximum percentage errors in annual load estimate to 15.4% and 6.5%. TON and silicon concentrations were less variable with changing river flow, and monthly sampling was sufficient to predict annual load estimates to within 10%. However, to investigate within-river nutrient dynamics and behaviour, it is suggested that a weekly sampling interval would be the minimum frequency required for TON and Si studies, and daily sampling would be a minimum requirement to adequately investigate phosphorus dynamics. The loss in nutrient-concentration signal due to reduced sampling interval is presented. Hysteresis in the nutrient concentration/flow relationships for all 32 storm events during the study period were modelled and seasonal patterns discussed to infer nutrient sources and behaviour. The high-resolution monitoring in this study identified, for the first time, major peaks in phosphorus concentration in winter that coincide with sudden falls in air temperature, and was associated with biofilm breakdown. This study has shown that to understand complex catchment nutrient processes, accurately quantify nutrient exports from catchments, and observe changes in water quality as a result of nutrient mitigation efforts over time, it is vital that the newly emerging field-based automated sampler/analyzer technologies begin to be deployed, to allow for routine high-resolution monitoring of our rivers in the future.  相似文献   

14.
利用HD-6型多功能自控氡室, 配合国际标准计量仪器AlphaGUARD PQ2000Pro测氡仪, 分别采用循环采样和负压采样两种方式, 对FD-125氡钍分析器的3个闪烁室进行校准实验, 计算3个闪烁室K值和相对固有误差。 结果表明: 循环采样方式得到的闪烁室K值比负压采样方式小, 负压采样方式得到的K值偏大, 已经接近《地震水文地球化学观测技术规范》要求的闪烁室K值的极限。 循环采样得到的闪烁室相对固有误差≤5%, 达到了《地震水文地球化学观测技术规范》的要求, 而负压采样方式得到的闪烁室相对固有误差>5%, 未达到《地震水文地球化学观测技术规范》的要求; 因此, 循环采样方式更适合地震监测闪烁室法测氡仪的氡室校准。  相似文献   

15.
Soil-gas sampling and analysis is a common tool used in vapor intrusion assessments; however, sample collection becomes more difficult in fine-grained, low-permeability soils because of limitations on the flow rate that can be sustained during purging and sampling. This affects the time required to extract sufficient volume to satisfy purging and sampling requirements. The soil-gas probe tubing or pipe and sandpack around the probe screen should generally be purged prior to sampling. After purging, additional soil gas must be extracted for chemical analysis, which may include field screening, laboratory analysis, occasional duplicate samples, or analysis for more than one analytical method (e.g., volatile organic compounds and semivolatile organic compounds). At present, most regulatory guidance documents do not distinguish between soil-gas sampling methods that are appropriate for high- or low-permeability soils. This paper discusses permeability influences on soil-gas sample collection and reports data from a case study involving soil-gas sampling from silt and clay-rich soils with moderate to extremely low gas permeability to identify a sampling approach that yields reproducible samples with data quality appropriate for vapor intrusion investigations for a wide range of gas-permeability conditions.  相似文献   

16.
This paper presents a digital linear filter which maps composite resistivity transforms to apparent resistivities for any four—electrode array over a horizontally layered earth. A filter is provided for each of three sampling rates; the choice of filter will depend on resistivity contrasts and computational facilities. Two methods of filter design are compared. The Wiener-Hopf least-squares method is preferable for low sampling rate filters. The Fourier transform method is more successful in producing a filter with a high sampling rate which can handle resistivity contrasts of 100 000: 1.  相似文献   

17.
Multiple theoretical sampling designs are studied to determine whether sampling designs can be identified that will provide for characterization of ground water quality in rural regions of developing nations. Sampling design in this work includes assessing sampling frequency, analytical methods, length of sampling period, and requirements of sampling personnel. The results answer a set of questions regarding whether using innovative sampling designs can allow hydrogeologists to take advantage of a range of characterization technologies, sampling strategies, and available personnel to develop high-value, water-quality data sets. Monte Carlo studies are used to assess different sampling strategies in the estimation of three parameters related to a hypothetical chemical observed in a ground water well: mean concentration (MeanC), maximum concentration (MaxC), and total mass load (TML). Five different scenarios are simulated. These scenarios are then subsampled using multiple simulated sampling instruments, time periods (ranging from 1 to 10 years), and sampling frequencies (ranging from weekly to semiannually to parameter dependent). Results are analyzed via the statistics of the resulting estimates, including mean square error, bias, bias squared, and precision. Results suggest that developing a sampling strategy based on what may be considered lower quality instruments can represent a powerful field research approach for estimating select parameters when applied at high frequency. This result suggests the potential utility of using a combination of lower quality instrument and local populations to obtain high frequency data sets in regions where regular monitoring by technicians is not practical.  相似文献   

18.
Reconstruction of seismic data is routinely used to improve the quality and resolution of seismic data from incomplete acquired seismic recordings. Curvelet‐based Recovery by Sparsity‐promoting Inversion, adapted from the recently‐developed theory of compressive sensing, is one such kind of reconstruction, especially good for recovery of undersampled seismic data. Like traditional Fourier‐based methods, it performs best when used in conjunction with randomized subsampling, which converts aliases from the usual regular periodic subsampling into easy‐to‐eliminate noise. By virtue of its ability to control gap size, along with the random and irregular nature of its sampling pattern, jittered (sub)sampling is one proven method that has been used successfully for the determination of geophone positions along a seismic line. In this paper, we extend jittered sampling to two‐dimensional acquisition design, a more difficult problem, with both underlying Cartesian and hexagonal grids. We also study what we term separable and non‐separable two‐dimensional jittered samplings. We find hexagonal jittered sampling performs better than Cartesian jittered sampling, while fully non‐separable jittered sampling performs better than separable jittered sampling. Two other 2D randomized sampling methods, Poisson Disk sampling and Farthest Point sampling, both known to possess blue‐noise spectra, are also shown to perform well.  相似文献   

19.
A nitrate sensor has been set up to measure every 10 min the nitrate signal in a stream draining a small agricultural catchment dominated by fertilized crops during a 2‐year study period (2006–2008) in the south‐west of France. An in situ sampling protocol using automatic sampler to monitor flood events have been used to assume a point‐to‐point calibration of the sensor values. The nitrate concentration exhibits nonsystematic concentration and dilution effects during flood events. We demonstrate that the calibrated nitrate sensor signal gathered from the outlet is considered to be a continuous signal using the Nyquist–Shannon sampling theorem. The objectives of this study are to quantify the errors generated by a typical infrequent sampling protocol and to design appropriate sampling strategy according to the sampling objectives. Nitrate concentration signal and flow data are numerically sampled to simulate common sampling frequencies. The total fluxes calculated from the simulated samples are compared with the reference value computed on the continuous signal. Uncertainties are increasing as sampling intervals increase; the method that is not using continuous discharge to compute nitrate fluxes bring larger uncertainty. The dispersion and bias computed for each sampling interval are used to evaluate the uncertainty during each hydrological period. High underestimation is made during flood periods when high‐concentration period is overlooked. On the contrary, high sampling frequencies (from 3 h to 1 day) lead to a systematic overestimation (bias around 3%): highest concentrations are overweighted by the interpolation of the concentration in such case. The in situ sampling protocol generates less than 1% of load estimation error and sample highest concentration peaks. We consider useful such newly emerging field technologies to assess short‐term variations of water quality parameters, to minimize the number of samples to be analysed and to assess the quality state of the stream at any time. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

20.
The benefits of three simple modifications to the design of a Birkbeck bedload slot‐sampling system that has been continuously operating in Nahal Eshtemoa, Israel, since the early 1990s are demonstrated. The modifications include the deployment of a removable slot cover which delays the accumulation of sediment, so allowing sampling at late stages of a flood and, in conjunction with other samplers, extending the period of sampling during a flood wave; inclusion of a slot the size of which is adjustable so that that the probability of sampling the largest clast sizes in transit as bedload can be increased post‐installation, once knowledge is gained about the bedload grain‐size distribution; and a sampler side‐wall door that allows stratification and textural changes within the accumulated bedload to be identified, so promoting intelligent sampling of the deposit for grain‐size determination. Results from seven flash‐floods are presented and discussed, with recommendations for bedload monitoring, particularly in rivers where sediment flux is high and dynamic sediment records are inevitably short because of instrumental limitations. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号