首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A general method for estimating ground-water solute mass transfer rate parameters from field test data is presented. The method entails matching solute concentration and hydraulic head data collected during the recovery phase of a pumping test through application of a simulation-regression technique. Estimation of hydraulic conductivity and mass transfer rate parameter values is performed by fitting model simulations to the data. Parameter estimates are utilized to assess cleanup times for pump-and-treat aquifer remediation scenarios. Uncertainty in the cleanup time estimate is evaluated using statistical information obtained with the parameter estimation technique. Application of the method is demonstrated using a hypothetical ground-water flow and solute transport system. Simulations of field testing, parameter estimation, and remedial time frames are performed to evaluate the usefulness of the method. Sets of random noise that signify potential field and laboratory measurement errors are combined with the hypothetical data to provide rigorous testing of the method. Field tests are simulated using ranges of values for data noise, the mass transfer rate parameters, the test pumping rates, and the duration of recovery monitoring to evaluate their respective influence on parameter and cleanup time estimates. The demonstration indicates the method is capable of yielding accurate estimates of the solute mass transfer rate parameters. When the parameter values for the hypothetical system are well estimated, cleanup time predictions are shown to be more accurate than when calculated using the local equilibrium assumption.  相似文献   

2.
Integrated watershed models can be used to calculate streamflow generation in snow‐dominated mountainous catchments. Parameterization of water flow is often complicated by the lack of information on subsurface hydraulic properties. In this study, bulk density optimization was used to determine hydraulic parameters for the upper and lower regolith in the GEOtop model. The methodology was tested in two small catchments in the Dry Creek Watershed in Idaho and the Libby Creek Watershed in Wyoming. Modelling efficiencies for profile‐average soil–water content for the two catchments were between 0.52 and 0.64. Modelling efficiencies for stream discharge (cumulative stream discharge) were 0.45 (0.91) and 0.54 (0.94) for the Idaho and Wyoming catchments, respectively. The calculated hydraulic properties suggest that lateral flow across the upper–lower regolith interface is an important driver of streamflow in both the Idaho and Wyoming watersheds. The overall calibration procedure is computationally efficient because only two bulk density values are optimized. The two‐parameter calibration procedure was complicated by uncertainty in hydraulic conductivity anisotropy. Different upper regolith hydraulic conductivity anisotropy factors had to be tested in order to describe streamflow in both catchments.  相似文献   

3.
We examine the effect of uncertainty due to limited information on the remediation design of a contaminated aquifer using the pump and treat method. The hydraulic conductivity and contaminant concentration distributions for a fictitious contaminated aquifer are generated assuming a limited number of sampling locations. Stochastic optimization with multiple realizations is used to account for aquifer uncertainty. The optimization process involves a genetic algorithm (GA). As the number of realizations increases, a greater extraction rate and more wells are needed. There was a total cost increase, but the optimal remediation designs became more reliable. Stochastic optimization analysis also determines the locations for extraction wells, the variation in extraction rates as a function of the change of well locations, and the reliability of the optimal designs. The number of realizations (stack number) that caused the design factors to converge could be determined. Effective stochastic optimization may be achieved by reducing computational resources. An increase in the variability of the conductivity distribution requires more extraction wells. Information about potential extraction wells can be used to prevent failure of the remediation task.  相似文献   

4.
This study introduces Bayesian model averaging (BMA) to deal with model structure uncertainty in groundwater management decisions. A robust optimized policy should take into account model parameter uncertainty as well as uncertainty in imprecise model structure. Due to a limited amount of groundwater head data and hydraulic conductivity data, multiple simulation models are developed based on different head boundary condition values and semivariogram models of hydraulic conductivity. Instead of selecting the best simulation model, a variance-window-based BMA method is introduced to the management model to utilize all simulation models to predict chloride concentration. Given different semivariogram models, the spatially correlated hydraulic conductivity distributions are estimated by the generalized parameterization (GP) method that combines the Voronoi zones and the ordinary kriging (OK) estimates. The model weights of BMA are estimated by the Bayesian information criterion (BIC) and the variance window in the maximum likelihood estimation. The simulation models are then weighted to predict chloride concentrations within the constraints of the management model. The methodology is implemented to manage saltwater intrusion in the “1,500-foot” sand aquifer in the Baton Rouge area, Louisiana. The management model aims to obtain optimal joint operations of the hydraulic barrier system and the saltwater extraction system to mitigate saltwater intrusion. A genetic algorithm (GA) is used to obtain the optimal injection and extraction policies. Using the BMA predictions, higher injection rates and pumping rates are needed to cover more constraint violations, which do not occur if a single best model is used.  相似文献   

5.
Decision Analysis for Pump-and-Treat Design   总被引:1,自引:0,他引:1  
The use of decision analysis (DA) has been proposed as a technique for selecting from among alternative designs for subsurface remediation. To assess the ability of DA to generate consistent decisions for the widely practiced pump-and-treat (PAT) strategy, 27 candidate PAT designs were compared for a case study site. The sensitivity of the alternative selection to various modeling assumptions was examined, including the complexity of the site-specific numerical models, the assumed degree of aquifer heterogeneity, the manner of defining failure, and the assumed cost of failure. The initial net-present-worth analysis resulted in the selection of one of two designs that included injection wells for effluent disposal and hydraulic control. However, when the injection wells were excluded from consideration, the selection from a diverse set of alternative PAT designs was highly sensitive to the particular modeling assumptions. In general, the practical usefulness of the DA approach is dependent on the ability to characterize the nature and probability of system failure.  相似文献   

6.
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling‐based approaches are expensive and provide low‐density spatial and temporal information. Time‐lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation‐related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling‐based approaches for assessing emplacement and monitoring biostimulation‐based remediation. Field studies demonstrating the ability of time‐lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment‐related geochemical properties. Crosshole radar zero‐offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time‐lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost‐effective surface‐based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.  相似文献   

7.
Optimal cost pump-and-treat ground water remediation designs for containment of a contaminated aquifer are often developed using deterministic ground water models to predict ground water flow. Uncertainty in hydraulic conductivity fields used in these models results in remediation designs that are unreliable. The degree to which uncertainty contributes to the reliability of remediation designs as measured by the characterization of the uncertainty is shown to differ depending upon the geologic environments of the models. This conclusion is drawn from the optimal design costs for multiple deterministic models generated to represent the uncertainty of four distinct models with different geologic environments. A multi scenario approach that includes uncertainty into the remediation design called the deterministic method for optimization subject to uncertainty (DMOU) is applied to these distinct models. It is found that the DMOU is a method for determining a remediation design subject to uncertainty that requires minimal postprocessing efforts. Preprocessing, however, is required for the application of the DMOU to unique problems. In the ground water remediation design problems, the orientation of geologic facies with respect to the orientation of flow patterns, pumping well locations, and constraint locations are shown to affect the preprocessing, the solutions to the DMOU problems, and the computational efficiency of the DMOU approach. The results of the DMOU are compared to the results of a statistical analysis of the effects of the uncertainty on remediation designs. This comparison validates the efficacy of the DMOU and illustrates the computational advantages to using the DMOU over statistical measures.  相似文献   

8.
Recent advancements in analytical solutions to quantify water and solute travel time distributions (TTDs) and the related StorAge Selection (SAS) functions synthesize catchment complexity into a simplified, lumped representation. Although these analytical approaches are efficient in application, they require rarely available long‐term and high‐frequency hydrochemical data for parameter estimation. Alternatively, integrated hydrologic models coupled to Lagrangian particle‐tracking approaches can directly simulate age under different catchment geometries and complexity, but at a greater computational expense. Here, we bridge the two approaches, using a physically based model to explore the uncertainty in the estimation of the SAS function shape. In particular, we study the influence of subsurface heterogeneity, interactions between distinct flow domains (i.e., the vadose zone and saturated groundwater), diversity of flow pathways, and recharge rate on the shape of TTDs and the SAS functions. We use an integrated hydrology model, ParFlow, linked with a particle‐tracking model, SLIM, to compute transient residence times (or ages) at every cell in the domain, facilitating a direct characterization of the SAS function. Steady‐state results reveal that the SAS function shape shows a wide range of variation with respect to the variability in the structure of subsurface heterogeneity. Ensembles of spatially correlated realizations of hydraulic conductivity indicate that the SAS functions in the saturated groundwater have an overall weak tendency toward sampling younger ages, whereas the vadose zone gives a strong preference for older ages. We further show that the influence of recharge rate on the TTD is tightly dependent on the variability of subsurface hydraulic conductivity.  相似文献   

9.
ABSTRACT

This study investigates the impact of hydraulic conductivity uncertainty on the sustainable management of the aquifer of Lake Karla, Greece, using the stochastic optimization approach. The lack of surface water resources in combination with the sharp increase in irrigation needs in the basin over the last 30 years have led to an unprecedented degradation of the aquifer. In addition, the lack of data regarding hydraulic conductivity in a heterogeneous aquifer leads to hydrogeologic uncertainty. This uncertainty has to be taken into consideration when developing the optimization procedure in order to achieve the aquifer’s sustainable management. Multiple Monte Carlo realizations of this spatially-distributed parameter are generated and groundwater flow is simulated for each one of them. The main goal of the sustainable management of the ‘depleted’ aquifer of Lake Karla is two-fold: to determine the optimum volume of renewable groundwater that can be extracted, while, at the same time, restoring its water table to a historic high level. A stochastic optimization problem is therefore formulated, based on the application of the optimization method for each of the aquifer’s multiple stochastic realizations in a future period. In order to carry out this stochastic optimization procedure, a modelling system consisting of a series of interlinked models was developed. The results show that the proposed stochastic optimization framework can be a very useful tool for estimating the impact of hydraulic conductivity uncertainty on the management strategies of a depleted aquifer restoration. They also prove that the optimization process is affected more by hydraulic conductivity uncertainty than the simulation process.
Editor Z.W. Kundzewicz; Guest editor S. Weijs  相似文献   

10.
Groundwater models are critical decision support tools for water resources management and environmental remediation. However, limitations in site characterization data and conceptual models can adversely affect the reliability of groundwater models. Therefore, there is a strong need for continuous model uncertainty reduction. Ensemble filters have recently emerged as promising high-dimensional data assimilation techniques. Two general categories of ensemble filters exist in the literature: perturbation-based and deterministic. Deterministic ensemble filters have been extensively studied for their better performance and robustness in assimilating oceanographic and atmospheric data. In hydrogeology, while a number of previous studies demonstrated the usefulness of the perturbation-based ensemble Kalman filter (EnKF) for joint parameter and state estimation, there have been few systematic studies investigating the performance of deterministic ensemble filters. This paper presents a comparative study of four commonly used deterministic ensemble filters for sequentially estimating the hydraulic conductivity parameter in low- and moderately high-dimensional groundwater models. The performance of the filters is assessed on the basis of twin experiments in which the true hydraulic conductivity field is assumed known. The test results indicate that the deterministic ensemble Kalman filter (DEnKF) is the most robust filter and achieves the best performance at relatively small ensemble sizes. Deterministic ensemble filters often make use of covariance inflation and localization to stabilize filter performance. Sensitivity studies demonstrate the effects of covariance inflation, localization, observation density, and conditioning on filter performance.  相似文献   

11.
Groundwater model predictions are often uncertain due to inherent uncertainties in model input data. Monitored field data are commonly used to assess the performance of a model and reduce its prediction uncertainty. Given the high cost of data collection, it is imperative to identify the minimum number of required observation wells and to define the optimal locations of sampling points in space and depth. This study proposes a design methodology to optimize the number and location of additional observation wells that will effectively measure multiple hydrogeological parameters at different depths. For this purpose, we incorporated Bayesian model averaging and genetic algorithms into a linear data-worth analysis in order to conduct a three-dimensional location search for new sampling locations. We evaluated the methodology by applying it along a heterogeneous coastal aquifer with limited hydrogeological data that is experiencing salt water intrusion (SWI). The aim of the model was to identify the best locations for sampling head and salinity data, while reducing uncertainty when predicting multiple variables of SWI. The resulting optimal locations for new observation wells varied with the defined design constraints. The optimal design (OD) depended on the ratio of the start-up cost of the monitoring program and the installation cost of the first observation well. The proposed methodology can contribute toward reducing the uncertainties associated with predicting multiple variables in a groundwater system.  相似文献   

12.
This study investigates stochastic optimization of dense nonaqueous phase liquid (DNAPL) remediation design at Dover Air Force Base Area 5 using emulsified vegetable oil (EVO) injection. The Stochastic Cost Optimization Toolkit (SCOToolkit) is used for the study, which couples semianalytical DNAPL source depletion and transport models with parameter estimation, error propagation, and stochastic optimization modules that can consider multiple sources and remediation strategies. Model parameters are calibrated to field data conditions on prior estimates of parameters and their uncertainty. Monte Carlo simulations are then performed to identify optimal remediation decisions that minimize the expected net present value (NPV) cleanup cost while maintaining concentrations at compliance wells under the maximum contaminant level (MCL). The results show that annual operating costs could be reduced by approximately 50% by implementing the identified optimal remediation strategy. We also show that recalibration and reoptimization after 50 years using additional monitoring data could lead to a further 60% reduction in annual operating cost increases the reliability of the proposed remediation actions.  相似文献   

13.
Characterization of groundwater contaminant source using Bayesian method   总被引:2,自引:1,他引:1  
Contaminant source identification in groundwater system is critical for remediation strategy implementation, including gathering further samples and analysis, as well as implementing and evaluating different remediation plans. Such problem is usually solved with the aid of groundwater modeling with lots of uncertainty, e.g. existing uncertainty in hydraulic conductivity, measurement variance and the model structure error. Monte Carlo simulation of flow model allows the input uncertainty onto the model predictions of concentration measurements at monitoring sites. Bayesian approach provides the advantage to update estimation. This paper presents an application of a dynamic framework coupling with a three dimensional groundwater modeling scheme in contamination source identification of groundwater. Markov Chain Monte Carlo (MCMC) is being applied to infer the possible location and magnitude of contamination source. Uncertainty existing in heterogonous hydraulic conductivity field is explicitly considered in evaluating the likelihood function. Unlike other inverse-problem approaches to provide single but maybe untrue solution, the MCMC algorithm provides probability distributions over estimated parameters. Results from this algorithm offer a probabilistic inference of the location and concentration of released contamination. The convergence analysis of MCMC reveals the effectiveness of the proposed algorithm. Further investigation to extend this study is also discussed.  相似文献   

14.
In geostatistical inverse modeling, hydrogeological parameters, such as hydraulic conductivity, are estimated as spatial fields. Upon discretization this results in several thousand (log-)hydraulic conductivity values to be estimated. Common inversion schemes rely on gradient-based parameter estimation methods which require the sensitivity of all measurements with respect to all parameters. Point-like measurements of steady-state concentration in aquifers are generally not well suited for gradient-based methods, because typical plumes exhibit only a very narrow fringe at which the concentration decreases from a maximal value to zero. Only here the sensitivity of concentration with respect to hydraulic conductivity significantly differs from zero. Thus, if point-like measurements of steady-state concentration do not lie in this narrow fringe, their sensitivity with respect to hydraulic conductivity is zero. Observations of concentrations averaged over a larger control volume, by contrast, show a more regular sensitivity pattern. We thus suggest artificially increasing the sampling volume of steady-state concentration measurements for the evaluation of sensitivities in early stages of an iterative parameter estimation scheme. We present criteria for the extent of artificially increasing the sampling volume and for decreasing it when the simulation results converge to the measurements. By this procedure, we achieve high stability in geostatistical inversion of steady-state concentration measurements. The uncertainty of the estimated parameter fields is evaluated by generating conditional realizations.  相似文献   

15.
Ye Zhang 《Ground water》2014,52(3):343-351
Modeling and calibration of natural aquifers with multiple scales of heterogeneity is a challenging task due to limited subsurface access. While computer modeling plays an essential role in aquifer studies, large uncertainty exists in developing a conceptual model of an aquifer and in calibrating the model for decision making. Due to uncertainties such as a lack of understanding of subsurface processes and a lack of techniques to parameterize the subsurface environment (including hydraulic conductivity, source/sink rate, and aquifer boundary conditions), existing aquifer models often suffer nonuniqueness in calibration, leading to poor predictive capability. A robust calibration methodology is needed that can address the simultaneous estimations of aquifer parameters, source/sink, and boundary conditions. In this paper, we propose a multistage and multiscale approach that addresses subsurface heterogeneity at multiple scales, while reducing uncertainty in estimating the model parameters and model boundary conditions. The key to this approach lies in the appropriate development, verification, and synthesis of existing and new techniques of static and dynamic data integration. In particular, based on a given set of observation data, new inversion techniques can be first used to estimate aquifer large‐scale effective parameters and smoothed boundary conditions, based on which parameter and boundary condition estimation can be refined at increasing detail using standard or highly parameterized estimation techniques.  相似文献   

16.
Numerical modeling of groundwater-surface water interactions provides vital information necessary for determining the extent of nutrient transport, quantifying water budgets, and delineating zones of ecological support. The hydrologic data that drive these models are often collected at disparate scales and subsequently incorporated into numerical models through upscaling techniques such as piecewise constancy or geostatistical methods. However, these techniques either use basic interpolation methods, which often simplifies the system of interest, or utilize complex statistical methods that are computationally expensive, time consuming, and generate complex subsurface configurations. We propose a bulk parameter termed “vertically integrated hydraulic conductivity” (KV), and defined as the depth-integrated resistance to fluid flow sensed at the groundwater-surface water interface, as an alternative to hydraulic conductivity when investigating vertical fluxes across the groundwater-surface water interface. This bulk parameter replaces complex subsurface configurations in situations dominated by vertical fluxes and where heterogeneity is not of primary importance. To demonstrate the utility of KV, we extracted synthetic temperature time series data from a forward numerical model under a variety of scenarios and used those data to quantify vertical fluxes using the amplitude ratio method. These quantified vertical fluxes and the applied hydraulic head gradient were subsequently input into Darcy's Law and used to quantify KV. This KV was then directly compared to the equivalent hydraulic conductivity (KT) assuming an infinitely extending layer. Vertically integrated hydraulic conductivity allows for more accurate and robust flow modeling across the groundwater-surface water interface in instances where complex heterogeneities are not of primary concern.  相似文献   

17.
Hydrological modelling is an important tool for research, policy, and management, but uncertainty remains about parameters transferability from field observations made at small scale to models at the catchment scale and larger. This uncertainty compels the need to develop parameter relationships that are translatable across scale. In this study, we compare the changes to modelled processes as resolution is coarsened from 100‐m to 1‐km in a topographically complex, 255‐km2 Colorado River headwater catchment. We conducted a sensitivity analysis for hydraulic conductivity (K) and Manning's n parameters across four orders of magnitude. Results showed that K acts as a moderator between surface and subsurface contributions to streamflow, whereas n moderates the duration of high intensity, infiltration‐excess flow. The parametric sensitivity analysis informed development of a new method to scale effective hydraulic conductivity across modelling resolutions in order to compensate for the loss of topographic gradients as resolution is coarsened. A similar mathematical relationship between n and lateral resolution changes was not found, possibly because n is also sensitive to time discretization. This research provides an approach to translate hydraulic conductivity parameters from a calibrated coarse model to higher resolutions where the number of simulations are limited by computational demand.  相似文献   

18.
The characterization of heterogeneity in hydraulic conductivity (K) is a major challenge for subsurface remediation projects. There are a number of field studies that compare the K estimates obtained using various techniques, but to our knowledge, no field‐based studies exists that compare the performance of estimated K heterogeneity fields or the associated characterization costs. In this paper, we compare the costs of characterizing the three‐dimensional K heterogeneity and its uncertainty estimates of a glaciofluvial aquifer‐aquitard sequence at a 15 m × 15 m × 18 m field site situated on the University of Waterloo campus. We compare geostatistical analysis of high resolution permeameter K data obtained from repacked core samples in five boreholes and hydraulic tomography analysis of four pumping tests consisting of up to 41 monitoring points per test. Aside from the comparison of costs, we also assess the performance of each method by predicting several pumping tests. Our analysis reveals that hydraulic tomography is somewhat more costly than the geostatistical analysis of high resolution permeameter K data due to the higher capital costs associated with the method. However, the equipment may be reused at other sites; hence these costs may be recovered over the life of the equipment. More significantly, hydraulic tomography is able to capture the most important features of the aquifer‐aquitard sequence leading to more accurate predictions of independent pumping tests. This suggests that more robust remediation systems may be designed if site characterization is performed with hydraulic tomography.  相似文献   

19.
Efficient allocation of remediation resources is a critical need throughout the nation. Economic risk-cost-benefit analysis is an important tool for meeting this need. This paper provides site engineers, geologists, and managers with a conceptual understanding of economic risk-cost-benefit analysis and shows how it can be applied, even in situations where existing data are sparse or poor in quality. An example analysis is applied to the remediation of radioactive waste at Oak Ridge National Laboratory, in which the cost-effectiveness is compared for two remediation alternatives: containment of the waste or monitoring only. A data-worth analysis is also carried out to estimate the maximum justifiable exploration budget and the cost-effectiveness of two proposed data collection programs. Results indicate that the methodology has potential in making robust remediation decisions regarding certain types of questions.  相似文献   

20.
Soil and groundwater contamination are often managed by establishing on‐site cleanup targets within the context of risk assessment or risk management measures. Decision‐makers rely on modeling tools to provide insight; however, it is recognized that all models are subject to uncertainty. This case study compares suggested remediation requirements using a site‐specific numerical model and a standardized analytical tool to evaluate risk to a downgradient wetland receptor posed by on‐site chloride impacts. The base case model, calibrated to observed non‐pumping and pumping conditions, predicts a peak concentration well above regulatory criteria. Remediation scenarios are iteratively evaluated to determine a remediation design that adheres to practical site constraints, while minimizing the potential for risk to the downgradient receptor. A nonlinear uncertainty analysis is applied to each remediation scenario to stochastically evaluate the risk and find a solution that meets the site‐owner risk tolerance, which in this case required a risk‐averse solution. This approach, which couples nonlinear uncertainty analysis with a site‐specific numerical model provides an enhanced level of knowledge to foster informed decision‐making (i.e., risk‐of‐success) and also increases stakeholder confidence in the remediation design.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号