首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Categorical parameter distributions consisting of geologic facies with distinct properties, for example, high-permeability channels embedded in a low-permeability matrix, are common at contaminated sites. At these sites, low-permeability facies store solute mass, acting as secondary sources to higher-permeability facies, sustaining concentrations for decades while increasing risk and cleanup costs. Parameter estimation is difficult in such systems because the discontinuities in the parameter space hinder the inverse problem. This paper presents a novel approach based on Traveling Pilot Points (TRIPS) and an iterative ensemble smoother (IES) to solve the categorical inverse problem. Groundwater flow and solute transport in a hypothetical aquifer with a categorical parameter distribution are simulated using MODFLOW 6. Heads and concentrations are recorded at multiple monitoring locations. IES is used to generate posterior ensembles assuming a TRIPS prior and an approximate multi-Gaussian prior. The ensembles are used to predict solute concentrations and mass into the future. The evaluation also includes an assessment of how the number of measurements and the choice of the geological prior determine the characteristics of the posterior ensemble and the resulting predictions. The results indicate that IES was able to efficiently sample the posterior distribution and showed that even with an approximate geological prior, a high degree of parameterization and history matching could lead to parameter ensembles that can be useful for making certain types of predictions (heads, concentrations). However, the approximate geological prior was insufficient for predicting mass. The analysis demonstrates how decision-makers can quantify uncertainty and make informed decisions with an ensemble-based approach.  相似文献   

2.
The oil spill from Prestige tanker showed the importance of scientifically based protocols to minimize the impacts on the environment. In this work, we describe a new forecasting system to predict oil spill trajectories and their potential impacts on the coastal zone. The system is formed of three main interconnected modules that address different capabilities: (1) an operational circulation sub-system that includes nested models at different scales, data collection with near-real time assimilation, new tools for initialization or assimilation based on genetic algorithms and feature-oriented strategic sampling; (2) an oil spill coastal sub-system that allows simulation of the trajectories and fate of spilled oil together with evaluation of coastal zone vulnerability using environmental sensitivity indexes; (3) a risk management sub-system for decision support based on GIS technology. The system is applied to the Mediterranean Sea where surface currents are highly variable in space and time, and interactions between local, sub-basin and basin scale increase the non-linear interactions effects which need to be adequately resolved at each one of the intervening scales. Besides the Mediterranean Sea is a complex reduced scale ocean representing a real scientific and technological challenge for operational oceanography and particularly for oil spill response and search and rescue operations.  相似文献   

3.
4.
Integrating migration velocity analysis and full waveform inversion can help reduce the high non‐linearity of the classic full waveform inversion objective function. The combination of inverting for the long and short wavelength components of the velocity model using a dual objective function that is sensitive to both components is still very expensive and have produced mixed results. We develop an approach that includes both components integrated to complement each other. We specifically utilize the image to generate reflections in our synthetic data only when the velocity model is not capable of producing such reflections. As a result, we get the migration velocity analysis working when we need it, and we mitigate its influence when the velocity model produces accurate reflections (possibly first for the low frequencies). This is achieved using a novel objective function that includes both objectives. Applications to a layered model and the Marmousi model demonstrate the main features of the approach.  相似文献   

5.
We introduce a new ensemble-based Kalman filter approach to assimilate GRACE satellite gravity data into the WaterGAP Global Hydrology Model. The approach (1) enables the use of the spatial resolution provided by GRACE by including the satellite observations as a gridded data product, (2) accounts for the complex spatial GRACE error correlation pattern by rigorous error propagation from the monthly GRACE solutions, and (3) allows us to integrate model parameter calibration and data assimilation within a unified framework. We investigate the formal contribution of GRACE observations to the Kalman filter update by analysis of the Kalman gain matrix. We then present first model runs, calibrated via data assimilation, for two different experiments: the first one assimilates GRACE basin averages of total water storage and the second one introduces gridded GRACE data at \(5^\circ\) resolution into the assimilation. We finally validate the assimilated model by running it in free mode (i.e., without adding any further GRACE information) for a period of 3 years following the assimilation phase and comparing the results to the GRACE observations available for this period.  相似文献   

6.
结合国产工业CT机的研制,我们开发了工业CT机成像专用软件。该软件包括:投影数据分析和处理、重建算法、CT值数据分析、图像处理、仿真等模块。  相似文献   

7.
The recognition and assurance of the quality of ground water monitoring data are crucial to the correct assessment of the magnitude and extent of a ground water contamination problem. This article addresses an approach being developed to systematically evaluate the quality of a given set of ground water monitoring data collected during site investigation/ remedial action efforts. The system consists of a checklist of criteria, grouped into four major categories, which can be applied to laboratory or field measurements.
The first category, basis of measurement, considers whether the appropriate sampling, boring and/or analytical methods were chosen to obtain the measurement and the limitations of each method. Secondly, application of the method is assessed. This includes examination of the extent to which procedures were correctly performed, the use of quality control measures and calibration, and possible sources of error in the measurements. Third, evaluation of applied statistical methods is made, with consideration given to which statistics are meaningful in a given context and whether measurements are reproducible. The final category, corroborative information, considers whether independent data or other information are available that add credibility to the values measured.
In this approach, a "high quality" data value is defined as one in which accuracy is supported by meeting the preceding criteria. When accompanied by precision information, high quality data allow for defensible assessments and actions. This evaluation system is useful in developing monitoring programs and in guiding documentation of field and laboratory methods during data collection. It relies heavily on experienced judgment and can be catalyst for the beneficial exchange of knowledge and ideas among ground water professionals.  相似文献   

8.
Delineation of regional arid karstic aquifers: an integrative data approach   总被引:1,自引:0,他引:1  
This research integrates data procedures for the delineation of regional ground water flow systems in arid karstic basins with sparse hydrogeologic data using surface topography data, geologic mapping, permeability data, chloride concentrations of ground water and precipitation, and measured discharge data. This integrative data analysis framework can be applied to evaluate arid karstic aquifer systems globally. The accurate delineation of ground water recharge areas in developing aquifer systems with sparse hydrogeologic data is essential for their effective long-term development and management. We illustrate the use of this approach in the Cuatrociénegas Basin (CCB) of Mexico. Aquifers are characterized using geographic information systems for ground water catchment delineation, an analytical model for interbasin flow evaluation, a chloride balance approach for recharge estimation, and a water budget for mapping contributing catchments over a large region. The test study area includes the CCB of Coahuila, Mexico, a UNESCO World Biosphere Reserve containing more than 500 springs that support ground water-dependent ecosystems with more than 70 endemic organisms and irrigated agriculture. We define recharge areas that contribute local and regional ground water discharge to springs and the regional flow system. Results show that the regional aquifer system follows a topographic gradient that during past pluvial periods may have linked the Río Nazas and the Río Aguanaval of the Sierra Madre Occidental to the Río Grande via the CCB and other large, currently dry, upgradient lakes.  相似文献   

9.
Application of altimetry data assimilation on mesoscale eddies simulation   总被引:3,自引:0,他引:3  
Mesoscale eddy plays an important role in the ocean circulation. In order to improve the simulation accuracy of the mesoscale eddies, a three-dimensional variation (3DVAR) data assimilation system called Ocean Variational Analysis System (OVALS) is coupled with a POM model to simulate the mesoscale eddies in the Northwest Pacific Ocean. In this system, the sea surface height anomaly (SSHA) data by satellite altimeters are assimilated and translated into pseudo temperature and salinity (T-S) profile data. Then, these profile data are taken as observation data to be assimilated again and produce the three-dimensional analysis T-S field. According to the characteristics of mesoscale eddy, the most appropriate assimilation parameters are set up and testified in this system. A ten years mesoscale eddies simulation and comparison experiment is made, which includes two schemes: assimilation and non-assimilation. The results of comparison between two schemes and the observation show that the simulation accuracy of the assimilation scheme is much better than that of non-assimilation, which verified that the altimetry data assimilation method can improve the simulation accuracy of the mesoscale dramatically and indicates that it is possible to use this system on the forecast of mesoscale eddies in the future.  相似文献   

10.
The effectiveness of an ensemble Kalman filter (EnKF) is assessed in the Selat Pauh of Singapore using observing system simulation experiment. Perfect model experiments are first considered. The perfect model experiments examine the EnKF in reducing the initial perturbations with no further errors than those in the initial conditions. Current velocity at 15 observational sites from the true ocean is assimilated every hour into the false ocean. While EnKF reduces the initial velocity error during the first few hours, it fails after one tidal cycle (approximately 12 h) due to the rapid convergence of the ensemble members. Successively, errors are introduced in the surface wind forcing. A random perturbation ε is applied independently to each ensemble member to maintain the ensemble spread. The assimilation results showed that the success of EnKF depends critically on the presence of ε, yet it is not sensitive to the magnitude of ε, at least in the range of weak to moderate perturbations. Although all experiments were made with EnKF only, the results could be applicable in general to all other ensemble-based data assimilation methods.  相似文献   

11.
An ensemble-based approach is developed to conduct optimal path planning in unsteady ocean currents under uncertainty. We focus our attention on two-dimensional steady and unsteady uncertain flows, and adopt a sampling methodology that is well suited to operational forecasts, where an ensemble of deterministic predictions is used to model and quantify uncertainty. In an operational setting, much about dynamics, topography, and forcing of the ocean environment is uncertain. To address this uncertainty, the flow field is parametrized using a finite number of independent canonical random variables with known densities, and the ensemble is generated by sampling these variables. For each of the resulting realizations of the uncertain current field, we predict the path that minimizes the travel time by solving a boundary value problem (BVP), based on the Pontryagin maximum principle. A family of backward-in-time trajectories starting at the end position is used to generate suitable initial values for the BVP solver. This allows us to examine and analyze the performance of the sampling strategy and to develop insight into extensions dealing with general circulation ocean models. In particular, the ensemble method enables us to perform a statistical analysis of travel times and consequently develop a path planning approach that accounts for these statistics. The proposed methodology is tested for a number of scenarios. We first validate our algorithms by reproducing simple canonical solutions, and then demonstrate our approach in more complex flow fields, including idealized, steady and unsteady double-gyre flows.  相似文献   

12.
Stochastic weather generators have evolved as tools for creating long time series of synthetic meteorological data at a site for risk assessments in hydrologic and agricultural applications. Recently, their use has been extended as downscaling tools for climate change impact assessments. Non‐parametric weather generators, which typically use a K‐nearest neighbour (K‐NN) resampling approach, require no statistical assumptions about probability distributions of variables and can be easily applied for multi‐site use. Two characteristics of traditional K‐NN models result from resampling daily values: (1) temporal correlation structure of daily temperatures may be lost, and (2) no values less than or exceeding historical observations can be simulated. Temporal correlation in simulated temperature data is important for hydrologic applications. Temperature is a major driver of many processes within the hydrologic cycle (for example, evaporation, snow melt, etc.) that may affect flood levels. As such, a new methodology for simulation of climate data using the K‐NN approach is presented (named KnnCAD Version 4). A block resampling scheme is introduced along with perturbation of the reshuffled daily temperature data to create 675 years of synthetic historical daily temperatures for the Upper Thames River basin in Ontario, Canada. The updated KnnCAD model is shown to adequately reproduce observed monthly temperature characteristics as well as temporal and spatial correlations while simulating reasonable values which can exceed the range of observations. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

13.
A land data assimilation system (LDAS) can merge satellite observations (or retrievals) of land surface hydrological conditions, including soil moisture, snow, and terrestrial water storage (TWS), into a numerical model of land surface processes. In theory, the output from such a system is superior to estimates based on the observations or the model alone, thereby enhancing our ability to understand, monitor, and predict key elements of the terrestrial water cycle. In practice, however, satellite observations do not correspond directly to the water cycle variables of interest. The present paper addresses various aspects of this seeming mismatch using examples drawn from recent research with the ensemble-based NASA GEOS-5 LDAS. These aspects include (1) the assimilation of coarse-scale observations into higher-resolution land surface models, (2) the partitioning of satellite observations (such as TWS retrievals) into their constituent water cycle components, (3) the forward modeling of microwave brightness temperatures over land for radiance-based soil moisture and snow assimilation, and (4) the selection of the most relevant types of observations for the analysis of a specific water cycle variable that is not observed (such as root zone soil moisture). The solution to these challenges involves the careful construction of an observation operator that maps from the land surface model variables of interest to the space of the assimilated observations.  相似文献   

14.
15.
Multiphase dynamic data integration into high resolution subsurface models is an integral aspect of reservoir and groundwater management strategies and uncertainty assessment. Over the past two decades, advances in computing and the development and implementation of robust algorithms for automatic history matching have considerably reduced the time and effort associated with subsurface characterization and reduced the subjectivity associated with manual model calibration. However, reliable and accurate subsurface characterization continues to be challenging due to the large number of model unknowns to be estimated using a relatively smaller set of measurements. For ensemble-based methods in particular, the difficulties are compounded by the need for a large number of model replicates to estimate sample-based statistical measures, specifically the covariances and cross-covariances that directly impact the spread of information from the measurement locations to the model parameters. Statistical noise resulting from modest ensemble sizes can overwhelm and degrade the model updates leading to geologically inconsistent subsurface models. In this work we propose to address the difficulties in the implementation of the ensemble Kalman filter (EnKF) for operational data integration problems. The methods described here use streamline-derived information to identify regions within the reservoir that will have a maximum impact on the dynamic response. This is achieved through spatial localization of the sample-based cross-covariance estimates between the measurements and the model unknowns using streamline trajectories. We illustrate the approach with a synthetic example and a large field-study that demonstrate the difficulties with the traditional EnKF implementation. In both the numerical experiments, it is shown that these challenges are addressed using flow relevant conditioning of the cross-covariance matrix. By mitigating sampling error in the cross-covariance estimates, the proposed approach provides significant computational savings through the use of modest ensemble sizes, and consequently offers the opportunity for use with large field-scale groundwater and reservoir characterization studies.  相似文献   

16.
Time-lapse seismic data is useful for identifying fluid movement and pressure and saturation changes in a petroleum reservoir and for monitoring of CO2 injection. The focus of this paper is estimation of time-lapse changes with uncertainty quantification using full-waveform inversion. The purpose of also estimating the uncertainty in the inverted parameters is to be able to use the inverted seismic data quantitatively for updating reservoir models with ensemble-based methods. We perform Bayesian inversion of seismic waveform data in the frequency domain by combining an iterated extended Kalman filter with an explicit representation of the sensitivity matrix in terms of Green functions (acoustic approximation). Using this method, we test different strategies for inversion of the time-lapse seismic data with uncertainty. We compare the results from a sequential strategy (making a prior from the monitor survey using the inverted baseline survey) with a double difference strategy (inverting the difference between the monitor and baseline data). We apply the methods to a subset of the Marmousi2 P-velocity model. Both strategies performed well and relatively good estimates of the monitor velocities and the time-lapse differences were obtained. For the estimated time-lapse differences, the double difference strategy gave the lowest errors.  相似文献   

17.
Hybrid simulation has been shown to be a cost-effective approach for assessing the seismic performance of structures. In hybrid simulation, critical parts of a structure are physically tested, while the remaining portions of the system are concurrently simulated computationally, typically using a finite element model. This combination is realized through a numerical time-integration scheme, which allows for investigation of full system-level responses of a structure in a cost-effective manner. However, conducting hybrid simulation of complex structures within large-scale testing facilities presents significant challenges. For example, the chosen modeling scheme may create numerical inaccuracies or even result in unstable simulations; the displacement and force capacity of the experimental system can be exceeded; and a hybrid test may be terminated due to poor communication between modules (e.g., loading controllers, data acquisition systems, simulation coordinator). These problems can cause the simulation to stop suddenly, and in some cases can even result in damage to the experimental specimens; the end result can be failure of the entire experiment. This study proposes a phased approach to hybrid simulation that can validate all of the hybrid simulation components and ensure the integrity large-scale hybrid simulation. In this approach, a series of hybrid simulations employing numerical components and small-scale experimental components are examined to establish this preparedness for the large-scale experiment. This validation program is incorporated into an existing, mature hybrid simulation framework, which is currently utilized in the Multi-Axial Full-Scale Sub-Structuring Testing and Simulation (MUST-SIM) facility of the George E. Brown Network for Earthquake Engineering Simulation (NEES) equipment site at the University of Illinois at Urbana-Champaign. A hybrid simulation of a four-span curved bridge is presented as an example, in which three piers are experimentally controlled in a total of 18 degrees of freedom (DOFs). This simulation illustrates the effectiveness of the phased approach presented in this paper.  相似文献   

18.
This paper presents an approach to identify structural parameters using the Least Mean Square (LMS) adaptive transversal filter. This method features easy computer simulation and fast data processing. This approach is effective even in the presence of input noise when it is applied in communications; however, this advantage is not obvious when it is used in structural systems, since the signal bandwidht it deals with in communications is much wider than the bandwidth in structure systems. With a low-pass filter added to a data acquisition system, this approach functions effectively for both linear and nonlinear structural systems. Numerical examples and experimental tests have been included to demonstrate the technique for both linear and nonlinear systems.  相似文献   

19.
The article presents an approach for creating a computationally efficient stochastic weather generator. In this work the method is tested by the stochastic simulation of sea level pressure over the sub-polar North Atlantic. The weather generator includes a hidden Markov model, which propagates regional circulation patterns identified by a self organising map analysis, conditioned on the state of large-scale interannual weather regimes. The remaining residual effects are propagated by a regression model with added noise components. The regression step is performed by one of two methods, a linear model or artificial neural networks and the performance of these two methods is assessed and compared. The resulting simulations express the range of the major regional patterns of atmospheric variability and typical time scales. The long term aims of this work are to provide ensembles of atmospheric data for applied regional studies and to develop tools applicable in down-scaling large-scale ocean and atmospheric simulations.  相似文献   

20.
Sehlke G  Jacobson J 《Ground water》2005,43(5):722-730
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multipurpose national laboratory managed by the Department of Energy, has developed a system dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River basin, a transboundary basin that includes portions of Idaho, Utah, and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and ground water data and for simulating the interactions between these sources within a given basin. In addition, we also found that system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory, and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple "what-if" scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or ground water modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause-effect relationships in large-scale hydrological systems, for integrating disparate data, for incorporating output from traditional hydraulic/hydrologic models, and for integration of interdisciplinary data, information, and criteria to support better management decisions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号