This article presents a new public domain tool for generalized Lagrangian particle tracking in rivers. The approach can be applied with a variety of two- and three-dimensional flow solvers. Particle advection by the flow is incorporated using flow fields from the chosen solver assuming particles follow the Reynolds-averaged flow, although some other simple passive and active particle behaviors are also treated. Turbulence effects are treated using a random walk algorithm with spatial step lengths randomly chosen from Gaussian distributions characterized by the diffusivity from the flow solver. Our work extends this concept to a general framework that is solver and coordinate system independent to allow easy comparisons between differing flow treatments. To better treat problems where detailed information is required in specific regions, the approach includes novel cloning and colligation algorithms which enhance local resolution at modest computational expense. We also provide tools for computing local concentrations and total exposure over a user-specified time interval. Several examples of predictions are provided to illustrate applications of the technique, including examination of the role of curvature-driven secondary flows, storage in lateral separation eddies, treatment of larval drift, treatment of fuel spill dispersion, river-floodplain connections, and sedimentation in floodplain ponds by tie channel connections. We also demonstrate that the model can reproduce analytically derived concentration profiles for simple diffusivities. These examples show that the Lagrangian particle tracking approach and the extensions proposed here are broadly applicable and viable for treating difficult river problems with multiple temporal and spatial scales. The examples also illustrate the utility of the cloning/colligation extensions and show how these can decrease the computational effort required on problems where high local resolution is required. Enhancement of the tools and even broader applicability can be achieved through the inclusion of multiple particle populations and particle–particle interactions. 相似文献
The efficiency by which communities capture limiting resources may be related to the number of species or functional types competing therein. This is because species use different resources (i.e. complementarity effect) or because species-rich communities include species with extreme functional traits (positive selection effect). We conducted two manipulative studies to separate the effects of functional richness and functional identity on the feeding efficiency (i.e. filtration rate) of suspension-feeding invertebrates growing on vertical surfaces. In addition, one experiment tested whether the density of organisms influences the effect of functional diversity. Monocultures and complete mixtures of functional types were fed with a solution of microalgae of different sizes (6 μm–40 μm). Experiments conducted at two locations, Helgoland and Plymouth, showed that functional identity had far larger effects on filtration rate than richness. Mixtures did not outperform the average monoculture or the best-performing monoculture and this pattern was independent on density. The high efficiency of one of the functional types in consuming most microalgae could have minimised the resource complementarity. The loss or gain of particular species may therefore have a stronger impact on the functioning of epibenthic communities than richness per se. 相似文献
Reconnaissance sampling of surface and subsurface sediment to a maximum depth of 80 m below the sea floor shows that typical values of 0.03 p.p.m. and anomalies of 0.2–1.3 p.p.m. mercury have been present in northeastern Bering Sea since Early Pliocene time. Values are highest in modern beach (maximum 1.3 and mean 0.22 p.p.m. Hg) and nearshore subsurface gravels (maximum 0.6 and mean 0.06 p.p.m. Hg) along the highly mineralized Seward Peninsula and in clayey silt rich in organic matter (maximum 0.16 and mean 0.10 p.p.m. Hg) throughout the region. Although gold mining may be partly responsible for high mercury levels in the modern beach near Nome, Alaska (maximum 0.45 p.p.m.), equally high or greater concentrations of mercury occur in buried Pleistocene sediments immediately offshore (maximum 0.6 p.p.m.) and in modern unpolluted beach sediments at Bluff (maximum 1.3 p.p.m.); this suggests that the contamination effects of mining may be no greater than natural concentration processes in the Seward Peninsula region. The mercury content of offshore surface sediment, even adjacent to mercury-rich beaches, corresponds to that of unpolluted marine and fresh-water sediment elsewhere. The normal values that prevail offshore may be attributable to entrapment of mercury-bearing heavy minerals on beaches near sources and/or dilution effects of offshore sedimentation. The few minor anomalies offshore occur in glacial drift derived from mercury source regions of Chukotka (Siberia) and Seward Peninsula; Pleistocene shoreline processes have reworked the drift to concentrate the heavy metals.The distribution pattern of mercury indicates that particulate mercury-bearing minerals have not been widely dispersed from onland deposits in quantities sufficient to increase mercury levels above normal in offshore sediments of Bering Sea; however, it shows that natural sedimentary processes can concentrate this mercury in beaches of the coastal zone where there already is concern because of potential pollution from man's activities. 相似文献
AbstractTemperature and salinity data from the vicinity of Bermuda reveal large vertical displacements of the isopycnals of over 100 m close to the island. A model based on the steady flow of an inviscid, stratified ocean past a circularily symmetric island on a rotating plane gives good qualitative agreement. The effects of island slope and nonlinearities are accounted for in a perturbation procedure.In an anomalous area over the left slope of the island (looking downstream) large steps were observed in the temperature and salinity profiles. The theoretical flow is shown to have a minimum Richardson number in this region. In a quasi-empirical manner it is possible to compute a Richardson number profile from the observed density data. This procedure gives values very close to that needed for instability to be possible suggesting that instabilities promote mixing and the development of the observed layers. 相似文献
The present work deals with assessment of earthquake-induced displacement of the base restrained retaining walls (RW’s). A detailed and rigorous finite element (FE) investigation has been carried out following the shaking table experiments on a scaled-down RW model. The FE simulations were performed by conducting several nonlinear time history analyses on a two-dimensional (2D) plane strain FE model of a prototype RW. The hardening and softening of backfill have been simulated by calibrating the Mohr Coulomb material model against the triaxial test results. Role of different backfill into the seismic performance of base restrained RW has also been investigated. It was observed that the cohesionless backfill has a slight influence on the earthquake induced displacement of base restrained RW’s. Amplification of horizontal acceleration in backfill has been observed with no direct correlation with the applied earthquake excitation. The understanding and findings based on shaking table experiment and FE simulations have been used for development of an analytical model for estimation of earthquake induced displacement of base restrained RW. The validity of proposed analytical model has also been examined against the shaking table experiment and FE simulation results.
An integrated electromagnetic (EM) and seismic geophysical study was performed to evaluate non-invasive approaches to estimate depth to shallow groundwater (i.e., < 5 m) in arid environments with elevated soil salinity, where the installation of piezometers would be limited or prohibited. Both methods were tested in two study areas, one serving as a control site with relatively simple hydrogeology and the other serving as the experimental site with complex hydrogeology. The control site is located near the shore of Utah Lake (Palmyra, Utah, USA) where groundwater is shallow and unconfined in relatively homogeneous lacustrine sediments. The experimental site is in Carson Slough, Nevada, USA near the Ash Meadows National Wildlife Refuge in Amargosa Valley. Carson Slough is underlain by valley fill, with variable shallow depths to water beneath an ephemeral braided stream system. The geophysical methods used include frequency domain electromagnetic induction with multiple antenna–receiver spacings. High-resolution P-wave seismic profiles using a short (0.305 m) geophone spacing for common depth-point reflection stacking and first arrival modeling were also acquired. Both methods were deployed over several profiles where shallow piezometer control was present. EM results at both sites show that water surfaces correspond with a drop in conductivity. This is due to elevated concentrations of evaporative salts in the vadose zone immediately above the water table. EM and seismic profiles at the Palmyra site accurately detected the depth to groundwater in monitoring wells, as well as interpolated depths between them. This demonstrates that an integrated approach is ideal for relatively homogeneous aquifers. On the other hand, interpreting the EM and seismic profiles at Carson Slough was challenging due to the laterally and vertically variable soil types, segmented perched water surfaces, and strong salinity variations. The high-resolution images and models provided by the geophysical profiles confirm the simple soil and hydrological structure at the Palmyra site as well as the laterally complex structure at Carson Slough. The integrated approach worked well for determining depth to water in the geologically simple site, but was less effective in the geologically complex site where multiple water tables appear to be present. 相似文献
Optimization of large‐scale injection‐based remedial systems requires engineering to intentionally capitalize on the biological, chemical, and physical mechanisms that occur within and between the zones of reagent application. These types of systems can be called hybrid designs as they employ multiple processes to achieve remediation endpoints ( Figure 1 ), resulting in optimized system performance and a reduction in the overall life‐cycle cost. While all remedial applications incorporate these mechanisms to some extent, the importance of each of these processes is magnified in large‐scale applications. This column discusses the dominant mechanisms responsible for mass reduction within both source and distal plume footprints, with a focus on the application of “Hybridized Design” for enhanced reductive dechlorination (ERD) systems. Figure 1 Open in figure viewer PowerPoint Diagram showing the hybrid design approach which encompasses physical (sorption, advection, diffusion), chemical (mass flux, abiotic degradation) and biological (metabolic and cometabolic degradation) processes. 相似文献
The South America VLF Network (SAVNET) has been installed in April 2009, and is composed of eight tracking receivers spread over South America, in Brazil, Peru and Argentina, and the Antarctica Peninsula. SAVNET is monitoring the properties of subionospheric propagating waves that reveal changes of the electrical properties of the ionospheric diurnal D-region or nocturnal E-region. In this paper, we will show the ability of the diagnostic obtained by SAVNET to discuss the monitoring of the solar activity on short timescales related to ionization due to solar flares. The sensitivity of flare detection as a function of the solar activity level will be discussed. On longer timescales related to the solar cycle, SAVNET is also able to provide information on the solar Lyman-α radiation. Finally we show that the VLF technique is well suited to search for of seismic-electromagnetic effects, and to provide a genuine diagnostic of high-energy astrophysical phenomena. 相似文献
Statistical analysis of extreme values is applied to wind data from National Centers for Environmental Prediction and National Center for Atmospheric Research reanalysis grid points over the ocean region bounded at 23°S and 40°W and 42°W towards the south and southeastern Brazilian coast. The period of analysis goes from 1975 to 2006. The generalized extreme value and generalized Pareto distributions are employed for annual and daily maxima, respectively. The Pareto?CPoisson point process characterization is also used to analyze peaks over threshold. Return levels for 10, 25, 50, and 100?years are calculated at each grid point. However, most of the reanalysis data fall within 1?C10-year return periods, suggesting that hazardous wind speed with low probability (return periods of 50?C100) have rarely measured in this period. Wide confidence intervals on these levels show that there is not enough information to make predictions with any degree of certainty to return periods over 100?years. Low extremal index (??) values are found for excess wind speeds over a high threshold, indicating the occurrence of consecutively high peaks. In order to obtain realistic uncertainty information concerning inferences associated with threshold excesses, a declustering method is performed, which separates the excesses into clusters, thereby rendering the extreme values more independent. 相似文献